From the safe distance of a composer with an armchair appreciation for science, I'm something of a fan of the neuroscience of music. It's really exciting stuff and my impression is that we're only beginning to learn about it. Here are some aspects of music that I'd like — as a composer — neuroscience to tell us more about; as music if the temporal art par excellence, it's not surprising that they have everything to do with how we process events in time:
1. The two irreversible arrows of music: in pitch and in time. Inverting a sequence of pitches creates only a weak equivalence, indeed an equivalence which deteriorates as one moves towards extremes. And in time, a reversal of a sequence, last-on-first-out, is also only weakly equivalent.
2. The perceptual "borderlands" between parameters: between pitch and timbre or between form and rhythm or rhythm and pitch, including such phenomena as interference beating*.
3. The "chunking" of musical memory: apparently we take in music in pieces which are then reassembled, mentally, into a continuity. (Which is closely related to:)
4. The gaps in timing between mental anticipation of a physical music-producing action, the execution of that action, the perception and analysis of the resulting sound, as well as any motor feedback processing related to the action. (When, exactly, does the music take place? What does "now" mean in music when all of these events occur, in objective external time, at different points? How the hell do musicians ever reach a unison attack?)
5. What other sensory mechanisms contribute or might potentially contribute to an enhanced perception of acoustic events? Research so far has pointed to some limits in the perceptual apparatus and this has led to some practical applications — from transmission of speech over a narrow band to noise reduction to compression and sampling rates. But I'm far more interested, as a musician, in expansion than limitations. For example, I know that a considerable part of "hearing" music, for me, comes from the sense of touch as a complement to the stuff that goes through the ears. AFAIC, fine tuning, in a live ensemble, is often easier by feel than audition, through the reduction of beats more felt than heard. Musicians with hearing losses also know this well, but could this also explain things like La Monte Young's massive sine wave complexes (which don't resolve to simple 5-limit harmonies but share a common, if sub-audio, difference tone)? Also, the physical placement of sounds in space is a fascinating topic. How about echolocation? Some humans — particularly the visually impaired — have become virtuoso echolocaters, but I think I've detected the same in babies crying. What a wonderful extra resource for musicians?
* One of the most interesting results with regard to beating that I've heard about lately is this, a paper on Waves, Beats and Expectancy in speech by Eric Keller. The ways in which speech and music piggy-back on an overlapping set of neural organs is another fascinating topic. While beating in musical contexts is well familiar, that the phenomena was shared with speech was a surprise to me. And while this is wildly prematured and underinformed speculation on my part, wouldn't it be so cool to find a neurological basis, in beating, for poetic and musical metre?