Whistled Language Engages Both Hemispheres
Spoken and written language, sign language, and click consonants are all processed mainly in the left hemisphere, though the right hemisphere does contribute, for example by processing pitch and the prosody of language. Play "ba" to the left ear and "da" to the right ear using headphones, and most people will hear "da." A 2005 study of whistled language "speakers" in the Canary Islands showed that they used the same areas in the left and right hemispheres when listening to spoken and whistled communications, but non-whistlers processed whistles in non-language areas. Nature, Vol 433, 31-32 A recent study of whistled language in the mountains of northeast Turkey added an assessment of comprehension by supplying different whistled syllables to the left and right ears. The subjects showed the typical left-hemisphere dominance for speech, but processed whistles equally well in both hemispheres, a possibly unique finding in language research. Current Biology, Vol 25, R706-R708. (You can hear examples of the Canary Island whistled language and of click consonants in the Student Resources for this chapter.)

New Hearing Aid Avoids Cocktail Party Effect
The cocktail party effect is annoying enough for most of us, but even worse for people who wear hearing aids. Most hearing aids use directional microphones oriented to the front, which helps, but not all conversation takes place face-to-face. To solve the problem, researchers at Germany's University of Oldenburg have invented a hearing aid that detects the sounds the person is listening to, and amplifies those. It does this by matching the incoming sounds with the wearer's EEG, monitored by 10 electrodes embedded in a flexible C-shaped strip that wraps around the back of the ear. To be practical, the developers will have to miniaturize the matchbox-size amplifier and connect the device to a cell phone via Bluetooth or to the cloud, rather than the current desktop computer. But those problems don't detract from the significance of the neuro-technological accomplishment this device represents. New Scientist, May 218, p 10.


Why the Left Hemisphere is Better at Language We have long known that the planum temporale is larger in the left hemisphere in most people (see p 260 of the text), but we were limited in studying the complexity of that structure until recently, when MRI procedures were developed that could measure the density of interconnections among neurons. Using this neurite orientation dispersion and density imaging along with EEG measurements that detected how fast subjects processed speech, biopsychologists were able to determine that the fast processers had an extraordinary number of densely packed neural synapses in the left planum temporale. The researchers concluded that this greater physical connectivity is crucial to left-hemisphere superiority in speech processing. Science Advances, DOI: 10.1126/sciadv.aar6830.

Much of Auditory Organization is Innate
Experience is important in determing the functionality of brain auditory mechanisms, as evidenced by the lower success of cochlear implants after childhood. However, there is new evidence that cortical architecture is the same in deaf as in hearing individuals; they have the same pattern of connectivity that provides the tonotopic organization seen in the hearing. This apparently is driven by genetics and by spontaneous firing of the inner hair cells prior to hearing onset. (Note the similarity to spontaneous waves of excitation in the fetal retina, which selects which synapses will survive; see p 69 of the text.) Though experience is necessary to refine both the auditory and the visual systems, the organizational structure is built in, providing the opportunity for therapeutic intervention in deafness and blindness. Scientific Reports, 6: 29375. DOI: 10.1038/srep29375.