These videos illustrate several stages of development of the COCOHA solution. Each shows only part of the functionality, and all involve bulky laboratory equipment that may seem remote from what a user might want on his/her ear. The idea is to show “proof of concept” demonstrating how each of the main hurdles can be addressed. In a final product, all these elements would be integrated within an attractive and ergonomic device.

This video  from Eriksolm/Oticon provides an introduction to the motivation behind COCOHA.

The next video demonstrates real-time cognitive control based on brain signals.  The listener hears two voices coming from loudspeakers.  When the listener focuses on one voice, that voice is amplified and the other voice is attenuated.  Attention switches several times between voices, as indicated by the pointer.  The gain to be applied to each voice is derived via a real-time decoder from brain signals recorded by EEG (electrode cap).  Our ultimate goal is to include this form of control within a hearing aid, so that a user can seamlessly attend to any desired voice and hear it perfectly.


This video demonstrates acoustic analysis (voice separation) based on the 4-microphone WHISPER audio platform (the microphones are taped to the table) associated with a beamforming algorithm that steers processing of the microphone signals to isolate each voice. The separated voices are then streamed to the cognitive control device that selects one or the other voice based on the listener’s attention (as in the previous video).  The WHISPER platform is designed to exchange wirelessly with other platforms (to benefit from a wider microphone array), and to stream sound to the cognitive control device associated with the hearing aid. The purpose of this video is to demonstrate that high-quality sound can be recovered using an ad-hoc array of microphones.


The following two videos show elements of Oticon’s Hearing Aid Prototype, including the audio platform and cognitive control platform, and an experiment with real-time attentional decoding (selecting between Emmanuel Macron and Queen Elizabeth II).


The following video from Oticon/Eriksolm shows how hearing impaired users feel about cognitive control. Here steering is perfomed using an eye-tracking device. Eye tracking is a potential alternative to brain control (if we can’t get that to work well enough) or complement to brain control (to make it work better or faster).


The following two videos show how the eye-gaze control signal can be derived from electrical signals recorded from the ear canal (Ear-EOG).  The advantage of this form of control is that the necessary signals can be measured unobtrusively from electrodes placed within the ear canal (for example on an in-ear hearing aid). Similar electrodes are used to record brain signals, making it easier to envision hybrid brain/gaze-based control.