Post-doc position in visually-driven speech enhancement for cognitively-inspired multi-modal hearing-aid devices (Ricard Marxer )


Subject: Post-doc position in visually-driven speech enhancement for cognitively-inspired multi-modal hearing-aid devices
From:    Ricard Marxer  <ricardmp@xxxxxxxx>
Date:    Mon, 13 Jun 2016 14:09:11 +0100
List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>

--94eb2c0978b44db56105352899e3 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Postdoctoral Research Fellow University of Stirling Location:Stirling Salary:=C2=A331,656 to =C2=A337,768 per annum (Grade 7) Hours:Full Time Contract Type:Contract / Temporary Placed on:16th May 2016 Closes:26th June 2016 Job Ref:SCH00631 Post Details Fixed Term Contract: expected dates 01 August 2016 to 30 September 2018. Closing date: midnight on 26 June 2015. The Post The UK Engineering and Physical Sciences Research Council (EPSRC) funded project =E2=80=9CTowards visually-driven speech enhancement for cognitively-inspired multi-modal hearing-aid devices (AV-COGHEAR)=E2=80=9D,= aims to develop a new generation of hearing aid technology that extracts speech from noise by using a camera to see what the talker is saying. Our proposed approach is consistent with normal hearing. Listeners naturally combine information from both their ears and eyes: we use our eyes to help us hear. When listening to speech, eyes follow the movements of the face and mouth and a sophisticated, multi-stage process uses this information to separate speech from the noise and fill in any gaps. Our proposed cognitively-inspired multi-modal approach will act much the same way by exploiting visual information from a camera, and develop novel causal algorithms, for intelligently combining audio and visual Big Data information, in order to improve speech quality and intelligibility in real-world noisy environments. The postdoctoral research fellow will be part of a multi-institute team, led by Stirling, comprising speech researchers, psychologists, and industry partners, and will focus heavily on cognitively-inspired signal and image processing. For more information visit: http://www.jobs.ac.uk/job/ANR200/postdoctoral-research-fellow/ --94eb2c0978b44db56105352899e3 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable <div dir=3D"ltr">Postdoctoral Research Fellow<br><br>University of Stirling= <br><br>Location:Stirling<br>Salary:=C2=A331,656 to =C2=A337,768 per annum = (Grade 7)<br>Hours:Full Time<br>Contract Type:Contract / Temporary<br>=C2= =A0<br>Placed on:16th May 2016<br>Closes:26th June 2016<br>Job Ref:SCH00631= <br><br>Post Details<br><br>Fixed Term Contract: expected dates 01 August 2= 016 to 30 September 2018. <br><br>Closing date: midnight on 26 June 2015.<b= r><br>The Post<br><br>The UK Engineering and Physical Sciences Research Cou= ncil (EPSRC) funded project =E2=80=9CTowards visually-driven speech enhance= ment for cognitively-inspired multi-modal hearing-aid devices (AV-COGHEAR)= =E2=80=9D, aims to develop a new generation of hearing aid technology that = extracts speech from noise by using a camera to see what the talker is sayi= ng. Our proposed approach is consistent with normal hearing. Listeners natu= rally combine information from both their ears and eyes: we use our eyes to= help us hear. When listening to speech, eyes follow the movements of the f= ace and mouth and a sophisticated, multi-stage process uses this informatio= n to separate speech from the noise and fill in any gaps. Our proposed cogn= itively-inspired multi-modal approach will act much the same way by exploit= ing visual information from a camera, and develop novel causal algorithms, = for intelligently combining audio and visual Big Data information, in order= to improve speech quality and intelligibility in real-world noisy environm= ents. The postdoctoral research fellow will be part of a multi-institute te= am, led by Stirling, comprising speech researchers, psychologists, and indu= stry partners, and will focus heavily on cognitively-inspired signal and im= age processing.<br><br>For more information visit: <a href=3D"http://www.jo= bs.ac.uk/job/ANR200/postdoctoral-research-fellow/">http://www.jobs.ac.uk/jo= b/ANR200/postdoctoral-research-fellow/</a> <div><br></div></div> --94eb2c0978b44db56105352899e3--


This message came from the mail archive
/var/www/html/postings/2016/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University