Location: | Edinburgh |
---|---|
Salary: | £34,308 to £42,155 (salary dependent on experience and qualifications) (Grade 7) |
Hours: | Full Time |
Contract Type: | Fixed-Term/Contract |
Placed On: | 7th December 2022 |
---|---|
Closes: | 3rd January 2023 |
Job Ref: | 2636 |
Closing date: Midnight on 3rd January 2023
Heriot-Watt University has established a reputation for world-class teaching and leading-edge, relevant research, which has made it one of the top UK universities for innovation, business and industry.
The School of Engineering & Physical Sciences has an international research reputation and close connection with the professional and industrial world of science, engineering and technology.
Job Summary
We are seeking to recruit a highly motivated PDRA willing to work as part of a team in this project to support the development of IoT communications links and machine learning.
The research project COG-MHEAR: Towards cognitively inspired 5G-IoT enabled, multi-modal Hearing Aids funded by UK research council (EPSRC), involves six academic partners including University of Edinburgh and industrial partners including Bell-Labs, Alcatel-Lucent Technologies, USA and Sonova AG. The research will explore and validate the new concept hearing-aid to transform hearing care by 2050, we aim to completely re-think the way HAs are designed. Our transformative approach - for the first time - draws on the cognitive principles of normal hearing. Listeners naturally combine information from both their ears and eyes: we use our eyes to help us hear. We will create "multi-modal" aids which not only amplify sounds but contextually use simultaneously collected information from a range of sensors to improve speech intelligibility. Adding in these new sensors and the processing that is required to make sense of the data produced will place a significant additional power and miniaturization burden on the HA device. We will need to make our sophisticated visual and sound processing algorithms operate with minimum power and minimum delay, and will achieve this by making dedicated hardware implementations, accelerating the key processing steps. In the long term, we aim for all processing to be done in the HA itself - keeping data local to the person for privacy.
Applicants must have a PhD degree (or have submitted a PhD thesis) in Electrical, Electronic, computer engineering, Computer science or a related discipline, have excellent knowledge and strong background in Signal Processing for wireless communications/ networks (in particular physical/ cross layer technologies), MIMO signal processing, interference management, machine learning, beamforming, precoding, channel estimation, IoT, user interface technologies, mathematics and optimisation evidenced by high-quality research publications. The applicants must also have excellent written, oral presentation, and interpersonal skills, and software programming skills (Matlab and/or C/C++/JAVA/Python) appropriate for wireless communications simulation and testbed (USRP) work. It is desirable that the applicants have some industrial experience working in telecommunications/machine learning/networks/programming/testbed.
For all criteria, further details and how to apply:
enzj.fa.em3.oraclecloud.com/hcmUI/CandidateExperience/en/sites/CX/job/2636/?utm_medium=jobshare
Heriot-Watt University is committed to securing equality of opportunity in employment and to the creation of an environment in which individuals are selected, trained, promoted, appraised and otherwise treated on the sole basis of their relevant merits and abilities. Equality and diversity are all about maximising potential and creating a culture of inclusion for all.
Heriot-Watt University values diversity across our University community and welcomes applications from all sectors of society, particularly from underrepresented groups. For more information, please see our website www.hw.ac.uk/uk/services/equality-diversity.htm and also our award-winning work in Disability Inclusive Science Careers disc.hw.ac.uk.
Type / Role:
Subject Area(s):
Location(s):