Back to search results

BCU CEBE Dean’s Scholarship 2023/4

Birmingham City University - College of Computing

Qualification Type: PhD
Location: Birmingham
Funding for: UK Students, EU Students, International Students
Funding amount: £19,237 - please see advert
Hours: Full Time
Placed On: 10th April 2024
Closes: 30th April 2024

Celebrate our outstanding research achievements in Computing, Engineering, and the Built Environment, as acknowledged in the 2021 Research Excellence Framework. We are excited to announce that the Associate Dean of Research is extending an invitation for five prestigious PhD scholarships within the College of Computing, scheduled to commence in September 2024.

Applications are welcomed from both UK and International candidates.

How to Apply 

To apply, please complete the project proposal form,ensuring that you quote the project reference, and complete an online application where you will be required to upload your proposal in place of a personal statement as a pdf document. 

You will also be required to upload two references, at least one being an academic reference, and your qualification/s of entry (Bachelor/Masters certificate/s and transcript/s). 

Project Title: HearingXR: Accessible AR/VR for d/Deaf and Hard-of-Hearing Users

Project Lead: Dr Wenge Xu Wenge.Xu@bcu.ac.uk
www.bcu.ac.uk/computing/about-us/our-staff/wenge-xu

Reference: Hearing XR          

Project Description

Over the past few years, there has been significant growth and adoption of Augmented Reality (AR) and Virtual Reality (VR) devices. These devices have been utilized for various purposes, such as training, education, and leisure activities; however, these applications have not been designed with accessibility in mind and have introduced several challenges for d/Deaf and hard-of-hearing users in using them. As one example, audio plays a significant role in current AR/VR applications, where it has been widely used for pinpointing target locations and navigating the environment. However, d/Deaf and hard-of-hearing users have issues noticing or identifying the audio resource for navigation. In addition, the current applications lack sensory substitution systems (e.g., visual prompts) to support their interaction with immersive technologies.

These challenges, however, can be easily addressed by employing a user-centred design approach with our target users and related stakeholders. This PhD project aims to (1) understand d/Deaf and hard-of-hearing users’ requirements for accessible AR/VR systems and co-design accessible features with them and related stakeholders, (2) conduct a set of experimental studies with d/Deaf and hard-of-hearing users to validate the designs, (3) provide insights into the design of accessible AR/VR experiences for d/Deaf and hard-of-hearing users.

The ultimate objective of the project is to produce an AR/VR plugin that contains several accessible features, developed based on d/Deaf and hard-of-hearing users' needs, to help AR/VR designers and developers develop accessible AR/VR applications.

Anticipated Findings and Contribution to Knowledge 

The candidate will be required first to conduct observation studies to observe d/Deaf and hard-of-hearing users’ experience with existing AR/VR applications and then conduct interviews with them. This would identify a list of challenges in the existing applications that need to be addressed and should produce a set of user requirements. Then, based on the user requirements, the candidate will design several features that can make AR/VR applications accessible to d/Deaf and hard-of-hearing users. These features will be developed in Unity/Unreal Engine and tested in a set of experimental studies with the target users to explore their feasibility.

The findings of these experiments will be published as research papers. The following deliverables will be produced:

  • D1: A list of challenges the d/Deaf and hard-of-hearing communities currently face with the existing applications.
  • D2: A set of user requirements from the d/Deaf and hard-of-hearing community to make AR/VR systems accessible. 
  • D3: A Unity/Unreal Engine plugin that contains several accessibility features designed for d/Deaf and hard-of-hearing users. This plugin will help AR/VR developers make their AR/VR applications accessible.
  • D4: A set of accessible AR/VR design guidelines for AR/VR designers to better design their applications for d/Deaf and hard-of-hearing users.

Project Title: Real-Time Interpolation of Impulse Responses in Dynamic Virtual Environments

Project Lead: Dr Carlo Harvey Carlo.Harvey@bcu.ac.uk
www.bcu.ac.uk/computing/about-us/our-staff/carlo-harvey

Reference: RAVE

Project Description

This research project aims to improve the audio experience in video games, simulations, and other virtual environments. Currently, when you move from one environment to another, the audio cues and filters often don't match, causing a jarring and unrealistic transition. Imagine playing a game where the sound changes abruptly as you move from a forest to a cave, ruining the immersive experience.

The goal of this project is to develop a solution that can seamlessly adjust the audio to match the environment you're in. It will use advanced techniques to analyse the surroundings, including objects, walls, and other materials, and apply the appropriate audio effects like echoes and reflections to make it sound more realistic. For example, if you're walking in a virtual forest and enter a cave, the audio will smoothly adapt to make you feel like you're truly inside the cave, with the appropriate echoes and reverberations.

This research is important because audio plays a crucial role in creating an immersive virtual experience. By improving the audio transitions, the project aims to make games and simulations more engaging, realistic, and enjoyable. Imagine being fully immersed in a virtual world where the audio matches what you see, making it feel more real and exciting.

This project involves collaboration between different fields, including computer science, audio engineering, and human-computer interaction. By bringing together expertise from these areas, the team hope to develop innovative techniques using artificial intelligence and advanced algorithms to enhance the audio experience in real-time.

Anticipated Findings and Contribution to Knowledge

The anticipated research findings will contribute to new knowledge by developing novel techniques and approaches for real-time interpolation of impulse responses using neural networks in dynamic 3D scenes. The research aims to address the challenges of providing accurate and immersive audio experiences in virtual environments with changing filters due to transitions between different spatial environments.

The anticipated findings will demonstrate the feasibility and effectiveness of using neural networks to dynamically adjust audio cues and filters in real time, accounting for dynamic objects, occlusions, reflections, and semantic materials within the 3D space. This will enable seamless and realistic audio rendering as users navigate virtual environments.

Expected outcomes:

  • Development of a novel neural network-based framework for real-time interpolation and synthesis of IRs in dynamic 3D scenes.
  • Demonstration of improved efficiency and accuracy compared to existing interpolation methods.
  • Insights into the trade-offs between computational complexity, network architectures, and audio quality.
  • Contribution to the field of spatial audio rendering for virtual reality, augmented reality, and interactive gaming applications.
  • Publication of research findings in peer-reviewed conferences and journals.
  • Open-source implementation and availability of the developed framework for the research community.

To categorise the impact of the anticipated findings we consider the application domains. In gaming, the realistic and dynamic audio rendering enabled by this research can greatly enhance the immersive experience for players. It can provide more accurate spatial audio cues, enabling players to locate and identify sounds within the virtual environment, which can greatly enhance gameplay and immersion. In simulations and training scenarios, realistic audio rendering is crucial for creating immersive and effective training experiences.

By accurately representing the acoustic characteristics of different environments and simulating realistic audio interactions, this research can enhance the training effectiveness and help users develop critical skills in a virtual setting. Furthermore, the impact of this research extends to areas such as virtual architectural walkthroughs, virtual meetings and collaborations, and virtual reality-based therapies. By providing realistic and dynamic audio rendering, users can experience a more immersive and engaging environment, enhancing the effectiveness of architectural presentations, remote collaborations, teleconferencing, metaverse co-working and therapeutic interventions.

Project Title: Smart Building Process Connectivity (SBPC): Enhancing Operational Efficiency through Data-Driven Algorithms

Project Lead:
Dr Gerald Feldman Gerald.Feldman@bcu.ac.uk
https://www.bcu.ac.uk/computing/about-us/our-staff/gerald-feldman

Reference: SBPC

Project Description

This research aims to enhance operational efficiency using data driven algorithms within smart buildings. The research includes:

  • A Comprehensive review of process connectivity, smart building frameworks, operational and maintenance data, and stakeholder requirements.
  • Development of simulation models through use cases to demonstrate the concept.
  • Collected simulation data used to drive novel data-driven algorithms.
  • Application of algorithms for operational optimization, efficiency, and predictive maintenance in smart buildings.

Anticipated Findings and Contribution to Knowledge

The anticipated findings and contribution to this project are as follows:

  • An improved process connectivity model for smart buildings through an in-depth review of related works. The review can include process connectivity, smart building frameworks, operational and maintenance data, and stakeholder requirements. 
  • Utilization of simulation tools (i.e., MATLAB/Simulink, Network Simulator 2, Python), for developing simulation model for smart buildings based on literature review insights.
  • Development of novel data-driven algorithms for (a) optimising the operational efficiency; and (b) predictive maintenance of the smart buildings.
  • Evaluate the system performance and demonstrate the efficiency of the integrated system among operational and non-operational requirements (i.e., energy and stakeholder).

Funding

Based on the UK Research and Innovation rates for 2024 – 2025, this funding model includes a 36 month fully funded PhD Studentship, in-line with the Research Council values, which comprises a tax-free stipend paid monthly (2024/5 - £19,237) per year and a Full Time Fee Scholarship irrespective of your fee status, subject to you making satisfactory progression within your PhD research.

Closing date

23:59 on Tuesday 30th April 2024 for a start date of the 2nd September 2024

Location

Faculty of Computing, Engineering and the Built Environment

STEAMhouse, City Centre Campus
Belmont Row, Birmingham B4 7RQ

We value your feedback on the quality of our adverts. If you have a comment to make about the overall quality of this advert, or its categorisation then please send us your feedback
Advert information

Type / Role:

Subject Area(s):

Location(s):

PhD tools
 

PhD Alert Created

Job Alert Created

Your PhD alert has been successfully created for this search.

Your job alert has been successfully created for this search.

Ok Ok

PhD Alert Created

Job Alert Created

Your PhD alert has been successfully created for this search.

Your job alert has been successfully created for this search.

Manage your job alerts Manage your job alerts

Account Verification Missing

In order to create multiple job alerts, you must first verify your email address to complete your account creation

Request verification email Request verification email

jobs.ac.uk Account Required

In order to create multiple alerts, you must create a jobs.ac.uk jobseeker account

Create Account Create Account

Alert Creation Failed

Unfortunately, your account is currently blocked. Please login to unblock your account.

Email Address Blocked

We received a delivery failure message when attempting to send you an email and therefore your email address has been blocked. You will not receive job alerts until your email address is unblocked. To do so, please choose from one of the two options below.

Max Alerts Reached

A maximum of 5 Job Alerts can be created against your account. Please remove an existing alert in order to create this new Job Alert

Manage your job alerts Manage your job alerts

Creation Failed

Unfortunately, your alert was not created at this time. Please try again.

Ok Ok

Create PhD Alert

Create Job Alert

When you create this PhD alert we will email you a selection of PhDs matching your criteria.When you create this job alert we will email you a selection of jobs matching your criteria. Our Terms and Conditions and Privacy Policy apply to this service. Any personal data you provide in setting up this alert is processed in accordance with our Privacy Notice

Create PhD Alert

Create Job Alert

When you create this PhD alert we will email you a selection of PhDs matching your criteria.When you create this job alert we will email you a selection of jobs matching your criteria. Our Terms and Conditions and Privacy Policy apply to this service. Any personal data you provide in setting up this alert is processed in accordance with our Privacy Notice

 
 
 
More PhDs from Birmingham City University

Show all PhDs for this organisation …

More PhDs like this
Join in and follow us

Browser Upgrade Recommended

jobs.ac.uk has been optimised for the latest browsers.

For the best user experience, we recommend viewing jobs.ac.uk on one of the following:

Google Chrome Firefox Microsoft Edge