|Funding for:||UK Students, EU Students|
|Placed On:||6th June 2019|
|Closes:||8th July 2019|
Supervisor: Professor Marta Kwiatkowska
Start Date: October 2019
The Department of Computer Science at the University of Oxford is offering two fully funded DPhil studentships in the Automated Verification theme under the supervision of Professor Marta Kwiatkowska, to commence in October 2019 or as soon as possible thereafter.
Project Description: The FUN2MODEL project (www.fun2model.org) aims to make advances towards provably robust 'strong' Artificial Intelligence. In contrast to 'narrow' AI perception tasks realised by deep learning, which are limited to learning data associations, and sometimes referred to as function-based, 'strong' AI aims to match human intelligence and requires model-based reasoning about causality and 'what if' scenarios, incorporation of cognitive aspects such as beliefs and goals, and probabilistic reasoning frameworks that combine logic with statistical machine learning.
The objectives of FUN2MODEL are to develop novel probabilistic verification and synthesis techniques to guarantee safety, robustness and fairness for complex decisions based on machine learning, formulate a comprehensive, compositional game-based modelling framework for reasoning about systems of autonomous agents and their interactions, and evaluate the techniques on a variety of case studies.
Studentship 1 Project Description: Fairness and bias in multi-agent interactions
Fairness and bias of algorithmic decisions is critical to ensure their acceptance in society, but has been lacking in recently deployed AI software, for example Microsoft’s bot Tay. As a result, a variety of definitions of algorithmic fairness and corresponding verification approaches have been developed. However, these do not capture the influence of the cognitive and affective aspects of complex decisions made by autonomous agents, such as preferences and emotional state, which are essential to achieve effective collaboration of human and artificial agents. This project aims to develop a probabilistic, Bayesian framework based on causal inference for reasoning about fairness and bias in multi-agent collaborations, together with demonstrator case studies and associated software tools.
Studentship 2 Project Description: Causal reasoning about accountability and blame
While deep learning is able to discern data associations, Bayesian networks are capable of reasoning about counterfactual and interventional scenarios, for example “What if the car had swerved when the child stepped on to the road?”. However, in order to model realistic human behaviours, Bayesian priors and inference must additionally account for cognitive goals and intentions, such as inference of intent for the pedestrian. This project aims to develop a framework for probabilistic causal reasoning with cognitive aspects to study accountability and blame in autonomous scenarios, together with demonstrator case studies and associated software tools.
The successful applicants will join an internationally leading research group of Professor Marta Kwiatkowska, who has an extensive track record in probabilistic verification and pioneering research on safety verification for neural networks and trust in human-robot collaborations. The group has presented their work at leading conferences in concurrency, verification, AI, and robotics (notably CAV, CONCUR, TACAS, IJCAI, AAAI and ICRA). They will have the opportunity to collaborate with other members of the FUN2MODEL project, and to utilise the methods, models and software developed as part of the project towards their own research objectives.
More information about Professor Kwiatkowska’s research and PRISM model checker can be found here:
The project will provide an annual stipend to the student of at least £15,600 per annum for 3.5 years. The project will also cover the costs of university and college fees at home/EU level (international students will need additional funding), travel to conferences and workshops, and provision for a laptop computer.
Applicants must satisfy the usual requirements for studying for a doctorate at Oxford. Candidates must also have good writing, communication and presentation skills (see the University's web pages on the DPhil in Computer Science for details).
You should apply online by 8th July 2019, quoting studentship reference CS-MK-2019 via
We expect to invite shortlisted applicants to interview in week commencing 22nd July 2019.
For further information about the project or for informal discussions about suitability, please contact Marta Kwiatkowska (email@example.com). For further information about the studentship or the application process please e-mail Computer Science Graduate Admissions firstname.lastname@example.org.
Type / Role: