Research Fellow in Robotics and Machine Learning

University of Leeds - School of Computing

Are you interested in making robots more able to perform complex manipulations, particularly in cluttered environments? Would you like to work on an EPSRC funded project in their Human-Like Computing initiative? Do you have the knowledge and experience to take data from humans performing manipulations of objects in virtual environments and apply machine learning methods to extract rules which embody the strategies humans use to reach their goals? Would you like the challenge of then implementing these rules on a robot to test their efficacy?

This is one of two roles on an 18 month feasibility study entitled “Human-like physics understanding for autonomous robots” to investigate whether data garnered from how humans manipulate objects in cluttered environments can be used to improve robots’ abilities to do the same. State-of-the-art robot motion/manipulation planners use low-level probabilistic methods often based on random sampling. There are two drawbacks: (1) it restricts robots to plan their motion at the bottom-most geometric level and, without any top-down guidance, this results in the limited object manipulation ability displayed by today’s intelligent robots; (2) this approach produces randomized motion that is not legible to humans, which limits robots’ collaboration capabilities with humans. Through incorporating human-like decision making in robot planning, we aim to overcome these limitations, producing a fundamental step-change in the sophistication of these robots.

An example task under consideration is how to reach something at the back of crowded fridge shelf; similar challenges arise in commercial settings – e.g. the Amazon picking challenge. We will start by exploring how humans perform such tasks in a VR setting – which will allow us to vary the task parametrically and extract data easily. We then plan to use symbolic machine learning techniques to extract rules expressed using qualitative spatial representations to represent tacit human knowledge gained ontogenetically and phylogenetically. Finally we plan to test the learned model in a robotic setting.

You will also contribute to a second EPSRC project entitled “Multi-Robot Manipulation Planning for Forceful Manufacturing Tasks”. The goal of this project is to get robots to perform simple manufacturing tasks. To do this, a robot team will need to decide how to grasp the workpieces and how to move to perform these operations. For a planning algorithm to make such decisions, it will need to solve geometric collision constraints, forceful stability constraints, and sequential temporal constraints simultaneously.

To explore the post further please contact:

Prof. Tony Cohn, School of Computing, Tel: +44 (0)113 343 5482 or email: a.g.cohn@leeds.ac.uk

Dr Mehmet Dogar, School of Computing, Tel: +44 (0) 113 343 5777 or email: m.r.dogar@leeds.ac.uk

Location: Leeds - Main Campus
Faculty/Service: Faculty of Engineering
School/Institute: School of Computing
Category: Research
Salary: £32,548 to £38,833 p.a. Due to funding restrictions an appointment will not be made above £36,613 p.a.
Contract Type: Fixed Term (Until 31 March 2020 (Grant funding))

Share this job
     
  Share by Email   Print this job   More sharing options
We value your feedback on the quality of our adverts. If you have a comment to make about the overall quality of this advert, or its categorisation then please send us your feedback
Advert information

Type / Role:

Location(s):

Northern England