Dexterous Humanoid Robot Motion Learning

An ideal interface for humanoid robot control would be inexpensive, person-independent, easy to use, requiring no wearable equipment, and more importantly able to achieve the goal. This project aims to develop a novel goal-directed online learning method and build up a human-robot interactive demonstration system, which would enable people to easily control and interact with humanoid robots using the most natural body gestures. The robot platforms include Baxter robot and Sawyer robot.

Intelligent Human-Robot Interaction

This project aims to improve the human-robot interaction via body/hand gestures using state-of-the-art sensors. It will enable robot to real-time recognise human body gestures, especially the complicated hand gestures, and identify the users' intention, which could be of great help in the applications of the healthcare robot and service robot.

DREAM Project

DREAM is an EC-funded project that will deliver the next generation robot-enhanced therapy (RET). It develops clinically relevant interactive capacities for social robots that can operate autonomously for limited periods under the supervision of a psychotherapist. DREAM will also provide policy guidelines to govern ethically compliant deployment of supervised autonomy RET. The core of the DREAM RET robot is its cognitive model which interprets sensory data (body movement and emotion appearance cues), uses these perceptions to assess the child’s behaviour by learning to map them to therapist-specific behavioural classes, and then learns to map these child behaviours to appropriate robot actions as specified by the therapists. More details on official website

Human Hand Motion Analysis With Multisensory Information 

Different properties involved in the human hand motions provide people with rich sensory information such as hand position, velocity, force and their changes with the time to build up computational models of these motions. Integration of the above multiple sensory information is essential for the human hand motion analysis.
  • Motion capturing module: use different sensors to transfer the sensory information into digital signal recognisable to computers.
  • Preprocessing module: synchronise and filter the original digital data and segment them into individual tasks.
  • Knowledge base module: stores the human hand motion primitives, manipulation scenarios and correlations among the different sensory information.
  • Identification module: use clustering and machine learning methods to train the motion models and recognise the new or testing sensory information.
  • Desired trajectory generation module: generate the desired trajectories based on the human analysis framework for different applications.
  • Applications: Robotic hands, Prosthetic hands, Animation Hands, Human-Computer Interaction and so on.

Mobile, Autonomous and Affordable System to Increase Security in Large Unpredictable Environment

Civil installations such as power plants are often located in wide and remote areas. In the future, the number of small distributed facilities will increase as a direct result of new European environmental policies aimed at increasing societies‚€™ resilience to climate change. However, the protection of fragmented assets will be difficult to achieve
and will require portable security

systems that are affordable to those in charge of their management. The BASYLIS project (supported by European Seventh Framework Programme) aims to address these issues by developing a low-cost smart sensing platform that can automatically and effectively detect a range of security threats in complex environments. The principal obstacles to early threat detection in wide areas are of two types: functional (e.g. false-alarm rate) and ethical (e.g. privacy). Both problems are exacerbated when either the installations or the environments are dynamic. Potential solutions are unaffordable to most of the potential users.

In the work package 6, our projectives are to develope:

  • Multitracker: This software module will integrates all the sensors alarms, including the COTs ones thought the COTs Integration Board.
  • Behavioural Analysis: The behavioural analysis automates the identification and classification of suspicious behaviour of objects from the multitracker.

For details of the project, please refer to the project webpage.


A Software Package for Controlling Prosthetic hands via EMG based User Manipulation Intention

This project aims to develop a realtime platform for transfering human manipulation skills to prosthetic hands. The project is funded by Higher Education Innovation Fund 4 (HEIF4)