University College London MotionInput v2.0 supporting DirectX: Touchless Computing Interactions

This post has been republished via RSS; it originally appeared at: New blog articles in Microsoft Tech Community.

 

 

DeanMohamedally_0-1630071990216.png

 

Sheena Visram, Co-Supervisor for UCL MotionInput, leading on healthcare applications of Touchless Computing and researching the adoption of emerging technologies at Great Ormond Street Hospital for Children’s DRIVE centre, University College London Interaction Centre (UCLIC).


Paper publication:

https://arxiv.org/abs/2108.04357

 

Project website, video demos and community registration:

www.motioninput.com

 

Academic Project Co-Supervisors at University College London:

Prof Dean Mohamedally, Prof Graham Roberts, Dr Nicholas Gold, Prof Neil Sebire, Prof Yvonne Rogers, Prof Joseph Connor (UCL)

 

In Collaboration with Microsoft:

Lee Stott, Honorary Associate Professor in Software Systems Engineering (UCL)

 

The COVID-19 pandemic period has renewed interest in touchless computing in healthcare. This is largely motivated by infection prevention and control, and an interest to now consult and remotely monitor patients in their own homes. Previous efforts to implement touchless computing have been limited by the expense and installation of new hardware, software requirements and assembly of solutions. I have been leading on University College London (UCL)'s MotionInput v2.0 supporting DirectX project with Great Ormond Street Hospital for Children, with the task set to our team of how to create a low cost and scalable method for enabling touchless computing for existing and new software, by using standard webcams.

 

Project approach

UCL’s MotionInput v2.0 supporting DirectX is a modular framework with four modules of customised gesture tracking from a webcam using open-source libraries (hand gestures, head movement, eye tracking and exercise recognition). These are connected by a common Graphic User Interface (GUI) to enable Windows-based interactions on all existing PC software managed by DirectX, including games, creativity, office and especially healthcare applications. Human motion gestures, like hand and body motions, head and facial movement and eye tracking are mapped from an RGB webcam video stream to input commands for existing applications and games. It uses published open-source libraries in a federated and locally processed way.

 

DeanMohamedally_1-1630072440591.png

 

System architecture of the prototype software

 

Our solution

MotionInput v2.0 is designed to work across a broad range of contexts and with users of differing capabilities. Users can execute cursor commands – moving a cursor with their hands, eyes or head direction, perform mouse clicks with hand pinching, eye blinking and mouth opening gestures. They can scroll with a hand or head movement. Physical detection of exercises like squatting, jumping, cycling, air punches and kicks are also measured as inputs for DirectX. New contributions include an idle state for activating and deactivating gestures, a dynamic area of interest and depth variation captured from a 2D webcam.

 It is with great pleasure that I showcase some of the features of this year’s version of the software, with UCL’s Industry Exchange Network (UCL IXN) in collaboration with Microsoft and Great Ormond Street Hospital for Children’s DRIVE Unit. I am so proud of the ingenuity of this year’s team of undergraduates and Master’s in Computer Science students and their achievements: Ashild Kummen, Guanlin Li, Ali Hassan, Teodora Ganeva, Qianying Lu, Robert Shaw, Chenuka Ratwatte, Yang Zou, Lu Han, Emil Almazov

 

Featured healthcare applications

  • Healthcare staff can navigate complex user interfaces, such as electronic health records and radiology images, touch free using a mid-air hand gesture. This includes free drawing on DICOM scans, and touchless panning and rotation of 3D imaging and anatomy. There are numerous occasions when the hands of a clinician are either occupied by clinical tasks or it is important to keep peripheral computer interfaces free from contamination.

Student project demo videos

 

 

 

 

  • Patients with motor impairments would be able to use head movements or eye tracking to interact with their computer for both functional and creative tasks. This might include navigating web browsers, scrolling through PDF files, and creative expression through painting and playing music.

Student  Demo Projects

 

 

 

  • Full body rehabilitative and repetitive hand exercises can be performed from home and encouraged by playing existing games on Windows, as well as Kinect games with skeletal tracking capabilities. This might be after a hospital treatment, to build strength after an event, to improve motor skills or as part of targeted conditioning as guided by physiotherapists.

Student Demo Projects

 

 

 

 

We have launched our community beta test programme for public and enterprise sectors to participate in testing and refining this technology with their existing Windows software.

 

Click here to register to join our Community Interest group with access to our beta testing partnerships programmes via UCL:

Healthcare and clinical applications testing with Great Ormond Street Hospital for Children’s GOSH DRIVE Education outreach with the UCL Institute of Education Accessibility opportunities with the UCL GDI Hub Retail and large event spaces testing in partnership with NTTDATA Esports PC games playtesting with Sparq Esports

 

We look forward to meeting with organisations looking to setting up their own touchless computing test programmes and initiatives with us, so get in touch with the community interest link above!

 

Leave a Reply

Your email address will not be published. Required fields are marked *

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.