deepHand: Deep Learning the Finger and Hand Articulations

deepHand is an improved natural user interface that uses a single camera and convolutional neural networks to robustly track hand position and finger joint angles, allowing for more intuitive gestures with less hardware.
Technology No. 2016-RAMA-67373

As virtual and augmented reality devices continue to become more commonplace, gesture-based natural user interfaces (NUI) seem to be the future. NUIs can provide an immersive and natural computing experience, especially when compared to traditional input devices such as mice and keyboards. Although NUIs are an improvement from traditional input devices, they are still limited by their hand-tracking capabilities as well as by their extensive hardware requirements.

Researchers at Purdue University have developed a greatly improved NUI known as deepHand. deepHand determines hand position robustly using a single camera, even when the hand may be partially hidden from view. Unlike current hand-tracking software, deepHand identifies and tracks the finger joint angles instead of the position of the finger joints themselves, which allows the system to be considerably more robust and invariant to changes in camera orientation. It continuously tracks objects even when they are covered or slightly removed from view through the use of convolutional neural networks (CNNs). CNNs are based on biological visual cortexes and work by layering and overlapping images; in deepHand it tracks the joint angles. deepHand allows more intuitive hand gestures while using less hardware, opening the door for immersive human-computer interactions.

Advantages:

-Natural gesture based inputs

-Uses a single camera

-Permits hand-tracking under occultation

Potential Applications:

-Gaming and animation

-Immersive human-computer interaction

-Healthcare

-Education

TRL: 6

Intellectual Property:

Provisional-Patent, 2015-12-15, United States

Utility Patent, 2016-12-15, United States

DIV-Patent, 2019-06-10, United States

CON-Patent, 2019-12-09, United States

CON-Gov. Funding, 2020-11-30, United States

CON-Patent, 2022-06-13, United States

Keywords: Natural User Interface (NUI), deepHand, gesture-based inputs, hand-tracking software, convolutional neural networks (CNNs), immersive human-computer interaction, virtual reality (VR), augmented reality (AR), robust hand pose estimation, single camera tracking, Algorithm, Computer Technology, Mechanical Engineering

  • expand_more mode_edit Authors (4)
    Joon Hee Choi
    Chiho Choi
    Karthik Ramani
    Ayan Sinha
  • expand_more cloud_download Supporting documents (1)
    Product brochure
    deepHand: Deep Learning the Finger and Hand Articulations.pdf
Questions about this technology?