Display Technology
3D gesture recognition in a single chip
Andreas Guete looks at how an innovative technique called E-Field is being used to offer contact-free position tracking, gesture recognition and approach detection
T
he advent of modern desktop operating systems such as Microsoft’s Windows and Apple’s MAC OS has
created a need for more than just a keyboard and while the mouse was the default device for desktop machines, portable notebook computers have relied on something more integrated. Originally the options were a trackball (essentially a mouse turned on its back) or an isometric joystick. However by the early 1990s these somewhat clumsy options were replaced by the more intuitive touchpad. Since then, touchpads have evolved to offer multi- touch detection and capacitive sensing, but they still require a considerable amount of space while only detaching in two dimensions.
Using an innovative technique called E- Field, which uses near-field RF technology, Microchip has developed GestIC; a 3D gesture control technology that overcomes the limitations of existing touchpad solutions.
Integrated in to a single-chip, the MGC3130 creates an electrical field that can detect in 1D (presence) 2D (touchpad) and 3D (gesture) to provide contact-free position tracking, gesture recognition and approach detection for a range of applications (Figure 1).
22 September 2014 Figure 1
equates to a wavelength of about 3km. However, in its implementation, the distance between the electrodes is much smaller than the wavelength, which allows
Components in Electronics
ordinates; moving objects can, therefore, describe gestures which can be recognised. The device can control up to five receiving electrodes to acquire
Technology overview The electrical near-field sensing technology employed by GestIC uses a carrier frequency of around 100kHz, which
the magnetic field component and radiated energy to remain at practically zero. This results in a pure quasi-static E- Field that doesn’t create interference in the radio spectrum used by other devices, such as mobile phones. Furthermore, unlike other sensing technologies sometimes employed to create 3D user interfaces - such as optical, ultrasound or infrared - E- Fields are not affected by light, sound, skin colour, humidity or other ambient parameters.
Any movement within the E-Field is detected and translated in to X and Y co-
positional data, this is then processed by the on-chip Signal Processing Unit (SPU) and, when used with Microchip’s free Gesture Library software (Figure 2), allows the MGC3130 to detect and decode a wide range of gestures. With a positional update rate of up to 200/second, the five electrodes allow a vast number of data points to be acquired. Integrating the on-chip SPU and library on- chip means a truly real-time gesture recognition system can be realised in a single chip. The low power modes available allow an ‘always on’ solution even when implemented in battery powered devices; even in the lowest power mode, approach detection is still enabled, allowing maximum power savings without sacrificing responsiveness or functionality. Because position data is acquired in
three dimensions (X, Y and Z) and processed on-chip to deliver real-time response, users can experience a more immersive interface which is more intuitive and more responsive that existing alternatives. For developers, interfacing to the MGC3130 is also simple, using a 4-pin interface that supports SPI, I2C and more. Existing ‘free space’ detection systems
rely predominantly on optical sensors. However, these face a number of challenges, such as angular related detection and coverage. In addition, the average power of optical-based systems can be over 500mW for even rudimentary 3D functionality, while the 50frames/second update rate presents real- time issues. The E-Field solution, on the other hand, requires less than one tenth of that power and offers significantly higher update rates.
www.cieonline.co.uk
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52