According to a report by ABI Research, the dominance of touchscreen user interfaces will reduce over the next 5 years. This is due to the introduction of new gadgets and sensors in the market.
The new product form factors will enable new user interfaces like voice, gesture, eye-tracking, and neural.
ABI Research’s latest report examines popular user interface (UI) methods as well as the natural sensory technologies transitioning from research labs into future product solutions.
“Touch got mobile device usability to where it is today, but touch will become one of many interfaces for future devices as well as for new and future markets,” says senior practice director Jeff Orr.
The findings of the report says that hand and facial gesture recognition will experience the greatest growth in future smartphone and tablet shipments with a CAGR of 30 percent and 43 percent respectively from 2014 to 2019.
ABI Research says that the impact of UI innovation in mobile devices will be felt across a wide range of CE applications, including the car and in the home.
The report suggests that the UI must be kept simple enough to be intuitive, as mobile applications integrate more technology.
Packing a mobile device with sensors goes little beyond being a novelty,” adds Orr.
Key components have also evolved from single-function elements into multi-sensor, single-chip packages.
This has not only benefited the handheld form-factor, but been the premise for the leading commercially available wearable devices.