Computer scientists are developing new adaptive mobile technology that could enable blind and visually-impaired people to ‘see’ through their smartphone or tablet.
A team from University of Lincoln in Britain plans to use colour and depth sensor technology inside new smartphones and tablets to enable 3D mapping and localisation, navigation and object recognition.
The team will then develop the best interface to relay that to users — whether that is vibrations, sounds or the spoken word.
“If people were able to use technology embedded in devices such as smartphones, it would not require them to wear extra equipment which could make them feel self-conscious,” said project lead Nicola Bellotto.
The researchers aim to develop a system that will recognise visual clues in the environment. This data would be detected through the device camera and used to identify the type of room as the user moves around the space.
A key aspect of the system will be its capacity to adapt to individual users’ experiences, modifying the guidance it provides as the machine ‘learns’ from its landscape and from the human interaction.
So, as the user becomes more accustomed to the technology, the quicker and easier it would be to identify the environment.
“There are also existing smartphone apps that are able to, for example, recognise an object or speak text to describe places.
But we aim to create a system that understands how people observe and recognise particular features of their environment,” Bellotto explained.