Modular AI Wheelchairs Can Watch for Obstacles, Incorporate Head Tracking

This site may earn affiliate commissions from the links on this page. Terms of use.

When I was young, I subscribed to a gaming magazine published by Sierra On-Line. One issue of the magazine had a brief article about how the father of a young man had modified a joystick to allow his son (who I believe was a quadriplegic) to play and beat the game by moving only his head. I was only nine or 10 years old at the time, but the story struck a chord in me. It convinced me that one of the most valuable attributes of modern technology was its usefulness to disabled people. It could help them access experiences and activities that physical or neurological issues might otherwise prevent them from enjoying.

Thirty years later, we’ve come a long way from serial port joysticks with modified inputs. Exoskeletons are slowly moving towards reality. Microsoft has done commendable work designing its new Xbox Adaptive Controller. And in the UK, Dr. Konstantinos Sirlantzis, Senior Lecturer in Intelligent Systems at the University of Kent, is working on a series of AI-enabled wheelchairs with sophisticated obstacle tracking and eye-based guidance.

Dr. Sirlantzis is developing a modular system to allow specific components to be integrated into already-existing wheelchairs, as PCMag reports, in lieu of building a single one-size-fits-all robotic wheelchair. While it’s a vastly different market, this is actually similar to the approach Microsoft took with its Adaptive Controller. Because it was physically impossible to build one controller that could address every scenario, Microsoft focused on providing a solid basic set of capabilities with modular support for a huge range of additional devices. In this case, Dr. Sirlantzis’ has developed a range of tracking features, including iris, head, and nose tracking “depending on the user’s changing abilities over time and condition progression.”

The AI integration is intended to provide a wealth of secondary data for the wheelchair in order to assess the overall state of its user. The goal is to create a system that can provide additional autonomy when the person is fatigued, as well as to gather real-time health data and notifications should he or she need assistance. According to Sirlantzis, the wheelchair should be sophisticated enough to detect obstacles in real-time and automatically route around them, without a need for user intervention.

Other features include the ability to “call” a wheelchair to a specific location, though this doesn’t yet include actual autonomous 3D mapping. That’s being worked on, as shown below:

The lab working on the autonomous efforts is using the Robot Operating System coded in C or C++ with a little Python. The organization will present its work at NAIDEX 2019 on March 26 or 27, billed as ” [t]he most established professional and public event dedicated to independent living for people with a disability or impairment.” For more details and videos of the system in action, read PCMag’s full interview with Dr. Sirlantzis.

Now Read:

ExtremeTechExtreme – ExtremeTech