Researchers at Stanford University have developed a robotic ‘smart’ cane that guides people with visual impairments using technology originated for autonomous vehicles.
Most sensor canes use ultrasound to notify a user that there’s some object directly in front of or above them.
But the team at Stanford’s Intelligent Systems Laboratory equipped their augmented cane with a LIDAR sensor, a laser-based technology used in some self-driving cars to measure the distance of nearby obstacles.
The cane also incorporates smartphone-style GPS, accelerometers, magnetometers, and gyroscopes to monitor a user’s position, orientation, speed and direction.
A motorized, omnidirectional wheel on the bottom tip maintains contact with the ground and gently tugs and nudges users around impediments.
But the cane uses AI-generated algorithms to actually steer the user toward a goal — like a coffee shop or subway — rather than just away from an obstacle.
‘We want the humans to be in control but provide them with the right level of gentle guidance to get them where they want to go as safely and efficiently as possible,’ Mykel Kochenderfer, an aeronautics professor and expert in aircraft collision-avoidance systems, said in a statement.
The accelerated cane designed by engineers at Stanford University’s Intelligent Systems Lab (above) uses AI-generated algorithms to actually steer the user toward a goal — like a coffee shop or subway — rather than just away from an obstacle
Weighing just three pounds, the augmented cane can be made at home using free open-source software, a downloadable parts list, and DIY soldering instructions, keeping the price to roughly $400.
Other high-tech canes can weigh nearly 50 pounds and cost more than $6,000.
‘We wanted something more user-friendly than just a white cane with sensors,’ Patrick Slade, of Stanford’s Intelligent Systems Laboratory, said in the statement.
‘Something that cannot only tell you there’s an object in your way, but tell you what that object is and then help you navigate around it.’
A motorized, omnidirectional wheel on the bottom tip maintains contact with the ground and gently tugs and nudges users around impediments
The researchers tested their prototype with both volunteers from the Palo Alto Vista Center for the Blind and Visually Impaired and sighted participants who were blindfolded.
They were all asked to complete everyday navigation tasks, like walking down a hallway, moving from one place to another outside, and sidestepping obstacles.
Weighing just three pounds, the augmented cane can be made at home using free open-source software, a downloadable parts list, and DIY soldering instructions, keeping the price to roughly $400
The cane increased the walking speed for blind participants by roughly 20 percent over using a standard folding cane alone.
For sighted people wearing blindfolds, the results were more impressive, increasing their speed by more than a third.
They’re making the device simple, affordable and open-source ‘for ease of replication and cost,’ Kochenderfer said.
‘Anyone can go and download all the code, bill of materials, and electronic schematics — all for free.’
Saying more experiments and engineering are needed to make the augmented cane ready for everyday use, Kochenderfer hopes an industry partner will step forward.
The results of the trials appeared in the journal Science Robotics.
An estimated 285 million people are visually impaired worldwide, according to the World Health Organization, with 90 percent in developing countries — including 12 million in India alone.
In the last decade or so, the traditional white folding cane has gotten a boost from technology: The SmartCane from Indian start-up AssisTech helps visually impaired users avoid hazards both on the ground and in the air using ultrasound to detect obstacles up to 10 feet away.
Attaching to the top of a standard folding white cane, it emits ultrasound waves that vibrate on the relevant side when they bounce back to warn of an obstacle ahead.
Different patterns and intensities let the user know how far away an object is, and its sensors can detect an item as small as one inch.
‘Blindness is not just a medical condition but possesses the larger dimensions of social exclusion, stigma, and neglect,’ SmartCane developer Rohal Paul said in 2014.
‘Blind people are often taken by surprise by over-hanging branches, protruding air-conditioners, and parked vehicles while navigating through unfamiliar terrain.’
Thomas Panek, who is blind, tasked Google with creating an app to help him to run independently. Here he uses the Project Guideline app via a phone attached to his harness
Another device, the UltraCane, uses similar technology: It comes equipped with a dual-range, narrow-beam ultrasound system that projects a hazard protection area in front of the user.
Entrepreneurial engineer Paul Clark has even devised the UltraBike, an ultrasound sensor that can be fitted on the handlebars of any standard bicycle to give the rider constant directional feedback about obstacles ahead and at each side.
When something comes near buttons beneath the appropriate thumb begin to vibrate.
In 2020, Google announced it was testing a new app that would allow blind people to run on their own without a guide dog or human assistant.
Project Guideline uses a phone’s camera to track a guideline on a course and then sends audio cues to the user via bone-conducting headphones. If the runner strays too far from the center, the sound will get louder on whichever side they’re favoring.