Over the last few weeks, I’ve been working on my newest project – a device for the visually impaired that makes it possible to sense one’s surroundings without seeing or touching them. This has resulted in a device that I have tested while on vacation. It’s no walk in the park to walk through the park, relying only on sensors tethered together on a miniature computer that are controlled by software you programmed yourself, but it does work. This first of a series of devices is now available as open hardware controlled by open source software for your nonviewing pleasure. Ladies and gentlemen, I present to you the UltraPilot Mark I.
I don’t want to post redundantly, so I’ll start out with a link to the project on Github. On the website you can find code that will work with a new device for sensing your surroundings without using your eyes, as well as pictures and instructions on how to build it.
If you’re asking yourself how I got to be building this device, you’re not alone – it’s not exactly my area of expertise. In fact, I didn’t set out to do it in the first place. It all started as a result of my planning for Apis electronica, for my dissertation.
A main goal of my dissertation is the automatization of unmanned aerial vehicle mapping, which means that the drone needs to be able to avoid obstacles in the air and make navigation decisions in flight. Using visual data to do so is a fine thing, but for a system to be robust enough to actually use in the field, you need the critical navigation to be able to run even if something happens to one set of sensors (i.e. the camera(s)) and you might want the drone to fly in situations where visual information isn’t the most reliable source of information – in fog, in low light situations, etc. So at a minimum, you need to use another set of sensors that work differently and are located differently on the drone itself.
This got me thinking about ultrasound. Ultrasound is a pretty reliable ranging method and it works independently of the lighting situation. One problem, though, is that it has a pretty limited range. Using ultrasonic alone, the drone would only be able to recognize obstacles from up close and it would have to navigate around them using only the information it could gather in its immediate vicinity. As I designed the navigation algorithm, I realized that it would be moving much like a visually impaired person, if that person were able to move freely in all three dimensions.
I called a friend who is blind and asked how she does it, if there were any tricks that I could use for a navigation algorithm and what remote sensing options blind people have. I was shocked to discover that, although there are a number or portable navigation devices for the visually impaired, almost all of them required additional tools. I looked it up. She was right – there just don’t seem to be a whole lot of devices on the market, so that even if a blind person uses an electronic aid, they probably still need a cane or another type of assistance.
So I build a proof of concept device – an ultrasonic sensor connected to a Raspberry Pi, which measured distances in the direction the sensor was pointed in and reported them back to the user using a vibrotactor. After showing this to a few blind acquaintances and using it myself in a collection of different circumstances, I was satisfied that the device has a lot of potential.
I was able to navigate without using my eyes in an unfamiliar city to the train station and back, as well as to various places around the city. It was hard, but then again, walking around blindfolded is hard, no matter what kind of aid you use. I have won new respect for all my visually impaired friends yet again!
But it’s still not finished. I have a lot of ideas to improve it now and am hopeful that the UltraPilot will someday make a tangible difference in people’s lives. This is just the proof of concept, but it already shows that a truly useful electronic aid for the blind can be produced on a lot budget.
Videos and updates on the planned Mark II and Mark III to come.