Up till now I’ve been operating under the assumption that my dissertation would be about modeling power production from renewable sources. As interesting as that topic is, a pretty unconventional idea that I had last week just hasn’t let me go. It would mean a complete change of direction. Yesterday I had the possibility of driving to Marburg and talking to one of my mentors about it, and he’s sold on the idea. To give in to the sweet temptation of pirate-speak, he said aye, robot 🙂
I was really happy that my professor liked the idea, because, frankly, it’s more than a hop and a skip away from the previous topic, and it’s at least a good stone toss away from the things that people normally research at our institute. But it involves remote sensing and flying robots, so I honestly don’t see how anybody couldn’t think it’s very cool. Plus, as one of the project’s mentors, I’m pretty sure he’s banking on the chance of flying the thing once or twice.
The new project is one that I’ve been planning for quite some time, albeit without a dissertation in mind. I’m calling it EyeRobot, because it will be about visual remote sensing with robots, and it will be part of the Apis electronica library.
But Daniel, you still haven’t said what Apis electronica is!
Apis electronica is a library of piloting software intended for use with unmanned aerial vehicles. I am aware that there are a lot of good piloting libraries out there, but to my knowledge they’re all made for individual flight platforms. Apis’ goal is not to replace them, but rather to provide a unified API that gets out of the user’s way, translating their control requests into commands the drone they’re working with understands so that they don’t have to worry about how that drone flies.
That sounds cool. So how does EyeRobot fit into the picture?
EyeRobot will take care of the remote sensing part. Technologically, we’re ready to move beyond using satellites and airplanes for aerial surveying and begin including smaller, more flexible, unmanned vehicles. However, these smaller platforms bring a lot of challenges. For example, they will often fly where little information has been collected previously. The sensor accuracy will be lower in many cases than on larger platforms. The viewing angle will be more difficult to determine, especially if you need it georeferenced, making it more difficult to merge separate scans. The currently short flight time that’s possible on small drones makes this last issue highly relevant.
I think the time has come for drones to view, analyze and understand their environment, so I’m really excited about the research I’ll be doing over the next couple of years with the university of Marburg.