Pondering A Remote Robot Lab

Several years ago, I used to run a joint EPSRC & AHRB funded research network, the Creative Robotics Research Network (CRRN). The idea behind the network was to provide a forum for academics and practitioners with an interest in creative applications of robotics to share ideas, experience and knowledge.

We had a lot of fun with the network – the mailing list was active, we hosted several events, and on one network visit to a special effects company, I have a hazy memory of flamethrowers being involved… Erm…

Anyway, last weekend I went to a Raspberry Pi hackday organised by ex-IW resident Dr Lucy Rogers at Robin Hill, site of the Bestival for any festival goers out there, and currently taking the form of the electric woods, an atmospheric woodland sound and light show with a great curry along the way. If you can get on to the Island for half term, make an evening of it…

The event was sponsored by Alec Dabell, owner of Vectis Ventures, who also run the Island’s theme park – Blackgang Chine. (If you’ve ever holidayed on the Island as a child or with kids of your own, you’ll know it..:-) The idea? To play with some tech that can be worked up for controlling Blackgang’s animatronic dinosaurs or the light shows at Robin Hill and Blackgang Chine, as well as learning something along the way. (IBM’s Andy Stanford-Clark, another Island resident, pitched in with a talk on Lora, a low power wifi protocol for the internet of things, as well as being on hand to help out with those of us getting to grips with NodeRED and MQTT for the first time ;-)

Here’s a clip from a previous event…

Also at the event was another ex-CRRN member, Mat Walker, with his latest creation: Ohbot.

Designed as a desktop “talking head” robot for educational use, the Arduino controlled Ohbot has seven servos to control the motion of the head, lips and eyes and eyelids, as well as colour LEDs in the eyes themselves.

30062382870_2e770d0e3b_k

Text-to speech support also provides a good motivation for trying to get the lip synching to work properly. The Ohbot has a surprisingly expressive face, more so even than the remarkably similar one rendered in the simulator that comes as part of the programming environment. With an extra web cam, Ohbot can be programmed to move its head – and eyes – to follow you around the room…

Needless to say, Ohbot got me thinking… And here’s how…

One of the things being developed in the OU at the moment is a remote engineering lab, part of the wider OpenSTEM lab. The engineering lab, which is being put together by uberhacker Tim Drysdale, should go live to second year equivalent OU engineering students in October next year (I think?) and third year equivalent students the year after.

The lab itself has multiple bays for different physical experiments, with several instances of each experiment to allow several student individual access to the same experiment at the same time.

One of the first experiments to be put together is a mechanical pendulum – students can log in to the apparatus, control the motion of the pendulum, and observe in real time it’s behaviour via a live video feed, as well as data traces from instrumentation applied to the apparatus. One of the things Tim has been working on is getting the latency of the control signals and the video feed right down – and it seems to be looking good.

30010230890_4a42e22f2c_k

Another couple of courses in production at the OU at the moment are two first year equivalent computing courses. The first one of these teaches students basic programming using Scratch (I have issues with this, but anyway…); Ohbot also uses a blockly style user interface, although it’s currently built just for Windows machines, I think?

Hmmm… as part of the Open Engineering Lab, the OU has bought three (?) Baxter robots, with the intention that students will be able to log in and programmatically control them in real time. I seem to recall there was also some discussion about whether we could run some Lego EV3 robots, perhaps even mobile ones. The problem with mobile robots, of course, is the “activity reset” problem. The remote experimentation lab activities need to run without technician support, which means they need to clear down in a safe way at the end of each student’s activity andd reset themselves fro the next student to log in to them. With mobile robots, this is an issue. But with Ohbot, it should be a doddle? (We’d probably have to rework the software, but that in turn maybe something that could be done in collaboration with the Ohbot guys…)

Keenly priced at under a couple of hundred squids, with sensors, I can easily image a shelf with 8 or so Ohbot bays providing an interactive remote robot programming activity for our first year computing, as well as engineering, students. The question is, can I persuade anyone else that this might be worth exploring..?