Somewhen around 2002-3, when we first ran the short course “Robotics and the Meaning of Life”, my colleague Jon Rosewell developed a drag and drop text based programming environment – RobotLab – that could be used to programme the behaviour of a Lego Mindstorms/RCX brick controlled robot, or a simulated robot in an inbuilt 2D simulator.
The environment was also used in various OU residential schools, although an update in 2016 saw us move from the old RCX bricks to Lego EV3 robots, and with it a move to the graphical Lego/Labview EV3 programming environment.
RobotLab is still used in the introductory robotics unit of a level 1 undergrad OU course, a unit that is itself an update of the original Robotics and the Meaning of Life short course. And although the software is largely frozen – the screenshot below shows it running under Wineskin on my Mac – it continues to do the job admirably:
- the environment is drag and drop, to minimise errors, but uses a text based language (inspired originally by Lego scripting code, which it generated to control the actual RCX powered robots);
- the simulated robot could be configured by the user, with noise being added to sensor inputs and motor outputs, if required, and custom background images could be loaded into the simulator:
- a remote control panel could be used to control the behaviour of the real – or simulated – robot to provide simple tele-operation of it. A custom remote application for the residential school allowed a real robot to controlled via the remote app, with a delay in the transmission of the signal that could be used to model the signal flight time to a robot on the moon! The RobotLab remote app provided a display to show the current power level of each motor, as well as the values of any sensor readings.
– the RobotLab environment allowed instructions to be stepped through an instruction at a time, in order to support debugging;
– a data logging tool allowed real or simulated logged data to be “uploaded” and viewed as a time series line chart.
Time moves on, however, and we’re now starting to thing about revising the robotics material. We had started looking at an updated, HTML5 version of RobotLab last year, but that effort seems to have stalled. So I’ve started looking for an alternative.
Robot Operating System (ROS)
Following on from a capital infrastructure bid a couple of years ago, we managed to pick up a few Baxter robots that are intended to be used in the “real, remote experiment” OpenSTEM Lab. (Baxter is also demoed at the level 1 engineering residential school.) Baxter runs under ROS, the Robot Operating System, and can be programmed using Python. Both 2D and 3D simulators are available for ROS, which means we could go down the ROS root as a the programming environment for any revised level 1 course.
At this point, it’s also worth saying that the level 1 course is something of a Frankenstein course, including introductory units to Linux and networking, as well as robotics. If the course is rewritten, the current idea is to replace the Linux module with one on more general operating systems. This means that we could try to create a set of modules that all complement each other, yet standalone as separate modules. For example, ROS works on a client server model, which allows us to foreshadow, or counterpoint, ideas arising in the other units.
The relative complexity of the ROS environment means that we can also use it to motivate the idea of using virtual machines for running scientific software with complex dependencies and rather involved installation processes. However, from a quick look at the ROS tools, they do look rather involved for a first intro course and I think would require quite a bit of wrapping to hide some of the complexity.
If you’re interested, here‘s a quick run down of what needs adding to a base Linux 16.04 install:
|sudo apt-get update|
|sudo apt-get install -y software-properties-common|
|sudo apt-add-repository multiverse|
|sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list'|
|sudo apt-key adv –keyserver hkp://ha.pool.sks-keyservers.net:80 –recv-key 421C365BD9FF1F717815A3895523BAEEB01FA116|
|#If that address doesn't work:|
|sudo apt-key adv –keyserver hkp://pgp.mit.edu:80 –recv-key 421C365BD9FF1F717815A3895523BAEEB01FA116|
|sudo apt-get update|
|sudo apt-get install ros-kinetic-ros-base|
|sudo rosdep init|
|echo "source /opt/ros/kinetic/setup.bash" >> ~/.bashrc|
|sudo apt-get install -y ros-$ROS_DISTRO-stdr-simulator|
|sudo apt-get install -y ros-$ROS_DISTRO-teleop-twist-keyboard|
|##To run simulator:|
|#roslaunch stdr_launchers server_with_map_and_gui_plus_robot.launch|
|##To run keyboard remote:|
|#rosrun teleop_twist_keyboard teleop_twist_keyboard.py cmd_vel:=robot0/cmd_vel|
|#sudo sh -c 'echo "deb http://packages.osrfoundation.org/gazebo/ubuntu-stable `lsb_release -cs` main" > /etc/apt/sources.list.d/gazebo-stable.list'|
|#wget http://packages.osrfoundation.org/gazebo.key -O – | sudo apt-key add –|
|#sudo apt-get update|
|#sudo apt-get install gazebo7 libgazebo7-dev|
|#sudo apt-get -y install python-rosdep python-catkin-tools python-wstool|
|#mkdir -p ~/ws_baxter/src|
|#wstool init .|
|#wstool merge https://raw.githubusercontent.com/vicariousinc/baxter_simulator/kinetic-gazebo7/baxter_simulator.rosinstall|
|#rosdep install -y –from-paths . –ignore-src –rosdistro kinetic –as-root=apt:false|
|#catkin config –extend /opt/ros/kinetic –cmake-args -DCMAKE_BUILD_TYPE=Release|
|##Need to modify bash.sh|
|#roslaunch baxter_gazebo baxter_world.launch|
|#rosrun baxter_examples joint_velocity_wobbler.py|
|##Simple keyboard-based control demo:|
|#rosrun baxter_examples joint_position_keyboard.py|
The simulator and the keyboard remote need launching from separate terminal processes.
A rather more friendly environment we could work with is the blockly based RobertaLab. Scratch is already being used in one of the new first level courses to introduce basic programming, of a sort, so the blocks style environment is one that students will see elsewhere, albeit in a rather simplistic fashion. (I’d argued for using BlockPy instead, but the decision had already been taken…)
In keeping with complementing the operating systems unit, we could use a Docker containerised version of RobertaLab to allow students to run RobertaLab on their own machines:
A couple of other points to note about RobertaLab. Firstly, it has a simulator, with several predefined background environments. (I’m not sure if new ones can be easily uploaded, but if not the repo can be forked an new ones added form src.)
As with BlockPy, there’s the ability to look at Python script generated from the blocks view. In the EV3Dev selection, Python code compatible with the Ev3dev library is generated. But other programming environments are available too, in which case different code is generated. For example, the EV3lejos selection will generate Java code.
This ability to see the code behind the blocks is a nice stepping stone towards text based programming, although unlike BlockPy I don’t think you can go from code to blocks? The ability to generate code for different languages from the same blocks programme could also be used to make a powerful point, I think?
Whether or not we go with the robotics unit rewrite, I think I’ll try to clone the current RobotLab activities in using the RobertaLab environment, and also add some extra value bits by commenting on the Python code that is generated.
By the by, here’s a clone of the Dockerfile used by exmatrikulator for building the RobertaLab container image:
|RUN apt update && apt install -y avrdude avr-libc binutils-avr default-jdk gcc-arm-none-eabi gcc-avr gdb-avr git libssl-dev libusb-0.1-4 maven nbc phantomjs python-pip srecord unzip wget && apt-get clean|
|RUN pip install uflash|
|RUN wget -q https://github.com/OpenRoberta/robertalab/archive/master.zip && \|
|unzip master.zip && \|
|RUN mvn clean install|
|RUN /robertalab-master/ora.sh –createemptydb OpenRobertaServer/db-2.2.0/openroberta-db|
|CMD ["/robertalab-master/ora.sh", "–start-from-git"]|
In the original robotics unit, one of the activities looked at how a simple neural network could be trained using data collected by the robot. I’m not sure if Dale Lane’s Machine Learning for Kids application, which also looks to be written using
block.lyScratch. This means it probably can’t be integrated with Open RobertaLab, but even if it isn’t, although it would perhaps make a nice complement to both the use of OpenRobertaLab to control a simple simulated robot, and the use of Scratch in the other level 1 module as an environment for building simple games as a way of motivating the teaching of basic programming concepts.
No-one’s interested in Jupyter notebooks, but folk seem to think Scratch is absolutely bloody lovely for teaching programming to novice ADULTS, so I might as well go with them in adopting that style of interface…