Keeping Track of TM129 Robotics Block Practical Activity Updates…

Not being the sort of person to bid for projects that require project plans and progress reports and end of project reports, and administration, and funding that you have to spend on things that don’t actually help, I tend to use this blog to keep track of things I’ve done in the form of blog posts, as well as links to summary presentations of work in progress (there is nothing ever other than work in progress…).

But I’ve not really been blogging as much as I should, so here’s a couple of links to presentations that I gave last week relating to the TM129 update:

  • Introducing RoboLab: an integrated robot simulator and Jupyter notebook environment for teaching and learning basic robot programming: this presentation relates to the RoboLab environment I’ve been working that integrates a Javascript robot simulator based on ev3devsim in a jupyter_proxy_widget with Jupyter notebook based instructional material. RoboLab makes heavy use of Jupyter magics to control the simulator, download programs to it, and retrieve logged sensor data from it. I think it’s interesting but no-one else seems to. I had to learn a load of stuff along the way: Javascript, HTML and CSS not the least among them.
  • Using Docker to deliver virtual computing environments (VCEs) to distance education students: this represents some sort of summary about my thinking around delivering virtualised software to students in Docker containers. We’ve actually be serving containers via Kubernetes on Azure using LTI-authed links from the Moodle VLE to launch temporary Jupyter notebook servers via an OU hosted JupyterHub server since Spring, 2018, and shipped the TM351 VCE (virtual computing environment) to TM351 students this October, with TM129 students getting starting to access their Dockerised VCE, which also bundles all the practical activity instructional notebooks. I believe that the institution is looking to run a "pathfinder" project (?) regarding making containerised environments avaliable to students in October, 2021. #ffs Nice to know when your work is appreciated, not… At least someone will make some internal capital in promotion and bonus rounds from that groundbreaking pathfinder work via OU internal achievement award recognition in 2022.

The TM129 materials also bundle some neural network / MLP / CNN activities that I intend to write up in a similar way at some point (next week maybe; but the notes take f****g hours to write…). I think some of the twists in the way the material is presented is quite novel, but then, wtf do I know.

There’s also bits and bobs I explored relating to embedding audio feedback into RoboLab, which I thought might aid accessibility as well as providing a richer experience for all users. You’d have thought I might be able to find someone, anyone, in the org who might be interested in bouncing more ideas around that (we talk up our accessibilitiness(?!)), or maybe putting me straight about why it’s a really crappy and stupid thing to do, but could I find a single person willing to engage on that? Could I f**k…

In passing, I note I ranted about TM129 last year (Feb 2019) in I Just Try to Keep On Keeping On Looking at This Virtual(isation) Stuff…. Some things have moved on, some haven’t. I should probably do a reflective thing comparing that post with the things I ended up tinkering with as part of the TM129 Robotics block practical activity update, but then again, maybe I should go read a book or listen to some rally podcasts instead…

Author: Tony Hirst

I'm a Senior Lecturer at The Open University, with an interest in #opendata policy and practice, as well as general web tinkering...

%d bloggers like this: