Imagining a Local Open Computing Lab Server (lOCL)

In the liminal space between sleep and wakefulness of earlier this morning, several pieces of things I’ve been pondering for years seemed to come together:

  • from years ago, digital application library shelves;
  • from months ago, a containerised Open Computing Lab <- very deprecated; things have moved on…)
  • from the weekend, trying to get TM129 and TM351 software running in containers on a Raspberry Pi 400;
  • from yesterday, a quick sketch with Portainer.

And today, the jigsaw assembled itself in the form of a local Open Computing Lab environment.

This combines a centrally provided, consistently packaged approach to the delivery of self-contained virtualised computational environments (VCEs) in the form of Docker containerised services and applications accessed over http) with a self-hosted (locally or remotely) via an environment that provides discovery, retrieval and deployment of these VCEs.

At the moment, for TM351 (live in October 2020), we have a model where students on the local install route (we also have a hosted offering):

  • download and install Docker
  • from the command line, pull the TM351 VCE
  • from the command line, start up the VCE and then access it from the browser.

In TM129, there is a slightly simpler route available:

  • download and install Docker
  • from the command line, pull the TM129 VCE
  • using the graphical Docker Dashboard (Windows and Mac), launch and manage the container.

The difference arises from the way a default Jupyter login token is set in the TM129 container, but a random token is generated in the TM351 VCE, which makes logging in trickier. At the moment, we can’t set the token (as an environment variable) via the Docker Dashboard, so getting into the TM351 container is fiddly. (Easily done, but another step that requires fiddly instructions.)

Tinkering with the Raspberry Pi over the last few days, and thinking about the easiest route to setting it up as a Docker engine server that can be connected to over a network, @kleinee reminded me in a Twitter exchange of the open-source portainer application [repo], a browser based UI for managing Docker environments.

Portainer offers several things:

  • the ability to connect to local or remote Docker Engines
  • the ability to manage Docker images, inclduing pulling them from a specified repo (DockerHub, or a private repo)
  • the ability to manage containers (start, stop, inspect, view logs, etc); this includes the ability to set environment variables at start-up
  • the ability to run compositions
  • a JSON feed powered menu listing a curated set of images / compositions.

So how might this work?

  • download and install Docker
  • pull a lOCL portainer Docker image and set it running
  • login
  • connect to a Docker Engine; in the screenshot below, the Portainer application is running on my RPi
  • view the list of course VCEs
  • select one of the listed VCEs to run it using it’s predefined settings
  • customise container settings via advanced options
  • we can also create a container from scratch, including setting environment variables, start directories etc
  • we can specify volume mounts etc

All the above uses the default UI, with custom settings via a control panel to set the logo and specify the application template feed (the one I was using is here).

I’m also using the old portainer UI (I think) and need to try out the new one (v2.0).

So… next steps?

  • fork the portainer repo and do a simplified version of it (or at least, perhaps such CSS display:none some of the more confusing elements, perhaps toggled with a ‘simple/full UI’ button somewhere
  • cross build the image for desktop machines (Win, Mac etc) and RPi
  • cross build TM351 and TM129 VCE images for desktop machines and RPi, and perhaps also some other demo containers, such as a minimal OU branded Jupyter notebook server and perhaps an edge demo OU branded Jupyter server with lots of my extensions pre-installed. Maybe an package up an environment and teaching materials for the OpenLearn Learn to Code for Data Analysis course as a demo for how OpenLearn might be able to make use of this approach
  • instructions for set up on:
    • desktop computer (Mac, Win, Linux)
    • RPi
    • remote host (Digital Ocean is perhaps simplest; portainer does have a tab for setting up against Azure, but it seems to require finding all sorts of fiddly tokens)

Two days, I reckon, to pull bits together (so four or five, when it doesn’t all "just" work ;-)

But is it worth it?

Author: Tony Hirst

I'm a Senior Lecturer at The Open University, with an interest in #opendata policy and practice, as well as general web tinkering...

%d bloggers like this: