That is all.
That is how it works.
That is what the machines are learning to do.
That is all.
That is how it works.
That is what the machines are learning to do.
So I think have the bare bones of a
lOCL (local Open Computing Lab) thing’n’workflow running…
I’m also changing the name… to
VOCL — Virtual Open Computing Lab … which is an example of a VCL, Virtual Computing Lab, that runs VCEs, Virtual Computing Environments. I think…
If you are Windows, Linux, Mac or a 32 bit Raspberry Pi, you should be able to do the following:
Next, we will install a universal browser based management tool,
docker run -d -p 80:8000 -p 9000:9000 --name=portainer --restart=always -v /var/run/docker.sock:/var/run/docker.sock portainer/portainer-ce
docker run -d -p 80:8000 -p 9000:9000 --name=portainer --restart=always -v \\.\pipe\docker_engine:\\.\pipe\docker_engine portainer/portainer-cemay be the way to go?
On my to do list is to customise portainer a bit and call it something
On first run, portainer will prompt you for an admin password (at least 8 characters).
You’ll then have to connect to a Docker Engine. Let’s use the local one we’re actually running the application with…
When you’re connected, select to use that local Docker Engine:
Once you’re in, grab the feed of
lOCL containers: <s>
https://raw.githubusercontent.com/ouseful-demos/templates/master/ou-templates.json</s> (I’ll be changing that URL sometime soon…: NOW IN
OpenComputingLab/locl-templates Github repo) and use it to feed the portainer templates listing:
From the App Templates, you should now be able to see a feed of examaple containers:
[desktop only] containers can only be run on desktop (
amd64) processors, but the other should run on a desktop computer or on a Raspberry Pi using docker on a 32 bit Rasbperry Pi operating system.
Access the container from the Containers page:
By default, when you launch a container, it is opened onto the domain
0.0.0.0. This can be changed to the actual required domain via the Endpoints configuration page. For example, my Raspberry Pi appears on
raspberrypi.local, so if I’m running portainer against that local Docker endpoint, I can configure the path as follows:
>I should be able to generate Docker images for the 64 bit RPi O/S too, but need to get a new SD card… Feel free to chip in to help pay for bits and bobs — SD cards, cables, server hosting, an RPi 8GB and case, etc — or a quick virtual coffee along the way…
The magic that allows containers to be downloaded to Raspberry Pi devices or desktop machines is based on:
buildx), which allow you to build containers targeted to different processors;
docker pull Xand depending on the hardware you’re running on, the appropriate image will be pulled down.
For more on cross built containers and multiple architecture support, see Multi-Platform Docker Builds. This describes the use of
manifest lists which let us pull down architecture appropriate images from the same Docker image name. See also Docker Multi-Architecture Images: Let docker figure the correct image to pull for you.
To cross-build the images, and automate the push to Docker Hub, along with an appropriate manifest list, I used a Github Action workflow using the recipe decribed here: Shipping containers to any platforms: multi-architectures Docker builds.
Here’s a quick summary of the images so far; generally, they either run just on desktop machines (specifically, these are
amd64 images, but I think that’s the default for Docker images anyway? At least until folk start buying the new M1 Macs.:
Jupyter notebook (
oulocl/vce-jupyter): a notebook server based on
andresvidal/jupyter-armv7l because it worked on RPi; this image runs on desktop and RPi computers. I guess I can now start iterating on it to make a solid base Jupyter server image. The image also bundles
sklearn. These seem to take forever to build using
buildx so I built wheels natively on an RPi and added them to the repo so the packages can be installed directly from the wheels. Pyhton wheels are named according to a convention which bakes in things like the Python version and processor architecture that the wheel is compiled for.
the OpenRefine container should run absolutely everywhere: it was built using support for a wide range of processor architectures;
the TM351 VCE image is the one we shipped to TM351 students in October; desktop machines only at the moment…
the TM129 Robotics image is the one we are starting to ship to TM129 students right now; it needs a rebuild because it’s a bit bloated, but I’m wary of doing that with students about to start; hopefully I’ll have a cleaner build for the February start;
the TM129 POC image is a test image to try to get the TM129 stuff running on an RPi; it seems to, but the container is full of all sorts of crap as I tried to get it to build the first time. I should now try to build a cleaner image, but I should really refactor the packages that bundle the TM129 software first because they distribute the installation weight and difficulty in the wrong way.
the Jupyter Postgres stack is a simple Docker Compose proof of concept that runs a Jupyter server in one container and a PostgreSQL server in a second, linked container. This is perhaps the best way to actually distribute the TM351 environment, rather than the monolithic bundle. At the moment, the Jupyter environment is way short of the TM351 environment in terms of installed Python packages etc., and the Postgres database is unseeded.
TM351 also runs a Mongo database, but there are no recent or supported 32 bit Mongo databases any more so that will have to wait till I get a 64 bit O/S running on my RPi. A test demo with an old/legacy 32 bit Mongo image did work okay in a docker-compose portainer stack, and I could talk to it from the Jupyter notebook. It’s a bit of a pain because it means we won’t be able to have the same image running on 32 and 64 bit RPis. And TM351 requires a relatively recent version of Mongo (old versions lack some essentially functionality…).
A simple http web terminal offering ssh access into a Raspberry Pi…
# In RPi terminal sudo apt install libffi-dev pip install webssh # Run with: wssh --fbidhttp=False --port=8000
Then in browser, go to
raspberrypi.local:8000 and log in to
raspberrypi.local host with defult credentials (user
raspberry), or whatever credentials you have set…
Insecure as anything, but a quick way to get ssh terminal access if you don’t have another terminal handy.
Next obvious steps would be to try to run the service in the background and ideally run it as a service. The security should probably also be tightened up.
Note that another alternative is to run a Jupyter server, which will provide terminal access and some simple auth on the front end, though you’d be limited to running with the permissions associated with the notebook user.
[Ah, this looks like it has steps for getting a service defined, as well as creating a local SSL certificate: https://blog.51sec.org/2020/07/python-development-installation-on.html ]
PS see also webmin: https://sbcguides.com/install-webmin-on-raspberry-pi/ or https://thedreamingdad.com/install-webmin-raspberry-pi/ (although this takes up 300MB+ of space… )
That said, for webmin, Chrome will tell you to f***k off and won’t let you in because the SSL certs will be off… Because Google knows best and Google decides what mortals can and can’t see; and as with everyone else who works in ‘pootahs and IT, it’s not in their interest for folk to have an easy way in to accessing compute via anything other than regulated insitutional or monetised services. I f****g hate people who block access to compute more than I f****g hate computers.)
One of the things I’d been hoping to do last year was learn a few Island folklore tales for telling at Island Storytellers sessions. The Thing put paid to those events, of course, but as a sort of new year resolultion, I’ve started digging.
There are a few well worn island tales that appear in pretty much every “tales of the Wight” collection, however it’s themed (smugglers, ghosts, legends, folklore, wrecks, etc) and I guess tales that people still tell within families, so to not just rehash every other story, I figure I need a new way in to some of them.
So I’ve started trying to work up a pattern that takes a place, a time, either a bit of law or a bit of lore, and one or more events as a basis for “researching” a story, from which I can generate:
a) the simple telling, which in many cases may appear on the surface to be a rehash of all the other tellings of the same story;
b) a deeper layer that colours each bit of the story for me and provides more hooks for how to remember it.
Using the place is important because it means I can start to anchor things in a memory palace based on the island. Using the data also provides an opportunity to hook things in the memory palace in temporal layer that allows stories in the same time period to colour and link to each other, as well as stories in the same place to colour the place over time. At some point, I daresay characters may also become pieces in the memory palace.
As far as digging around the stories goes, I’ve started looking for primary and old-secondary resources. Primary in the form of original statutes, places and photos (I intend to visit each location as I pull the pieces together to help situate the story properly), court reports (if I can find them!) etc. And old-secondary sources in the form of old books that tell the now familiar, perhaps even then familiar, tales but from the different historical context of the time of writing.
So for example, there’s a wealth of old tourist guides to the Island, going back a couple of hundred years or so, including the following, which can all be found via the Internet Archive or Google Books:
Many of the above recount the same old, same old stories; but from a quick skim, there is often a slightly different emphasis or bit of colourful interpretation.
But the tours also include occasional new stories, and illustrations and fragments of primary material or commentary, and/or references to the same (which will hopefully give me new ratholes to chase down:-).
Many of them also appear to have a fondness for anecdotes about the weather, architecture, landscape, and people encountered, so I’m hopeful of finding some new to me stories in there too…
…such as why there was an Act of James I posted in the entrance to Godshill Church “which enacts that every female who unfortunately intrudes on the parish a second illegitimate child shall be liable to imprisonment and
hard labour in Bridewell for six months”…