Making Music and Embedding Sounds in Jupyter Notebooks

It’s looking as if the new level 1 courses won’t be making use of Jupyter notebooks (unless I can find a way of sneaking them in via the single unit I’be put together!;-) but I still think they’re worth spending time exploring for course material production as well as presentation.

So to this end, as I read through the materials being drafted by others for the course, I’ll be looking for opportunities to do the quickest of quick demos, whenever the opportunity arises, to flag things that might be worth exploring more in future.

So here’s a quick example. One of the nice design features of TM112, the second of the two new first level courses, is that it incorporates some mimi-project activities for students work on across the course. One of the project themes relates to music, so I wondered what doing something musical in a Jupyter notebook might look like.

The first thing I tried was taking the outlines of one of the activities – generating an audio file using python and MIDI – to see how the embedding might work in a notebook context, without the faff of having to generate an audio file from python and then find a means of playing it:

midimusic

Yep – that seems to work… Poking around music related libraries, it seems we can also generate musical notation…

midimusic2

In fact, we can also generate musical notation from a MIDI file too…

midimusic3

(I assume the mappings are correct…)

So there may be opportunities there for creating simple audio files, along with the corresponding score, within the notebooks. Then any changes required to the audio file, as well as the score, can be effected in tandem.

I also had a quick go at generating audio files “from scratch” and then embedding the playable audio file

 

audio

That seems to work too…

We can also plot the waveform:

audio2

This might be handy for a physics or electronics course?

As well as providing an environment for creating “media-ful” teaching resources, the code could also provide the basis of interactive student explorations. I don’t have a demo of any widget powered examples to hand in a musical context (maybe later!), but for now, if you do want to play with the notebooks that generated the above, you can do so on mybinder – http://mybinder.org/repo/psychemedia/ou-tm11n – in the midiMusic.ipynb and Audio.ipynb notebooks. The original notebooks are here: https://github.com/psychemedia/OU-TM11N

PS this looks like it could be handy: https://github.com/akaihola/jupyter_abc, a Jupyter notebook extension to render abc notation, h/t Antti Kaihola.

Accessible Jupyter Notebooks?

Pondering the extent to which Jupyter notebooks provide an accessible UI, I had a naive play with the Mac VoiceOver app run over Jupyter notebooks the other day: markdown cells were easy enough to convert to speech, but the code cells and their outputs are nested block elements which seemed to take a bit more navigation (I think I really need to learn how to use VoiceOver properly for a proper test!). Suffice to say, I really should learn how to use screen-reader software, because as it stands I can’t really tell how accessible the notebooks are…

A quick search around for accessibility related extensions turned up the jupyter-a11y: reader extension [code], which looks like it could be a handy crib. This extension will speak aloud a the contents of a code cell or markdown cell as well as navigational features such as whether you are in the cell at the top or the bottom of the page. I’m not sure it speaks aloud the output of code cell though? But the code looks simple enough, so this might be worth a play with…

On the topic of reading aloud code cell outputs, I also started wondering whether it would be possible to generate “accessible” alt or longdesc text for matplotlib generated charts and add those to the element inserted into the code cell output. This text could also be used to feed the reader narrator. (See also First Thoughts on Automatically Generating Accessible Text Descriptions of ggplot Charts in R for some quick examples of generating textual descriptions from matplotlib charts.)

Another way of complementing the jupyter-a11y reader extension might be to use the python pindent [code] tool to annotate the contents of code cells with accessible comments (such as comments that identify the end of if/else blocks, and function definitions). Another advantage of having a pindent extension to annotate the content of notebook python code cells is that it might help improve the readability of code for novices. So for example, we could have a notebook toolbar button that will toggle pindent annotations on a selected code cell.

For code read aloud by the reader extension, I wonder if it would be worth running the content of any (python) code cells through pindent first?

PS FWIW, here’s a related issue on Github.

PPS another tool that helps make python code a bit more accessible, in an active sense, in a Jupyter notebook is this pop-up variable inspector widget.

Querying Panama Papers Neo4j Database Container From a Linked Jupyter Notebook Container

A few weeks ago I posted some quick doodles showing, on the one hand, how to get the Panama Papers data into a simple SQLite database and in another how to link a neo4j graph database to a Jupyter notebook server using Docker Compose.

As the original Panama Papers investigation used neo4j as its backend database, I thought putting the data into a neo4j container could give me the excuse I needed to start looking at neo4j.

Anyway, it seems as if someone has already pushed a neo4j Docker container image preseeded with the Panama Papers data, so here’s my quickstart.

To use it, you need to have Docker installed, download the docker-compose.yaml file and then run:

docker-compose up

If you do this from a command line launched from Kitematic, Kitematic should provide you with a link to the neo4j database, running on the Docker IP address and port 7474. Log in with the default credentials ( neo4j / neo4j ) and change the password to panamapapers (all lower case).

Download the quickstart notebook into the newly created notebooks directory, and you should be able to see it from the notebooks homepage on Docker IP address port 8890 (or again, just follow the link from Kitematic).

I’m still trying to find my way around both the py2neo Python wrapper and the neo4j Cypher query language, so the demo thus far is not that inspiring!

And I’m not sure when I’ll get a chance to look at it again…:-(

Using IPython on Lego EV3 Robots Running Ev3Dev

In part so I don’t lose the recipe, here are some notes for getting up and running with IPython on a Lego EV3 brick.

The command lines are prefixed to show whether we’re running them on the Mac or the brick…

To start with, we need to flash a microSD card with an image of the ev3dev operating system. The instructions are on the ev3dev site – Writing an SD Card Image Using Command Line Tools on OS X. – , but I repeat the ones for a Mac here for convenience.

  1. Download an image from the repository – I used the ev3dev-jessie-2015-12-30 image because the current development version is in a state of flux and the python bindings don’t necessarily work with it just at the moment…
  2. Assuming you have downloaded the file to your home Downloads directory (that is, ~/Downloads), launch a new terminal and run: cd Downloads
    [Mac] unzip ev3-ev3dev-jessie-2015-12-30.img.zip
    [Mac] diskutil list
  3. Put the microSD card (at least 2GB, but no more than 16GB, I think? I used a 4GB microSD (HC) card) into an SD card adapter and pop it into the Mac SD card reader. Check that it’s there and what it’s called; (you’re looking for the new disk…):
    [Mac] diskutil list
  4. Now unmount the new disk corresponding the the SD card: diskutil unmountDisk /dev/disk1s1Downloads_—_robot_ev3dev____—_ssh_—_112×25
    If you don’t see the /dev/desk1 listing, check that the write protect slider on your SD card holder isn’t in the write protect position.
  5. We’re going to write some raw bits to the card (/dev/disk1 in my example), so we need to write to /dev/rdisk1 (note the r before the disk name). The write will take some time – 5 minutes or so – but if you’re twitchy, ctrl-T will show you a progress message). You’ll also need to enter your Mac password. (NOTE: if you use the wrong disk name, you can completely trash your computer. So be careful;-)
    [Mac] sudo dd if=~Downloads/ev3-ev3dev-jessie-2015-12-30.img of=/dev/rdisk1 bs=4m
    GOTCHA: when I tried, I got a Permission Denied message at first. It seems that for some reason my Mac thought the SD card was write protected. On the SD card adapter is a small slider that sets the card to “locked” or “unlocked”. The SD card reader in the Mac is a mechanical switch that detects which state the slider is in and whether the card is locked. I don’t know if it was a problem with the card adapter or the Mac reader, but I took the card reader out of the Mac, changed the slider setting, put card reader back in, and did the unmount and then sudo dd steps again. It still didn’t work, so I took the card out again, put the slider back the other way, and tired again. This time it did work.
  6. Once the copy is complete, take the SD card adapter out, remove the microSD car and put in in the EV3 brick. Start the EV3 brick in the normal way.

Hopefully, you should the brick boot into the Brickman UI (it may take two or three minutes, include a period when the screen is blank and the orange lights are ticking for a bit…)

Navigate the brick settings to the Networks menu, select Wifi and scan for your home network. Connect to the network (the password settings will be saved, so you shouldn’t have to enter them again).

By default, the IP address of the brick should be set to 192.168.1.106. To make life easier, I set up passwordless ssh access to the brick. Using some guidance in part originally from here, I accessed the brick from my Mac terminal and set up an ssh folder. First, log in to the brick from the Mac terminal:

[Mac] ssh robot@192.168.1.106

When prompted, the password for the robot user is maker.

Downloads_—_robot_ev3dev____—_bash_—_112×25

This should log you in to the brick. Run the following command to create an ssh directory into which the shh key will be placed, and then exit the brick commandline to go back to the Mac command line.

[Brick] install -d -m 700 ~/.ssh
[Brick] exit

Create a new key on your Mac, and then copy it over to the brick:

[Mac] ssh-keygen -R 192.168.1.106
[Mac] cat ~/.ssh/id_rsa.pub | ssh robot@192.168.1.106 'cat > .ssh/authorized_keys'
You will be prompted again for the password – maker – but next time you try to log in to the brick, you should be able to do so without having to enter the password. (Instead, the ssh key will be used to get you in.)

Downloads_—_robot_ev3dev____—_bash_—_112×25_and_Edit_Post_‹_OUseful_Info__the_blog____—_WordPress

If you login to the brick again – from the Mac commandline, run:

[Mac] ssh robot@192.168.1.106

you should be able to run a simple Python test program. Attach a large motor to input A on the brick. From the brick commandline, run:

[Brick] python

to open up a Python command prompt, and then enter the following commands to use the preinstalled ev3dev-lang-python Python bindings to run the motor for a few seconds:

[Python] import ev3dev.ev3 as ev3
[Python] m = ev3.LargeMotor('outA')
[Python] m.run_timed(time_sp=3000, duty_cycle_sp=75)

Enter:

[Python] exit

to exit from the Python interpreter.

Now we’re going to install IPython. Run the following commands on the brick command line (update, but DO NOT upgrade the apt packages). If prompted for a password, it’s still maker:

[Brick] sudo apt-get update
[Brick] sudo apt-get install -y ipython

You should now be able to start an IPython interpreter on the brick:

[Brick] ipython

The Python code to test the motor should still work (hit return it you find you are stuck in a code block). Typing:

[Brick] exit

will take you out of the interpreter and back to the brick command line.

One of the nice things about IPython is that we can connect to it remotely. What this means is that I can start an IPython editor on my Mac, but connect it to an IPython process running on the brick. To do this, we need to install another package:

[Brick] sudo apt-get install -y python-zmq

Now you should be able to start an IPython process on the brick that we can connect to from the Mac:

[Brick] ipython kernel

The process will start running and you should see a message of the form:

To connect another client to this kernel, use:
--existing kernel-2716.json

This file contains connections information about the kernel.

Now open another terminal on the Mac, (cmd-T in the terminal window should open a new Terminal tab) and let’s see if we can find where that file is. Actually, here’s a crib – in the second terminal window, go into the brick:

[Mac] ssh robot@192.168.1.106

And then on the brick command line in this second terminal window, show a listing of a the directory that should contain the connection file:

[Brick] sudo ls /home/robot/.ipython/profile_default/security/

You should see the –existing kernel-2716.json file (or whatever it’s exactly called) there. Exit the brick command line:

[Brick] exit

Still in the second terminal window, and now back on the Mac command, copy the connection file from the brick to your current directory on the Mac:

[Mac] scp robot@192.168.1.106:/home/robot/.ipython/profile_default/security/kernel-2716.json ./

If you have IPython installed on you Mac, you should now be able to open an IPython interactive terminal on the Mac that is connected to the IPython kernel running on the brick:

[Mac] ipython console --existing ${PWD}/kernel-2716.json --ssh robot@192.168.1.106

You should be able to execute the Python motor test command as before (remember to import the necessary ev3dev.ev3 package first).

Actually, when I ran the ipython console command, the recent version of jupyter on my Mac gave me a depreaction warning which means I would have been better running:

[Mac] jupyter console --existing ${PWD}/kernel-2716.json --ssh robot@192.168.1.106

So far so good – can we do any more with this?

Well, yes, a couple of things. When starting the IPython process on the brick, we could force the name and location of the connection file:

[Mac] ipython kernel -f /home/robot/connection-file.json

Then on the Mac we could directly copy over the connection file:

[Mac] scp robot@192.168.1.106:/home/robot/connection-file.json ./

Secondly, we can configure a Jupyter notebook server running on the Mac to so that it will create a new IPython process on the brick for each new notebook.

Whilst you can configure this yourself, it’s possibly easy to make use of the remote_ikernel helper:

[Mac] pip3 install remote_ikernel
[Mac] remote_ikernel manage --add --kernel_cmd="ipython kernel -f {connection_file}" --name="Ev3dev" --interface=ssh --host=robot@192.168.1.106

Now when you should be able to connect to a notebook run against an IPython kernel on the brick.

Home-jupyter

Note that it may take a few seconds for the kernel to connect and the first cell to run – but from then on it should be quite responsive.

Untitled2-ipynb

To show a list of available kernels for a particular jupyter server, run the following in a Jupyter code cell:

import jupyter_client
jupyter_client.kernelspec.find_kernel_specs()

PS for ad hoc use, I thought it might be useful to try setting up a quick wifi router that I could use to connect the brick to my laptop in the form of an old Samsung Galaxy S3 android phone (without a SIM card). Installing the Hotspot Control app provided a quick way of doing this… and it worked:-)

PPS for a more recent version of IPython, install it from pip.

If you installed IPython using the apt-get route, uninstall it with:

[Brick] sudo apt-get uninstall ipython

Install pip and some handy supporting tools that pip may well require at some point:

[Brick] sudo apt-get install build-essential python-dev

Running:

[Brick] sudo apt-get install python-pip

would run an old version of pippip --version shows 1.5.6 – which could be removed using sudo apt-get remove python-setuptools.

To grab a more recent version, use:

[Brick] wget https://bootstrap.pypa.io/get-pip.py
[Brick] sudo -H python get-pip.py

which seems to take a long time to run with no obvious sign of progress, and then tidy up:

[Brick] rm get-pip.py

Just to be sure, then update it:

[Brick] sudo pip install --upgrade setuptools pip

which also seems to take forever. Then install IPython:

[Brick] sudo pip install -y ipython

I’m also going to see if I can give IPythonwidgets a go, although the install requirements looks as if it’ll bring down a huge chunk of the Jupyter framework too, and I’m not sure that’s all necessary?

[Brick] sudo pip install ipywidgets

For a lighter install, try sudo pip install --no-deps ipywidgets to install ipywidgets without dependencies. The only required dependencies are ipython>=4.0.0, ipykernel>=4.2.2 and traitlets>=4.2.0;.

The widgets didn’t seem to work at first, but it seems that following a recent update to the Jupyter server on host, it needed a config kick before running jupyter notebook:

jupyter nbextension enable --py --sys-prefix widgetsnbextension

PPS It seems to take a bit of time for the connection to the IPython server to be set up:

ajh59_—_Python_—_123×24

The Timeout occurs quite quickly, but then I have to wait a few dozen seconds for the tunnels on ports to be set up. Once this is done, you should be able to run things in a code cell. I usually start with print("Hello World!") ;-)

PPPS For plotting charts:

sudo apt-get install -y python-matplotlib

Could maybe even go the whole hog…

sudo apt-get install -y python-pandas
sudo pip install seaborn

PPPPS Here’s my current build file (under testing atm – it takes about an hour or so…) – ev3_ipy_build.sh, so:
[Mac] scp ev3_ipy_build.sh robot@192.168.1.106
[Brick] chmod +x ev3_ipy_build.sh
[Brick] sudo ./ev3_ipy_build.sh

sudo apt-get update
sudo apt-get install -y build-essential python-dev
sudo apt-get install -y python-zmq python-matplotlib python-pandas

wget https://bootstrap.pypa.io/get-pip.py
sudo -H python get-pip.py
rm get-pip.py
sudo pip install --upgrade setuptools pip

sudo pip install ipython ipykernel traitlets seaborn
sudo pip install --no-deps ipywidgets

PPPPPS to clone the SD card on a Mac, insert the SD card and run:

[Mac] diskutil list
[Mac] sudo dd if=/dev/disk1 of=~/Desktop/my_ev3dev_image.dmg

The corresponding restore (in the process described at the start of this post) would use /my_ev3dev_image.dmg rather than the ev3-ev3dev-jessie-2015-12-30.img image.

PPPPPPS Connecting to remote kernel on brick – start IPyhton kernel on brick:

[Brick] ipython kernel -f /home/robot/test.json

Copy the connection file over to the host:
[Mac] scp robot@192.168.1.106:/home/robot/test.json ./

Check the path you copied it to
[Mac] pwd

For me, that returned /Users/ajh59.

Start a console on the host using the existing connection file – use a full, explicit path to the file. Also works with things like Spyder:

[Mac] jupyter console --existing /Users/ajh59/test.json --ssh robot@192.168.1.106

spyder-ssh-ev3

Steps Towards Some Docker IPython Magic – Draft Magic to Call a Contentmine Container from a Jupyter Notebook Container

I haven’t written any magics for IPython before (and it probably shows!) but I started sketching out some magic for the Contentmine command-line container I described in Using Docker as a Personal Productivity Tool – Running Command Line Apps Bundled in Docker Containers,

What I’d like to explore is a more general way of calling command line functions accessed from arbitrary containers via a piece of generic magic, but I need to learn a few things along the way, such as handling arguments for a start!

The current approach provides crude magic for calling the contentmine functions included in a public contentmine container from a Jupyter notebook running inside a container. The commandline contentmine container is started from within the notebook contained and uses a volume-from the notebook container to pass files between the containers. The path to the directory mounted from the notebook is identified by a bit of jiggery pokery , as is the method for spotting what container the notebook is actually running in (I’m all ears if you know of a better way of doing either of these things?:-)

The magic has the form:

%getpapers /notebooks rhinocerous

to run the getpapers query (with fixed switch settings for now) and the search term rhinocerous; files are shared back from the contentmine container into the .notebooks folder of the Jupyter container.

Other functions include:

%norma /notebooks rhinocerous
%cmine /notebooks rhinocerous

These functions are applied to files in the same folder as was created by the search term (rhinocerous).

The magic needs updating so that it will also work in a Jupyter notebook that is not running within a container – this should simply be just of case of switching in a different directory path. The magics also need tweaking so we can pass parameters in. I’m not sure if more flexibility should also be allowed on specifying the path (we need to make sure that the paths for the mounted directories are the correct ones!)

What I’d like to work towards is some sort of line magic along the lines of:

%docker psychemedia/contentmine -mountdir /CALLING_CONTAINER_PATH -v ${MOUNTDIR}:/PATH COMMAND -ARGS etc

or cell magic:

%%docker psychemedia/contentmine -mountdir /CALLING_CONTAINER_PATH -v ${MOUNTDIR}:/PATH
COMMAND -ARGS etc
...
COMMAND -ARGS etc

Note that these go against the docker command line syntax – should they be closer to it?

The code, and a walked through demo, are included in the notebook available via this gist, which should also be embedded below.


Thinking Around the Edges – Lego EV3 Robots Running Remote Jupyter Kernels

I’ve been pondering the use of Lego EV3 robots on the OU TXR120 residential school again, and after a couple of chats with OU colleagues Jane Bromley (who just won a sci-fi film pitch competition run by New Scientist) and Jon Rosewell (who leads on our level 1 robotics offerings), here’s where I’m at on “visioning” it…?!;-)

The setting for the residential school is 8 groups of 3 students per group working with their own Lego EV3 robot to complete a set of challenges. Although we were hoping to have laptops from which the robots could be programmed wirelessly, it seems as if we might end up with wifi-less desktop machines (which will require tethering the robots to the desktop machine to programme them) with big screens, though I could be wrong.. and we may be able to work around that anyway (a quick trip to Maplin to buy some wifi dongles, for example, or some cables we can plug into a wifi routing switch, for example…) Having laptops would have been a boon for making the room a bit more flexible to work with (but many students have their own laptops…;-) and having a big screen that laptops could be mirror displayed against would possibly improve student experience.

The Lego bricks come with their own firmware, but as described in Pondering New Ways of Programming Lego EV3 Mindstorms Bricks we can also put Linux on them, and make use of a python wrapper library to actually programme the bricks using Python.

In what follows, I’ll refer to the EV3 brick as the client and a laptop or desktop computer that connects to it as the host.

One of the easiest ways to access the Python on the brick is to connect the brick to the network and then ssh in to it – for example (use the correct IP address for the brick):

ssh root@192.168.1.106

Each time you ssh in, you have to provide a password (r00tme is the default on my brick), but passwordless ssh can be set up which makes things quicker (you don’t have to enter the password).

To set up passwordless ssh onto the brick, you need to check that an .ssh folder is available in the home directory (~) on the brick for the ssh key. To do this, ssh in to EV3, cd ~ to ensure you’re at home you home, then ls -a to list the directory contents. If there is no .ssh directory, create one (this post suggests install -d -m 700 ~/.ssh).

If you don’t have an ssh key set up on the host machine you want to get passwordless ssh access from to the brick, follow the above link to find out how to create one. Once you do have ssh keys set up on host, run something along the lines of the following, using the IP address of your connected EV3, to copy the key across to the EV3:

cat ~/.ssh/id_rsa.pub | ssh root@192.168.1.106 'cat > .ssh/authorized_keys'

You will need to provide the password to execute this command (r00tme for the version of Ev3dev I’m running).

You should now be able to ssh in without a password: ssh root@192.168.1.106

Rather than prompting for the password, ssh will use the key provided to log you in.

In my previous post, I described how it was possible to run an old IPython notebook server on the brick that can expose notebooks over the network, although it was a little slow. It’s also possible to ssh in to the brick and run an IPython terminal on the brick:

ssh root@192.168.1.106
ipython

A third way I’d have expected to work is to access a remote IPython kernel on the brick from an IPython console on my laptop: ssh into the EV3 brick, launch an IPython kernel, and pick up the location of the connection file. For example, if the command:

ssh root@192.168.1.106
ipython kernel

to run an IPython kernel on the EV3 responds with something like:

To connect another client to this kernel, use:
    --existing kernel-1129.json

Back on the laptop I’d expect to be able to run:
scp root@192.168.1.106:/root/.ipython/profile_default/security/kernel-1129.json ./
to grab a local copy of the connection file from the brick and copy it over to my host machine (which works) and then use it as an existing connection for a Jupyter console on the laptop host:
jupyter console --existing ./kernel-1129.json --ssh server

But that doesn’t work for some reason? CRITICAL | Could not find existing kernel connection file ./kernel-1129.json

Whatever…

So far, so much noise – the important thing to take away from the above is to get the passwordless ssh set up, because it makes the following possible…

Recall the basic scenario – we want to run code on the brick from a computer. The bricks will run an IPython notebook, but it’s slow. Or we can ssh in to the brick, start an IPython process up, and run things from an IPython command line on the brick.

But what if we could run a Jupyter server on host, making notebooks available via a browser, but use a remote kernel running on the brick?

The remote-ikernel package makes setting up remote kernels that can be accessed from the Jupyter server relatively straightforward to do – install the package on your host machine (eg my laptop) and then run the remote-ikernel command with the settings required:

pip3 install remote_ikernel
remote_ikernel manage --add --kernel_cmd="ipython kernel -f {connection_file}" --name="Ev3dev" --interface=ssh --host=root@192.168.1.106

This creates a kernel.json file and places it in the correct location. (On my Mac this was in the directory /Users/MYUSER/Library/Jupyter/kernels/. )

The kernel.json file created for me (/Users/ajh59/Library/Jupyter/kernels/rik_ssh_root_192_168_1_106_ev3dev/kernel.json) was as follows:

{
  "argv": [
    "/usr/local/opt/python3/bin/python3.5",
    "-m",
    "remote_ikernel",
    "--interface",
    "ssh",
    "--host",
    "root@192.168.1.106",
    "--kernel_cmd",
    "ipython kernel -f {host_connection_file}",
    "{connection_file}"
  ],
  "display_name": "SSH root@192.168.1.106 Ev3dev",
  "remote_ikernel_argv": [
    "/usr/local/bin/remote_ikernel",
    "manage",
    "--add",
    "--kernel_cmd=ipython kernel -f {connection_file}",
    "--name=Ev3dev",
    "--interface=ssh",
    "--host=root@192.168.1.106"
  ]
}

Launching the notebook server using jupyter notebook in the normal way should result in the server picking up the remote kernel from the new kernel file.

To list the kernels available, I launched a Jupyter notebook with the normal (local) Python kernel and ran:

from jupyter_client.kernelspec import KernelSpecManager
list(KernelSpecManager().find_kernel_specs().keys())

My newly created on was there, albeit horribly named (the name was also the directory name the kernel.json file was created in): rik_ssh_root_192_168_1_106_ev3dev.

So here’s where we’re at now:

  • a single desktop or laptop computer with passwordless ssh access to a Lego EV3;
  • the desktop or laptop computer runs a Jupyter server;
  • Jupyter notebooks are available that will start up and run code on a remote IPython kernel on the brick.

Home1

Untitled

Where do we need to be? The most immediate need is to have something that works for residential school. This means we need 8 desktop computers and 8 EV3s.

One thing we’d need to do is find a way of marrying computers and EV3s. Specifically, we need to have known IP addresses associated with specific bricks (can we assign a known IP based on MAC address of the wifi dongle attached to the bricks?) and we need to make sure that passwordless ssh works between the computers and the EV3s. The easiest way of doing the latter would be to use the same ssh keypair in every machine, but this would mean student group A might be able to mistakenly connect to the Student Group B’s machine.

One way round it might be to set up a Jupyterhub server that is responsible for managing connections. The Jupyterhub server is a multiuser hub that will set up a Jupyter server for each logged in user. If we set up the Jupyterhub attached the same wifi network as the EV3 bricks, with 8 users, one per student group, each user/student group could then presumably be assigned to one of the bricks. Students would login to the Jupyterhub, and their Jupyter server would be associated with the remote_ikernel kernel.json file for one of the bricks. Anyone who can connect to the local wifi network can then login to the Jupyterhub as one of the group users, launch a notebook, and run code on one of the bricks. This means that students could write notebooks from their own laptops, connected to the network. Wifi-less desktop machines provided for the residential school could presumably be added into the network via a local ethernet cable?

So we’d have something like this:

robotlab_-_jupyter_pptx2

Does that make sense?

Another Route to Jupyter Notebooks – Azure Machine Learning

In much the same way that the IBM DataScientist Workbench seeks to provide some level of integration between analysis tools such as Jupyter notebooks and data access and storage, Azure Machine Learning studio also provides a suite of tools for accessing and working with data in one location. Microsoft’s offering is new to me, but it crossed my radar with the announcement that they have added native R kernel support, as well as Python 2 and 3, to their Jupyter notebooks: Jupyter Notebooks with R in Azure ML Studio.

Guest workspaces are available for free (I’m not sure if this is once only, or whether you can keep going back?) but there is also a free workspace if you have a (free) Microsoft account.

Microsoft_Azure_Machine_Learning_Studio

Once inside, you are provides with a range of tools – the one I’m interested in to begin with is the notebook (although the piepwork/dataflow experiments environment also looks interesting):

Notebooks_-_Microsoft_Azure_Machine_Learning_Studio2

Select a kernel:

Notebooks_-_Microsoft_Azure_Machine_Learning_Studio

give your new notebook a name, and it will launch into a new browser tab:

test1

You can also arrange notebooks within separate project folders. For example, create a project:

Projects_-_Microsoft_Azure_Machine_Learning_Studio

and then add notebooks to it:

Projects_-_Microsoft_Azure_Machine_Learning_Studio2
When creating a new notebook, you may have noted an option to View More in Gallery. The gallery includes examples of a range of project components, including example notebooks:

gallery_cortanaintelligence_com_browse_orderby_freshness_desc_skip_0_categories__5B_Notebook__5D

Thinking about things like the MyBinder app, which lets you launch a notebook in a container from a Github account, it would be nice to see additional buttons being made available to let folk run notebooks in Azure Machine Learning, or the Data Scientist Workbench.

It’s also worth noting how user tools – such as notebooks – seem to be being provided for free with a limited amount of compute and storage resource behind them as a way of recruiting users into platforms where they might then start to pay for more compute power.

From a course delivery perspective, I’m often unclear as to whether we can tell students to sign up for such services as part of a course or whether that breaks the service terms?  (Some providers, such as Wakari, make it clear that “[f]or classes, projects, and long-term use, we strongly encourage a paid plan or Wakari Enterprise. Special academic pricing is available.”) It seems unfair that we should require students to sign up for accounts on a “free” service in their own name as part of our offering for a couple of reasons at least: first, we have no control over what happens on the service; second, it seems that it’s a commercial transaction that should be formalised in some way, even if only to agree that we can (will?) send our students to that particular service exclusively. Another possibility is that we say students should make their own service available, whether by installing software themselves or finding an online provider for themselves.

On the other hand, trying to get online services provided at course scale in a timely fashion within an HEI seems to be all but impossible, at least until such a time as the indie edtech providers such as Reclaim Hosting start to move even more into end-user app provision either at the individual level, or affordable class level (with an SLA)…

See also: Seven Ways of Running IPython / Jupyter Notebooks.