Category: Anything you want

Link Sharing in Classrooms, Workshops and Conferences

Every so often, I get reminded of old OUseful experiments by recent news, some of which still seem to work, and some of which don’t (the latter usually as a result of link rot or third party service closures).

So for example, a few weeks ago, the Google Education blog announced a new Chrome extension – Share to Classroom (Get your students on the same (web)page, instantly.

In the first case, it seems as if the extension lets someone at the front of the room share pages to class member screens inside the room. Secondly, folk in the class are also free to browse to other pages of their own finding and push suggestions of new pages to the person at the front, who can review them and then share them with everyone.

Anyway, in part it reminded me of one of my old hacks – the FeedShow Link Presenter. This was a much cruder affair, creating a frame based page that accepted an RSS feed of links referred to by the main presenter that audience members could click forwards and back through, but that also seems to have started exploring a “back with me” feature to sync everyone’s pages.

Presenters could go off piste (splashing page links not contained in the feed, but it looks as if these couldn’t be synched. (I’m not sure if I addressed that in a later revision.) Nor could audience members suggest links back to the main presenter.

The Feedshow link presenter had a Yahoo Pipes backend, and still seems to work; but with Pipes dues to close on September 30th, it looks as if this experiment’s time will have come to an end…


Ho hum… (I guess I’ve stopped doing link based presentations anyway, but it was nice to be reminded of them:-)

Poking Around the VW Thing (Algorithmic Cheating in the Automobile Industry)

The news in recent days that VW had installed a software device into some of it’s diesel engines that identify when the engine is being tested for emissions on a dynamometer (“dyno”) provides a nice demonstration of how “intelligent” software control systems can be use to identify particular operating environments and switch into an appropriate – or inappropriate – operational mode.

As I keep finding, press coverage of the events seems to offer less explanation and context than the original document that seems to have kicked off the recent publicity, specifically a letter from the US environmental protection agency to the Volkswagen Group of America:


(The document is a PDF of a scanned document; I had hoped to extracted the text using a variant of this recipe, running Tika on my own computer via Kitematic, Getting Text Out Of Anything (docs, PDFs, Images) Using Apache Tika, but it doesnlt seem to extract text from the inlined image(s). Instead, I uploaded it to Google Drive, then opened it in Google docs – you see pages from the original doc as an image, and below is the extracted text.)

Here are some of the bits that jumped out at me:

A defeat device is an AECD [Auxiliary Emission Control Device] “that reduces the effectiveness of the emission control system under conditions which may reasonably be expected to be encountered in normal vehicle operation and procedure use, unless: (1) Such conditions are substantially included in the Federal emission test procedure; (2) The need for the AECD is justified in terms of protecting the vehicle against damage or accident; (3) The AECD does not go beyond the requirements of engine starting; or (4) The AECD applies only for emergency vehicles …” 40 C.F.R. § 86.1803-01.

Motor vehicles equipped with defeat devices, … cannot be certified.

The CAA makes it a violation “for any person to manufacture or sell, or offer to sell, or install, any part or component intended for use with, or as part of any motor vehicle or motor vehicle engine, where a principal effect of the part or component is to bypass, defeat, or render inoperative any device or element of design installed on or in a motor vehicle or motor vehicle engine in compliance with regulations under this subchapter, and where the person knows or should know that such part or component is being offered for sale or installed for such use or put to such use.” CAA § 203(a)(3)(B), 42 U.S.C. § 7522(a)(3)(B); 40 C.F.R. § 86.1854-12(a)(3)(ii).

Each VW vehicle identified … has AECDs that were not described in the application for the COC that purportedly covers the vehicle. Specifically, VW manufactured and installed software in the electronic control module (ECM) of these vehicles that sensed when the vehicle was being tested for compliance with EPA emission standards. For ease of reference, the EPA is calling this the “switch.” The “switch” senses whether the vehicle is being tested or not based on various inputs including the position of the steering wheel, vehicle speed, the duration of the engine’s operation, and barometric pressure. These inputs precisely track the parameters of the federal test procedure used for emission testing for EPA certification purposes. During EPA emission testing, the vehicles ECM ran software which produced compliant emission results under an ECM calibration that VW referred to as the “dyno calibration” (referring to the equipment used in emissions testing, called a dynamometer). At all other times during normal vehicle operation, the “switch” was activated and the vehicle ECM software ran a separate “road calibration” which reduced the effectiveness of the emission control system (specifically the selective catalytic reduction or the lean NOx trap). As a result, emissions of NOx increased by a factor of 10 to 40 times above the EPA compliant levels, depending on the type of drive cycle (e.g., city, highway). … Over the course of the year following the publication of the WVU study [TH: see link below], VW continued to assert to CARB and the EPA that the increased emissions from these vehicles could be attributed to various technical issues and unexpected in-use conditions. VW issued a voluntary recall in December 2014 to address the issue. … When the testing showed only a limited benefit to the recall, CARB broadened the testing to pinpoint the exact technical nature of the vehicles’ poor performance, and to investigate why the vehicles’ onboard diagnostic system was not detecting the increased emissions. None of the potential technical issues suggested by VW explained the higher test results consistently confirmed during CARB’s testing. It became clear that CARB and the EPA would not approve certificates of conformity for VW’s 2016 model year diesel vehicles until VW could adequately explain the anomalous emissions and ensure the agencies that the 2016 model year vehicles would not have similar issues. Only then did VW admit it had designed and installed a defeat device in these vehicles in the form of a sophisticated software algorithm that detected when a vehicle was undergoing emissions testing.

VW knew or should have known that its “road calibration” and “switch” together bypass, defeat, or render inoperative elements of the vehicle design related to compliance with the CAA emission standards. This is apparent given the design of these defeat devices. As described above, the software was designed to track the parameters of the federal test procedure and cause emission control systems to underperform when the software determined that the vehicle was not undergoing the federal test procedure.

VW’s “road calibration” and “switch” are AECDs’ that were neither described nor justified in the applicable COC applications, and are illegal defeat devices.

The news also reminded of another tech journalism brouhahah from earlier this year around tractor manufacturer John Deere arguing that farmers don’t own their tractors, but instead purchase “an implied license for the life of the vehicle to operate the vehicle” (Wired, We Can’t Let John Deere Destroy the Very Idea of Ownership).

I didn’t really follow that story properly at the time but it seems the news arose out of a response to a consultation by the US Copyright Office around the Digital Millennium Copyright Act (DMCA), and in particular a “Proposed Class 21: Vehicle software – diagnosis, repair, or modification” category (first round comments, second round comments) for the DCMA Section 1201: Exemptions to Prohibition Against Circumvention of Technological Measures Protecting Copyrighted Works.

[UPDATE: Decisions have now been made on what exemptions are allowable: Exemption to Prohibition on Circumvention of Copyright Protection Systems for Access Control Technologies (PDF)]

Here’s how the class was defined:

21. Proposed Class 21: Vehicle software – diagnosis, repair, or modification
This proposed class would allow circumvention of TPMs [technological protection measures] protecting computer programs that control the functioning of a motorized land vehicle, including personal automobiles, commercial motor vehicles, and agricultural machinery, for purposes of lawful diagnosis and repair, or aftermarket personalization, modification, or other improvement. Under the exemption as proposed, circumvention would be allowed when undertaken by or on behalf of the lawful owner of the vehicle.

Note the phrase “for purposes of lawful diagnosis and repair”…

I also note a related class:

22. Proposed Class 22: Vehicle software – security and safety research
This proposed class would allow circumvention of TPMs protecting computer programs that control the functioning of a motorized land vehicle for the purpose of researching the security or safety of such vehicles. Under the exemption as proposed, circumvention would be allowed when undertaken by or on behalf of the lawful owner of the vehicle.

(and in passing note Proposed Class 27: Software – networked medical devices…).

Looking at some of the supporting documents, it’s interesting to see how the lobby moved. For example, from the Senior Director of Environmental Affairs for the Alliance of Automobile Manufacturers:

The proponents state that an exemption is needed for three activities related to vehicles – diagnosis, repair, and modification. In my limited time, I will explain why, for the first two activities – diagnosis and repair – there is no need to circumvent access controls on Electronic Control Units (ECUs). Then, I will address why tampering with ECUs to “modify” vehicle performance undermines national regulatory goals for clean air, fuel efficiency, and auto safety, and why the Copyright Office should care about that.

1. Diagnosis/repair
The arguments put forward by the proponents of this exemption are unfounded. State and federal regulations, combined with the Right to Repair MOU and the 2002 “Dorgan letter,” guarantee all independent repair shops and individual consumers access to all the information and tools needed to diagnose and repair Model Year 1996 or newer cars. This information and these tools are already accessible online, through a thriving and competitive aftermarket. Every piece of information and every tool used to diagnose and repair vehicles at franchised dealers is available to every consumer and every independent repair shop in America. This has been the case for the past 12 years. Moreover, all of these regulations and agreements require automakers to provide the information and tools at a “fair and reasonable price.” No one in the last 12 years has disputed this fact, in any of the various avenues for review provided, including U.S. EPA, the California Air Resources Board, and joint manufacturer-aftermarket organizations.

There is absolutely no need to hack through technological protection measures and copy ECU software to diagnose and repair vehicles.

2. Modification
The regulations and agreements discussed above do not apply to information needed to “modify” engine and vehicle software. We strongly support a competitive marketplace in the tools and information people need so their cars continue to perform as designed, in compliance with all regulatory requirements. But helping people take their cars out of compliance with those requirements is something we certainly do not want to encourage. That, in essence, is what proponents of exemption #21 are calling for, in asserting a right to hack into vehicle software for purposes of “modification.” In the design and operation of ECUs in today’s automobiles, manufacturers must achieve a delicate balance among many competing regulatory demands, notably emissions (air pollution); fuel economy; and of course, vehicle safety. If the calibrations are out of balance, the car may be taken out of compliance. This is so likely to occur with many of the modifications that the proponents want to make that you could almost say that noncompliance is their goal, or at least an inevitable side effect.

Manufacturer John Deere suggested that:

1. The purpose and character of the use frustrate compliance with federal public safety and environmental regulations

The first fair use factor weighs against a finding of fair use because the purpose and character of the use will encourage non-compliance with environmental regulations and will interfere with the ability of manufacturers to identify and resolve software problems, conduct recalls, review warranty claims, and provide software upgrade versions.

And General Motors seem to take a similar line:

TPMs also ensure that vehicles meet federally mandated safety and emissions standards. For example, circumvention of certain emissions-oriented TPMs, such as seed/key access control mechanisms, could be a violation of federal law. Notably, the Clean Air Act (“CAA”) prohibits “tampering” with vehicles or vehicle engines once they have been certified in a certain configuration by the Environmental Protection Agency (“EPA”) for introduction into U.S. commerce. “Tampering” includes “rendering inoperative” integrated design elements to modify vehicle and/or engine performance without complying with emissions regulations. In addition, the Motor Vehicle Safety Act (“MVSA”) prohibits the introduction into U.S. commerce of vehicles that do not comply with the Federal Motor Vehicle Safety Standards, and prohibits manufacturers, dealers, distributors, or motor vehicle repair businesses from knowingly making inoperative any part of a device or element of design installed on or in a motor vehicle in compliance with an applicable motor vehicle standard.14

Further, tampering with these systems would not be obvious to a subsequent owner or driver of a vehicle that has been tampered with. If a vehicle’s airbag systems, including any malfunction indicator lights, have been disabled (whether deliberately or inadvertently), a subsequent vehicle owner’s safety will be in jeopardy without warning. Further, if a vehicle’s emissions systems have been tampered with, a subsequent owner would have no way of knowing this has occurred. For tampering that the subsequent owner eventually discovers, manufacturer warranties do not cover the repair of damage caused by the tampering, placing the repair cost on the subsequent owner. For good cause, federal environmental and safety regulations regarding motor vehicles establish a well-recognized overall policy against allowing tampering with in-vehicle electronic systems designed for safety and emissions control.

While so-called “tinkerers” and enthusiasts may wish to modify their vehicle software for personal needs, granting greater access to vehicle software for purposes of modification fails to consider the overall concerns surrounding regulatory compliance and safety and the overall impact on safety and the environment. … Thus, the current prohibition ensures the distribution of safe and secure vehicle software within an overall vehicle security strategy implemented by car manufacturers that does not restrict vehicle owners’ ability to diagnose, modify or repair their cars.

The arguments from the auto lobby therefore go along the lines of “folk can’t mess with the code because they’ll try to break the law”, as opposed to the manufacturers systematically breaking the law, or folk trying to find out why a car performs nothing like the apparently declared figures. And I’m sure there are no elements of the industry wanting to prevent folk from looking at the code lest they find that it has “test circumvention” code baked in to it by the actual manufacturers…

What the VW case throws up, perhaps, is the need for a clear route for investigators to be allowed to find a way of checking on the compliance behaviour of various algorithms, not just in formal tests but also in unannounced to the engine management system, everyday road tests.

And that doesn’t necessarily require on-the-road tests in a real vehicle. If the controller is a piece of software acting on digitised sensor inputs to produce a particular set of control algorithm outputs, the controller can be tested on a digital testbench or test harness against various test inputs covering a variety of input conditions captured from real world data logging. This is something I think I need to read up more about… this could be a quick way in to the very basics: National Instruments: Building Flexible, Cost-Effective ECU Test Systems White Paper. Something like this could also be relevant: Gehring, J. and Schütte, H., “A Hardware-in-the-Loop Test Bench for the Validation of Complex ECU Networks”, SAE Technical Paper 2002-01-0801, 2002, doi:10.4271/2002-01-0801 (though the OU Library fails to get me immediate access to this resource…:-(.

PS In passing, I just spotted this: Auto Parts Distributor Pleads Guilty to Manufacturing and Selling Pirated Mercedes-Benz Software – it seems that Mercedes-Benz distribute “a portable tablet-type computer that contains proprietary software created by to diagnose and repair its automobiles and that requires a code or ‘license key’ to access [it]” and that a company had admitted to obtaining “without authorization, … [the] Mercedes-Benz SDS software and updates, modified and duplicated the software, and installed the software on laptop computers (which served as the SDS units)”. So a simple act of software copyright/license infringement, perhaps, relating to offboard testing and diagnostic tools. But another piece in the jigsaw, for example, when it comes to engineering software that can perform diagnostics.

PPS via @mhawksey, a link to the relevant West Virginia University test report – In-use emissions testing of light-duty diesel vehicles in the U.S. and noting Martin’s observation that there are several references to Volkswagen’s new 2.0 l TDI engine for the most stringent emission standards — Part 2 (reference [31] in the paper, Hadler, J., Rudolph, F., Dorenkamp, R., Kosters, M., Mannigel, D., and Veldten, B., “Volkswagen’s New 2.0l TDI Engine for the Most Stringent Emission Standards – Part 2,” MTZ Worldwide, Vol. 69, June, (2008). , which the OU Library at least doesn’t subscribe to:-(…

“Interestingly” the report concluded:

In summary, real-world NOx emissions were found to exceed the US-EPA Tier2-Bin5 standard (at full useful life) by a factor of 15 to 35 for the LNT equipped Vehicle A, by a factor of 5 to 20 for the urea-SCR fitted Vehicle B (same engine as Vehicle A) and at or below the standard for Vehicle C with exception of rural-up/downhill driving conditions, over five predefined test routes. Generally, distance-specific NOx emissions were observed to be highest for rural-up/downhill and lowest for high-speed highway driving conditions with relatively flat terrain. Interestingly, NOx emissions factors for Vehicles A and B were below the US-EPA Tier2-Bin5 standard for the weighted average over the FTP-75 cycle during chassis dynamometer testing at CARB’s El Monte facility, with 0.022g/km ±0.006g/km (±1σ, 2 repeats) and 0.016g/km ±0.002g/km (±1σ, 3 repeats), respectively.

It also seems that the researchers spotted what might be happening to explain the apparently anomalous results they were getting with help from reference 31: “The probability of this explanation is additionally supported by a detailed description of the after-treatment control strategy for Vehicle A presented elsewhere [31]“.

PPPS I guess one way of following the case might be track the lawsuits, such as this class action complaint against VW filed in California (/via @funnymonkey) or this one, also in San Francisco, or Tennessee, or Georgia, or another district in Georgia, or Santa Barbara, or Illinois, or Virginia, and presumably the list goes on… (I wonder if any of those complaints are actually informative/well-researched in terms of their content? And whether they are all just variations on a similar theme? In which case, they could be an interesting basis for a comparative text analysis?)

In passing, I also note that suits have previously – but still recently – been filed against VW, amongst others, regarding misleading fuel consumption claims, for example in the EU.

Even Though RSS Never Went Away, Could It Be Coming Back as a Facebook Sinker?

Long time readers will know I was – am – a huge fan of RSS and Atom, simple feed based protocols for syndicating content and attachment links, even going so far as to write a manifesto of a sort at one point (We Ignore RSS at OUr Peril).

This blog, and the earlier archived version of it, are full of reports and recipes around various RSS experiments and doodles, although in more recent years I haven’t really been using RSS as a creative medium that much, if at all.

But today I noticed this on the official Facebook developer blog: Publishing Instant Articles Directly From Your Content Management System [Instant Article docs]. Or more specifically, this:

When publishers get started with Instant Articles, they provide an RSS feed of their articles to Facebook, a format that most Content Management Systems already support. Once this RSS feed is set up, Instant Articles automatically loads new stories as soon as they are published to the publisher’s website and apps. Updates and corrections are also automatically captured via the RSS feed so that breaking news remains up to date.

So… Facebook will use RSS to synch content into Facebook from publishers’ CMS’.

Depending on the agreement Facebook has with the publishers, it may require that those feeds are private, rather than public, feeds that sink the the content directly into Facebook.

But I wonder, will it also start sinking content from other independent publishers into the Facebook platform via those open feeds, providing even less reason for Facebook users to go elsewhere as it drops bits of content from the open web into closed, personal Facebook News Feeds? Hmmm…

There seems to be another sort of a grab for attention going on too:

Each Instant Article is associated with the URL where the web version is hosted on the publisher’s website. This means that Instant Articles are open and compatible with all of the ways that people share links around the web today:

  • When a friend or page you follow shares a link in your News Feed, we check to see if there is an Instant Article associated with that URL. If so, you will see it as an Instant Article. If not, it will open on the web browser.
  • When you share an Instant Article on Facebook or using email, SMS, or Twitter, you are sharing the link to the publisher website so anyone can open the article no matter what platform they use.

Associating each Instant Article with a URL makes it easy for publishers to adopt Instant Articles without changing their publishing workflows and means that people can read and share articles without thinking about the platform or technology behind the scenes.

Something like this maybe?


Which is to say, this?


Or maybe not. Maybe there is some enlightened self interest in this, and perhaps Facebook will see a reason to start letting its content out via open syndication formats, like RSS.

Or maybe RSS will end up sinking the Facebook platform, by allowing Facebook users to go off the platform but still accept content from it?

Whatever the case, as Facebook becomes a set of social platform companies rather than a single platform company, I wonder: will it have an open standard, feed based syndication bus to help content flow within and around those companies? Even if that content is locked inside the confines of a Facebook-parent-company-as-web attention wall?

PS So the ‘related content’ feature on my WordPress blog associates this post with an earlier one: Is Facebook Stifling the Free Flow of Information?, which it seems was lamenting an earlier decision by Facebook to disable the import of content into Facebook using RSS…?! What goes around, comes around, it seems?!

Tweetable Bullet Points

Reading through Pew Research blog post announcing the highlights of a new report just now (Libraries at the Crossroads), I spotted various Twitter icons around the document, not unlike the sort of speech bubble comment icons used in paragraph level commenting systems.


Nice – built in opportunities for social amplification of particular bullet points:-)

PS thinks: hmm, maybe I could build something similar into auto-generated copy for contexts like this?

PPS via @ned_potter, the suggestion that maybe they’re making use of this WordPress plugin: ClickToTweet. Hmmm… And via @mhawksey, this plugin: Inline Tweet Sharer.

“Student for Life” – A Lifelong Learning Relationship With Your University… Or Linked In?

I’ve posted several times over the years wondering why universities don’t try to reimagine themselves as see undergrad degrees as a the starting point for a lifelong relationship as an educational service provider with their first-degree alumni.

A paragraph in an recent Educause Review article Data, Technology, and the Great Unbundling of Higher Education (via @Downes [commentary]) caught my eye this morning:

In an era of unbundling, when colleges and universities need to move from selling degrees to selling EaaS subscriptions, the winners will be those that can turn their students into “students for life” — providing the right educational programs and experiences at the right time.

On a quick read, there’s a lot in the article I don’t like, even though it sounds eminently reasonable, positing a competency based “full-stack model to higher education” in which providers will (1) develop and deliver specific high-quality educational experiences that produce graduates with capabilities that specific employers desperately want; (2) work with students to solve financing problems; and (3) connect students with employers during and following the educational experience and make sure students get a job.

The disruption that HE faces, then, is not one about course delivery, but rather one about life-as-career management?

What if … the software that will disrupt higher education isn’t courseware at all? What if the software is, instead, an online marketplace? Uber (market cap $40 billion) owns no vehicles. Airbnb (market cap $10 billion) owns no hotel rooms. What they do have are marketplaces with consumer-friendly interfaces. By positioning their interfaces between millions of consumers and sophisticated supply systems, Uber and Airbnb have significantly changed consumer behavior and disrupted these supply systems.

Is there a similar marketplace in the higher education arena? There is, and it has 40 million college students and recent graduates on its platform. It is called LinkedIn.

Competency marketplaces will profile the competencies (or capabilities) of students and job seekers, allow them to identify the requirements of employers, evaluate the gap, and follow the educational path that gets them to their destination quickly and cost-effectively. Although this may sound like science fiction, the gap between the demands of labor markets and the outputs of our educational system is both a complex sociopolitical challenge and a data problem that software, like LinkedIn, is in the process of solving. …

(I’m not sure if I don’t like the article because I disagree with it, or because it imagines a future that is one that I’d rather not see play out: the idea that learners don’t develop a longstanding relationship with a particular university, and consequently miss out on the social and cultural memories and relationships that develop therein, but instead taking occasional offerings from a wide a variety of providers and instead having their long term relationship with someone like LinkedIn, feels like something will be lost to me. Martin Weller captures some of it, I think, in his reflection yesterday on Product and process in higher ed in terms of how the “who knows?!” answer to the “what job are you going to do with that?” question about a particular degree becomes a nonsense answer, because the point of the degree has become just that: getting a particular job. Rather than taking a degree to widen your options, the degree becomes one of narrowing them down?! Maybe the first degree should be about setting yourself up to becoming a specialist over the course of occasional and extended education over a lifetime? UPDATE: related, this quote from an article on the “death of Twitter”: When a technology is used to shrink people’s possibilities, more than to expand them, it cannot create value for them. And so people will simply tune it out, ignore it, walk away from it if they can. In the sense that universities are a technology… hmmm…)

Furthermore, I get twitchy about this being another example of a situation where it’s tradable personal data that’s the valuable bargaining chip:

To avoid marginalization, colleges and universities need to insist that individuals own their competencies. Ensuring that ownership lies with the individual could make the competency profile portable and could facilitate movement across marketplaces, as well as to higher education institutions.

(As for how competencies are recognised, and fraud avoided in terms of folk claiming a competency that hasn’t be formally qualified, I’ve pondered this before, eg in the context of Time to build trust with an open achievements API?, or this idea for a Qualification Verification Service. It seems to me that universities don’t see it as their business proving that folk have the qualifications or certificates they’ve been awarded – which presumably means that if it does become yet another part of the EaaS marketplace, it’ll be purely corporate commercial interests that manage it.)

We’ll see….

Getting Your Own Space on the Web…

To a certain extent, we can all be web publishers now: social media let’s us share words, pictures and videos, online office suites allow us to publish documents and spreadsheets, code repositories allow us to share code, sites like allow you to publish specific sorts of applications, and so on.

So where do initiatives like a domain of one’s own come in, which provide members of a university (originally), staff and students alike, with a web domain and web hosting of their own?

One answer is that they provide a place on the web for you to call your own. With a domain name registered (and nothing else – not server requirements, no applications to install) you can set up an email address that you and you alone own and use it to forward mail sent to that address to any other address. You can also use your domain as a forwarding address or alias for other locations on the web. My domain forwards traffic to a blog hosted on (I pay WordPress for the privilege of linking to my site there with my domain address); another domain I registered – – acts as an alias to a site hosted on

The problem with using my domains like this mean that I can only forward traffic to sites that other people operate – and what I can do on those sites is limited by those providers. WordPress is a powerful web publishing platform, but only offers a locked down experience with no allowances for customising the site using your own plugins. If I paid for my own hosting, and ran my own WordPress server, the site could be a lot richer. But then in turn I would have to administer the site for myself, running updates, being responsible – ultimately – for the security and resource provisioning of the site myself.

Taking the step towards hosting your own site is a big one, for many people (I’m too lazy to host my own sites, for example…) But initiatives like Reclaim Hosting, and more recently OU Create (from Oklahoma University, not the UK OU), originally inspired by a desire to provide a personal playspace in which students could explore their own digital creativity and give them a home on the web in which they could run their own applications, eased the pain for many: the host could also be trusted, was helpful, and was affordable.

The Oklahoma create space allows students to register a subdomain (e.g. or custom domain (e.g. and associate it with what is presumably a university hosted serverspace into which users can presumably install their own applications.

So it seems to me we can tease apart two things:

  • firstly, the ability to own a bit of the web’s “namespace” by registering your own domain (, for example);
  • secondly, the ability to own a bit of the web’s “functionality space”: running your own applications that other people can connect to and make use of; this might be running your own possibly blogging platform, possibly customised using your own, or third party, extensions, or it might be running one or more custom applications you have developed on your own.

But what if you don’t want the responsibility of running, and maintaining, your own applications day in, day out? What if you only want to share an application to the web for a short period of time? What if you want to be able to “show and tell” and application for a particular class, and then put it back on the shelf, available to use again but not always running? Or what if you want to access an application that might be difficult to install, or isn’t available for your computer? Or you’re running a netbook or tablet, and the application you want isn’t available as an app, just as a piece of “traditionally installed software”?

I’ve started to think that docker style containers may offer a way of doing this. I’ve previously posted a couple of examples of how to run RStudio or OpenRefine via docker containers using a cloud host. How much nicer it would be if I could run such containers on a (sub)domain of my own running via a university host…

Which is to say – I don’t necessarily want a full hosting solution on a domain of my own, at least, not to start with, but I do want to be able to add my own bits of functionality to the web, for short periods of time at the least. That is, what I’d quite like is a convenient place to “publish” (in the sense of “run”) my own containerised apps; and then rip them down. And then, perhaps at a later date, take them away and run them on my own fully hosted domain.

Seven Graphical Interfaces to Docker

From playing with docker over the last few weeks, I think it’s worth pursuing as a technology for deploying educational software to online and distance education students, not least because it offers the possibility of using containers as app runners than can run an app on your own desktop, or via the cloud.

The command line is probably something of a blocker to users who expect GUI tools, such as a one-click graphical installer, or double click to start an app, so I had a quick scout round for graphical user interfaces in and around the docker ecosystem.

I chose the following apps because they are directed more at the end user – launching prebuilt apps, an putting together simple app compositions. There are some GUI tools aimed at devops folk to help with monitoring clusters and running containers, but that’s out of scope for me at the moment…

1. Kitematic

Kitematic is a desktop app (Mac and Windows) that makes it one-click easy to download images from docker hub and run associated containers within a local docker VM (currently running via boot2docker?).

I’ve blogged about Kitematic several times, but to briefly summarise: Kitematic allows you to launch and configure individual containers as well as providing easy access to a boo2docker command line (which can be used to run docker-compose scripts, for example). Simply locate an image on the public docker hub, download it and fire up an associated container.


Where a mount point is defined to allow sharing between the container and the host, you can simply select the desktop folder you want to mount into the container.

At the moment, Kitematic doesn’t seem to support docker-compose in a graphical way, or allow users to deploy containers to a remote host.

2. Panamax is a browser rendered graphical environment for pulling together image compositions, although it currently needs to be started from the command line. Once the application is up and running, you can search for images or templates:


Templates seem to correspond to fig/docker compose like assemblages, with panamax providing an environment for running pre-existing ones or putting together new ones. I think the panamax folk ran a competition some time ago to try to encourage folk to submit public templates, but that doesn’t seem to have gained much traction.


Panamax supports deployment locally or to a remote web host.


When I first came across docker, I found panamax really exciting becuase of the way it provided support for linking containers. Now I just wish Kitematic would offer some graphical support for docker compose that would let me drag different images into a canvas, create a container placeholder each time I do, and then wire the containers together. Underneath, it’d just build a docker compose file.

The public project files is useful – it’d be great to see more sharing of general useful docker-compose scripts and asscociated quick-start tutorials (eg WordPress Quickstart With Docker).

3. is a graphical tool for building docker compose files, but doesn’t have the drag, drop and wire together features I’d like to see. is published by CenturyLink, who also publish panamax, ( is the newer development, I think?)


Lorry_io_-_Docker_Compose_YAML_Editor2 lets you search specify your own images or build files, find images on dockerhub, and configure well-formed docker compose YAML scripts from auto-populated drop down menu selections which are sensitive to the current state of the configuration.

4. docker ui

docker.ui is a simple container app that provides an interface, via the browser, into a currently running docker VM. As such, it allows you to browse the installed images and the state of any containers.


Kitematic offers a similar sort of functionality in a slightly friendlier way. See additional screenshots here.

5. tutum Cloud

I’ve blogged about a couple of times before – it was the first service that I could actually use to get containers running in the cloud: all I had to do was create a Digial Ocean account, pop some credit onto it, then I could link directly to it from tutum and launch containers on Digital Ocean directly from the tutum online UI.


I’d love to see some of the cloud deployment aspects of tutum make it into Kitematic…

See also things like

6. docker Compose UI

The docker compose UI looks as if it provides a browser based interface to manage deployed container compositions, akin to some of the dashboards provided by online hosts.


I couldn’t get it to work… I get the feeling it’s like the docker ui but with better support for managing all the containers associated with a particular docker-compose file.

7. ImageLayers

Okay – I said I was going to avoid devops tools, but this is another example of a sort of thing that may be handy when trying to put a composition of several containers together because it might help identify layers that can be shared across different images. looks like it pokes through the Dockerfile of one or more containers and shows you the layers that get built.


I’m not sure if you can point it at a docker-compose file and let it automatically pull out the layers from identified sources (images, or build sources)?