Visual Controls for Spreadsheets

Some time ago, Paul Walk remarked that “Yahoo Pipes [might] do for web development what the spreadsheet did for non-web development before it (Microsoft Excel has been described as the most widely used Integrated Development Environment)”. After seeing how Google spreadsheets could be used as part of quick online mashup at the recent Mashed Library, Paul revised this observation along the lines of “the online spreadsheet [might] do for web development what the spreadsheet did for non-web development before it”.

In An Ad Hoc Youtube Playlist Player Gadget, Via Google Spreadsheets, I showed how a Google gadget can be used as a container for arbitrary Javascript code that can be used to process the contents of one or more Google spreadsheet cells, which, combined with the ability to pull in XML content from a remote location into a spreadsheet in real time, suggests that there is a lot more life in the spreadsheet than one might previously have thought.

So in a spirit of “what if” I wonder whether there is an opportunity for spreadsheets to take the next step towards being a development platform for the web in the following ways:

  • by offering support for a visual controls API, (cf. the Google Visualization API) which would provide a set of visual controls – sliders, calendar widgets and so on – that could directly change the state of a spreadsheet cell. I don’t know if the Google spreadsheet gadgets have helper functions that already support the ability to write, or change, cell values, but the Google Gdata spreadsheet does support updates (e.g. updating cells and Updating rows). Just like the visualization API lets you visually chart the contents of a set of cells, a visual controls API could provide visual interfaces for writing and updating cell values. So if anyone from the Lazyweb is listening, any chance of a trivial demo showing how to use something like a YUI slider widget within a Google spreadsheet gadget to update a spreadsheet cell? Or maybe a video type, that would take the URL of a a media file, or the splash page URl for a video on something like Youtube, and automatically create a player/popup player for the video if you select it? Or similarly, an audio player for an MP3 file? Or a slideshow widget for a set of image file cells?
  • “Rich typed” cells; for example, Pamela Fox showed how to use a map gadget in Google spreadsheets to geocode some spreadsheet location cells (Geocoding with Google Spreadsheets (and Gadgets)), so how would it be if we could define a location type cell which actually had a couple of other cells associated with in “another dimension” that were automatically populated with latitude and longitude values, based on a geocoding of the location entered in to a “location type” cell?
  • “real cell relative” addressing; I don’t really know much about spreadsheets, so I don’t know whether such a facility already exists, but it is possible to “really relatively reference” one cell from another; for example, could I create a formula along the lines of ={-1,-1}*{-1,0} that would take a cell “left one and up one” ({-1, -1}) and multiply it by the contents of the cell “left one” {-1, 0})? So e.g. if i paste the formula into C3, it performs the calculation B2*B3?
  • Rich typed cells could go further, and automatically pop-up an appropriate visual control if the cell as typed that way? (e.g. as a “slider controlled value”, for example; and a date type cell might launch a calendar control when you try to edit it, for example?

PS for my thoughts on reinventing email, see Sending Wikimail Messages in Gmail ;-)

An Ad Hoc Youtube Playlist Player Gadget, Via Google Spreadsheets

A tweet from Keir Clarke from Google Maps Mania last week tipped me off to this post from maps evangelist Pamela Fox – Geocoding with Google Spreadsheets (and Gadgets) – in which she demonstrates how to improve the spreadsheets to maps workflow using a Google spreadsheet gadget.

I’d actually been thinking about using a mapplet in the maps environment, (rather than a gadget in the spreadsheets environment) to do something similar, so it was great to see how someone else had set about tackling the matter :-)

Anyway, a quick look through the spreadsheet gadgets tutorial convinced me it should be easy enough to create a gadget that could act as a video playlist player for a set of Youtube movie URLs listed in a spreadsheet. I already had some gadget code that I guessed may be reusable, and it turned out it was (Google Gadgets – RSS Feed Powered YouTube Playlist Player).

Here’s a demo:

Highlight the list of cells and include the custom gadget URL for the player: http://hosting.gmodules.com/ig/gadgets/file/100510412849522254945/videoPlaylistSpreadsheetNew.xml?nocache.

As to where the list of videos came from? I scraped them from a webpage that included lots of embedded videos (i.e. a webpage that was essentially an ad hoc video playlist). A quick peak at the source of a candidate page showed me where I could find the URLs:

If we now load this page in to a Google spreadsheet using the =importXML formula (not the =importHTML formula), we can use an XPATH expression to pull out all the movie URLs from the page.

Here’s the expression you need:
//object/param[@name=’movie’]/@value

For a couple of examples, see this how to scrape a list of Youtube movies from a webpage using Google spreadsheets and view them in a Google gadget.

Speedmash and Mashalong

Last week I attended the very enjoyable Mashed Library event, which was pulled together by Owen Stephens (review here).

My own contribution was in part a follow on to the APIs session I attended at CETIS08 – a quick demo of how to use Yahoo Pipes and Google spreadsheets as rapid mashing tools. I had intended to script what I was going to do quite carefully, but an extended dinner at Sagar (which I can heartily recommend:-) put paid to that, and the “script” I did put together just got left by the wayside…

However, I’ve started thinking that a proper demo session, lasting one to two hours, with 2-4 hrs playtime to follow, might be a Good Thing to do… (The timings would make for either a half day or full day session, with breaks etc.)

So just to scribble down a few thoughts and neologisms that cropped up last week, here’s what such an event might involve, drawing on cookery programmes to help guide the format:

Owen’s observation that the flavour of the Mashed Library hackathon was heavily influenced by the “presentations” was well made; and maybe why it’s worth trying to build a programme around pushing a certain small set of tools and APIs, effectively offering “micro-training” in them to start with, and then exploring their potential use in the hands-on sessions, makes sense? It might also mean we could get the tools’n’API providers to offer a bit of sponsorship, e.g. in terms of covering the catering costs?

So, whaddya think? Worth a try in the New Year? If you think it might work, prove your commitment by coming up with a T-shirt design for the event, were it to take place ;-)

PS hmm, all these cookery references remind me of the How Do I Cook? custom search engine. Have you tried searching it yet?

PPS I guess I should also point out the JISC Developer Happiness Days event that is booked for early next year. Have you signed up yet?;-)

So What Do You Think You’re Doing, Sonny?

A tweet from @benjamindyer alerted me to a trial being run in Portsmouth where “behavioural analytics” are being deployed on the city’s CCTV footage in order to “alert a CCTV operator to a potential crime in the making” (Portsmouth gets crime-predicting CCTV).

I have to say this reminded me a little, in equal measures, of Phillip Kerr’s A Philosophical Investigation, and the film Minority Report, both of which explore, in different ways, the idea of “precrime”, or at least, the likelihood of a crime occurring, although I suspect the behavioural video analysis still has some way to go before it is reliable…!

When I chased the “crime predicting CCTV” story a little, it took me to Smart CCTV, the company behind the system being used in Portsmouth.

And seeing those screenshots, I wondered – wouldn’t this make for a brilliant bit of digital storytelling, in which the story is a machine interpretation of life going on, presented via a series of automatically generated, behavioural analysis subtitles, as we follow an unlikely suspect via the CCTV network?

See also: CCTV hacked by video artists, Red Road, Video Number Plate Recognition (VNPR) systems, etc. etc.

PS if you live in Portsmouth, you might as well give up on the idea of privacy. For example, add in a bit of Path Intelligence, “the only automated measurement technology that can continuously monitor the path that your shoppers or passengers take” which is (or at least, was) running in Portsmouth’s Gunwharf Quays shopping area (Shops track customers via mobile phone), and, err, erm… who knows?!

PPS it’s just so easy to feed paranoia, isn’t it? Gullible Twitter users hand over their usernames and passwords – did you get your Twitterank yet?! ;-)

Steps Towards Making Augmented Reality a Reality?

I’ve been a fan of the potential of augmented reality for some time (see Introducing Augmented Reality – Blending Real and Digital Worlds for some examples why…) but there have so far always been a couple of major stumbling blocks in the way of actually playing with this stuff. One has been the need to download and install the AR application itself; the other has been to get a hard copy, or print out, of the registration images that are used as the base for the digital overlay.

So when I saw this demo of a browser based Flash Augmented Reality application (via TechCrunch), I realised that the application installation barrier could soon be about to crumble… (though there is still potentially a compute power issue – the image registration and tracking is computationally expensive, which means the Flash app is not yet as reliable as a compiled, downloaded application).

The issue of having to print out the registration image still remains, however.

[Cue sideways glance to camera, and TV presenter mode;-)] Or does it?

Because it struck me that I have a portable, programmable image service to hand – my iPod touch. So maybe I could just display the registration image on that, and show it to my laptop…

//interactive.digitalpictures.com.au/?p=392

(A copy of the registration image is at http://is.gd/9ABh if you want to give it a go. The application code itself can be found at FLARToolkit.)

It also strikes me that maybe training the AR package on an image shown in an actual iPhone would be another way to go – making use of the iPhone/iPod Touch itself to help frame the image? (My iPod touch has a well defined black border around the edge of the screen after all…)

So here then we have another way of using two media in sympathy with each other to enrich an act of communication (cf. Printing Out Online Course Materials With Embedded Movie Links and Dual View Media Channels).

Finally, browsing the comments in the TechCrunch post, I found this link demoing an ARToolkit app for the iPhone:

So it looks like a magic lens app for the iPhone might not be so far away?

And if you or a friend has a second large screen smartphone (or ebook reader) to hand, you can use it as “magic paper” to render any required registration image or set of images, as shown above!;-).

PS see also Wikitude (here), an Android app that will overlay a camera view with information about points of interest.

Are you keeping up with all this? I’m not…

PS see also AR virtual pet game for iPhone.

OU Goes Social with “Platform”

Earlier this week, the OU quietly opened up its new social site – Platform – with a mailing going out today to inform students and alumni about it’s availability…

…and at first sight, it’s looking really good:

As a distance learning institution, our students potentially miss out on the sense of community that you get as a student in a traditional university, although we work hard at engaging students in online forums at a course level and the students assocation (OUSA) try to support general interest groups again with online forums. At a regional and local level, course tutorials offer students a chance to meet face to face, (although there is an increasing number of wholly online courses) and our students also take it on themselves to create their own local groups, Facebook groups, and so on.

So I’m guessing that one of the functions of the Platform site is to help develop the wider community feeling that membership of a university provides, alongside the course cohort communities.

But more than that – the site is open to anyone, whether or not they are a current student or part of the OU alumni. And there’s no hard sell…

So what’s on Platform?

The front page is a general news page, that also currently includes a couple of “interactive” features, specifically a poll and a Youtube video from one of the OU View channels on Youtube (The Open University, OU Life or OU Learn). (I assume that the polls, and maybe the video, will change on a regular basis?)

There’s also what looks like a “learning fact of the day” panel that provides a link to an actual “course sales” page in a reasonably un-intrusive way.

Just in passing, it’s worth comparing this panel with the OU “Learning Fact of the Day” widget, which actually links through to an OpenLearn course from which the fact was pulled, rather than driving the viewer to a page on the course selling catalogue.

Something that is not obviously on the site is a schedule of OU/BBC programmes, or even an OU/BBC iPlayer channel? Maybe that’s because the placement of this site in comparison to the open2.net site is not fully clear yet? Certainly I could see Platform cannibalising open2’s traffic if Platform started publicising OU/BBC programmes? But Open2 is looking rather tired… (That said, things are happening on that site. For example, the site is starting to include extra video features around our broadcast TV programmes, as the Barristers wraparound site shows (if you can manage to navigate round it to actually find the content, that is ;-) and commenting around the programme pages is slowly starting to take off (see for example the comments around the James May’s Big Ideas: Man-Machine programme).

But back to the Platform site…

The News tab links to a set of news stories I guess created by OU staff (at the moment?). And I’m guessing there’ll be a mix of text stories as well as audio packages. (Though I do take issue with calling linked to audio a “podcast”, I do have to admit;-)

Two more things to note about that audio link: firstly, it’s a link rather than an embedded player plus a link – clicking the link opened a player in a new window on my browser. That’s a shame… it would have been much neater if there was an embedded player there. Secondly, here’s where it’s pointing to: http://podcast.open.ac.uk/feeds/platform/20081124T124715_is_reality_tv_ruining_music.mp3. The OU podcast site (which is: a) still not out of testing/really launched yet, and b) not the OU iTunesU site. (I’m not sure how much the content from those sites will overlap). And from a little tweet I heard a week or two ago, the podcast site actually uses Amazon S3 for storage and delivery…

A few other things to notice about the News pages – ratings, tagging and comments are all available… (I’m not sure what the moderation policy is, w.g. whether or not Platform staffers are actively moderating (= not scalable/sustainable in the long run, if the site takes off?) or using a lazy approach (report this post). Same with the tags – e.g. if people use inappropriate or offensive tags, can these be moderated, deleted?

The Blogs area links to a set of blogs on different topics. At the moment this looks like they’ve commissioned people to write posts for the Platform blogs (Open2 uses a similar sort of approach for their topic blogs), so it’ll be interesting to see how that plays out. Certainly I don’t fully engage with writing posts to the Open2 Science and Technology blog, for a variety of reasons (I don’t like the blog engine they use; posts need to go through an editorial policy that strips out movies and maps in case of rights issues, but lets typos through that I can’t go in and change once the post is published, the traffic is lousy compared to the views I can get posting here on OUseful.info etc etc).

Each blog appears to have it’s own RSS feed, which is good (I haven’t checked which feed type they went for… it would be nice to think it was Atom).

The call to action around the feed – “Get Updates” – is well chosen, I think, and it’s nice that feed autodiscovery is enabled. I have to admit that the feed URL looks a bit odd, though… http://www.open.ac.uk/platform/blogs/alumni/%2A/%2A/feed. Hmm… (%2A renders as * if you hover over the URL in the browser status window)

The Campus area looks to be an attempt to bring something of the OU campus alive, with voices and tales from people who work there. (I’m guessing this part of the are will feed from the OUlife Youtube channel and maybe the research channel, when it launches?).

If anywhere, this is the page on the Platform site that looks most like the place that is linking out to other OU web properties on the “main” OU website. In which guess, I guess it’s really an info point? And many respects, the thing that is closest to a traditional university homepage (although, err, Where is the Open University Homepage??).

The Join In area is where forums can be found (also linked to as “Forums” from the front page, I think?

The Timeout area is where the games are… ;-) The OU actually has quite a long history of releasing games (e.g. here’s a round-up I did a couple of years ago: OU Online Games and Interactives), but the explosion in casual game formats and libraries means that they must be far easier (=quicker and cheaper) to make now, as well as being more acceptable, maybe?

Finally, it’s worth mentioning that the commenting and “joining in” features require you to login. There are two huge things happening here. Firstly, to login to the site, you don’t need to be a member of the OU (that is, you don’t need to be staff, student, or alumni). Secondly, you can – if you want – login in OpenID:

The OU has actually been running an experimental OU OpenID server for sometime, which allows anyone with OU credentials to use those credentials as an OpenID, but as far as I know, this is one of the first production service running on the open.ac.uk domain that lets users in with an OpenID, although take note here – the OpenID doesn’t let you in to any OU authenticated areas: it’s just for Platform. (I’m not sure if Cloudworks or Cohere do OpenID yet?)

Although there’s little customisation you can do as a virtue of registering – the benefits arise from being able to comment, and join in the forums – the site design certainly has the look and feel of a site that might, one day, let you drag and drop panels around, and rearrange the page furniture webtop fashion. (Or maybe we need to clarify the widget strategy first?!)

As yet, there’s no link to the platform site from the Open University homepage, so it’ll be interesting to see how the relationship between the OU homepage and the platform homepage evolves over the coming weeks and months (and also how the relationship between Platform and open2 are managed?).

Seeing how the relationship between Platform and the new generation of departmental websites will evolve over time will also be an interesting one. For example, my own Communication and Systems Department homepage is experimenting with “voices from the department” with a range of blog and audio content, and the team responsible are also looking for ways to make the site a destination site around communication related technologies (hence the “Gadgets” area):

Hmm – maybe I should offer to do a “speedmash” or “half hour hack” area for them?;-)

And finally, for a review of some “older” OU 2.0 services, Brian Kelly did a write up some time ago: The Open University’s Portfolio Of Web 2.0 Services. You can find links to most of them here: /use – From us, to you, and back again.

PS in case you’re wondering, I think I’m correct in saying that the OU Platform site is built on Drupal…

PPS Brilliant job folks – it’ll be interesting to see how people engage with it…

My CETIS 2008 Presentations

I’ve just spent a most enjoyable couple of days at the CETIS 2008 event in Birmingham, where I particpated in a couple of sessions on the future of the VLE, and HE APIs.

Just for the record, here’s the presentation I gave in the VLE session (“Web 2.ools and the VLE“):

(Mark Stiles was kind enough to say how he liked the slides, and in particular the way in which the pictures weren’t about anything at all…;-)

[Transcript of liveblog/tweeting: I was on around 3pm]

And here are the presentations I didn’t give in the APIs session – “APIs Wot I Play Wiv“:

And “What I’d Like From JISC APIs“:

Instead, I ran through the Data Scraping Wikipedia with Google Spreadsheets mashup; somewhere along the way, the idea of a “speed mashup” was introduced… this is maybe something I’ll try out at the Mashed Library event tomorrow….. err, later today… One thing that did come out of the session for me is that maybe there really is an opportunity for some sort of roadshow/masterclass around the very idea of mashups, with some quick and effective mashup demos along the way (which are, apparently, quite “intimidating” compared to what you can and canlt do with educational system APIs…;-)

There’s a few more notes – and some blatant self-promotion – on the CETIS08 APIs session wiki. Note to self: play with the PROD project discovery API.