Updating Google Calendars from a Google Spreadsheet

I got a request today along the lines of:

We’re in the process of creating a master calendar of events spreadsheet relevant to [various things]. These [various things] will all then have their own Google calendar so they can be looked at individually, embedded etc and everyone could of course have access to all and view them all via their personal Google calendar, turn different calendars on or off, sync with Outlook etc. etc.

X said “wouldn’t it be great if we made the master spreadsheet with Google docs and it could somehow automate and complete the calendars”.

Sigh…;-) So – is it possible?

I’ve only had a quick play so far with Google Apps script, but yes, it seems to be possible…

Take one spreadsheet, liberally sprinkled with event name, description, start and end times, an optional location, and maybe a even a tag or too (not shown):

The time related columns I specified as a date type using the “Data Validation…” form from the Tools menu:

Now take one Google apps script:

function caltest1() {
  var sheet = SpreadsheetApp.getActiveSheet();
  var startRow = 2;  // First row of data to process
  var numRows = 2;   // Number of rows to process
  var dataRange = sheet.getRange(startRow, 1, numRows, 5);
  var data = dataRange.getValues();
  var cal = CalendarApp.getDefaultCalendar();
  for (i in data) {
    var row = data[i];
    var title = row[0];  // First column
    var desc = row[1];       // Second column
    var tstart = row[2];
    var tstop = row[3];
    var loc = row[4];
    //cal.createEvent(title, new Date("March 3, 2010 08:00:00"), new Date("March 3, 2010 09:00:00"), {description:desc,location:loc});
    cal.createEvent(title, tstart, tstop, {description:desc,location:loc});

(This was largely a copy and paste from Sending emails from a Spreadsheet Tutorial, which I’d half skimmed a week or two ago and seemed to remember contained a howto for bulk mailing from a spreadsheet, and the Apps script Calendar class documentation.)

And here’s the result of running the function:

The email tutorial adds a bit of gloss that allows a further column to contain state information about whether an email has already been set; we could do something similar to specify whether or not an event has been automatically added to the calendar, and if not, add it when the function is run.

Because it can be a pain having to go into the script editor to run the function, it’s easier to just create a menu option for it:

Here’s how:

function onOpen() {
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var menuEntries = [ {name: "Shove stuff in calendar", functionName: "caltest2"} ];
  ss.addMenu("OUseful", menuEntries);

I had a little play to see if I could trivially get an RSS feed into the spreadsheet using an =importFeed() formula, and use the details from that to populate the calendar, but for some reason the feed importer function didn’t appear to be working?:-( When I tried using CSV data from a Yahoo RSS2CSV proxy pipe via a =importData() formula, the test function I’d written didn’t appear to recognise the date format…

PS Arghh… the test formula assumes a Date type is being passed to it… Doh!

Hack round importFeed still not working by grabbing a CSV version of the feed into the spreadsheet:

Tweak the calendar event creation formula:

cal.createEvent(title, new Date(tstart), new Date(tstop), {description:desc});

Run the function:

Heh heh :-)

PS it’s also possible to move content from a Google Calendar to a Google spreadsheet, as Grabbing Google Calendar Event Details into a Spreadsheet shows…

PPS it strikes me that the spreadsheets2calendar route provides one way of generating an iCal feed from a list of event times held in a spreadsheet, by popping the events into a Calendar and then making the most of its output formats? A bit like using Yahoo pipes as a quick’n’easy KML generator?

[UPDATE: related, via @mhawksey – Using Google Apps Script for a event booking system (Spreadsheet to Calendar & Site | Form to Spreadsheet, Email and possible Contacts)]

Re: not creating duplicate entries, check out this thread

Grabbing Google Calendar Event Details into a Spreadsheet

A comment to Updating Google Calendars from a Google Spreadsheet, where I showed how to get events described in a Google spreadsheet into a Google Calendar wondered if the other way around is possible? … It would be great for logging hours for projects or declaring working hours ;-)

Yes:-) Twenty seconds ago, this spreadsheet was empty:

A quick run of a script and I populated it from my default calendar using Google Apps script. Once again, all I did was peek through the docs and pull out the fragments I needed: Here’s the script:

function caltest3(){
  // The code below will retrieve events between 2 dates for the user's default calendar and
  // display the events the current spreadsheet
  var cal = CalendarApp.getDefaultCalendar();
  var sheet = SpreadsheetApp.getActiveSheet();
  var events = cal.getEvents(new Date("March 8, 2010"), new Date("March 14, 2010"));
  for (var i=0;i<events.length;i++) {
    var details=[[events[i].getTitle(), events[i].getDescription(), events[i].getStartTime(), events[i].getEndTime()]];
    var row=i+1;
    var range=sheet.getRange(row,1,1,4);

So now not only can we use a spreadsheet as a database we can also use a calendar in a similar way, and if necessary, link them all together?

Maintaining a Google Calendar from a Google Spreadsheet, Reprise

In the post Updating Google Calendars from a Google Spreadsheet, I described a recipe for adding events to a Google Calendar from a Google Spreadsheet using Google Apps Script. After a quick chat with the person who was compiling a spreadsheet they wanted to use to populate a set of calendars, I revisited the script to make a few tweaks and hopefully increase its usability.

So here’s a glimpse of the spreadsheet they’re using to list dates for various campaigns and channels where related activity might occur. Firstly, we have some columns relating to the event or activity, and the dates on which they occur. The first column (added to calendar) is a control switch that identifies that the calendar details have been updated for that event:

Within the spreadsheet, I set the two date columns to have the Date type (from the Tools menu, set the Data Validation option to Date). I’m not sure how the spreadsheet is (correctly) identifying the US date format (MM-DD-YY) – maybe from a US timezone as a global setting for the spreadsheet?

As well as various other admin columns, there are columns relating to whether or not a channel will be used to support a particular event:

From what I could ascertain, the way the spreadsheet is supposed to work goes along the lines of: someone adds details of an event and the associated channels for the event to the spreadsheet. “Add” in a channel column says that event is to be added to that channel calendar. When the updating script is run, for each event it checks the control column A to see that an event hasn’t been added to the various channel calendars, and if it hasn’t checks the channel columns; if a channel column is set to “Add” the event details are added to that event calendar.

So – how do I need to modify the original script? Firstly, the original script use the default calendar. In this case, we need a separate calendar for each channel, so in Google Apps I created one calender per channel:

We can grab a calendar by name from a spreadsheet apps script using a call of the form:

 var cal_broadcast=CalendarApp.openByName("broadcastDemo");

When the script runs, we need to grab the appropriate range of cells from the spreadsheet to see which calendars to update. For testing purposes, I only grabbed a few rows…

var startRow = 2;  // First row of data to process
var numRows = 4   // Number of rows to process
var dataRange = sheet.getRange(startRow, 1, numRows, 26);
var data = dataRange.getValues();

for (i in data) {
  var row = data[i];
  var title = row[1];
  var desc=row[15];
  var added = row[col_added];  //Check to see if details for this event have been added to the calendar(s)
  var tstart = row[2]; //start time - I have defined the column in the spreadsheet as a Date type
  var tstop = row[3]; //start time - I have defined the column in the spreadsheet as a Date type
  var broadcast=row[col_broadcast]; // is this event one to "Add" to the broadcast calendar?
  var itunes=row[col_itunes]; // is this event one to "Add" to the itunes calendar? etc
  var youtube=row[col_youtube];
  if (added!="Added") { //the calendar(s) have not been updated for this event
    if (broadcast=="Add") {
      cal_broadcast.createEvent(title, tstart,tstop, {description:desc}); //add the event to the "broadcast" calendar
    if (itunes=="Add"){
      cal_itunes.createEvent(title, tstart,tstop, {description:desc});
    // etc for each channel
    var v = parseInt(i)+2; // +2 is an offset to do with the numbering of rows and the "blank" header row 0;
    sheet.getRange(v, 1, 1, 1).setValue("Added"); //set the fact that we have updated the calendars for this event

In order to identify which columns to use to identify the broadcast, itunes, etc channels, I went defensive (the following bit of code comes before the previous snippet; what is does is to look at each column heading, and then set the column number for each channel appropriately based on its name; I should probably use a similar technique to identify the start/stop dates. What this approach does is accommodate changes to the spreadsheet in terms of the insertion of additional columns or the reordering of columns, for example, at a later date):

var ss=SpreadsheetApp.getActiveSpreadsheet();
var sheet=SpreadsheetApp.setActiveSheet(ss.getSheets()[0]); //need a routine to set active sheet by name?
//go defensive
var col_broadcast,col_itunes, col_youtube=1;
var maxcols=sheet.getMaxColumns();
for (var j=1;j<=maxcols;j++){
  var header= sheet.getRange(1, j, 1, 1).getValue();
   case "Added to Google (Y/N/Hold)":col_added=j-1;
   case "Broadcast":col_broadcast=j-1; break;
   case "iTunes":col_itunes=j-1; break;
   case "YouTube": col_youtube=j-1; break;

Running the combined function thus searches the spreadsheet for the appropriate channel columns and control column, checks the control column for each event entry to ensure that the event hasn’t been added to the selected calendars, and then adds the event to the appropriate channel calendars if required.

Playing with the script, it seemed a little bit clunky, so I tweaked it to update the channel cells with the word “Added” if it had been set to Add, and the calendar had been updated:

if (broadcast=="Add") {
  cal_broadcast.createEvent(title, tstart,tstop, {description:desc});
  dataRange.getCell(parseInt(i)+1,col_broadcast+1).setValue('Added'); // Replace "Add" with "Added"; +1 is offset for sheet numbering

It also struck me that if the settings of a channel was updated to “Add” after that event was updated, that channel’s calendar would never get updated. So I created a variant of the updating function that would just run on a per column basis and update a calendar entry for an event if it was set to “Add”, rather than checking the control column:

function caltestAddtoCal_broadcast(){ caltestAddtoCal("broadcast"); }
function caltestAddtoCal(addCal){
  if (addCal!="") {
    if ((addCal=="broadcast")&&(broadcast=="Add")) {
      cal_broadcast.createEvent(title, tstart,tstop, {description:desc});
      dataRange.getCell(parseInt(i)+1,col_broadcast+1).setValue('Added'); //+1 is offset for sheet numbering
  // ...

What this means is is that a channel controller can update entries in their calendar by running the script just for that channel and adding “Add” to any event they want adding to the calendar, the list of “Added” entries showing which events have already been added to that calendar:

Having doodled a script that sort of works, it’s now time to hack it around it so it looks a little more elegant. Which means refactoring.. sigh… and another reprise in a day or two, I guess…?!

Writing 2D Data Arrays to a Google Spreadsheet from Google Apps Script Making an HTTP POST Request for CSV Data

Just a quick post to log another Google Apps script how to – this time how to grab a 2D array of CSV data from a URL and then paste it into a spreadsheet. For the trivial case, we could just as easily do this with an =importData() formula, but I’m still working out what bits of glue might be useful on the Apps script front…;-)

So – the demo function will take a blank spreadsheet and do this to it:

That is, shove some data from a query run over a data.gov.uk SPARQL endpoint into it.

Here’s the script – it shows how to HTTP POST a query from the Apps script to the SPARQLproxy endpoint, grab the results back as CSV, then convert them crom the CSV array to a range that can be posted into the spreadsheet.

function dgtest(){

  var s = SpreadsheetApp.getActiveSpreadsheet();
  var ss = s.getActiveSheet();
  var sc = ss.getActiveCell();

  //I really need to tweak this so we can enter a SPARQL query and then generate the args string from it...
  var args="query=prefix+sch-ont%3A+%3Chttp%3A%2F%2Feducation.data.gov.uk%2Fdef%2Fschool%2F%3E%0D%0ASELECT+%3Fschool+%3Fname+%3Fdate+%3Fdistrict+WHERE+{%0D%0A%3Fschool+a+sch-ont%3ASchool%3B%0D%0Asch-ont%3AestablishmentName+%3Fname%3B%0D%0Asch-ont%3AopenDate+%3Fdate%3B%0D%0Asch-ont%3AdistrictAdministrative+%3Fdistrict.%0D%0A}+ORDER+BY+DESC%28%3Fdate%29+LIMIT+15&output=csv&callback=&tqx=&service-uri=http%3A%2F%2Fservices.data.gov.uk%2Feducation%2Fsparql";

  var x=UrlFetchApp.fetch('http://data-gov.tw.rpi.edu/ws/sparqlproxy.php',{method: 'post', payload: args});
  var ret=x.getContentText();
  ret = CSVToArray( ret, "," );

  var arr = [];
  var c = [];
  for (var i=0;i < ret.length-1;i++) {
    for (var j=0; j< ret[0].length;j++){

  var destinationRange = ss.getRange(1, 1, i, j);

The CSV2Array function is one I found on Stack Overflow

Screenscraping With Google Spreadsheets App Script and the =importHTML Formula

Exciting news: Google Apps script is now available for all Google spreadsheet users…, so it seems I was timely in starting to get to grips with playing with this stuff…;-)

So what can we do with Google Apps Script? I’ve been posting a few ideas already, but here’s something I was working on last night – a script that automates table screenscraping using the Google spreadsheet =importHTML() formula.

Here’s the setting: Lichfield council lists details of the last round of council elections on a summary results page:

and a set of individual pages detailing comprehensive results from each ward (example).

What I wanted was a big table listing all the results by ward.

It’s easy enough to pull these results into a Google Spreadsheet using the =importHTML(“URL”,”table”,N) formula (where N is the number of the table in the page), but pulling results in for all 26 wards and then processing them separately would be laborious, so here’s what I did…

First of all, pull in the summary table to get a list of all the wards:

The results by ward page all live on a commonly structured URL (e.g. http://www.lichfielddc.gov.uk/site/custom_scripts/electiondetail.php?ward=2&category=1) with the ward parameter identifying the ward. This URL isn’t picked up by the table scraper, so I had to do a bit of hand finishing, adding an ID column to which I added the appropriate ward number of each ward as used in the ward results URLs.

So, having got a sheet that listed the wards, and a (hand added) identifier for each ward, I could write a script that would create a separate sheet for each ward and add an appropriately customised =importHTML() formula to it:

function addSheets() {
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var sheet = ss.getSheetByName("Wards");
  var wardNameID=[];
  var wardIDName=[];
  if (sheet != null) {
    var col_ward=getColNumber(sheet,"Ward");
    var col_ID=getColNumber(sheet,"ID");
    var startRow = 2;  // First row of data to process
    var numRows = getMaxRows(sheet);
    var maxcols= getMaxColumns(sheet);
    var dataRange = sheet.getRange(startRow, 1, numRows, maxcols);
    var data = dataRange.getValues();
    for (i in data) {
     var row = data[i];
     var wardName=row[col_ward];
     var wardID=row[col_ID];
     //Browser.msgBox(wardName+" "+wardID);
  for (var i=2;i<numRows;i++){//28
    var sname=wardIDName[i];
    ss.insertSheet(sname, 1);
    sheet = ss.getSheetByName(sname);

This function uses a couple of utility functions – getMaxRows() and getMacColumns() which I superstitiously added to get the number of populated rows/columns from a sheet:

function getMaxRows(sheet){
  var maxrows = sheet.getMaxRows();var max =maxrows;
  for (var j=1;j<maxrows+1;j++){
    var header= sheet.getRange(j, 1, 1, 1).getValue();
    if (header==""){
       max=j; j=maxrows+1;
  return max;
function getMaxColumns(sheet){
  var maxcols=sheet.getMaxColumns(); var max=maxcols;
  for (var j=1;j<maxcols+1;j++){
    var header= sheet.getRange(1, j, 1, 1).getValue();
    if (header==""){
       max=j; j=maxcols+1;
  return max;

and the more generally useful getColNumber(), which gets the number of a column in the sheet given its header:

function getColNumber(sheet,colName){
  var colNum=-1;
  var maxcols=sheet.getMaxColumns();
  for (j=1;j<maxcols+1;j++){
    var header= sheet.getRange(1, j, 1, 1).getValue();
    if (header==colName) colNum=j-1;
  return colNum;

The addTableImporter() function is the one that adds the importTable() formula to each sheet as required:

function addTableImporter(ss,n) {
 var sc = ss.getActiveCell();

 var arr = [];
 var c = [];

 var destinationRange = ss.getRange(1 , 1, 1,1);


Running the addSheets() function creates one sheet per ward, imports the appropriate table, and names the sheet as the name of the Ward.

We’re now in a position to pull together a monolithic table that aggregates data from all the wards:

function aggregator(){
  // for each ward spreadsheet, load in data by column
  // dataRow=[wardName, candidate, votes]
  // if votes=="n/a", votes =100;
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var table=[];
  var writeRow=[];
  var bigsheet = ss.getSheetByName("Aggregate");
  var wards=getWardNames();
  for (var i=0;i<wards.length;i++){
    //get sheet
    var sheet = ss.getSheetByName(wards[i]);
    if (sheet!=null){
      var maxrows=getMaxRows(sheet);
      for (var j=2;j<maxrows+1;j++){
        var candidate=sheet.getRange(j, 1, 1, 1).getValue();
        var party=sheet.getRange(j, 2, 1, 1).getValue();
        var vote = sheet.getRange(j, 3, 1, 1).getValue(); if (typeof vote != 'number') vote=100;
        if (candidate!=""){
  var destinationRange = bigsheet.getRange(1, 1, table.length, 4);

Again, I make use of a utility function, this time to grab the names of the wards from the ward spreadsheet:

function getWardNames(){
  var ss = SpreadsheetApp.getActiveSpreadsheet();
   var sheet = ss.getSheetByName("Wards");
   var wardNames=[];
   if (sheet != null) {
       var col_ward=getColNumber(sheet,"Ward");
       var startRow = 2;  // First row of data to process
       var numRows = getMaxRows(sheet);
       var dataRange = sheet.getRange(startRow, 1, numRows, 1);
       var data = dataRange.getValues();
       for (i in data) {
        var row = data[i];
  return wardNames;

Running the aggregator function pulls all the data into one sheet:

Finally, for completeness, I added a routine that will delete the separate ward sheets:

function deleteWardSheets(){
     var ss= SpreadsheetApp.getActiveSpreadsheet();
     var wardNames=getWardNames();
     for (var i=0;i<wardNames.length;i++){

Supposedly, it’s possible to share scripts by submitting them to an Apps script gallery:

On submitting a script, it appears to get passed into an approvals process, so I’m not sure what happens next…?

Anyway – the code is all provided above if you want to try it out. Remember, what it’s doing is looking at a list of Ward names and identifiers, creating a sheet for each Lichfield local council ward, importing the election results table for each ward from the Lichfield site into the appropriate sheet, then providing another function that generates an aggregated datatable containing all the results, annotated with the name of the appropriate ward.

I was wondering whether this could all be done with from single function call, but I got twitchy that if I called a sheet before the imported data was loaded everything would break.

The take home from me about this is that with a little bit of creativity you can mix and match spreadsheet formula and Javascript functions. (I don’t know if spreadsheet formula can be accessed directly from the apps script?) That is, you can run calculations either in Javascript/Apps script, or if it’s more convenient to use a spreadsheet formula, you can…

Steps Towards a Volcanic Ash Advisory Google Maps Mashup Using Met Office Data


Hi Tony,
In looking for authoritative news on the cloud about to tr-ash my holiday, I found this advisory:


There’s a string of coordinates which appear to mark the extent of the cloud at “FL200” (20000 feet?). E.g. N6343 W01935 is 63.43N 19.35W.

It looks ripe for your skills, and if you could get a map and a how to …

Don’t you realise it’s a race weekend – and if I tinker on such a weekend, I tinker F1data?!;-)

Okay – so here’s a half hour hack (I timed it)…

The advisory data from the Met Office looks like this:

Rather than mess around with any scraping, I just copies and pasted some of the data into a text editor (I use TextWrangler on a Mac) and ran a few regular expressions over it. Starting at the bottom of the list:

– remove end of line characters and replace them with a space;
– remove the “-” and any whist space around it and replace it with a return (new line) character;
– replace the space with a tab character.

The result of the regular expression processing is a two column tab separated list of co-ordinates.

These can then be copied and pasted into a spreadsheet. For rapid prototyping purposes, I pasted the data into a Google spreadsheet because I know I can get a access to a CSV output from there, and I also know that I can get access to a map widget that will plot markers given lat/long data in that environment.

BIG OOPS… The next step was to decode the position data. How to read a Volcanic Ash Advisory gave a couple of clues, and a quick test suggested the data is direction, degrees, minutes, seconds concatenated. I’m guessing that in the general case the degrees are always two digits, the minutes one or two digits and the seconds zero or two digits but in the data I looked at the length was always 5 characters for latitude (dDDMM), 6 for longitude (dDDMSS), so I just worked with those fixed lengths.

Most of the mapping tools I use require lat/long co-ordinates in a decimal format, so a quick check of Stack Overflow turned up a function to convert from degrees, minutes, seconds to the decimal version (Converting latitude and longitude to decimal values).

UPDATE: I think the data format is actually just a decimal format to 2 dp. So the above and code below is all, in this case, nonsense… but I’m going to leave the content here anyway…

I then used a variant of this code to hack a formula using Google Apps script to run the conversion:

Using the apps script function I had defined as a spreadsheet formula, I could convert the lat/long data in the format provided by the advisory note into the more typical digital representation. Highlighting the converted lat/long data and Inserting a Google Maps Gadget gave me a mapped preview over the data.

Here’s the code (needs generalising bearing in the 1/2 digits for minutes; I had a problem with substring (not sure what – I only ever got one character out) which is why I used substr for the parsing):>


*** BROKEN CODE - left in for reference purposes only ***

function ConvertDMSToDD_BROKEN_STUPID(degrees, minutes, seconds, direction) {
    var dd = degrees + minutes/60 + seconds/(60*60);

    if (direction == "S" || direction == "W") {
        dd = dd * -1;
    } // Don't do anything for N or E
    return dd;

function ParseDMS(input) {
  var parts = [];
  if ((input.substring(0)=="N")||(input.substring(0)=="S")){
  } else {
  var coord = ConvertDMSToDD(parts[0], parts[1], parts[2], parts[3]);
  return coord;

You can see the [UPDATE: corrected] spreadsheet here: Volcanic Ash Advisory mapping demo.

I replaced the lat/long conversion formula with one that:
– adds a decimal place after the first two digits;
– adds a minus sign for W/S.

Here’s the corrected formula:

function ParseDMS(input) {
  var parts = [];
  var coord = parseFloat(parts[1]);
  if ((parts[0]=='S')||(parts[0]=='W')) coord=coord*-1;
  return coord;

So what would be next?

– tidy up the lat/long conversion code so that it works in general case;
– find a way of pulling data into the spreadsheet live.

It would also be interesting to try to take into account altitude data, and then view the data in 3D in something like Google Earth. A heat map view rather than separate markers might also be fun to do (e.g. using HeatMapAPI).

But for now, back to the weekend…

onFormSubmit – Raising Web Scale Events in Google Spreadsheets

What happens if you want to actually do something with a particular response from a web survey form at the time it is submitted, other than just collect it?

One of the handy things about Google Spreadsheets is the ability to create interactive web survey forms that can collect data that is then posted into a corresponding spreadsheet. Around the time of the Google I/O event, several event related features were released as part of Google Apps script, the javascript scripting framework that supports an increasing number of Google apps. And by “event” I don’t mean something like the upcoming Isle of Wight Festival – I mean computational events, that can be used to trigger other computational actions…

One of the new events is onFormSubmit, which I finally got round to playing with last night. Here’s my “Hello World” example:

So here’s the code:

//Test function for Google Apps Script onFormSubmit
//Two sheets in a single spreadsheet doc
//First sheet corresponds to form
//Second sheet just displays one of the elements from the most recent form submission
// the function testOnSub() has a trigger associated with it: 'From spreadsheet' 'On form submit'
function testOnSub() {
  var ss = SpreadsheetApp.openById(SPREADSHEET_KEY);
  var sheet=ss.getSheets()[1];
  var form=ss.getSheets()[0];
  var lr=form.getLastRow();
  var el=form.getRange(lr,2,1,1).getValue();
  var t=el;

Here’s how to set it…

Google apps script - spreadsheet events

What next? Earlier this week, I watched a compelling presentation from @progrium, based around the following slide deck:

Among the handy tools demonstrated (I loved the idea of clickhooks (src), clickable links with webhook callback actions) was a webhook debugging tool, postbin. What this tool does is just capture and redisplay stuff that is posted to it… which makes it ideal for a quick demo…

So for example, suppose I have a Google form set up, and I want to perform a particular action using a third party webservice on some element contained in the form submission, or maybe only on certain items according to what information was submitted via the form, as soon as the form is submitted. Here’s one way of doing that (code on gisthub):

// Simple spreadsheet, with first sheet containing form submission repsonses
// when the form is submitted:
// 1) grab the latest response,
// 2) post it to a third party service via an HTTP POST
function testWebhook() {
  var ss = SpreadsheetApp.openById(SPREADSHEET_ID);
  var form=ss.getSheets()[0];
  var lr=form.getLastRow();
  var el=form.getRange(lr,2,1,1).getValue();
  var t=el;
  //The following escape palaver is gleaned from a Google help forum...
  var p="val1="+encodeURIComponent(t).replace(/%20/g, "+")+"&val2="+encodeURIComponent(form.getRange(lr,3,1,1).getValue()).replace(/%20/g, "+");

  // Here's where we do the callback...
  var x=UrlFetchApp.fetch('http://www.postbin.org/YOURPASTEBINID',{method: 'post', payload: p});

Attach the on form submit trigger event to the function, and here’s the response when we submit a form:

Pastebin response from Google spreadsheet onFormSubmit callback

Clever, eh?

So what does this mean? It means that I can set up a Google Survey form and as soon as anyone posts a submission, I can process it, either within the Google Apps environment using Google Apps script, or using third party services that accept an HTTP post input.

As Jeff Lindsay suggests, the evented web is increasingly a reality…

Google Books Library Shelves

It’s been some time since I last had a look at the “My Library” service in Google Books, but with the announcement of Google eBooks store (currently US only, except for out-of-copyright free downloads) I popped over to my Google Books account to see whether anything else had changed…

One of the little known (I think?) features of Google Books is the “My Library” personalisation which allows you to create a collection of books and search over them. Searching your library finds all the books in your library collection that contain the search phrase; if a preview of the book is available returns deep links into the book to the point(s) at which the search terms appear:

Search within a book on google books

I’ve previously commented on the My Library aspect of Google Books in the context of its possible use by libraries for providing a full-text search option over books in their collection (e.g. Complementing the OPAC With a Full Text Search Book Catalogue where I describe the use of the service by Wiltshire Heritage Library (example) and the Penn State University Press booksearch (example)).

(At the moment I don’t think you can get statistics back on the searches carried out on a My Library profile, though Google books can do stats for publishers e.g. Google Books for Publishers).

Anyway – one of the problems I originally had with My Library was that you could only maintain a single collection. But it seems that it’s now possible to create separate collections by tagging books in your Library onto “shelves”:

Google Books - My Library

(Shelves appeared at the start of 2010, it seems: Updated Books Home Page and My Library.)

So what immediately comes to mind is that if you’re running several courses, you could add the books used in the course to a My Library shelf, and then publish a link to a search context for that shelf to give a full text searchable version of the books on the list (assuming they’ve been scanned by the Goog, of course). Where previews are available, deep links into books will be available as part of the search results.

I haven’t really populated any shelves yet, but here’s the idea:

Google books - My library search

I haven’t explored the Book Search Data API yet, bit it does seem to offer the ability to search over a particular user’s public library, as well as retrieve lists of books from the library. API options also exist for adding books to a library, though the API seems to only support adding labels, rather than updating shelves (or maybe legacy handlers map labelled books onto shelves?). With a bit of digging, it might be possible to find a route to automate the creation of a library shelf from a list of books. (Hmmm, maybe I should try this with the OU Set books list?!;-)

Google Books shelves thus seem to provide a way of creating different lists of books within a single user library, although I’m not sure if there is a limit on the number of books contained within a shelf, or in the library as a whole. Another nice feature is that it’s possible to select a shelf based filter to just display books from a similar shelf (click on the label in the left-hand sidebar to filter by shelf); this search facet also seems to be passed through to a bookmarkable URL for the filtered search via the as_coll argument (I think?). (Which is to say: you can share a link for a search within a particular shelf in a particular user’s library.)

I’m not sure if Google Books is available through Google Apps for Education, but it could be a useful component of a full text book search context around books on a reading list?

PS As Google Scholar appears to be improving its coverage, it strikes me that the Goog still doesn’t offer a Google service for building searchable reference lists, although it does let you customise the addition of links that will bookmark a reference to a service for you:

Google scholar citation linker

Here’s how the links are displayed:

Google scholar results

Given you can build weblink search contexts using Google custom search engines, full text book search contexts using the Books My Library service, search over content from bundled feeds in Google Reader and even run things like video search by user on Youtube*, the Goog must surely be looking to offer a collection building and searching over service for Google Scholar? So I wonder… could Google end up taking over a service like CiteULike or Mendeley to complement and bootstrap personalisation of their Google Scholar offering? Or would they just build their own (cut down) version of these services?

* Hmm… I wonder if there’s a Youtube API switch that lets you search playlists? It’s definitely possible to get a playlist feed out…

PPS the Goog is also lacking a way of exposing all these personal search contexts to a logged in user through the same interface. If it were down to me, I’d start to expose them in the left hand sidebar of Google websearch, so I’m guessing this will be a labs/experimental service in the new year, if it isn’t already so…

Google search tools


Visual UI Editor For Google Apps Script

One of the things that’s still on my to-do list is to get round to playing more with Google Apps script, particularly the ability to create web-hookable services around spreadsheets and generate custom user interfaces to Apps Script powered applications.

Here’s something I think I’ll need to add to the list, as brilliantly spotted by @mhawksey: an experimental visual editor for creating forms in Google Apps Script:

Visual UI builder in Google Apps script

Here’s a preview of it in action:

Google Apps UI builder

And again from Martin, here’s a link to the developer forum group discussion about how to use it in its current experimental state…

I have to pop out now, so know time to play, but I thought it worth a mention… (Martin will probably have a post up about it before long;-)

PS also of note, another step on how the route to open peer appraisal and peer-supported CPD might work out, check out Martin’s draft application for the ALT Learning Technologist of the Year Award 2011.

PPS in passing, via @schmerg, an HTML5 visual editor for browser based UIs: Maqetta

Google Visualisation API Controls Support Interactive Data Queries Within a Web Page

The only way I can keep up with updates to Google warez at the moment is to feed off tips, tricks and noticings shared by @mhawksey. Yesterday, Martin pointed put to me a couple of new controls offered by the Google visualization API – interactive dashboard controls (documentation), and an in-page chart editor.

What the interactive components let you do is download a dataset from a Google spreadsheet and then dynamically filter the data within the page.

So for example, over on the F1Datajunkie blog I’ve been posting links to spreadsheets containing timing data from recent Formula One races. What I can now do is run a query on one of the spreadsheets to pull down particular data elements into the web page, and then filter the results within the page using a dynamic control. An example should make that clear (unfortunately, I can’t embed a live demo in this hosted WordPress blog page:-(

I’ve posted a copy of the code used to generate that example as gist here: Google Dynamic Chart control, feeding off Google Spreadsheet/visualisation API query

Here’s the key code snippet – the ControlWrapper populates the control using the unique data elements found in a specified column (by label) within the downloaded dataset, and is then bound to a chart type which updates when the control is changed:

  var data = response.getDataTable();
  var namePicker = new google.visualization.ControlWrapper({
    'controlType': 'CategoryFilter',
    'containerId': 'filter_div',
    'options': {
      'filterColumnLabel': 'driver',
      'ui': {
        'labelStacking': 'vertical',
        'allowTyping': false,
        'allowMultiple': false    

  var laptimeChart = new google.visualization.ChartWrapper({
    'chartType': 'LineChart',
    'containerId': 'chart_div',
    'options': {
      'width': 800,
      'height': 800
  var dashboard = new google.visualization.Dashboard(document.getElementById('dashboard_div')).
    bind(namePicker, laptimeChart).

As well a drop down lists, there is a number range slider control which can be used to set minimum and maximum values of numerical filter, and a string filter that lets you filter data within a column using a particular term (it doesn’t seem to support Boolean search operators though…) Read more about the controls here: Google visualisation API chart controls

Something else I hadn’t noticed before: sort events applied to tables can also be used to trigger the sorting of data within a chart, which means you can offer interactions akin to some of those found on Many Eyes.

Whilst looking through the Google APIs interactive playground, I also noticed a couple of other in-page data shaping tools that I hadn’t noticed before: group and join

Group, which lets you group rows in a table and present and aggregated view of them:

That is, if you have data loaded into a datatable in a web page, you can locally produce summary reports based on that data using the supported group operation?

There’s also a join operation that allows you to merge data from two datatables where there is a commmon column (or at least, common entries in a given column) between the two tables:

What the join command means is that you can merge data from separate queries onto one or more Google spreadsheets within the page.

With all these programming components in place, it means that Google visulisation API support is now comprehensive to do all sorts of interactive visualisations within the page (I’m not sure of any other libraries that offer quite so many tools for wrangling data in the page? (The YUI datatable supports sorting and filtering, but I think that’s about it for data manipulation?)

I guess it also means that you can start to treat a web page as a database containing one or more datatables within it, along with tool support/function calls that allow you to work that database and display the results in a variety of visual ways?! And more than that, you can use interactive graphical components to construct dynamic queries onto the data in a visual way?!

PS here are a couple of other ways of using a Google spreadsheet as a database:
Using Google Spreadsheets as a Database with the Google Visualisation API Query Language
Using Google Spreadsheets Like a Database – The QUERY Formula