OU Library Home Page – Normalised Click Density

The OU Library website has been running Google Analytics for ages, but from what I can tell they haven’t done a hug amount with the results in terms of making the analytics actionable and using them to improve the site design (I’d love for someone to correct me with a blog post or two about how analytics have been used to improve site performance. If anyone would like to publish such a post, I’ll happily give you a guest slot here on OUseful.info…:-)

(As a bit of background, see Library Analytics, (Part 1), Library Analytics, (Part 2), Library Analytics, (Part 3), Library Analytics, (Part 4), Library Analytics, (Part 5), Library Analytics, (Part 6), Library Analytics, (Part 7) and Library Analytics, (Part 8))

Anyway, here’s the Library homepage (August 2009):

And here are two the real OU Library homepages:

(See also: Where is the Open University Homepage?;-)

And here’s the OU Library homepage as treemap, where the block size shows where the traffic goes (as recorded over the last month) as a percentage of all traffic to the OU LIbrary homepage.

OU Library homepage - normalised click density

So if each click was equally valuable, and each pixel on the screen was equally valuable, then that’s how the screen area should be allocated… (Hmm – that could be, err, interesting – an adaptive homepage where there’s one block element per link, and a treemap algorithm that allocates the area each block has when the page is rendered? Heh heh :-)

I did think about showing a heatmap of where on the homepage the clicks were made, but I figure I’ve probably already upset the Library folk enough by now. I also considered doing a treemap showing the realtive proportions of different keywords on Google that drove traffic to the OU Library homepage, but I figure that may be commercially sensitive in terms of bidding for Adsense keywords…