Archive for the ‘OBU’ Category
Another day, another OU web play… Realising that the OU on iTunesU presence has its downside (specifically – having to use iTunes), you can now get hold of OU “enhanced podcasts” from the OU Podcasts (beta) site.
The architecture of the site borrows heavily from the OU Youtube presence, offering Learn, Research and Life options (even if they aren’t populated yet).
I’m not sure whether there is duplication (or even triplication) of content across the Podcast, iTunesU and Youtube sites, but then again – would it matter if it was? And I’m not sure if there is a pipeline that allows content to be “deposited” once behind the firewall, then published on the podcast, Youtube and/or iTunesU sites in one go, as required (can anyone from any of the respective project teams comment on how the publishing process works, and whether there is any particular content strategy in place, or is content being grabbed and posted on the sites howsoever it can?!;-)
The pages for actual “programme” elements contains an embedded (though not shareable or embeddable?) player, along with subscription feeds for the topic area the “programme” is assigned to.
The programme page has a rather redundant “Permalink for this page” (err – it’s in the browser address bar?), and there doesn’t appear to be a link to the actual audio file, which might be useful going forward, but there is a range of topic/channel podcast subscription feeds.
I don’t think the podcast page resyndicates audio content from the open2.site, podcast feeds from OU/BBC Radio programmes or archived (real player, bleurghhh:-( content co-produced by the OU that is still available on the BBC website. (For examples, see the far from complete RadiOBU player.)
Design wise, I wonder how well this sort of page design would cope as a container for OU/BBC TV content? Maybe I should try to steal elements of the CSS stylesheet to tart up the OU/BBC 7 day catch-up service?! (Or maybe one of the podcast team fancy a quick doodle on the side?;-)
The URL design looks neat enough, too, taking the form: http://podcast.open.ac.uk/oulearn/arts-and-humanities/history-of-art/ (that is, http://podcast.open.ac.uk/oulearn/TOPIC/PROGRAMME/).
The eagle-eyed amongst you may notice that there is an option (for OU Staff?) to Login, which leads to the option to “Join [the] hosting service”:
So while it doesn’t look like there is much benefit to logging in at the moment, it seems as though there is a possibility that the site will be offering hosting for individually produced podcasts (using Amazon S3, I believe…) in the near future?
I’m not sure where individually produced podcasts would live on the podcasts site, though? In the appropriate topic area?
Once again, great job folks… :-) [Disclaimer: I have nothing to do with the OU podcasts site.]
PS a couple more minor quibbles, just because…;-) The favicon is a KMI favicon. This doesn’t really fit, IMHO. The release of the Podcasts site has not been reflected (yet) with a mention on the /use site, which looks increasingly “stale” (there’s no mention of Platform there, either…).
Although there doesn’t appear to be an opportunity for Faculties or Departments to have a presence, as such, on the site (unless they provide content for topic areas?), I wonder whether the podcast site back end could actually be used as a content delivery service for Departmental content (e..g. the content on the Department of Communication and Systems website).
Just because, I had a little dig around to see what representation the OU might have made, not least because of our involvement with traditional broadcast via the OU relationship with the BBC… (Note to self: check whether these BBC Commissioning – Open University – Rights Guidelines are current?)
Here’s a link to the OU’s response to Phase One of the Ofcom PSB review from the middle of last year: OU Response to Ofcom PSB Review Phase 1 (PDF) (read it on Scribd (maybe?)). (I couldn’t find a response for pahse 2?)
If you want to know what it said… well, you’ll just have to read the response yourself (it’s not too long). One thing I did find particularly interesting, though, was that there was no response to the question “8i) What do you think is the appropriate public service role for Channel 4 in the short, medium and long term? What do you think of Channel 4′s proposed vision?”.
Given that the OU was part of a consortium that (unsuccessfuly) bid to take over the running of Teachers TV last year, I’d have thought we might have an interest in who was involved in PSB in a wider sense (and what relationship the OU might have with them?)
And given one of the apparently mooted options for the future of Channel 4 is some sort of Channel 4 partnership with BBC Worldwide, what if part of that option suggested that the OU pays Channel 4, rather than the BBC, to produce and broadcast OU programmes?!
And as for contributions to the Carter report? I couldn’t find any public responses – though with one of the anticipated sections of Digital Britain covering the questions of intellectual property rights and their enforcement on the internet, there could be a potential “new revenue stream” for the OU exploiting our rights clearance experience, particularly as other universities seek to publish their teaching materials on the web?
PS As a quick refresher, here’s a quote from the OU charter about broadcast: “The objects of the University shall be the advancement and dissemination of learning and knowledge by teaching and research by a diversity of means such as broadcasting and technological devices appropriate to higher education, …“.
Followers of the OU’s new release feed (or my Twitter feed;-) probably know that The Open University and Digital Planet have joined forces to produce a series of specials (one every two months or so) over the coming year.
The first special, on the geo-web, those bits of the web that provide an intersection between digital and physical space, aired last Tuesday; (there’s a brief pre-emptive write up on Platform: Taking the travel bug to Nepal with Digital Planet).
If you haven’t listed to it yet, it’s available for one day more from the BBC World Service Digital Planet webpage – so if you hurry, you’ll still be able to download (or stream) a copy…
(Digital Planet actually goes out weekly, so you can also find a podcast feed for the series there…)
The topics covered in the programme included a feature on location awareness and Streetview integration from Google’s mobile mapping division, which taught me something new – Streetview’s crazy use of compass and accelerometer data!
The programme also contained some good discussion on privacy issues and a package on geocaching, featuring the OU’s very own Gill Clough, who you can see chatting to Digital World presenter Gareth Mitchell (with producer Pam Rutherford in the background!) here: The OU and Digital Planet: Gareth and Gill Go Geocaching (*video exclusive*).
Now if only they’d got Bill Thompson traipsing around in the mud, rather than sitting comfortably in the studio, too…;-)
As the OU’s academic contact on the episode, I got to bounce ideas around with the producer and presenter in the couple of weeks before the programme aired, as well as casting an eye over a draft of the script during the weekend before the Monday afternoon recording.
I’ve also been working sideways, as it were, with the producers of the Open2 website that supports the programme: Digital Planet on open2.net
One of the things we wanted to start exploring with the website was how we might start to engage with Digital Planet’s global audience.
And our first offering is: the interactive Open2 Digital Planet listeners’ map, which allows listeners to add a marker to the map showing where in the world they listen to the programme from…
Along with placing a marker, listeners can also tell us how they listen to the programme, and link to a photo of themselves if they wish:
As we’re running a lazy, userflagging moderation system, links are added to user marker bubbles to report content, if necessary…
You might also have noticed a Twitter feed – this is currently aggregating tweets tagged with #digitalplanet, and we’ll hopefully find some novel ways of using it – and appropriately licensed photos tagged digitalplanet on flickr – to support future programmes…
And finally – one package that aired that I haven’t mentioned yet was all about some travel bugs that are making their way to Nepal (Geocaching schools project: travel bugs to Nepal)… With a bit of luck, I intend to get a post up about that on the Open2.net Science and Technology blog over the next week or so… and maybe even travel bug tracking map on the open2.net Digital Planet site (if we can get a feed from geocaching.com, that is…)?! :-)
Followers of my twitterstream (@psychemedia may have picked up on the fact that this week’s episode of the BBC World Service programme Digital Planet was also our second co-produced episode of this weekly series (the first being a geo-web special earlier this year).
As with other OU/BBC co-pros, we at the OU get to comment on various aspects of the pre-production of the programme (helping identify possibly story packages, and identifying key themes) as well as commenting on draft scripts. (If you follow presenter Gareth Mitchell’s tweets (@garethm), you’ll know that happens over the weekend before the Monday afternoon recording of the programme (as well as Monday morning itself!)
Unlike most other co-pros, though, we wanted our relationship with the Digital Planet team to extend onto the web, providing a supporting package around the programme, as well as a legacy (Digital Planet programmes are available for at least 7 days after transmission as a podcast, but unlike many BBC (rather than BBC World Service) programmes, a public archive of previous programmes is not available.
We also wanted to try to come up with content that could engage Digital Planet’s international audience in an interactive way. So for example, we provided the Digital Planet listeners’ map to support the geo special, to allow listerners to place a marker to show where they’re listening from:
…as well as how they’re listening:
(Note to self – if we can get a live feed of marker locations and comments, we can maybe post some live mashups…;-)
For more on the first episode, and the listeners’ map, check out Exploring the GeoWeb with Digital Planet.
So, what have we done for the second special? Well the theme this time was “DIY Technology”, with packages from the recent Maker Faire in Newcastle, the story of Microsoft’s Photosynth (did you know there’s an unofficial but accepted iPhone version?!), and a feature all about the fascinating world of font design (the programme will be around for a day or two yet via the podcast feed – you can reach it from the Open2.net Digital Planet pages).
Partly because ace developer Simon Budgen managed to pull so many things together, we’ve actually got a mini-site for the supporting materials for this episode – Digital Planet DIY Technology Special, on open2.net – so what will you find there?
First up, on something font-tastic, there’s an opportunity to see the font Gareth made from his own handwriting – Gareth New Roman – as well as a download link for th font. If you think you can do better, there are links to some online tools to get you started designing your own font.
Secondly, there’s our Digital Planet – Photosynth page, which contains a link to a Photosynth of the studio using photos taken during the recording of the programme, (unfortunately, we couldn’t embed the Silverlight version of the player in the page because BBC Future Media Standards and Guidelines prevent it:-(
The page also includes a clip from the programme that I topped and tailed with a biddly-dee biddly-dee bong, a programme intro, and a closing credit, which had been on my “things we need to demonstrate” list for ages. (There’s a rights story behind that clip too, that I’ll maybe tell one day…;-)
The third (count ‘em) page we got up wrapped the Maker Faire package (Digital Planet at the Maker Faire). Rather than embedding another audio clip taken from the programme, this page actually embeds (using the embed code anyone can use) a BBC video report from the Faire:
BAs if that wasnlt enough, the page actually embeds a couple of other things too – firstly , an image feed pulling appropriately tagged images in from Flickr:
(Note the reactive/responsive moderation policy we have in place…)
..and also some Youtube videos that describe how to make some “LED Throwies” that were featured in the Maker Faire package:
So that’s the ‘official’ mini-site. But there’s more…
Over on the Open2 Science and Technology blog, I posted an article about Arduino, a simple electronics development board – ideal for tinkering with. And embedded in that post, another clip from Digital Planet (without the top and tailing credits this time – just a simple cut from the programme):
Just by the by, there are a couple of other things to note about the blog post: first, the embedded image is one we grabbed from Wikipedia; and secondly, we managed to sneak a tease for a forthcoming OU course in there…
So that’s it, right? Well, not quite – over on Platform, the OU’s social site that’s open to anyone (not just members of the OU family), there’s another blog post – reinforcing the font package and mixing in a couple more font related things for people to do. (Note that the Platform post is not linked to from the open2 pages (the open2 blog post is) – it’s there solely as another entry point to the open2 pages.
And finally, just for the record, here’s a note on schedules… I started chatting to the BBC production team about four weeks ago, bouncing around ideas for the programme. A couple of phone calls and a couple of email exchanges firmed things up, and then I got copies of the pre-recorded packages last Friday. Gareth circulated a draft studio script on Saturday, and I sent it back with comments Sunday. Another draft arrived Monday morning, (along with a chaser phone call!), and I sent final comments back before the mid-afternoon recording. Gareth sent a copy of his font to Simon on Sunday (Monday?) and the site went up in draft form (no mini-site at that point) on Monday. Gareth uploaded the studio photos to flickr and let us have them Monday, and on Monday evening I pulled them into a Photosynth that was linked to on the Tuesday. The Platform post appeared Wednesady and the embedded audio clips, Sci/Tech blog and mini-site navigation were in place today (Thursday), having been edited on Tuesday and Wednesday (using Audacity, as it happens…).
Huge thanks to the DP production team (Angela Sain, Rami Tzabar, Michelle Martin) and of course Gareth and billt, and esp. Simon B for pulling the open2 site together so reactively:-)
As ever, great fun… and a wonderful production schedule to work to!
One of the, err, side projects I’ve been looking at with a couple of people from the OBU has been bouncing around a few ideas about how we might “wrap” coverage of Formula One races with some open educational resources.
So with the first race of the new season over, I thought I’d have a quick play with some of the results data…
First off, where to get the results info? An API source doesn’t seem to be available anywhere that I’ve found as a free service, but the FIA media centre do publish a lot of the data (albeit in a PDF format): F1 Media Centre – Melbourne Grand Prix, 2009.
To get the data into an appropriate form required a little bit of processing (for example, recasting the race lap chart to provide the ranking per lap ordered by driver) but as ever, most of the charts fell out easily enough (although a couple more issues were raised – like being able to specify the minimum y-axis range value on a bar chart, for example).
Anyway, you can find the charts linked to from here: Australia Lap Times visualisation.
In the meantime, here are some examples (click through to reach the interactive original).
First up, a scatter plot to compare lap times for each driver across the race:
Secondly, a line chart to compare time series lap times across different drivers:
This bar chart views lets you compare the lap times for each driver over a subset of laps:
A “traditional” drivers standings chart for each lap:
Finally, this bar chart can be run as an animation (sort of) to show the rank of each driver for each lap during the race:
There are a few more data sets (e.g. pitting behaviour) that I haven’t had a look at yet, but if and when I do, I will link to them from the Australia Lap Times visualisation page on Many Eyes Wikified.
PS If you’re really into thinking about the data, maybe you’d like to help me think around how to improve the “Pit stop strategist” spreadsheet I started messing around with too?! ;-)
PPS It’s now time for the 2010 season, and this year, there’s some Mclaren car telemetry data to play with. For example, here’s a video preview of my interactive Mclaren data explorer.
Digging around looking for stats and data relating to the first Formula One Grand Prix of the new season, I came across some interesting looking technical info on the Race Car Engineering website (Formula 1 2009: Round 1 Australia tech data), as well as details of the staring weighs of the vehicles following qualification (post-qualifying car weights).
Assuming that the weight of the fuel is the post-qualifying weight of the car minus the minimum weight of the car, it should be possible to have a guess at when the teams are planning their first pit stop. So I started doodling a spreadsheet that could be used to try and work out fuel’n’pitting strategies (albeit very simplistically).
If you’re interested, you can find it here: Race Day Strategist Spreadsheet:
I’ve made a few assumptions about how to calculate how far the fuel will take a car, so if you can tell me if/where I’ve made any mistakes/errors/bad assumptions, please post a comment.
I’ve tried to make the working clear, where possible:
I also put together a ‘quick calculator’ that could be used to play-along-a-strategist while watching the race.
All the formulae were made up on the fly (“hmm, this could be interesting?”) so when I get a chance, I do a little reading to find out how other people have addressed the issue. (I’ve already found links for a couple of things I probably ought to reqad: Practice Work – Optimization of F1 – PIT STOP TACTICS (which may contain some interesting ideas) and the rather more involved Planning Formula One race strategies using discrete-event simulation (subscription required – so OU folks should be okay through the OU library. If there are any other things you think I should add to the list, please pop a reference to them in the comments.)
This spreadsheet could obviously go much further – addressing other pit stop timing delays, tyre considerations etc. Being able to pull in live timing data – e.g. time intervals between the car of interest and other vehicles – and predict car lap times would also add a little more intrigue when trying to decide whether or not to pit.
But it’s a start, and it got me asking a few questions that might not otherwise have come to mind ;-)
All I need to do now is work in the visual angle, maybe taking a little inspiration from Visualising Lap Time Data – Australian Grand Prix, 2009…
Way back when, the Library piloted a video search engine – DiVA – that would search over some of the video material that had been produced specifically for OU courses (Course Content Image Search) and possibly over some of the content the OU had co-produced with the BBC.
Recently, of course, the OU has got into co-producing flagship programmes for BBC1 and BBC2, as well as the lesser channels, but as far as I know, there is no easy way for us to search over this material (the best way used to be the now deprecated BBC Catalogue search).
As the BBC programme catalogue adds entries, this will become increasingly valuable for resource discovery, and it will also be interesting to see how Box of Broadcasts plays out, too.
For using video in courses, there are three main issues: 1) discovery of the clip; 2) rights clearance; 3) actually getting the video embedded in the VLE.
In an ideal world, I’d quite like to be able to go to an institutional version of Youtube, enter the search terms and get a video clip. This is already possible in the Youtube universe, of course…. For example, I want to use a clip from a James May programme that the OU co-produced, so the easiest way I could think of saying “this is the clip I want” was to search for james may motion capture on Youtube and grab the top result:
Overall time to go from thinking “I’d like this clip” to getting an embed code for it (albeit a copyright infringing one)? Less than a minute.
I have just started the process of trying to get an official version of the clip (start time: 14.00 Weds April 15th, 2009…) so it’ll be interesting to see whether I can get this clip in the VLE in time for when it’s actually needed at the start of June. Indeed, I’m not even sure I sent the email to the right person, so maybe I’ve only false started on actually finding out how to get this clip?!
When it comes to referring students to complete programmes, I’m not sure what the best approach is?
My ad hoc approach would be to try to find out whether a programme was likely to be broadcast on the BBC somewhere during the presentation of the course, and if it was, telling students to find it on iPlayer.
I’d possibly also look for links to what I needed from a BBC Programmes catalogue listing, the BBC World Service documentaries archive, the BBC Four interviews archive (deprecated), the BBC Learning Zone class clips website, or the BBC Archive and so on. (If they were no good, I’d end up on Redux….).
…and that’s just the BBC of course: the other UK terrestrial channels (or at least, ITV and Channel 4) now happily stream catch-up services on the web, as well as making some of their content (at least in Channel 4′s case) to services such as Joost (e.g. Channel 4 shows on Joost).
I’m not sure whether it’d also be useful to start compiling lists of links to BBC programme pages for OU co-pro programmes, because there’s nothing that obviously fulfills that role on Open2.net. (The closest I have at the moment is the OU/BBC iPlayer catch-up mashups here and here).