OUseful.Info, the blog…

Trying to find useful things to do with emerging technologies in open education

Posts Tagged ‘youtube

The Learning Journey Starts Here: Youtube.edu and OpenLearn Resource Linkage

Mulling over the OU’s OULearn pages on Youtube a week or two ago, colleague Bernie Clark pointed out to me how the links from the OU clip descriptions could be rather hit or miss:

Via @lauradee, I see that the OU has a new offering on YouTube.com/edu is far more supportive of links to related content, links that can represent the start of a learning journey through OU educational – and commentary – content on the OU website.

Here’s a way in to the first bit of OU content that seems to have appeared:

This links through to a playlist page with a couple of different sorts of opportunity for linking to resources collated at the “Course materials” or “Lecture materials” level:

(The language gives something away, I think, about the expectation of what sort of content is likely to be uploaded here…)

So here, for example, are links at the level of the course/playlist:

And here are links associated with each lecture, erm, clip:

In this first example, several types of content are being linked to, although from the link itself it’s not immediately obvious what sort of resource a link points to? For example, some of the links lead through to course units on OpenLearn/Learning Zone:

Others link through to “articles” posted on the OpenLearn “news” site (I’m not ever really sure how to refer to that site, or the content posts that appear on it?)

The placing of content links into the Assignments and Others tabs always seems a little arbitrary to me from this single example, but I suspect that when a few more lists have been posted some sort of feeling about what sorts of resources should go where (i.e. what folk might expect by “Assignment” or “Other” resource links). If there’s enough traffic generated through these links, a bit of A/B testing might even be in order relating to the positioning of links within tabs and the behaviour of students once they click through (assuming you can track which link they clicked through, of course…)?

The transcript link is unambiguous though! And, in this case at least), resolves to a PDF hosted somewhere on the OU podcasts/media filestore:

(I’m not sure if caption files are also available?)

Anyway – it’ll be interesting to hear back about whether this enriched linking experience drives more traffic to the OpenLearn resources, as well as whether the positioning of links in the different tab areas has any effect on engagement with materials following a click…

And as far as the linkage itself goes, I’m wondering: how are the links to OpenLearn course units and articles generated/identified, and are those links captured in one of the data.open.ac.uk stores? Or is the process that manages what resource links get associated with lists and list items on Youtube/edu one that doesn’t leave (or readily support the automated creation of) public data traces?

PS How much (if any( of the linked resource goodness is grabbable via the Youtube API, I wonder? If anyone finds out before me, please post details in the comments below:-)

Written by Tony Hirst

April 27, 2012 at 1:53 pm

Confluence in My Feed Reader – The Side Effects of Presenting

Don’tcha just love it when a complementary posts happen along within a day or two of each other? Earlier this week, Martin posted on the topic of Academic output as collateral damage suggested that “you can view higher education as a long tail content production system. And if you are producing this stuff as a by-product of what you do anyway then a host of new possibilities open up. You can embrace unpredictability”.

And then today, other Martin comes along with a post – Presentation: Twitter for in-class voting and more for ESTICT SIG – linking to a recording of a presentation he gave yesterday, but one that includes twitter backchannel captions from the presentation that were tweeted by the presentation that in turn itself, as well as the (potentially extended/remote) audience.

Brilliant… I love it…I’m pretty much lost for words…

`Just... awesome...

What we have here, then, is the opening salvo in a presentation capture and amplification strategy where the side effects of the presentation create a legacy in several different dimensions – an audio-visual record, for after the fact; a presentation that announces it’s own state to a potentially remote Twitter audience, and that in turn can drive backchannel activity; a recording of the backchannel, overlaid as captions on the video recording; and a search index that provides timecoded results from a search based on the backchannel and the tweets broadcast by the presentation itself. (If nothing else, capturing just the tweets from the presentation provides a way of deep searching in time into the presentation).

Amazing… just amazing…

Written by Tony Hirst

April 30, 2010 at 1:16 pm

Searching the Backchannel – Martin Bean, OU VC, Twitter Captioned at JISC10

Other Martin’s been at it again, this time posting JISC10 Conference Keynotes with Twitter Subtitles.

The OU’s VC, Martin Bean, gave the opening keynote, and I have to admit it really did make me feel that the OU is the best place for me to be working at the moment :-)

… though maybe after embedding that, my days are numbered…? Err…

Anyway, I feel like I’ve not really been keeping up with other Martin’s efforts, so here’s a quick hack a placemarker/waypoint in one of the directions I think the captioning could go – deep search linking into video streams (where deep linking is possible).

Rather than search the content, we’re going to filter captions for a particular video, in this case the twitter caption file from Martin (other, other Martin?!) Bean’s #JISC10 opening keynote. The pipework is simple – grab the URL of the caption file and a “search” term, parse the captions into a feed with one item per caption, then filter on the caption content. I added a little Regular Expression block just to give a hint as to how you might generate a deeplink into content based around the tart time of the caption:

Filter based searching caption

You can find the pipe here: Twitter caption search

One thing to note is that it may take some time for someone to tweet what someone has said. If we had a transcript caption file (i.e. a timecoded transcript of the presentation) we might be able to work out the “mean time to tweet” for a particular event/twitterer, in which case we could backdate timestamps to guess the actual point in the video that a person was tweeting about. (I looked at using auto-genearated transcript files from Youtube to trial this, but at the current time, they’re rubbish. That said, voice search on my phone was rubbish a year ago, but by Christmas it was working pretty well, so the Goog’s algorithms learn quickly, especially where error signals are available. So bear in mind that if you do post videos to Youtube, and you can upload a caption file, as well as helping viewers, you’ll also be helping train Google’s auto-transcription service (because it’ll be able to compare the result of auto-transcription with your captions file…. If you’re the Goog, there are machine learning/supervised learning cribs everywhere!))

(Just by the by, I also wonder if we could colour code captions to identify in a different colour tweets that refer to the content of an earlier tweet/backchannel content, rather than the foreground content of the speaker?)

Unfortunately, caption files on Youtube, which does support deep time links into videos, only appear to be available to video owners (Youtube API: Captions), so I can’t do a demo with Youtube content… and I so should be doing other things that I don’t have the time right now to look at what would be required deeplinking elsewhere…:-(

PS The captioner tool can be found here: http://www.rsc-ne-scotland.org.uk/mashe/ititle/

Martin Hawksey, whose work this is, has described the evolution of the app in a series of several posts here: http://www.rsc-ne-scotland.org.uk/mashe/?s=twitter+subtitles

Written by Tony Hirst

April 19, 2010 at 12:59 pm

Posted in Pipework, Search

Tagged with , ,

Watching YouTube Videos on Boxee via DeliTV

One of the easiest ways to get started with DeliTV is to use it to watch video feed subscription from YouTube.

With DeliTV, you can bookmark the following sorts of Youtube content and then view it in a DeliTV Channel:

Bookmarked YouTube page Resulting DeliTV subscription
User homepage/channel
e.g Teachers’ TV channel
Guardian Newspaper
Recently uploaded videos for that user
Playlist page e.g T151: 3D Geo-World Demos Playlist feed
Video page e.g The Machine is Us/ing Us (Final Version) Single video
[NEW] Search results page e.g Search for “formula one” Search results containing 20 most relevant videos

Here is the example channel bookmarked to a demo DeliTV channel guide: delitv_ytdemo:

(You can of course grab a copy of any of these bookmarks into your own delicious account.)

We can now bookmark this channel guide so that it appears in a DeliTV multiplex. In the following example, I’m bookmarking it to my main delitv feed, and also to the boxeetest5 multiplex.

Here’s the result in my boxeetest5 feed:

DeliTV

And here’s a view of the delitv_ytdemo channel guide:

DeliTV channel guide

This is what the bookmarked user/channel produces – the recent uploads listing for that user/channel:

DeliTV - Youtube user/channel recent upoads

And here’s the playlist guide:

DeliTV - Youtube playlist feed

Remember, with DeliTV you don’t need to bookmark the actual Youtbe feed – just bookmark the user/channel, playlist or video page to Delicious, and DeliTV will do the rest for you…

To learn how to subscribe to your own DeliTV channel, see Deli TV – Personally Programmed Social Television Channels on Boxee: Prototype

PS a new feature, currently in testing, lets you bookmark a search results page. Whilst it is possible to generate searches for playlist or users/channels as well as videos, DeliTV currently returns just the 20 most relevant Youtube videos when a Youtube search results page is bookamarked.

Written by Tony Hirst

September 19, 2009 at 3:27 pm

Posted in Anything you want, OBU

Tagged with , , ,

UK HEI Boxee Channel

A week or so ago, Liz Azyan posted a list of UK HEI Youtube channels. Although not quite as polished as @liamgh et al’s OU Boxee app, I piucked up on a couple of suggestions Liam made over a pint last night about simply subscribing to an RSS feed in Boxee to roll my own UK HEI Youtube Boxee channel thing…

So here are the institutional channels:

and here’s a peek inside one of them:

This lets me watch the most recently uploaded videos to all (?) the UK HEIs’ most recent uploads to their Youtube channels, organised by institution via a lean back TV interface.

(You might be able to submenu the institutional channels/streams according to playlists they have specified, as well as tidying up things like icons/logos, maybe, but this was a 10 minute hack, rather than a half hour hack, ok?!;-)

Here’s the recipe…

1. Grab the table from Liz’s web page and create a feed from it:

2. Generate the feed URIs for the most recent uploads to each channel (in the form required by Boxe – e.g. rss://gdata.youtube.com/feeds/base/users/abertayTV/uploads?alt=rss&v=2&orderby=published):

Filter out stuff that isn’t a feed and complete the pipe:

We can now grab the RSS feed from the pipe in the normal way and subscribe to it via a personal account on the Boxee website.

If you now launch the Boxee app, select:

- Video:

- Internet

- Video Feeds (My Feeds)

- the UK HEI Youtube Videos Channel

And from there, you should be able to browse – and play – the recent uploads to all the UK HEI Youtube channels that Liz has listed.

Not that I had a niggle with my Boxe player – I could hear the audio but not see the video for any of the Youtube videos when I tried to play them. If anyone else tries out this channel and gts the same problem, please let me know and I;ll see if it’s a feed problem. Otherwise, I’ll assume it’s a local glitch…

Here’s the RSS feed URI again: “UK HEI Youtube Channels on Boxee” RSS feed

PS out of interest, if I had bid to do this as a #jiscri project, how much should I have asked for?
planning: 10 mins chatting with Liam over a pint ysterday;
design: <5 mins looking up Youtube API/URI patterns
implementation:: <5 mins creating Yahoo pipe
configuration: <5 mins subscribing to the pipe feed in Boxee
testing: <5 mins seeing if it worked in Boxee (which it doesn’t, properly, but I’m blaming that on a local problem and trustung that it does actually work… err…?!;-)
Okay, so all told it was maybe a sub-20 minute hack rather than 5 minute one?
documentation: (i.e. blog post) 30-45 mins, incl grabbing screenshots.

And I’m on holiday today…

Written by Tony Hirst

August 28, 2009 at 2:44 pm

Posted in Tinkering

Tagged with , ,

Follow

Get every new post delivered to your Inbox.

Join 820 other followers