Posts Tagged ‘youtube’
Mulling over the OU’s OULearn pages on Youtube a week or two ago, colleague Bernie Clark pointed out to me how the links from the OU clip descriptions could be rather hit or miss:
Via @lauradee, I see that the OU has a new offering on YouTube.com/edu is far more supportive of links to related content, links that can represent the start of a learning journey through OU educational – and commentary – content on the OU website.
Here’s a way in to the first bit of OU content that seems to have appeared:
This links through to a playlist page with a couple of different sorts of opportunity for linking to resources collated at the “Course materials” or “Lecture materials” level:
(The language gives something away, I think, about the expectation of what sort of content is likely to be uploaded here…)
So here, for example, are links at the level of the course/playlist:
And here are links associated with each
lecture, erm, clip:
In this first example, several types of content are being linked to, although from the link itself it’s not immediately obvious what sort of resource a link points to? For example, some of the links lead through to course units on OpenLearn/Learning Zone:
Others link through to “articles” posted on the OpenLearn “news” site (I’m not ever really sure how to refer to that site, or the content posts that appear on it?)
The placing of content links into the Assignments and Others tabs always seems a little arbitrary to me from this single example, but I suspect that when a few more lists have been posted some sort of feeling about what sorts of resources should go where (i.e. what folk might expect by “Assignment” or “Other” resource links). If there’s enough traffic generated through these links, a bit of A/B testing might even be in order relating to the positioning of links within tabs and the behaviour of students once they click through (assuming you can track which link they clicked through, of course…)?
The transcript link is unambiguous though! And, in this case at least), resolves to a PDF hosted somewhere on the OU podcasts/media filestore:
(I’m not sure if caption files are also available?)
Anyway – it’ll be interesting to hear back about whether this enriched linking experience drives more traffic to the OpenLearn resources, as well as whether the positioning of links in the different tab areas has any effect on engagement with materials following a click…
And as far as the linkage itself goes, I’m wondering: how are the links to OpenLearn course units and articles generated/identified, and are those links captured in one of the data.open.ac.uk stores? Or is the process that manages what resource links get associated with lists and list items on Youtube/edu one that doesn’t leave (or readily support the automated creation of) public data traces?
PS How much (if any( of the linked resource goodness is grabbable via the Youtube API, I wonder? If anyone finds out before me, please post details in the comments below:-)
One of the easiest ways to get started with DeliTV is to use it to watch video feed subscription from YouTube.
With DeliTV, you can bookmark the following sorts of Youtube content and then view it in a DeliTV Channel:
|Bookmarked YouTube page||Resulting DeliTV subscription|
e.g Teachers’ TV channel
|Recently uploaded videos for that user|
|Playlist page e.g T151: 3D Geo-World Demos||Playlist feed|
|Video page e.g The Machine is Us/ing Us (Final Version)||Single video|
|[NEW] Search results page e.g Search for “formula one”||Search results containing 20 most relevant videos|
Here is the example channel bookmarked to a demo DeliTV channel guide: delitv_ytdemo:
(You can of course grab a copy of any of these bookmarks into your own delicious account.)
We can now bookmark this channel guide so that it appears in a DeliTV multiplex. In the following example, I’m bookmarking it to my main delitv feed, and also to the boxeetest5 multiplex.
Here’s the result in my boxeetest5 feed:
And here’s a view of the delitv_ytdemo channel guide:
This is what the bookmarked user/channel produces – the recent uploads listing for that user/channel:
And here’s the playlist guide:
Remember, with DeliTV you don’t need to bookmark the actual Youtbe feed – just bookmark the user/channel, playlist or video page to Delicious, and DeliTV will do the rest for you…
To learn how to subscribe to your own DeliTV channel, see Deli TV – Personally Programmed Social Television Channels on Boxee: Prototype
PS a new feature, currently in testing, lets you bookmark a search results page. Whilst it is possible to generate searches for playlist or users/channels as well as videos, DeliTV currently returns just the 20 most relevant Youtube videos when a Youtube search results page is bookamarked.
A week or so ago, Liz Azyan posted a list of UK HEI Youtube channels. Although not quite as polished as @liamgh et al’s OU Boxee app, I piucked up on a couple of suggestions Liam made over a pint last night about simply subscribing to an RSS feed in Boxee to roll my own UK HEI Youtube Boxee channel thing…
So here are the institutional channels:
and here’s a peek inside one of them:
This lets me watch the most recently uploaded videos to all (?) the UK HEIs’ most recent uploads to their Youtube channels, organised by institution via a lean back TV interface.
(You might be able to submenu the institutional channels/streams according to playlists they have specified, as well as tidying up things like icons/logos, maybe, but this was a 10 minute hack, rather than a half hour hack, ok?!;-)
Here’s the recipe…
1. Grab the table from Liz’s web page and create a feed from it:
2. Generate the feed URIs for the most recent uploads to each channel (in the form required by Boxe – e.g. rss://gdata.youtube.com/feeds/base/users/abertayTV/uploads?alt=rss&v=2&orderby=published):
Filter out stuff that isn’t a feed and complete the pipe:
If you now launch the Boxee app, select:
- Video Feeds (My Feeds)
- the UK HEI Youtube Videos Channel
And from there, you should be able to browse – and play – the recent uploads to all the UK HEI Youtube channels that Liz has listed.
Not that I had a niggle with my Boxe player – I could hear the audio but not see the video for any of the Youtube videos when I tried to play them. If anyone else tries out this channel and gts the same problem, please let me know and I;ll see if it’s a feed problem. Otherwise, I’ll assume it’s a local glitch…
Here’s the RSS feed URI again: “UK HEI Youtube Channels on Boxee” RSS feed
PS out of interest, if I had bid to do this as a #jiscri project, how much should I have asked for?
- planning: 10 mins chatting with Liam over a pint ysterday;
- design: <5 mins looking up Youtube API/URI patterns
- implementation:: <5 mins creating Yahoo pipe
- configuration: <5 mins subscribing to the pipe feed in Boxee
- testing: <5 mins seeing if it worked in Boxee (which it doesn’t, properly, but I’m blaming that on a local problem and trustung that it does actually work… err…?!;-)
Okay, so all told it was maybe a sub-20 minute hack rather than 5 minute one?
- documentation: (i.e. blog post) 30-45 mins, incl grabbing screenshots.
And I’m on holiday today…
In what I intend to be the last post in the series for a while – maybe – here’s a quick hack around the Guardian Open-Platform Content API that shows how to “annotate” a particular set of articles (specifically, recent video game reviews) with video trailers for the corresponding game pulled in from Youtube, and then render the whole caboodle in a Grazr widget.
So let’s begin…
Take one Guardian API query:
http://api.guardianapis.com/content/search?filter=/global/reviews&filter=/technology/games&api_key=MYSECRETACTIVATEDAPIKEY we construct the URI and call the webservice:
Here’s a typical result:
As with the football results map pipe, the linkj-text looks like a good candidate for search query text. So let’s tweak it for use as a search on somewhere like Youtube:
I want to be able to let the user choose trailer or review videos, which is why I cleared the search string of the word “review” first, before adding the user preference back into the string.
Now run a Youtube GData/API search using an old pipe block I found laying around (Youtube video search pipe) and grab the top result:
Now I happen to know that if you give a Grazr widget a feed with enclosure.url attributes that point to a flash file, it will embed the flash resource for you:
So now we can take the RSS output of the pipe and pop it into a Guardian game review with video previews Grazr widget:
(The widget layout is customised as described in part in Setting the 3 Pane divider positions.)
If you want to grab the widget and embed it your own webpages, it’s easy enough to do so (although not on hosted WordPress blogs). Simply click on “Share” and select “Customize” from the widget menu bar:
Then you can customise the widget and grab an embed code, or a link to a full screen view:
Good, eh? ;-)
PS Grazr (which is actually an OPML viewer as well as an RSS viewer) embeds other stuff too. For example, here it as as a slideshare viewer; and here it is showing how to automatically generate embedded players for Youtube videos and MP3 audio files using delicious Feed Enhancer – Auto-Enclosures pipe. (If you’re into a bit of hackery, it’ll carry Scribd iPaper too: Embed Scribd iPaper in Grazr Demo. I’m guessing you should be able to get it to embed arbitrary flash games as well?)
Pretty much all the sentiment I’ve picked from my post on Twitter Powered Subtitles for Conference Audio/Videos on Youtube is that the process for generating the subtitles is too complicated, so I’ve had a go at simplifying it:
The workflow is now as follows. Suppose you have a recording of an event that people were tweeting through using a particular hashtag, and you want to annotate the recording using the tweets made at the time as subtitles.
- Tweak the number of results on the page and the date setting (if necessary):
- If you only want tweets FROM a particular person, limit the search that way too:
- If the results you want to convert to subtitles are on “older” search results pages, navigate to the required results page
- When you have a results page containing the tweets you want to convert to subtitles, grab the URL of that results page and copy it into the subtitler form at http://ouseful.open.ac.uk/twitterSubtitles.php
- Optionally, if you want to specify the tweet that you want to be the first subtitle, copy its URL (that is, the URL that is pointed to by the View tweet link for that tweet:
- Optionally again, if you want to specify the tweet on the results page that you want to be the lastsubtitle, grab it’s URL and paste it into the form.
- Generate the subtitles:
- Save the page as a text file with the suffix .sub:
- You can now upload the .sub subtitle file to Youtube.
Go to Twitter advanced search and search for the particular hashtag;
So hopefully, that’s a little easier? (Note that there is a also a bookmarklet on the subtitler page that will create the subtitle file directly from a Twitter advanced search results page.)
PS here are some more thoughts about ways in which the subtitler might develop: Twitter Powered Youtube Subtitles, Reprise: Anytime Commenting.
PPPS Woudln’t it be good if CoverItLive offered an exportable subtitle file from previous events? In the meantime, does anyone know if it’s possible to get an RSS feed of posts from previous CoverItLive event commentaries?
PPPs See also: Twitterprompter?, discussing several possible use cases for Twitter in a live presentation environment.
One of the things that attracts me to serialised feeds (as well as confusing the hell out of me) is the possibility of letting people subscribe to, and add, comments in “relative time”…
… that is, as well as viewing the content via a serialised feed, the comments feed should also be serialised (with timestamps for each comment calculated relative to the time at which the person commenting started receiving the serialised feed).
Applying this to the idea of tweeted Youtube movie subtitles (Twitter Powered Subtitles for Conference Audio/Videos on Youtube) in which every tweet made during a presentation at or around that presentation becomes a subtitle on a recording of that presentation, it strikes me that a similar model is possible.
That is, different individuals could watch a Youtube video at different times, tweeting along as they do so, and then these tweets could be aggregated according to relative timestamps to provide a single, combined set of subtitles.
So how might this work in practice? Here’s a thought experiment run through…
Firstly, it’d probably be convenient to set up a twitter account to send the tweets to (say @example, for example).
Create a tag for the video – this could be something like #yt:tBmFzF8szpo for the video at http://www.youtube.com/watch?v=tBmFzF8szpo.
(Alan Levine reminded me about flickr machine tags earlier today, which are maybe also worth considering in this respect, e.g. as a source of inspiration for a tagging convention?)
Grab a ctrl-C copy of the phrase @example #yt:tBmFzF8szpo for quick pasting into a new tweet, and then start watching the video, tweeting along as you do so…
To generate your subtitle feed, you can then do a search based on Tweets from your username (which would be @psychemedia in my case) to e.g. @example, with hashtag #yt:tBmFzF8szpo, and maybe also using a date range.
(You could augment the Yahoo pipe I used in the twitter subtitle generator proof of concept to remove the hashtag when generating the feed used for subtitling?)
The actual subtitle file generator could then pull in several different subtitle feeds from separate people, relativise their timestamps relative to the time of the first tweet (which could maybe use a keyword, too – such as “START”: @example START #yt:tBmFzF8szpo;-) and then produce an aggregated subtitle feed.
As more people watched the video (maybe including the subtitles to date), their feeds could be added to the aggregating subtitle file generator, and the subtitle file updated/refreshed.
Individuals could even rewatch the video and create new feeds for themselves to join in the emerging conversation…
(Okay, so it’s maybe slower than just reading through the comments, having to replay the video in real time to read the tweets, but this is a sort of thought experiment, right, albeit one that can be implemented quite easily…;-)
PS In one of the comments to Show and Translate YouTube Captions Matt Cutts gave an example of a URL that “will search for the word “china” in videos with closed captions” [ http://www.youtube.com/results?closed_captions=1&search_query=china ] (although I’m not sure how well it works?).
So I’m thinking – if live tweets from an event can be associated with a video of an event (maybe because the video is posted with a link to a (now out of date!) upcoming record for that event in order to anchor it in time) then being able to search the tweets as captions/subtitles provides a crib for deeplink searching into the video? (But then, I guess the Goog is looking at audio indexing anyway?)
PPS I just came across another tool for adding subtitles to Youtube videos, as well as videos from other online video sites – overstream.net:
It’s worth looking at, maybe?
PPPS see also Omnisio, a recent Google acquisition that offers “select clips from videos you find on YouTube and other video sites, and easily post them on your profile page or blog. Even better, you and your friends can add comments directly in the video!”.
And there’s more: “With Omnisio you make and share your own shows by assembling clips from different videos.” Roll on the remixes :-)