There seem to have a been a lot of posts recently about URL shorteners/minifiers, such as this or this, which linked back to On URL shorteners by delicious founder Joshua Schachter. I’m not sure if Brian Kelly has done a risk assessment post about it yet, though? ;-)
So what are we to do in the case of URL shorteners going down, or disappearing from the web?
How about this?
When you publish a page, do a lookup using the most popular URL shortener sites to grab the shortened URL for that page from those services, and hard code those URLs into the page as metadata:
Then if a particular URL shortener service goes down, there’s a fallback position available in the form of using the web search engines to track down your page, as long as they index the page metadata?
PS it also strikes me that if a URL service were to go down, it’d be in e.g. Google’s interests to buy up their databases in the closing down fire sale…
PPS annotating every page would potentially introduce overload the URL shortening services, I suspect, so I wonder this: maybe page publishers should inject the meta data into a page if they see incoming referrer traffic coming in to the page from a URL shortening service? So for example, if the server sees incoming traffic to a page from is.gd, it grabs the is.gd short URL for the page and adds it to the page metadata? This is not a million miles away from a short URL trackback service? (Cf. also e.g. things like Tweetback.)
PPPS via Downes: Short URL Auto-Discovery (looks like it’s being offered as an RFC).