With the internet of things still trying to find its way, I wonder why more folk aren’t talking about participatory surveillance?
For years, websites have been gifting information to third parties that you have visited them (Personal Declarations on Your Behalf – Why Visiting One Website Might Tell Another You Were There), but as more people are instrumenting themselves, the opportunities for mesh network based surveillance are ever more apparent.
Take something like thetrackr, for example. The device itself is a small bluetooth powered device the size of a coin that you attach to your key fob or keep in your wallet:
The TrackR is a Bluetooth device that connects to an app running on your phone. The phone app can monitor the distance between the phone and device by analyzing the power level of the received signal. This link can be used to ring the TrackR device or have the TrackR device ring the phone.
The other essentially part is an app you run permanently on your phone that listens out for the trackr device. Not just yours, but anyone’s. And when it detects one it posts its location to a central server:
[thetrackr] Crowd GPS is an alternative to traditional GPS and revolutionizes the possibilities of what can be tracked. Unlike traditional GPS, Crowd GPS uses the power of the existing cell phones all around us to help locate lost items. The technology works by having the TrackR device broadcast a unique ID over Bluetooth Low Energy when lost. Other users’ phones can detect this wireless signal in the background (without the user being aware). When the signal is detected, the phone records the current GPS location, sends a message to the TrackR server, and the TrackR server will then update the item’s last known location in its database. It’s a way that TrackR is enabling you to automatically keep track of the location of all your items effortlessly.
And if you don’t trust the trackr folk, other alternatives are available. Such as tile:
The Tile app allows you to anonymously enlist the help of our entire community in your search. It works both ways — if you’re running the app in the background and come within range of someone’s lost item, we’ll let the owner know where it is.
This sort of participatory surveillance can be used to track stolen items too, such as cars. The TRACKER mesh network (which I’ve posted about before: Geographical Rights Management, Mesh based Surveillance, Trickle-Down and Over-Reach) uses tracking devices and receivers fitted to vehicles to locate other similarly fitted vehicles as they pass by them:
TRACKER Locate or TRACKER Plant fitted vehicles listen out for the reply codes being sent out by stolen SVR fitted vehicles. When the TRACKER Locate or TRACKER Plant unit passes a stolen vehicle, it picks up its reply code and sends the position to the TRACKER Control Room.
That’s not the only way fitted vehicles can be used to track each other. A more general way is to fit your car with a dashboard camera, then use ANPR (automatic number plate recognition) to identify and track other vehicles on the road. And yes, there is an app for logging anti-social or dangerous driving acts the camera sees, as described in a recent IEEE Spectrum article on The AI dashcam app that wants to rate every driver in the world. It’s called the Nexar app, and as their website proudly describes:
Nexar enables you to use your mobile telephone to record the actions of other drivers, including the license plates, types and models of the cars being recorded, as well as signs and other surrounding road objects. When you open our App and begin driving, video footage will be recorded. …
If you experience a notable traffic incident recorded through your use of the App (such as someone cutting you off or causing an accident), you can alert Nexar that we should review the video capturing the event. We may also utilize auto-detection, including through the use of “machine vision” and “sensor fusion” to identify traffic law violations (such as a car in the middle of an intersection despite a red stop light). Such auto-detected events will appear in your history. Finally, time-lapse images will automatically be uploaded.
Upon learning of a traffic incident (from you directly or through auto-detection of events), we will analyze the video to identify any well-established traffic law violations, such as vehicle accidents. Our analysis will also take into account road conditions, topography and other local factors. If such a violation occurred, it will be used to assign a rating to the license plate number of the responsible driver. You and others using our App who have subsequent contact with that vehicle will be alerted of the rating (but not the nature of the underlying incidents that contributed to the other driver’s rating).
And of course, this is a social thing we can all participate in:
Nexar connects you to a network of dashcams, through which you will start getting real-time warnings to dangers on the road
It’s not creepy though, because they don’t try to relate to number plates to actual people:
Please note that although Nexar will receive, through video from App users, license plate numbers of the observed vehicles, we will not know the recorded drivers’ names or attempt to link license plate numbers to individuals by accessing state motor vehicle records or other means. Nor will we utilize facial recognition software or other technology to identify drivers whose conduct has been recorded.
So that’s all right then…
But be warned:
Auto-detection also includes monitoring of your own driving behavior.
so you’ll be holding yourself to account too…
Folk used to be able to go to large public places and spaces to be anonymous. Now it seems that the more populated the place, the more likely you are to be located, timestamped and identified.
Whether or not William Gibson actually said – either exactly, or approximately – “The future is already here. It’s just not evenly distributed yet” – it’s undoubtedly the case that many of the technologies that will come to influence our lives in the near future have already been invented, they just haven’t been fully tested, regulated, insured against or officially approved yet.
So to get an idea about what’s upcoming, one thing we can do is track the regulators and testing agencies, as well as new offerings from the insurers, such as the Driverless Car Insurance from Adrian Flux:
Our new driverless policy will cover you against:
- Loss or damage to your car caused by hacking or attempted hacking of its operating system or other software
- Updates and patches to your car’s operating system, firewall, and mapping and navigation systems that have not been successfully installed within 24 hours of you being notified by the manufacturer
- Satellite failure or outages that affect your car’s navigation systems
- Failure of the manufacturer’s software or failure of any other authorised in-car software
- Loss or damage caused by failing when able to use manual override to avoid an accident in the event of a software or mechanical failure
Getting on for fifteen years ago now, approximately, the UK Health and Safety Executive commissioned a report on The future health and safety implications of global positioning satellite and machine automation, looking at the health and safety implications of automated machinery particularly in a quarrying context, the sort of thing introduced by Rio Tinto’s “Mine of the Future” in 2008. (The HSE also have a report from 2004 that, among other things, considers risks associated with autonomous underwater vehicles: Risk implications in site characterisation and analysis for offshore engineering and design. Which reminds me, when does the Unmanned Warrior exercise take place?)
Another place we might look to are registers of clinical trials. So for example, how are robots are being tested in UK Clinical Trials?
Or how about software related clinical trials?
Hmm.. thinks.. I wonder: is “software” being prescribed in the UK? If so, it should be recorded in the GP prescribing opendata… But as what, I wonder?!
PS One for the librarians out there – where else should I be looking? Tracking legislation and government codes of practice is one source (eg as per Regulating Autonomous Vehicles: Land, Sea and Air…). But what other sources are there?
I haven’t played with my ESP / twitter mapping code for a bit, but I dug it out again last night for a quick play, and to see how much of it I could reuse if I moved from Mongo to a neo4j / graph database backend (an excuse, in part, to learn a bit more about neo4j, but also because I think it would be easier to write interesting queries over something properly represented as a graph).
One of my favourite maps shows the folk most commonly followed by followers of a person, or set of people, on Twitter. But there are other ways of doing this two step projection, and I think they describe different things:
- common friends of your followers: this is “people like me” from the perspective of someone’s audience; if lots of folk follow you on Twitter because you interest them, you represent a shared interest of those people. If lots of those folk follow other individuals in common, that’s maybe because the interest they share with respect to you also applies to other folk they follow in common; other folk somehow like you. Alternatively, it may be that there are “affiliated” interests: lots of folk follow a particular golfer because they share an interest in golf, but maybe lots of them also follow a particular brand of whisky because of an interest in the thirteenth hole; so maybe the golfer should try to tie up with with whisky brand. These common friends of your followers are also your competitors in the sense that they too are trying to gain the attention of your followers.
- common followers of your followers: birds of a feather flock together (homophily); if folk share an interest in you, and they are all followed by someone who doesn’t follow you, perhaps someone who shares their interests, then maybe those common followers (who don’t follow you) of your followers are a place to grow your audience? You also have a route to those people (via your followers). And there are easy to identify metrics for any campaign, such as the rate at which you convert folk who follow your followers but not you into folk who do follow you.
- common friends of your friends: you can’t choose your followers (although you can block folk to exclude them from your follower list) but you do choose your friends (that is, people you follow). You friends influence you by virtue of the fact you see what they say. If you’re choosing friends as folk that you want to influence in turn, then by mapping who their common friends are (that is, who they commonly follow), you can see who influences them. If they don’t follow folk like you, but you want to gain their attention, you need to gain the attentions of the folk they follow.
- common followers of your friends: you follow folk because of your particular interests; if other folk follow the same people as you, perhaps they share the same interests; which means they may be your competitors, or they may be potential collaborators. You might also be able to use them to find other folk to follow (that is, look for the folk your friends followers follow that you don’t currently follow). You might also be able to use this group to find new possible followers – from the folk who follow them but don’t follow you.
I keeping meaning to formalise this stuff… hmmm…
My festival coping strategy of avoiding the Main Stage (with one exception), along with using Clashfinder a few days before to sketch out some sort of schedule seemed to work pretty well this year, so here’s a quick playlist of what I enjoyed…
Thursday was camp-pitching day, and whilst sight of Status Quo from the (out)side of the Big Top suggested they could still cut it, settling in with Ska’d for Life (though still outside the tent) seemed a better way to round off the day…
I wish I’d taken a pen, because I’m not sure what I saw Friday afternoon, although I do remember walking late into Blossoms and thinking I should have stuck my head in when I’d walked past the Big Top at the start of their set:
With a gap im my schedule till the evening, I gave that tent a bit more of a go in the form of Black Violin, who had a nice edge to them, at least, when they were doing the violin thing…
I’d been looking forward to an early evening slot by islanders Bully Bones / @BullyBonesMusic, so was sad to see it hampered by feedback issues (the sound engineer seemed out of his depth when it came to debugging and was no help to the band) – I think they need to get their lines and plugs checked, though, because this could – should – have been a great start to the evening.
Ne’er mind, though, because Barnsley band Hands Off Gretel / @HandsOffGretel more than made up for it… (Playing in Huddersfield this week if you get a chance… I’m wondering whether a visit back to family home is in order…!)
And props to them for taking some merchandise… I can’t remember the last time I bought a band T-shirt (erm, not strictly true: it was last Friday…), and the CD’s playing now…
Popping in to the Quay Arts Kashmir tent to make some more charitable donations by proxy via their real ale bar, I hadn’t expected to see any farmer rock, but Paul Middleton & the Angst Band had me in stitches. Exactly the sort of thing I’d have expected to see at Festival at the Edge in years passed, though I don’t think I ever did… Absolutely the best way to spend an afternoon…
Then decision time – only, not really… Iggy had to go by the wayside, unfortunately, because my festival faves The Orders / @The_OrdersUK, who seemed to have spent the afternoon flyering, (or at least, got someone to do their flyering):
were up for their first set of the weekend:
Then it was a quick dash, set list ephemera in hand….
…to catch the end of The Buzzcocks, who I’ve never seen before, but whose songs were as familiar as familiar could be!
Then it was time for a sit down…
Then my one proper trip to the Main Stage for the short opening set of the day by The Orders / @The_OrdersUK, which included the following (which is a bit rockier live and seemed to go down pretty well…)
I’m looking forward to seeing them climb the Festival bill over the coming years… Book ’em now, if you can… (and I’ll maybe cover their ferry fare if you’re on a tight budget…)
I’d had Cabbage / @AhCabbage down on my “to watch” list, but spotted they were doing a short set in advance of the one I’d highlighted, which didn’t require such a walk, yet did provide the additional benefit that if they were any good I could watch them twice.
I watched them twice. My second flailing mayhem dance of the weekend. Just insane…
And now, buzzy ears, still buzzin’…
A couple of recent articles on bias in the justice system recently caught my eye that show different models of engagement around data analysis in a particular topic area:
- Hester, Rhys, and Todd K. Hartman. “Conditional Race Disparities in Criminal Sentencing: A Test of the Liberation Hypothesis from a Non-Guidelines State”, Journal of Quantitative Criminology pp 1-24, an academically published, peer reviewed article that will cost you £30 to look at.
- Uncovering Big Bias with Big Data By David Colarusso on May 31st, 2016, The Lawyerist blog, a recreational data blog post.
The blog post comes complete with links to a github repo containing a Jupyter notebook describing the analysis. The data is not provided, for data protection/privacy compliance, although a link to the original source of the data, and a brief description of it, is (but not a deep link to the actual data?). I’m not sure if any data associated with the academic paper is publicly or openly available, or whether any of the analysis scripts are (see below – analysis scripts are available).
The blog post is open to comments (there are a few) and the author has responded to some of them. The academic post authors made themselves available via a Reddit AMA (good on them:-): Racial Bias AMA (h/t @gravityvictims/Alastair McCloskey for the tip).
The Reddit AMA notes the following: an ungated (i.e., not behind paywall) version of our research at the SSRN or Dr. Hartman’s website. The official publication was First online: 29 February 2016. The SSRN version is dated as Date posted: November 6, 2014 ; Last revised: January 4, 2016. The SSRN version includes a small amount of Stata code at the end (the “official” version of the paper doesn’t?), but I’m not sure what data it applies to or whether it’s linked to from the data (I only skimmed the paper.) Todd Hartman’s website includes a copy of the published paper and a link to the replication files (7z compressed, so how many folk will be able to easily open that, I wonder?!).
So Stata, R and data files. Good stuff. But from just the paper homepage on the Springer Journal site, I wouldn’t have got that?
Of course, the Springer paper reference gets academic brownie points.
PS by the by, In the UK the Ministry of Justice Justice Data Lab publish regular reports on who’s using their data. For example: Justice Data Lab statistics: June 2016.
Sometime last year, the VC sponsored RoboRace autonomous car race series was announced as a supporting race for the Formula-E for the 2016-17 season.
According to the NVidia blog, the first RoboRace cars will be powered by an NVidia Drive PX2 computer (more: PCWorld – The specs and story behind the autonomous Robocar and its Nvidia Drive PX 2 brains). (Specs for the PX2 don’t appear to be on the Nvidia Automotive Solutions webpages yet?)
I haven’t seen any announcements regarding the teams yet, so if you haven’t had an invite already (nor me!;-), you’re probably not on the list. I did see this ad a few weeks ago though…
Here’s a list of folk currently claiming to be associated with Roborace on LinkedIn.
However, that doesn’t necessarily mean you can’t get into the autonomous racing thing…
For example, you could always sign up for the International Autonomous Robot Racing Challenge (IARRC), or have a go at building your own version of Georgia Tech’s AutoRally autonomous robot rally car (the Georgia Tech folk have made their code available…).
Or how about giving your wheels a day out with a self-driving car trackday?
One thing that perked my attention in this Medium post on The First Autonomous Track Day: An interview with creator and racer Joshua Schachter was the name Joshua Schachter. Hmm… is that the self-same Joshua Schachter who create the delicious social-bookmarking website?