Community Detection? (And Is Your Phone a Cookie?)

A few months ago, I noticed that the Google geolocation service would return a lat/long location marker when provided with the MAC address of a wifi router (Using Google to Look Up Where You Live via the Physical Location of Your Wifi Router [code]) and in various other posts I’ve commented on how communities of bluetooth users can track each other’s devices (eg Participatory Surveillance – Who’s Been Tracking You Today?).

Which got me wondering… are there any apps out there that let me detect the MAC address of Bluetooth devices in my vicinity, and is there anyone aggregating the data, perhaps as a quid pro quo for making such an app available?

Seems like the answer is yes, and yes…

For example, John Abraham’s Bluetooth 4.0 Scanner [Android] app will let you [scan] for Bluetooth devices… The information is recorded includes: device name, location, RSSI signal strength, MAC address, MAC address vendor lookup.

In a spirit of sharing, the Bluetooth 4.0 Scanner app “supports the earthping.com project – crowdsourced Bluetooth database. Users are also reporting usage to find their lost Bluetooth devices”.

So when you run the app to check the presence of Bluetooth devices in your own vicinity, you also gift location of those devices – along with their MAC addresses – to a global database – earthping. Good stuff…not.

We’re all familiar (at least in the UK) with surveillance cameras everywhere, and as object recognition and reconciliation tools improves it seems as if tracking targets across multiple camera views will become a thing, as demonstrated by the FX Pal Dynamic Object Tracking System (DOTS) for “office surveillance”.

It’s also increasingly the case that street furniture is appearing that captures the address of our electronic devices as we pass them. For example, in New York, Link NYC “is a first-of-its-kind communications network that will replace over 7,500 pay phones across the five boroughs with new structures called Links. Each Link will provide superfast, free public Wi-Fi, phone calls, device charging and a tablet for Internet browsing, access to city services, maps and directions”. The points will also allow passers-by to ‘view public service announcements and more relevant advertising on two 55” HD displays’ – which is to say they track everything that passes, tries to profile anyone who goes online via the service, and then delivers targeted advertising to exactly the sort of people passing each link.

LinkNYC is completely free because it’s funded through advertising. Its groundbreaking digital OOH advertising network not only provides brands with a rich, context-aware platform to reach New Yorkers and visitors, but will generate more than a half billion dollars in revenue for New York City.

So I wondered just what sorts of digital info we leak as we do walk down the street. Via Tracking people via WiFi (even when not connected), I learn that devices operate in one of two modes – a listening beacon mode, where they essentially listening for access points, but at high battery cost. Or a lower energy ping mode, where they announce themselves (along with MAC address) to anyone who’s listening.

If you want to track passers-by, many of whom will be pinging their credentials to anyone whose listening, you can set up things like wifi routers in monitor mode to listen out for – and log – such pings. Edward Keeble describes how to do it in the post Passive WiFi Tracking

If you’d rather not hack together such a device yourself, you can always buy something off the shelf to log the MAC addresses of passers-by, eg from somebody such as Libelium’s Meshlium Scanner [datasheet – PDF]. So for example:

  • Meshlium Scanner AP – It allows to detect (sic) Smartphones (iPhone, Android) and in general any device which works with WiFi or Bluetooth interfaces. This model can receive and store data from Waspmotes with GPRS, 3G or WiFi, sending via HTTP protocol. The collected data can be send (sic) to the Internet by using the Ethernet.
  • Meshlium Scanner 3G/GPRS-AP – It allows to detect (sic) Smartphones (iPhone, Android) and in general any device which works with WiFi or Bluetooth interfaces. This model can receive and store data from Waspmotes with GPRS, 3G or WiFi, sending via HTTP protocol. The collected data can be send (sic) to the Internet by using the Ethernet, and 3G/GPRS connectivity
  • Meshlium Scanner XBee/LoRa -AP – It allows to detect (sic) Smartphones (iPhone, Android) and in general any device which works with WiFi or Bluetooth interfaces. It can also capture the sensor data which comes from the Wireless Sensor Network (WSN) made with Waspmote sensor devices. The collected data can be send (sic) to the Internet by using the Ethernet and WiFi connectivity.

So have any councils started installing that sort of device I wonder? And if so, on what grounds?

On the ad-tracking/marketing front, I’m also wondering whether there are extensions to cookie matching services that can match MAC addresses to cookies?

PS you know that unique tat you’ve got?! FBI Develops tattoo tracking technology!

PPS capturing data from wifi and bluetooth devices is easy enough, but how about listening out for mobile phone as phones? Seems that’s possible too, though perhaps not off-the-shelf for your everyday consumer…? What you need, apparently, is an IMSI catcher such as the Harris Corp Stingray. Examples of use here and here.

See also: Tin Foil Hats or Baseball Caps? Why Your Face is a Cookie and Your Data is midata and We Are Watching, and You Will be Counted.

Accessible Jupyter Notebooks?

Pondering the extent to which Jupyter notebooks provide an accessible UI, I had a naive play with the Mac VoiceOver app run over Jupyter notebooks the other day: markdown cells were easy enough to convert to speech, but the code cells and their outputs are nested block elements which seemed to take a bit more navigation (I think I really need to learn how to use VoiceOver properly for a proper test!). Suffice to say, I really should learn how to use screen-reader software, because as it stands I can’t really tell how accessible the notebooks are…

A quick search around for accessibility related extensions turned up the jupyter-a11y: reader extension [code], which looks like it could be a handy crib. This extension will speak aloud a the contents of a code cell or markdown cell as well as navigational features such as whether you are in the cell at the top or the bottom of the page. I’m not sure it speaks aloud the output of code cell though? But the code looks simple enough, so this might be worth a play with…

On the topic of reading aloud code cell outputs, I also started wondering whether it would be possible to generate “accessible” alt or longdesc text for matplotlib generated charts and add those to the element inserted into the code cell output. This text could also be used to feed the reader narrator. (See also First Thoughts on Automatically Generating Accessible Text Descriptions of ggplot Charts in R for some quick examples of generating textual descriptions from matplotlib charts.)

Another way of complementing the jupyter-a11y reader extension might be to use the python pindent [code] tool to annotate the contents of code cells with accessible comments (such as comments that identify the end of if/else blocks, and function definitions). Another advantage of having a pindent extension to annotate the content of notebook python code cells is that it might help improve the readability of code for novices. So for example, we could have a notebook toolbar button that will toggle pindent annotations on a selected code cell.

For code read aloud by the reader extension, I wonder if it would be worth running the content of any (python) code cells through pindent first?

PS FWIW, here’s a related issue on Github.

PPS another tool that helps make python code a bit more accessible, in an active sense, in a Jupyter notebook is this pop-up variable inspector widget.

Simple Live Timing Data Scraper…

A couple of weeks ago, I noticed an F1 live timing site with an easy to hit endpoint… here’s the Mac commandline script I used to grab the timing info, once every five seconds or so…

mkdir f1_silverstone
i=1; sleep 900; while true ; do curl http://www.livesportstreaming24.com/live.php >> f1_silverstone/f1output_race_${i}.txt ;i=$((i+1)); sleep 5 ; done

Now I just need to think what I’m going to do with the data! Maybe an opportunity to revisit this thing and try out some realtime dashboard widget toys?

Mediated/Augmented Reality (Un)Course Notes, Part I

Pokemon Go seems to have hit the news this week – though I’m sure for anyone off social media last week and back to it next week, the whole thing will have completely passed them by – demonstrating that augmented reality apps really haven’t moved on much at all over the last five years or so.

But notwithstanding that, I’ve been trying to make sense of a whole range of mediated reality technologies for myself as prep for a very short unit on technologies and techniques on that topic.

Here’s what I’ve done to date, over on the Digital Worlds uncourse blog. This stuff isn’t official OU course material, it’s just my own personal learning diary of related stuff (technical term!;-)

More to come over the next couple of weeks or so. If you want to comment, and perhaps influence the direction of my meanderings, please feel free to do that here or on the relevant post.

An evolving feed of the posts is available in chronological order and in reverse chronological order.

Dogfooding… and Creating (Learning) for a Purpose

“Eating your own dogfood”, aka dogfooding, refers the practice of a company testing it’s own products by using them internally. At a research day held by Somerset College, a quote in a talk by Lorna Sheppard on Len Deighton’s cookbooks (yes, that Len Deighton…) from a 2014 Observer magazine article (Len Deighton’s Observer cookstrips, Michael Caine and the 1960s) caught my attention:

[G]enerally, you stand a better chance of succeeding in something if whatever you create, you also like to consume.

Implicit in this is the idea that you are also creating for a purpose.

In the OU engineering residential school currently running at the University of Bath, one of the four day long activities the students engage with is a robotics activity using Lego EV3 robots, where at each stage we try to build in a reason for adding another programming construct or learning how to work with a new sensor. That is, we try to motivate the learning by making it purposeful.

The day is structured around a series of challenges that allow students to develop familiarity with programming a Lego EV3 robot, adding sensors to it, logging data from the sensors and then interpreting the data. The activities are contextualised by comparing the work done on the Lego EV3’s with the behaviour of a Roomba robot vacuum cleaner – by the end of the morning, students will have programmed their robot to perform the majority of the Roomba’s control functions, including finding it’s way home to a homing beacon, as well as responding to touch (bumper), colour (line stopper) and proximity (infra-red and ultrasonic) sensors.

The day concludes with a challenge, where an autonomous robot must enter – and return from  – a closed tunnel network, using sensors to collect data about the internal structure of the tunnel, as well identifying the location of a casualty who has an infra-red emergency beacon with them.

27804006510_8058ebaf59_k

(The lids are placed on the tunnels so the students can’t see inside.)

As well as the partition walls (which are relocated each time the challenge is run, so I’m not giving anything away!), pipework and cables (aka coloured tape) also run through the tunnel and may be mapped by the students using a downward facing light sensor.

27803993990_0f3948050a_z

The casualty is actually a small wooden artist’s mannequin – the cuddly teddy we used to use does not respond well to the ultrasound sensor the students use to map the tunnel.

27514269174_12d0923f67_k

The data logged by the students include motor rotation data to track the robots progress, ultrasonic sensor data to map the walls, infra-red sensor data to find the emergency beacon and a light sensor to identify the cables/pipework.

The data collected looks something like this:

final challenge

The challenge is then to map the (unseen by the students) tunnel network, and tell the robot’s story from the data.

The result is a narrative that describes the robot’s progress, and a map showing the internal structure of the tunnel:

27603239203_722872db89_k

If time allows, this can then be used as the basis for programming the robot to complete a rescue mission!

The strategies used by the students to log the data, and control the robot to send it into the tunnel and retrieve it safely again, are based on what they learned completing the earlier challenges set throughout the day.

The Internet of Thinking Things – Intelligence at the Edge

Via F1 journalist James Allen’s blog (Insight: Inside McLaren’s Secretive F1 Openerations Room, “Mission Control”), I learn that the wheel hub of McLaren’s latest MP4-31 Formula One car hacks its own data. According to McLaren boss, Ron Dennis:

Each wheel hub has its own processing power, we don’t even take data from the sensors that surround the wheel [that measure] brake temperatures, brake wear, tyre pressures, G-Forces – all of this gets processed actually in the wheel hub – it doesn’t even get transmitted to the central ECU, the Electronic Control Unit.

If driver locks a brake or the wheel throws itself out of balance, we’re monitoring the vibration that creates against a model that says, “if the driver continues with this level of vibration the suspension will fail”, or the opposite, “we can cope with this vibration”.

With artificial intelligence and machine learning modeling now available as a commodity service, at least for connected devices, it’ll be interesting to see what the future holds for intelligence at the edge – sensors that don’t just return data (“something moved” from a security sensor, but that return information (“I just saw a male, 6′, blue trousers, green top, leaving room 27 and going to the water cooler; it looked like… etc etc..”)

Of course, if you’re happy with your sensors just applying a model, rather than building one, that appears to be the case for the MP4-31 wheel hub, it seems that you can already do that at the 8 bit level using Deep Learning, as described by Pete Warden in How to Quantize Neural Networks with TensorFlow.

By the by, if you want to have a quick play with a TensorFlow learner, check out the TensorFlow Neural Network Playground. Or how about training a visual recognition system with IBM’s Visual Recognition Demo?