Category: Anything you want

Patently Imagined Futures, (or, what’s Facebook been getting up to recently?)

One of the blogs on my “must read” list is Bill Slawski’s SEO by the Sea, which regularly comments on a wide variety of search related patents, both recent and in the past, obtained by Google and what they might mean…

The US patent system is completely dysfunctional, of course, acting as way of preventing innovative competition in a way that I think probably wasn’t intended by its framers, but it does provide an insight into some of the crazy bar talk ideas that Silicon Valley types thought they might just go and try out on millions of people, or perhaps already are trying out.

As an example, here are a couple of recent patents from Facebook that recently crossed my radar.


Images uploaded by users of a social networking system are analyzed to determine signatures of cameras used to capture the images. A camera signature comprises features extracted from images that characterize the camera used for capturing the image, for example, faulty pixel positions in the camera and metadata available in files storing the images. Associations between users and cameras are inferred based on actions relating users with the cameras, for example, users uploading images, users being tagged in images captured with a camera, and the like. Associations between users of the social networking system related via cameras are inferred. These associations are used beneficially for the social networking system, for example, for recommending potential connections to a user, recommending events and groups to users, identifying multiple user accounts created by the same user, detecting fraudulent accounts, and determining affinity between users.

Which is to say: traces of the flaws in a particular camera that are passed through to each photograph are unique enough to uniquely identify that camera. (I note that academic research picked up on by Bruce Schneier demonstrated this getting on for a decade ago: Digital Cameras Have Unique Fingerprints.) So when a photo is uploaded to Facebook, Facebook can associate it with a particular camera. And by association with who’s uploading the photos, a particular camera, as identified by the camera signature baked into a photograph, can be associated with a particular person. Another form of participatory surveillance, methinks.

Note that this is different to the various camera settings that get baked into photograph metadata (you know, that “administrative” data stuff that folk would have you believe doesn’t really reveal anything about the content of a communication…). I’m not sure to what extent that data helps narrow down the identity of a particular camera, particularly when associated with other bits of info in a data mosaic, but it doesn’t take that many bits of data to uniquely identify a device. Like your web-browser’s settings, for example, that are revealed to webservers of sites you visit through browser metadata, and uniquely identify your browser. (See eg this paper from the EFF – How Unique Is Your Web Browser? [PDF] – and the associated test site: test your browser’s uniqueness.) And if your camera’s also a phone, there’ll be a wealth of other bits of metadata that let you associate camera with phone, and so on.

Facebook’s face recognition algorithms can also work out who’s in an image, so more relationships and associations there. If kids aren’t being taught about graph theory in school from a very young age, they should be… (So for example, here’s a nice story about what you can do with edges: SELECTION AND RANKING OF COMMENTS FOR PRESENTATION TO SOCIAL NETWORKING SYSTEM USERS. Here’s a completely impenetrable one: SYSTEMS, METHODS, AND APPARATUSES FOR IMPLEMENTING AN INTERFACE TO VIEW AND EXPLORE SOCIALLY RELEVANT CONCEPTS OF AN ENTITY GRAPH.)

Here’s another one – hinting at Facebook’s role as a future publisher:


An online publisher provides content items such as advertisements to users. To enable publishers to provide content items to users who meet targeting criteria of the content items, an exchange server aggregates data about the users. The exchange server receives user data from two or more sources, including a social networking system and one or more other service providers. To protect the user’s privacy, the social networking system and the service providers may provide the user data to the exchange server without identifying the user. The exchange server tracks each unique user of the social networking system and the service providers using a common identifier, enabling the exchange server to aggregate the users’ data. The exchange server then applies the aggregated user data to select content items for the users, either directly or via a publisher.

I don’t really see what’s clever about this – using an ad serving engine to serve content – even though Business Insider try to talk it up (Facebook just filed a fascinating patent that could seriously hurt Google’s ad revenue). I pondered something related to this way back when, but never really followed it through: Contextual Content Server, Courtesy of Google? (2008), Contextual Content Delivery on Higher Ed Websites Using Ad Servers (2010), or Using AdServers Across Networked Organisations (2014). Note also this remark on the the University of Bedfordshire using Google Banner Ads as On-Campus Signage (2011).

(By the by, I also note that Google has a complementary service where it makes content recommendations relating to content on your own site via AdSense widgets: Google Matched Content.)

PS not totally unrelated, perhaps, a recent essay by Bruce Schneier on the need to regulate the emerging automatic face recognition industry: Automatic Face Recognition and Surveillance.

So It Seems I’m Now A Senior Lecturer…

…although I haven’t actually seen the letter yet, but our HoD announcement went round about the latest successful promotions earlier today, so I’m hoping that counts…!

I took considerable persuading to put a case in (again…) but thanks to everyone (particularly David, Karen and Mark, and Andy from attempts passed) who put the hours in improving on the multiple revisions of the case as it progressed through the OU promotions process and supporting me in the process – as well as those Deans and HoDs past who’ve allowed me to get away with what I’ve been doing over the last few years;-)

If anyone wants to see a copy of the case I made, I’m happy to let you have a copy…

Anyway… is this now the time to go traditional, stop blogging, and start working on getting the money in and preparing one or two journal publications a year for an academic readership in the tens?!


We Are Watching, and You Will be Counted

Two or three weeks ago, whilst in Cardiff, I noticed one of these things for the first time:

20682320123_203e79367d_o 20682320123_203e79367d_o_jpg

It’s counts the number of cyclists who pass by it and is a great example of the sort of thing that could perhaps be added to a “data walk”, along with several other examples of data revealing street furniture as described by Leigh Dodds in Data and information in the city.

It looks like could be made by a company called Falco – this Falco Cycle Counter CB650 (“[a]lready installed for Cardiff County Council as well as in Copenhagen and Nijmegen”)? (Falco also make another, cheaper one, the CB400.)

From the blurb:

The purpose of the Falco Cycle Counter is to show the number of cyclists on a bicycle path. It shows the number of cyclists per day and year. At the top of the Counter there is a clock indicating time and date. On the reverse it is possible to show city map or other information, alternatively for a two-way bicycle path it is possible to have display on both side of the unit. Already installed for Cardiff County Council as well as in Copenhagen and Nijmegen, three very strong cycling areas, the cycle counter is already proving to be an effective tool in managing cycle traffic.

As with many of these sorts of exhibit, it can phone home:

When configured as a Cycle Counter, the GTC can provide a number of functions depending on the configuration of the Counter. It is equipped with a modem for a SIM card use which provides a platform for mobile data to be exported to a central data collection system.

This makes possible a range of “on-… services”, for example: [g]enerates individual ‘buy-in’ from local people via a website and web feed plus optional Twitter RSS enabling them to follow the progress of their own counter personally.

I was reminded of this appliance (and should really have blogged it sooner) by a post today from Pete Warden – Semantic Sensors – in which he remarked on spotting an article about “people counters” in San Francisco that count passing foot traffic.

In that case, the counters seem to be provided by a company called Springboard who offer a range of counting services using a camera based counting system: a small counting device … mounted on either a building or lighting/CCTV column, a virtual zone is defined and pedestrians and cars who travel through the zone are recorded.

Visitor numbers are recorded using the very latest counting software based on “target specific tracking”. Data is audited each day by Springboard and uploaded daily to an internet server where it is permanently stored.

Target specific tracking software monitors flows by employing a wide range of characteristics to determine a target to identify and track.

Here’s an example of how it works:

As Pete Warden remarked, [t]raditionally we’ve always thought about cameras as devices to capture pictures for humans to watch. People counters only use images as an intermediate stage in their data pipeline, their real output is just the coordinates of nearby pedestrians.

He goes on:

Right now this is a very niche application, because the systems cost $2,100 each. What happens when something similar costs $2, or even 20 cents? And how about combining that price point with rapidly-improving computer vision, allowing far more information to be derived from images?

Those trends are why I think we’re going to see a lot of “Semantic Sensors” emerging. These will be tiny, cheap, all-in-one modules that capture raw noisy data from the real world, have built-in AI for analysis, and only output a few high-level signals.

For all of these applications, the images involved are just an implementation detail, they can be immediately discarded. From a systems view, they’re just black boxes that output data about the local environment.

Using cameras to count footfall appears to be nothing new – for example, the Leeds Data Mill openly publish Leeds City Centre footfall data collected by the council from “8 cameras located at various locations around the city centre [which monitor]numbers of people walking past. These cameras calculate numbers on an hourly basis”. I’ve also briefly mentioned several examples regarding the deployment of related technologies before, for example The Curse of Our Time – Tracking, Tracking Everywhere.

From my own local experience, it seems cameras are also being used (apparently) to gather evidence about possible “bad behaviour” by motorists. Out walking the dog recently, I noticed a camera I hadn’t spotted before:


It’s situated at the start of a hedged both sides footpath that runs along the road, although the mounting suggests that it doesn’t have a field of view down the path. Asking in the local shop, it seems as if the camera was mounted to investigate complaints of traffic accelerating off the mini-roundabout and cutting-up pedestrians about to use the zebra-crossing:


(I haven’t found any public consultations about mounting this camera, and should really ask a question, or even make an FOI request, to clarify by what process the decision was made to install this camera, when it was installed, for how long, for what purpose, and whether it could be used for other ancillary purposes.)

On a slightly different note, I also note from earlier this year that Amazon acquired internet of things platform operator 2lemetry, “an IoT version of Enterprise Application Integration (EAI) middleware solutions, providing device connectivity at scale, cross-communication, data brokering and storage”, apparently. In part, this made me think of an enterprise version of Pachube, as was (now Xively?).

So is Amazon going to pitch against Google (in the form of Nest), or maybe Apple, perhaps building “home things” services around a home hub server? After all, they have listening connected voice for your home already in the form of the voice controlled Amazon Echo (a bit like a standalone Siri, Cortana or Google Now). (Note to self: check out the Amazon Amazon Alexa Voice Services Developer Kit some time…)

As Pete Warden concluded, it seems obvious to me that machine vision is becoming a commodity“. What might we expect as and when listening and voice services also become a commodity?

Related: a recent article from the Guardian posing the question What happens when you ask to see CCTV footage?, as is your right by making a subject access request under the Data Protection Act, picks up on a recently posted paper by OU academic Keith Spiller: Experiences of accessing CCTV data: the urban topologies of subject access requests.

Link Sharing in Classrooms, Workshops and Conferences

Every so often, I get reminded of old OUseful experiments by recent news, some of which still seem to work, and some of which don’t (the latter usually as a result of link rot or third party service closures).

So for example, a few weeks ago, the Google Education blog announced a new Chrome extension – Share to Classroom (Get your students on the same (web)page, instantly.

In the first case, it seems as if the extension lets someone at the front of the room share pages to class member screens inside the room. Secondly, folk in the class are also free to browse to other pages of their own finding and push suggestions of new pages to the person at the front, who can review them and then share them with everyone.

Anyway, in part it reminded me of one of my old hacks – the FeedShow Link Presenter. This was a much cruder affair, creating a frame based page that accepted an RSS feed of links referred to by the main presenter that audience members could click forwards and back through, but that also seems to have started exploring a “back with me” feature to sync everyone’s pages.

Presenters could go off piste (splashing page links not contained in the feed, but it looks as if these couldn’t be synched. (I’m not sure if I addressed that in a later revision.) Nor could audience members suggest links back to the main presenter.

The Feedshow link presenter had a Yahoo Pipes backend, and still seems to work; but with Pipes dues to close on September 30th, it looks as if this experiment’s time will have come to an end…


Ho hum… (I guess I’ve stopped doing link based presentations anyway, but it was nice to be reminded of them:-)

Poking Around the VW Thing (Algorithmic Cheating in the Automobile Industry)

The news in recent days that VW had installed a software device into some of it’s diesel engines that identify when the engine is being tested for emissions on a dynamometer (“dyno”) provides a nice demonstration of how “intelligent” software control systems can be use to identify particular operating environments and switch into an appropriate – or inappropriate – operational mode.

As I keep finding, press coverage of the events seems to offer less explanation and context than the original document that seems to have kicked off the recent publicity, specifically a letter from the US environmental protection agency to the Volkswagen Group of America:


(The document is a PDF of a scanned document; I had hoped to extracted the text using a variant of this recipe, running Tika on my own computer via Kitematic, Getting Text Out Of Anything (docs, PDFs, Images) Using Apache Tika, but it doesnlt seem to extract text from the inlined image(s). Instead, I uploaded it to Google Drive, then opened it in Google docs – you see pages from the original doc as an image, and below is the extracted text.)

Here are some of the bits that jumped out at me:

A defeat device is an AECD [Auxiliary Emission Control Device] “that reduces the effectiveness of the emission control system under conditions which may reasonably be expected to be encountered in normal vehicle operation and procedure use, unless: (1) Such conditions are substantially included in the Federal emission test procedure; (2) The need for the AECD is justified in terms of protecting the vehicle against damage or accident; (3) The AECD does not go beyond the requirements of engine starting; or (4) The AECD applies only for emergency vehicles …” 40 C.F.R. § 86.1803-01.

Motor vehicles equipped with defeat devices, … cannot be certified.

The CAA makes it a violation “for any person to manufacture or sell, or offer to sell, or install, any part or component intended for use with, or as part of any motor vehicle or motor vehicle engine, where a principal effect of the part or component is to bypass, defeat, or render inoperative any device or element of design installed on or in a motor vehicle or motor vehicle engine in compliance with regulations under this subchapter, and where the person knows or should know that such part or component is being offered for sale or installed for such use or put to such use.” CAA § 203(a)(3)(B), 42 U.S.C. § 7522(a)(3)(B); 40 C.F.R. § 86.1854-12(a)(3)(ii).

Each VW vehicle identified … has AECDs that were not described in the application for the COC that purportedly covers the vehicle. Specifically, VW manufactured and installed software in the electronic control module (ECM) of these vehicles that sensed when the vehicle was being tested for compliance with EPA emission standards. For ease of reference, the EPA is calling this the “switch.” The “switch” senses whether the vehicle is being tested or not based on various inputs including the position of the steering wheel, vehicle speed, the duration of the engine’s operation, and barometric pressure. These inputs precisely track the parameters of the federal test procedure used for emission testing for EPA certification purposes. During EPA emission testing, the vehicles ECM ran software which produced compliant emission results under an ECM calibration that VW referred to as the “dyno calibration” (referring to the equipment used in emissions testing, called a dynamometer). At all other times during normal vehicle operation, the “switch” was activated and the vehicle ECM software ran a separate “road calibration” which reduced the effectiveness of the emission control system (specifically the selective catalytic reduction or the lean NOx trap). As a result, emissions of NOx increased by a factor of 10 to 40 times above the EPA compliant levels, depending on the type of drive cycle (e.g., city, highway). … Over the course of the year following the publication of the WVU study [TH: see link below], VW continued to assert to CARB and the EPA that the increased emissions from these vehicles could be attributed to various technical issues and unexpected in-use conditions. VW issued a voluntary recall in December 2014 to address the issue. … When the testing showed only a limited benefit to the recall, CARB broadened the testing to pinpoint the exact technical nature of the vehicles’ poor performance, and to investigate why the vehicles’ onboard diagnostic system was not detecting the increased emissions. None of the potential technical issues suggested by VW explained the higher test results consistently confirmed during CARB’s testing. It became clear that CARB and the EPA would not approve certificates of conformity for VW’s 2016 model year diesel vehicles until VW could adequately explain the anomalous emissions and ensure the agencies that the 2016 model year vehicles would not have similar issues. Only then did VW admit it had designed and installed a defeat device in these vehicles in the form of a sophisticated software algorithm that detected when a vehicle was undergoing emissions testing.

VW knew or should have known that its “road calibration” and “switch” together bypass, defeat, or render inoperative elements of the vehicle design related to compliance with the CAA emission standards. This is apparent given the design of these defeat devices. As described above, the software was designed to track the parameters of the federal test procedure and cause emission control systems to underperform when the software determined that the vehicle was not undergoing the federal test procedure.

VW’s “road calibration” and “switch” are AECDs’ that were neither described nor justified in the applicable COC applications, and are illegal defeat devices.

The news also reminded of another tech journalism brouhahah from earlier this year around tractor manufacturer John Deere arguing that farmers don’t own their tractors, but instead purchase “an implied license for the life of the vehicle to operate the vehicle” (Wired, We Can’t Let John Deere Destroy the Very Idea of Ownership).

I didn’t really follow that story properly at the time but it seems the news arose out of a response to a consultation by the US Copyright Office around the Digital Millennium Copyright Act (DMCA), and in particular a “Proposed Class 21: Vehicle software – diagnosis, repair, or modification” category (first round comments, second round comments) for the DCMA Section 1201: Exemptions to Prohibition Against Circumvention of Technological Measures Protecting Copyrighted Works.

Here’s how the class was defined:

21. Proposed Class 21: Vehicle software – diagnosis, repair, or modification
This proposed class would allow circumvention of TPMs [technological protection measures] protecting computer programs that control the functioning of a motorized land vehicle, including personal automobiles, commercial motor vehicles, and agricultural machinery, for purposes of lawful diagnosis and repair, or aftermarket personalization, modification, or other improvement. Under the exemption as proposed, circumvention would be allowed when undertaken by or on behalf of the lawful owner of the vehicle.

Note the phrase “for purposes of lawful diagnosis and repair”…

I also note a related class:

22. Proposed Class 22: Vehicle software – security and safety research
This proposed class would allow circumvention of TPMs protecting computer programs that control the functioning of a motorized land vehicle for the purpose of researching the security or safety of such vehicles. Under the exemption as proposed, circumvention would be allowed when undertaken by or on behalf of the lawful owner of the vehicle.

(and in passing note Proposed Class 27: Software – networked medical devices…).

Looking at some of the supporting documents, it’s interesting to see how the lobby moved. For example, from the Senior Director of Environmental Affairs for the Alliance of Automobile Manufacturers:

The proponents state that an exemption is needed for three activities related to vehicles – diagnosis, repair, and modification. In my limited time, I will explain why, for the first two activities – diagnosis and repair – there is no need to circumvent access controls on Electronic Control Units (ECUs). Then, I will address why tampering with ECUs to “modify” vehicle performance undermines national regulatory goals for clean air, fuel efficiency, and auto safety, and why the Copyright Office should care about that.

1. Diagnosis/repair
The arguments put forward by the proponents of this exemption are unfounded. State and federal regulations, combined with the Right to Repair MOU and the 2002 “Dorgan letter,” guarantee all independent repair shops and individual consumers access to all the information and tools needed to diagnose and repair Model Year 1996 or newer cars. This information and these tools are already accessible online, through a thriving and competitive aftermarket. Every piece of information and every tool used to diagnose and repair vehicles at franchised dealers is available to every consumer and every independent repair shop in America. This has been the case for the past 12 years. Moreover, all of these regulations and agreements require automakers to provide the information and tools at a “fair and reasonable price.” No one in the last 12 years has disputed this fact, in any of the various avenues for review provided, including U.S. EPA, the California Air Resources Board, and joint manufacturer-aftermarket organizations.

There is absolutely no need to hack through technological protection measures and copy ECU software to diagnose and repair vehicles.

2. Modification
The regulations and agreements discussed above do not apply to information needed to “modify” engine and vehicle software. We strongly support a competitive marketplace in the tools and information people need so their cars continue to perform as designed, in compliance with all regulatory requirements. But helping people take their cars out of compliance with those requirements is something we certainly do not want to encourage. That, in essence, is what proponents of exemption #21 are calling for, in asserting a right to hack into vehicle software for purposes of “modification.” In the design and operation of ECUs in today’s automobiles, manufacturers must achieve a delicate balance among many competing regulatory demands, notably emissions (air pollution); fuel economy; and of course, vehicle safety. If the calibrations are out of balance, the car may be taken out of compliance. This is so likely to occur with many of the modifications that the proponents want to make that you could almost say that noncompliance is their goal, or at least an inevitable side effect.

Manufacturer John Deere suggested that:

1. The purpose and character of the use frustrate compliance with federal public safety and environmental regulations

The first fair use factor weighs against a finding of fair use because the purpose and character of the use will encourage non-compliance with environmental regulations and will interfere with the ability of manufacturers to identify and resolve software problems, conduct recalls, review warranty claims, and provide software upgrade versions.

And General Motors seem to take a similar line:

TPMs also ensure that vehicles meet federally mandated safety and emissions standards. For example, circumvention of certain emissions-oriented TPMs, such as seed/key access control mechanisms, could be a violation of federal law. Notably, the Clean Air Act (“CAA”) prohibits “tampering” with vehicles or vehicle engines once they have been certified in a certain configuration by the Environmental Protection Agency (“EPA”) for introduction into U.S. commerce. “Tampering” includes “rendering inoperative” integrated design elements to modify vehicle and/or engine performance without complying with emissions regulations. In addition, the Motor Vehicle Safety Act (“MVSA”) prohibits the introduction into U.S. commerce of vehicles that do not comply with the Federal Motor Vehicle Safety Standards, and prohibits manufacturers, dealers, distributors, or motor vehicle repair businesses from knowingly making inoperative any part of a device or element of design installed on or in a motor vehicle in compliance with an applicable motor vehicle standard.14

Further, tampering with these systems would not be obvious to a subsequent owner or driver of a vehicle that has been tampered with. If a vehicle’s airbag systems, including any malfunction indicator lights, have been disabled (whether deliberately or inadvertently), a subsequent vehicle owner’s safety will be in jeopardy without warning. Further, if a vehicle’s emissions systems have been tampered with, a subsequent owner would have no way of knowing this has occurred. For tampering that the subsequent owner eventually discovers, manufacturer warranties do not cover the repair of damage caused by the tampering, placing the repair cost on the subsequent owner. For good cause, federal environmental and safety regulations regarding motor vehicles establish a well-recognized overall policy against allowing tampering with in-vehicle electronic systems designed for safety and emissions control.

While so-called “tinkerers” and enthusiasts may wish to modify their vehicle software for personal needs, granting greater access to vehicle software for purposes of modification fails to consider the overall concerns surrounding regulatory compliance and safety and the overall impact on safety and the environment. … Thus, the current prohibition ensures the distribution of safe and secure vehicle software within an overall vehicle security strategy implemented by car manufacturers that does not restrict vehicle owners’ ability to diagnose, modify or repair their cars.

The arguments from the auto lobby therefore go along the lines of “folk can’t mess with the code because they’ll try to break the law”, as opposed to the manufacturers systematically breaking the law, or folk trying to find out why a car performs nothing like the apparently declared figures. And I’m sure there are no elements of the industry wanting to prevent folk from looking at the code lest they find that it has “test circumvention” code baked in to it by the actual manufacturers…

What the VW case throws up, perhaps, is the need for a clear route for investigators to be allowed to find a way of checking on the compliance behaviour of various algorithms, not just in formal tests but also in unannounced to the engine management system, everyday road tests.

And that doesn’t necessarily require on-the-road tests in a real vehicle. If the controller is a piece of software acting on digitised sensor inputs to produce a particular set of control algorithm outputs, the controller can be tested on a digital testbench or test harness against various test inputs covering a variety of input conditions captured from real world data logging. This is something I think I need to read up more about… this could be a quick way in to the very basics: National Instruments: Building Flexible, Cost-Effective ECU Test Systems White Paper. Something like this could also be relevant: Gehring, J. and Schütte, H., “A Hardware-in-the-Loop Test Bench for the Validation of Complex ECU Networks”, SAE Technical Paper 2002-01-0801, 2002, doi:10.4271/2002-01-0801 (though the OU Library fails to get me immediate access to this resource…:-(.

PS In passing, I just spotted this: Auto Parts Distributor Pleads Guilty to Manufacturing and Selling Pirated Mercedes-Benz Software – it seems that Mercedes-Benz distribute “a portable tablet-type computer that contains proprietary software created by to diagnose and repair its automobiles and that requires a code or ‘license key’ to access [it]” and that a company had admitted to obtaining “without authorization, … [the] Mercedes-Benz SDS software and updates, modified and duplicated the software, and installed the software on laptop computers (which served as the SDS units)”. So a simple act of software copyright/license infringement, perhaps, relating to offboard testing and diagnostic tools. But another piece in the jigsaw, for example, when it comes to engineering software that can perform diagnostics.

PPS via @mhawksey, a link to the relevant West Virginia University test report – In-use emissions testing of light-duty diesel vehicles in the U.S. and noting Martin’s observation that there are several references to Volkswagen’s new 2.0 l TDI engine for the most stringent emission standards — Part 2 (reference [31] in the paper, Hadler, J., Rudolph, F., Dorenkamp, R., Kosters, M., Mannigel, D., and Veldten, B., “Volkswagen’s New 2.0l TDI Engine for the Most Stringent Emission Standards – Part 2,” MTZ Worldwide, Vol. 69, June, (2008). , which the OU Library at least doesn’t subscribe to:-(…

“Interestingly” the report concluded:

In summary, real-world NOx emissions were found to exceed the US-EPA Tier2-Bin5 standard (at full useful life) by a factor of 15 to 35 for the LNT equipped Vehicle A, by a factor of 5 to 20 for the urea-SCR fitted Vehicle B (same engine as Vehicle A) and at or below the standard for Vehicle C with exception of rural-up/downhill driving conditions, over five predefined test routes. Generally, distance-specific NOx emissions were observed to be highest for rural-up/downhill and lowest for high-speed highway driving conditions with relatively flat terrain. Interestingly, NOx emissions factors for Vehicles A and B were below the US-EPA Tier2-Bin5 standard for the weighted average over the FTP-75 cycle during chassis dynamometer testing at CARB’s El Monte facility, with 0.022g/km ±0.006g/km (±1σ, 2 repeats) and 0.016g/km ±0.002g/km (±1σ, 3 repeats), respectively.

It also seems that the researchers spotted what might be happening to explain the apparently anomalous results they were getting with help from reference 31: “The probability of this explanation is additionally supported by a detailed description of the after-treatment control strategy for Vehicle A presented elsewhere [31]“.

PPPS I guess one way of following the case might be track the lawsuits, such as this class action complaint against VW filed in California (/via @funnymonkey) or this one, also in San Francisco, or Tennessee, or Georgia, or another district in Georgia, or Santa Barbara, or Illinois, or Virginia, and presumably the list goes on… (I wonder if any of those complaints are actually informative/well-researched in terms of their content? And whether they are all just variations on a similar theme? In which case, they could be an interesting basis for a comparative text analysis?)

In passing, I also note that suits have previously – but still recently – been filed against VW, amongst others, regarding misleading fuel consumption claims, for example in the EU.

Even Though RSS Never Went Away, Could It Be Coming Back as a Facebook Sinker?

Long time readers will know I was – am – a huge fan of RSS and Atom, simple feed based protocols for syndicating content and attachment links, even going so far as to write a manifesto of a sort at one point (We Ignore RSS at OUr Peril).

This blog, and the earlier archived version of it, are full of reports and recipes around various RSS experiments and doodles, although in more recent years I haven’t really been using RSS as a creative medium that much, if at all.

But today I noticed this on the official Facebook developer blog: Publishing Instant Articles Directly From Your Content Management System [Instant Article docs]. Or more specifically, this:

When publishers get started with Instant Articles, they provide an RSS feed of their articles to Facebook, a format that most Content Management Systems already support. Once this RSS feed is set up, Instant Articles automatically loads new stories as soon as they are published to the publisher’s website and apps. Updates and corrections are also automatically captured via the RSS feed so that breaking news remains up to date.

So… Facebook will use RSS to synch content into Facebook from publishers’ CMS’.

Depending on the agreement Facebook has with the publishers, it may require that those feeds are private, rather than public, feeds that sink the the content directly into Facebook.

But I wonder, will it also start sinking content from other independent publishers into the Facebook platform via those open feeds, providing even less reason for Facebook users to go elsewhere as it drops bits of content from the open web into closed, personal Facebook News Feeds? Hmmm…

There seems to be another sort of a grab for attention going on too:

Each Instant Article is associated with the URL where the web version is hosted on the publisher’s website. This means that Instant Articles are open and compatible with all of the ways that people share links around the web today:

  • When a friend or page you follow shares a link in your News Feed, we check to see if there is an Instant Article associated with that URL. If so, you will see it as an Instant Article. If not, it will open on the web browser.
  • When you share an Instant Article on Facebook or using email, SMS, or Twitter, you are sharing the link to the publisher website so anyone can open the article no matter what platform they use.

Associating each Instant Article with a URL makes it easy for publishers to adopt Instant Articles without changing their publishing workflows and means that people can read and share articles without thinking about the platform or technology behind the scenes.

Something like this maybe?


Which is to say, this?


Or maybe not. Maybe there is some enlightened self interest in this, and perhaps Facebook will see a reason to start letting its content out via open syndication formats, like RSS.

Or maybe RSS will end up sinking the Facebook platform, by allowing Facebook users to go off the platform but still accept content from it?

Whatever the case, as Facebook becomes a set of social platform companies rather than a single platform company, I wonder: will it have an open standard, feed based syndication bus to help content flow within and around those companies? Even if that content is locked inside the confines of a Facebook-parent-company-as-web attention wall?

PS So the ‘related content’ feature on my WordPress blog associates this post with an earlier one: Is Facebook Stifling the Free Flow of Information?, which it seems was lamenting an earlier decision by Facebook to disable the import of content into Facebook using RSS…?! What goes around, comes around, it seems?!

Tweetable Bullet Points

Reading through Pew Research blog post announcing the highlights of a new report just now (Libraries at the Crossroads), I spotted various Twitter icons around the document, not unlike the sort of speech bubble comment icons used in paragraph level commenting systems.


Nice – built in opportunities for social amplification of particular bullet points:-)

PS thinks: hmm, maybe I could build something similar into auto-generated copy for contexts like this?

PPS via @ned_potter, the suggestion that maybe they’re making use of this WordPress plugin: ClickToTweet. Hmmm… And via @mhawksey, this plugin: Inline Tweet Sharer.