Wednesday, May 25, 2016

Cory Doctorow on our movement towards digital alchemy

Cory Doctorow gave a terrific 13 minute talk at a recent O'Reilly conference on how it is time to stop and do something about DRM - a world in which everything [the internet of things] is made of computers is not just one with anti-competitive effects but which is a nightmare for privacy and security.

In ancient times alchemists were a bit like scientists, Cory tells us. They used something like the scientific method to attempt to understand the laws of nature. If they could not reproduce results they blamed it on demons and superstition - the demons changed the rules of nature, the alchemists believed, in order to prevent anyone understanding the universe.  Alchemists also kept what they found secret, rather than sharing and engaging in peer review and efforts to recreate results as scientists do. The result was lots of alchemists finding out for themselves that mercury was poisonous. The enlightenment demonstrated that sharing of knowledge worked in the public good.

O'Reilly have released 3 minutes of the talk freely for sharing via on YouTube



Cory, as ever, has lots of engaging tales to tell. One of the most serious was perhaps that of a security researcher who suffers from diabetes. He discovered a flaw in an insulin pump. The built in wifi would enable a stranger to take control of the pump and kill a diabetic using it from 30 feet away. But he couldn't publish the details because he would be revealing DRM circumvention instructions, a felony in the US, punishable by up to 5 years in jail. He could refuse to use the pump but not tell others why or how it was unsafe.

Other researchers have shown it is possible to take remote control of a Chrysler Jeep via the internet since it has a Sprint network SIM. 1.4 million of these vehicles in the US could be driven from anywhere on the internet, access to acceleration, brakes and steering, all controllable remotely.

MP3 players and smartphones are likely improving long term commercial prospects for hearing aid vendors. But future hearing aids will be computers. General purpose technology in your head but programmed and controlled not by the user but the manufacturer or supplier. They will know what we hear, when we hear it, filter what it doesn't want us to hear or even make us hear things depending on how they are configured. And they will have DRM. As will skyscrapers with seismic dampers that keep them from falling down and smart thermostats that enable power companies to reach into our home and turn the temperature down and stop us turning it up again.

Yet when we make it illegal to crack or circumvent DRM we make it attractive for commercial enterprises to engage in this type of nuisance behaviour, especially if they are making hardware on razor thin margins.

Cory concluded by noting the stuff - DRM - designed in an effort to maximise revenues for the entertainment business, is driving us towards a dystopian future where we will "be Huxleyed into the full Orwell". It's a new age of alchemy, a demon haunted world where we're not allowed to understand the technology and internet of things we live amongst. But do take a small break and listen to the whole talk yourself. It's 13 minutes well spent.

Monday, May 23, 2016

The tyranny of the algorithm yet again...

I was reminded yet again, over the weekend, how easy it is to get branded persona non grata in the age of the judgmental algorithm.

A search for "Isis Close" via any popular search engine throws up a significant collection of such streets, primarily in the Thames Valley region, in places like Long Hanborough, Aylesbury, Oxford, Abingdon and Putney, amongst others.

That these place names exist won't be a surprise to anyone familiar with English limnology - the study of rivers and inland waters. As Wikipedia helpfully tells us, "The Isis is the name given to the part of the River Thames above Iffley Lock which flows through the university city of Oxford". In at least one local primary school I'm familiar with, the classes are called Windrush, Cherwell, Isis and Thames.

Unfortunately for those who live in an Isis Close, Street, Road or other equivalent, the label Isis has been appropriated by or conferred upon (by the media) a bunch of murderous extremists in the Middle East. So the word "Isis" has become somewhat toxic in the West.

Now PayPal has decided that they are not prepared to facilitate payments for goods to be delivered to an address which includes the word "Isis".

An Isis street resident ran into some unexpected difficulties when attempting to purchase a small quantity of haberdashery on the internet with the aid of a PayPal account. The transaction would not process. In puzzlement she eventually got irritated enough to brave the 24/7 customer support telephone tag labyrinth. The short version of the response from the eventual real person she managed to get through to was that PayPal have blacklisted addresses which include the name "Isis". They will not process payments for goods to be delivered to an Isis related address, whatever state of privileged respectability the residents of such properties may have earned or inherited in their lifetimes to this point.

Who knows if the "avoid Isis" algorithm was added by a low level techie, a policy decision within PayPal driven by risk averse lawyers or some other process. Whatever the process, the result is that people with an Isis address have been tagged with a "do not touch" label on the internet.

We rarely understand that the poor, the mentally or physically disabled, ethnic and religious minorities, and every other collective marginalised group that privileged society discriminates against already live in a dystopian world. A world full of legal, environmental, economic and societal strictures and attitudes that make life difficult. We rarely understand, either, how easy it is, in a world full of big data, wielded by commerce and governments alike, to be cast out from our world of privilege to that of the marginalised.

So what are residents of "Isis" addresses to do. Interestingly enough, PayPal are not so wedded to their dissociation from "Isis" that they are banning this particular brand of humanity from holding PayPal accounts. The marginalised are still potentially profitable fodder. The organisation is merely concerned with not facilitating transactions which result in goods or services being delivered to those addresses. So Isis addressees could
  • have their orders delivered to an alternative address for collection
  • apply to the local council to have the name of their street changed
  • move house
The first is a serious inconvenience. The latter two options rather extreme and even then, given the tendency of unwelcome labels to hang around people on the internet and in big databases, there's no guarantee that a change of street name or address, drastic as they are, would entirely rid the people concerned of the tyranny of the algorithm.

Being unable to buy a sewing kit on the internet might seem like a minor middle class inconvenience but metadata, like details of someone's address, matters. It's only two years since the former US National Security Agency and CIA chief, General Michael Hayden, went on record with his "We kill people based on metadata" comment. Metadata is used to categorise and discriminate. We have no control over what commercial organisations or government or other economic actors and indeed criminals do with it.

With the UK Investigatory Powers Bill heading for the statute books, and the government taking little or no notice of the serious criticism it has received, this state of affairs is not primed to improve any time soon. Could I remind the reader of this simple engineer's perspective of the meaning of s78 alone in this large and complex law - it looks a bit like this:
I leave you to decide whether the government and its institutions will be able or willing to use this unimaginably gigantic collection of data in the public interest and without doing too much collateral damage to the sifted and categorised populace along the way.

Update: A stark example of where the tyranny of the algorithm does real damage - software used across the US to predict future criminals and it’s biased against black people.