Notes from on #ORGCon 2013

The last conversation I had before leaving ORGCon for the day was with Alec Muffet, Adriana Lukas, Glynn Wintle and others about whether ORG was a Liberal Democrat party conspiracy, or was in some way permanently leftist. This is a concern shared with me privately and one which is important enough to gather opinion on. The consensus was that it wasn’t a liberal democrat conspiracy and had significant involvment from right-wingers (by which I mean pro-market and social liberals), including vouched for members of that persuasion.

Tim Wu’s view on what we ought to do

The assurances received are reassuring but there is a case to answer. Ex-Green figure head Jim Killock did, after all, agree to open the conference and – in his words – “set the scene” with a talk by Tim Wu. I have frankly never heard of this chap before his talk today, but was persuaded that freedom is not be found on the road he is travelling.

He’s a knowlegable man with a firm grasp of technical history, and his talk was educational. He is rightly concerned for free-speech and privacy on the internet but his solutions are not going to work. He wants to bring corporations under the control of the state through anti-trust and monopolies legislation and net-neutrality regulations, as if the state was a perfectly incoruptable God capable of stewarding the internet better than the people who own it now (which is a very large number of companies). Bizarrely he suggested a US network operator (AT&T if I recall correctly) was pressured into accepting PRISM through fear of what might happen to a pair of mergers that were before the US anti-trsut authorities, but did not acknowledge the obvious corrolaries of that story and again endorsed the use of anti-trust law in the Q&A.

Snooper’s charter is still not dead

The session on the Snooper’s Charter was basically an info dump, delivered in under an hour. The dump included the state of the legislation now and the “problems” facing legislators, that is, the types of data they can’t yet legally get. An investigative journalist told his story and established the ages of various measures (which are not new) Discussion then continued, somewhat inaudibly, to talk about some of the various problems with stopping the passage of the bill. I concluded a better way to get up-to-date on this area was to read through the report, which was helpfully available in the exhibition area.

Data protection revolution

The session on the Data Protection Bill was better paced and structured. The lady from Privacy International claimed that the EU regulation would be a tame evolution of current law, which may be true to some extent, but the level of protection proscribed in law seemed to be very high. She listed:

  • The regulation sought to ensure a strong right to privacy that was enforced (against businesses).
  • A right to data-portability, which means businesses would be mandated to provide (securely, we assume) a data-export function where all the information they hold on you can be downloaded in a form that would allow it to be uploaded to a competitor.
  • A ban on automated profiling, whcih means a ban on making decisions about people using software algorithms that might cause them to be treasted differently from other customers.
  • A right to be forgotten about.
  • A requirement that privacy protection is on by default.
  • A requirement that privacy protection is designed into systems before customers interact with it.

Privacy campaigners are obviously very keen for that shopping list to get passed, as it represents all of the features of their preferred way for consumers to deal with companies. Corporations seem to be a bit concerned that the regulations are very prescriptive and want more flexibility. That sets the scene for a face off in the EU and some 4000 ammendments have been tabled. Privacy campaigners are lobbying for their interests to be represented by MEPs and are tracking how MEPs vote, sorting “good” from “bad”.

The deputy information commissioner seemed more reasonable. He mentioned for example that a right to be forgotten that entitled consumers to have their data erased entirely would be difficult to acheive technically and would have to be relaxed. The word he used was “unrealistic”, quite so, every developer knows that you do not lightly delete database records. He also mentioned that impact assessments were a potential problem and “one size fits all” regulations may be onerous to some (I have heard that impact assessment might need to be conducted for every change to regulated software). Conversely there may be a problem of fragmentation in the data protection regime, with courts, police and public sector bodies subject to a separate regime. One regime could be hard enough with a great deal of detail in it, from the details of documentation to be filed and the qualifications of members of staff corporations would be required to retain.

Another item mentioned by the deputy information commissioner was that there is a stiff deadline for the regulations to pass. If they don’t get through by March/April 2014 then they will have to be abandoned as elections are held and may not be picked up again.


Kassey Chappelle spoke on behalf of Vodafone (video bookmark). She claimed there is a level of under-investment in privacy by corporations [not according to that Microsoft advert 🙂 ] and that Vodafone had tried to do more. They had conducted a health-check of privacy in the company and found a spaghetti mess of policies and procedures agreed on a product-by-product basis in a range of jurisdictions, all agreed beteen product managers and lawyers. She claimed, fairly I suppose, that lawyers are trained to provide technically correct answers about what to do in the face of the law and that the user experience, and the user’s desires of the product are secondary. Consent was being captured in various places, usually by just showing a massive block of legalese. It was not meaningful to users and was a mess to adminsiter.


The alternative approach outlined for Vodafone gave the marketing and product development functions control over privacy and made privacy a feature of their business model and user experience rather than a tick-box add-on considered only at the end. This was privacy by design, as intended for the new law, but brought into the business proposition and (it was claimed) into the culture of the company, at least that was the declared intent.

Chappelle called this model “accountability” and wanted legislation that gave companies the responsibility to deal with consumers in an accountable way, and placed the burden of getting audits of that behaviour onto the corporations. Of course, a large corporation like Vodafone has no problem paying for such processes but this is not the case for many start ups; although of course a start up also has full control of what promises it chooses to be accountable for.


The end of the conference EFF founder John Perry Barlow addressed the conference for the keynote. His was to be an interactive keynote and he pointed out every seat contained a microphone so that questions could be asked. This proved interesting.

Barlow’s vision was of a world in which any person, anywhere on Earth (a big enough vision for the next few decades), would be able to access any knowledge that is known to the human race as quickly as their particular mind was able to assimilate it, so without restrictions of access and intellectual property regimes. he saw the world changing fast as “religions”, such as Christianity or The United States of America suffered from the shock of new information flows and transparency.

For actual religions, he said, this made religious epistemology – the idea that what is right is in a particular book – unsustainable as young minds free to explore the Web would find superior answers more readily and which undermined religious texts.

For the governments, their secrets were no longer sustainable. They would have to act with integrity (I use the Randian word, Barlow may not have) and act they presented themselves to act (without breach between principle and action). State authority would also be undermined and governemace would have to be more “horizontal”. Here I think Barlow meant distributed, or heteroarchical, but you could also say that this would inevitably be market driven (markets, the ultimate free heteroarchy). BArlow may have meant something else here, but I cannot think what.

orgcon keynote jpbarlowIn corporations, the desire to hire blandly acceptable normal looking people will be undermined. At the moment, if you had a tattoed face – Barlow speculated – you would probably not get the job at IBM. In future you may do. The reason he gave is that the young today have no trouble asserting their individuality publicly so quirky lifestyles will be known openly and will be visible to everyone in the future workplace. He thought this would undermine the expectation that there is some kind of normal person.

Toward the end of the main talk he said something optimistic about our technology and tools. He said such tools are not malignant. That that they do is what we intend for them, as long as we give thought to what that intent is. Tim Wu would have interjected that we have a duty to supervise the tools as well, but did not. Barlow did say, however, that we may find we need new ways to get things done – new ways to solve societal problems – and would have to accept that.

Broadcast breakdown

The end of the talk was then encouraged to be a two way session. He said he spent is life trying to destroy broadcast media and did not want to become one. The microphones came out and back and forth segments were interspersed with longer repsonses from Barlow. It was during this period that Barlow mentioned he’s been working in Iceland, Ireland, New Zealand and Ecuador to find new ways to privately store and communicate data. He talked of various ways to build alternatives to “bit transport” (internet connections) that did not involve blowing up the Internet, as Tim Wu suggested, but rather tunnelling and layering over it. White spaces in radio spectrum were considered important.

There was a challenge made to Barlow, who remember is founder of one US digital civil liberties group called the EFF, to bring the 4th ammendment to protect the UK. The idea that PRISM was enabling massive surveillance of UK citizens as traffic passed through the UK had this audience understandably riled, but it was as if the failure of the NSA to adhere to America’s constitution while acting in America was somehow Barlow’s problem. His earlier caution that the NSA do care about the constition and are also incompetent was not considered protection enough, and so asking the audience to get some perspective on that riled them further. In response to this bizarre anger Barlow did promise and did repeat that his organisation would prefer the US constition to be intepretted as applying to foreign citizens, but frankly I don’t think this is entirely his problem! I found one part of his response quite compelling: he said there is no world government to enforce a privacy standard so if we want one then we – technologists, programmers – need to make it “practically and through technology architecture” because no one else will.

Nor, actually, was the next little challenge. Barlow was asked whether Julian Assange – still hiding in the Ecuadorian embassy – was the deserving recipient of Swedish due process for the rape allegation he was indicted for, or whether he was the victim of a US conspiracy to get him. This question put Barlow in the awkward position of having to comment on the reality of the rape itself (and that analysis is complicated by claims of ambiguity around consent). I cannot see how Barlow could have answered such a question satisfactorally and well, he did not and used the words “ungentlemanly conduct”. There was a little anger in the room and some people walked out early (without saying why), but the Twittesphere errupted with abhorrence and condemed Barlow as if he had said rape was okay. In fact, he had been diplomatic towards Assange  in the face of imperfect knowledge, and before a trial, and had been placed on the spot by the questioner.

It was all very undignified but, bizarrely, only on Twitter.

When rights collide?

One of the angles not covered in the Old Holborn fall out is that of privacy.

For the uninitiated, firstly where have you been? And secondly the story is roughly as follows:

  1. Opinionated anonymous tweeter and blogger says some things that some people in Liverpool find highly offensive;
  2. People in Liverpool do some research and find out the name, address, workplace, phone number etc of tweeter, publish it, and there is some harassment such as phone calls, threats, mails and calls to his employer etc.
  3. The tweeter’s account is suspended – probably by the tweeter.
  4. The police are called – probably by both parties, but for different reasons.
  5. The tweeter comes back online.

The issue of general free speech has been the subject of much debate. But there is a specific angle which may split Libertarians: Is it acceptable to publicise someone’s name, address and other personal details – if they haven’t given permission, and especially if they have let it be known that they wish to be anonymous?

Let’s get rid of the easy answer first:

Where it’s someone “official” who has been provided with the personal information, they should not give it out – at least without a court order. This applies to the government, ISPs, utility companies etc. And where such information is collected from these people illegitimately though hacking, social engineering, or because someone knows some who works there, again it is patently wrong.

One of the above may have been true in this case. But some tweets have indicated that someone knew someone who had a sister who went to the tweeter’s wedding. By knowing the name and the spouse’s name they could do a little research, review old tweets for clues, do some Googling, some searching on LinkedIn and put together the full identity picture.

So at the end of the day, with the exception of the friend’s friend’s sister, it was all put together using public domain information.

We libertarians may rush to the libertarian gods for answers – the Rothbards, the Rands etc – but they pre-date the internet and aren’t very helpful. We can refer to rights and discuss whether our personal information is our property and therefore whether we need to give permission before it is published. But every day we may give our name to casual acquaintances and tell them where we live, and what our job is. Do we need to hand each of them a legal notice telling them that they must not pass this on without express permission?

It’s funny how even some of the most ardent the free speech advocates come down on the side of privacy – even though what is being disclosed is factual. Of course there are problems. Personal information can be used by ne’er-do-wells to hunt someone down and commit violence against them. But one of the challenges of being a free-speech purist is that there can be unfortunate consequences of unfettered speech.

Someone argued with me today that such information was not a part of speech, it was data, and therefore free speech arguments do not apply to it. I countered that free speech involves context, and that part of the context of any utterance is the identity of the speaker.

I would also argue that the risk of “exposure” is a control over the excesses of free speech. In a face to face situation, the control over offensive free speech is the risk that the offended party may initiate violence. This of course is wrong, but of scant comfort when you’ve got a bloody nose and the aggressor has scarpered. With online speech, there is no bloody nose – unless you’re outed to your foes and one of them has the balls to come calling.

As ever, there is never a perfect solution. It’s about balancing conflicting “rights”. Free speech is a natural right. I’m not convinced however, that rights exist in our personal data, except where data is given with the specific expectation that it should be entrusted. The risk of exposure for an anonymous blogger may help to curtail the most offensive excesses of their output.

But really, the main lesson is this: If you know you’re going to offend people online, and those people might come after you, you need to keep your real life and online personae totally separate – even from friends and family. If you wish to keep such personae apart from the “authorities” and from those who might be able to steal information from the authorities, then you’ll need to learn about TOR, proxy servers and other electronic anonymisation and obfuscation tools too.

  • James Rigby
  • Lives in Wickford, Essex
  • Information Security Consultant.

And doesn’t care who knows it. At least not when I’m not offending anyone.

Having your cake, and eating my cookies

© Gina Guillotine

It is obvious that web users are tracked by advertisers, we all notice occasionally when one advert follows us around the Internet. If I let my fiancée use my laptop I would not be surprised if I was followed around the web by adverts for Russell and Bromley shoes. Yes this is annoying, even intrusive, but this tracking and the “Cookies” that allow it increase advertising revenues for publishers. They are part of a new economic order that has emerged without anyone pointing a gun at anyone, they are part of the emergent implicit compromise that is our culture and our economy online. We know it happens, we know people profit from it and we accept it in our own self-interest, or because life is too short to worry.

The institution of “the web”, which is not one institution at all, offers an immeasurable amount of knowledge and connects us together as a culture in a way which is unique and precious. It is shepherded, as the natural world is shepherded, by owners who each have rights to their small part of it. Users own terminal equipment and lease copper wires that connect them to networks owned by ISPs that connect them to servers owned by companies, universities, institutions, and individuals. When a user chooses to connect to their services, he sends a message from his terminal equipment out across that network to a service’s server. We must not forget that this is a voluntary act and is, for the most part, initiated by the user.

I want to explore another voluntary act to draw a useful comparison. If I have a problem and I ask for help, I will naturally want to choose who I ask. The question might be very personal, or might reveal my intent in a way that is unhelpful to my interests. If it is particularly sensitive I might check over my shoulder before I begin to speak and I might naturally wish to keep a track of to whom I have spoken and what I have revealed to make sure that I haven’t revealed too much to one person. Occasionally, I might reveal one piece of information, then decline to offer a certain second piece of information knowing that it is possible to infer something sensitive, for example, when the inferred fact would breach a confidence. Some people get really complicated about this and look out for occasions when confiding in person A changes their behaviour in a way that offers a clue to person B whom is looking out for it and thereby the secret is revealed without A ever breaching the confidence to B. Frankly, I think life is too short to worry about that kind of thing, but I know people do invest time in it, especially if they have made decisions in life that they need to lie about.

The point of talking about these scenarios is to highlight one thing about them: no one ever said that deciding who to trust and what to trust them with is the responsibility of the person receiving the information. In real life, we take responsibility for deciding whom to trust with information and for the inferences that the recipient might make. We fret about acting responsibly with what we are told, and maintaining a good impression, which means acting to avoid certain inferences being made (especially untrue ones).

Accessing a website is very much like asking someone for help (a request is made, an action might be performed, and information flows back) but some people seem to believe that users online cannot take the same level of responsibility and demand health warnings that indicate when the listener is alert to connecting together facts and making inferences. I think they are wrong.

“But Simon!” you scream, “online companies operating big databases can remember more about you than even you could remember. You are powerless to keep track of what you reveal.”. This is a matter of degree, and scale, but is not a fundamental difference. It arrises because the economics of keeping track of the facts (requests, questions, actions performed) is much more interesting for one party than it is for another, and this is especially true of advertising companies but applies in e-commerce where retailers push products based on order history rather than browsing history. The ad revenue, or future orders, offer a strong incentive to service operators to keep a detailed track, but no strong incentive exists for users. Users are left with a vague fear about what might be inferred, but rationally decide that life is too short to worry.

I said companies are incentivized to do tracking, by improved ad and order revenue, but that says nothing about the cost. As ever, the market supplies cost-effective solutions by coordinating the division of labour. Specialist firms have sprung up to do customer tracking and product recommendation effectively and cheaply for the whole array of e-commerce and media websites. Profiling software for advertising firms is lucrative so advertising brokerages run their own versions, and instead the division of labour is between the advertising brokerage (e.g. Google AdSense) and the service operator (e.g. a blog, like this one) rather than between service operator and software firm, but division of labour is still at work.

The EU’s Cookie Directive is an irritating legislative solution to the vague sense of fear experienced by users. Rather than taking responsibility for what they voluntariliy reveal, they commission a solution from politicians that is based on the implied threat of imprisment. The law obligates websites to implement silly yet expensive pop up health warnings whilst offering little clarity about what those pop ups must look like. Implementing those pop ups for an enterprise website will require a fusion of legal, business and technical knowledge that will keep a large group of very well paid people occupied in interminable meetings for a very long time, and then occupied on engineering and impact analysis tasks for a long time afterwards.

It’s wrong of users to impose those laws, when a better voluntarist solution exists.  I said earlier that users have little incentive to do the work of keeping track of all the trivial facts they reveal and agreed that they cannot win against companies with massive databases and professionally produced specialist software. All true, but users’ vague sense of fear is also an incentive for users to act to protect their privacy. The market in technology online offers a process of dividing labour in a way that matches those weak, distributed incentives with the intellectual effort of people able to do something about them; and it already has done. For example, the Do Not Track specification, implemented in Firefox etc, is a means to tell service operators that the issue bothers you, leaving it up to them to meet that preference and balance their other priorities. Identical to any other request from a customer, this is good honest process. Evidon is a company with a browser plugin called Ghostery which produces a little database of tracking that has happened to you, and puts you in control of the process in an accessible way. In fact, all web browsers have included features of this kind to some extent and even the advertising networks that can afford it have begun to offer transparency and access for users to see what they have inferred about them.

The invention of these technical solutions is a sign the market is adjusting to the new problems and incentives it now faces. Users, in their home lives, are adopting counter acting technical solutions to address the technology that their peers deploy in their work lives and there is plenty more scope for this. I worry that by running to the Government users’ have forcibly cut off the revenue that is supporting online services they use, and when they are ready to eat their cake, the users won’t have paid for it.