[Content FIlter Applied – For Your Own Protection]

I was away over the weekend. As a result, I didn’t see what had been hitting the news until I got home late yesterday evening. And what did I see? David Cameron making his latest asinine contribution to Global Warming purely by dint of opening his mouth. Here’s the speech that outlines this.

There was an attempt at trying to create a faint hint of menace about what was said to the Internet companies yesterday regarding content filtering. The prime Minister seemingly tried to adopt the tone of a couple of heavies sloping into a new landlord’s pub and advising him that to comply with “local customs” might be in in his best interests, however financially inconvenient it might be.

What it actually sounded like was a bit like him proposing a law to stop international drug dealers using the Exchange and Mart to sell big bags of lovely heroin. It’s likely to be about as effective.

Part of the problem is that Mr Cameron seems to be rather too quick to turn his ear towards the execrable Clare Perry MP, (for those not aware of her, think of a latter-day Mary Whitehouse, but without even her charm and humour), whose efforts at opt-out filtering have struck many with any knowledge of the online world as fundamentally misunderstanding its whole mechanism and culture. There are a number of problems with what is being proposed, but let’s run through the major ones as I see them:

  1. The Prime Minister says:

    “The internet is not just where we buy, sell and socialise it is where crimes happen and where people can get hurt and it is where children and young people learn about the world, each other, and themselves.The fact is that the growth of the internet as an unregulated space has thrown up two major challenges when it comes to protecting our children.”

    Well, this would almost sound plausible if the Internet were an unregulated space, but that is not true.  Crimes happen there.  And because crimes happen there, those who commit them can be prosecuted. Sometimes deciding upon jurisdiction is difficult, but it is not impossible.  These behaviours are only criminal because we have already decided to label them so.  The biggest problem seems not deciding what the crime is, but acting on its commission. We have learned that material related to illegal activity within relevant jurisdictions will be released to the authorities given sufficient due cause to do so, so this should not be an obstacle to prosecuting those responsible. Further, the publisher of the content is liable and should be pursued. And the search engines are not the publishers.

  2. The Prime Minister says:

    “And I mean ‘we’ collectively government, parents, internet providers and platforms, educators and charities.”

    True, but I rather think that the proposals, as they are laid out, fundamentally unbalance the relationship between areas of responsibility and power.  I am the father of a young daughter. I believe it is my responsibility to teach her what is wrong at right to look at,  and how to be aware of the dangers of internet usage in precisely the same as it was (and remains) my responsibility to teach her about talking to strange people or crossing the road safely.  Her school can help of course, but ultimately, outside of the classroom, the power and the responsibility belongs to me.

  3. The Prime Minister says:

    “The police and CEOP – that is the Child Exploitation and Online Protection centre – are already doing a good job in clamping down on the uploading and hosting of this material in the UK.

    Indeed, they have together cut the total amount of known child abuse content hosted in the UK from 18 per cent of the global total in 1996 to less than 1 per cent today.

    They are also doing well on disrupting the so-called ‘hidden internet’, where people can share illegal files and on peer-to-peer sharing of images through photo-sharing sites or networks away from the mainstream internet.

    Once CEOP becomes a part of the National Crime Agency, that will further increase their ability to investigate behind pay walls to shine a light on the hidden internet and to drive prosecutions of those who are found to use it.”

    Well, this is nice. Or wuold be if CEOP were not having its funding squeezed.  And I wouldn’t be overly enthused about the NCA either, given the performance of SOCA.

    He says that, “They are also doing well on disrupting the so-called ‘hidden internet’.  This is very difficult to substantiate, given the nature of dark web and other, non-web internet activities.   The whole point is that these activities are clandestine and very difficult to detect for those outside the target groups. It’s even hard to know even the full extent of such a region at all, so how did this “doing well” line arrive. “Trying their hardest” maybe, but this is not the same  thing at all.

    This argument also pre-supposes that most of the activity under consideration is going on over the web. This is a dangerous and foolish supposition. It’s difficult to see what Google or Bing could do, for example, to stop groups swapping material over IRC over encrypted channels on a VPN.  Because the Internet is more than just the web. Much more. And certainly not able to be policed in the way that the Prime Minster’s faint echoes of George Dixon are meant to reassure us with.

  4. The Prime Minster says:

    “So the search engines themselves have a purely reactive position.

    When they’re prompted to take something down, they act.

    Otherwise, they don’t.

    And if an illegal image hasn’t been reported – it can still be returned in searches.

    In other words, the search engines are not doing enough to take responsibility.”

    But just a few moments before his speech has just said that they [the search companies] are taking steps to help connect the dots in searching for such material.  The search engines are accused of being reactive. But this should not be a surprise, because that’s precisely what they are. They are designed to be reactive. The IWF database works primarily because it is human reported and followed up. Automated processes to screen for such content are not reliable enough, nor are they quick enough in real-time to be entirely effective. What would be acceptable false positive and negative rates for such screening? So the government did the only sensible thing they could with CEOP and cut their funding.

    Search engine systems are algorithmic and are designed to search content that site owners have asked them to spider (via registering the domain with a search engine, then adding mark-up and configuration options for the spidering software to find and read). But it’s just as easy for systems administrators to configure their servers to refuse connections by search engines, or even to misreport their contents. There is little the search engine companies can do about this.

  5. The Prime Minister says:

    So here’s an example.

    If someone is typing in ‘child’ and ‘sex’ there should come up a list of options:

    ‘Do you mean child sex education?’

    ‘Do you mean child gender?’

    What should not be returned is a list of pathways into illegal images which have yet to be identified by CEOP or reported to the IWF.

    Then there are some searches which are so abhorrent and where there can be no doubt whatsoever about the sick and malevolent intent of the searcher that there should be no search results returned at all.

    Put simply – there needs to be a list of terms – a black list – which offer up no direct search returns.

    So I have a very clear message for Google, Bing, Yahoo and the rest.

    You have a duty to act on this – and it is a moral duty.”

    Does the Prime Minster realise exactly how technically difficult the problem of “What should not be returned is a list of pathways into illegal images which have yet to be identified by CEOP or reported to the IWF” actually is? Most of the major search engines look at relevance mostly by employing statistical methods – looking for clusters of common words that occur together in specific places. The level of semantic awareness of these applications, though it appears impressive, is actually quite limited. And even more so when on considers non-textual data. These things are not impossible, but they are costly: costly in terms of time and processing power to complete. Even then, they are not foolproof, which means some level of human intervention is needed to check. And given the sheer amount of data that this applies to, checking it all is a pretty much impossible task to do proactively.

But then, the entire tone of the speech bothers me anyway. Part of me still thinks of this as a rather paternalistic hand on the shoulder, telling us exactly what our best interests are, without letting us decide that for ourselves. It bothers me that a party that was so supposedly committed to libertarian ideals in much of its social policy in its election manifesto has suddenly decided to play a much more authoritarian game in the light of the small number of high-profile cases that have appeared in an increasingly hysterical media. The cognitive dissonance experienced by some at the Daily Mail must be fairly staggering. On the one hand they campaign to “ban this filth”, while quite happily splashing pictures under-age girls in various states of undress. It makes for fairly queasy reading.

When the option comes, as it one day may, I will opt for no filtering. And I will do this simply because I feel I should be the one deciding what I should and should not see. Those who choose to break the law should be prosecuted under its full force; they will receive little sympathy from me if they are. But the introduction of filtering of this type is open to future abuse, and is a situation we should be very careful about allowing at all. We do it at our peril.

Advertisements

One thought on “[Content FIlter Applied – For Your Own Protection]

  1. There is of course the problem that, if the major search engines did start blocking such terms, the people responsible for producing and distributing such content would just set up their own independent search engine (many already exist for similar sub-cultures) and there would be nothing the major search engines (or indeed the ISPs) could do about them – and many of them are hosted in countries with no laws against such content.

    There are of course many people who reckon that this is just the first step to outright government censorship of the Internet, and we all know how well that works in other countries.

    Like

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s