NetHui 2011 - Session 4 - Forbidden knowledge

posted 28 Jun 2011, 22:06 by Unknown user   [ updated 28 Jun 2011, 22:22 ]

Forbidden knowledge: name suppression, filtering and censorship


  • Google's mission is to take all the worlds information and make it universally accessible and useful.

  • Trends in censorship – 2002 about 4 countries were actively censoring internet, in 2011 there are 40. Google has services in around 150 countries, 25 of which are in countries with censorship, filtering or blocking (especially YouTube).

  • Example – automatic filtering of suspected child abuse sites is generally accepted, but once filtering technology has been implemented it is often then expanded to also block other unpopular sites, like wikileaks.

  • This "scope creep" is commonplace in other areas, for instance using language from copyright law (which is a civil offence) and putting it straight into criminal law (for enforcing suppression orders online).

  • Tendency for government to just say that ISPs should be responsible, but often they do not have the resources, or the appropriate expertise to decide which sites should be blocked or not.

  • More nuanced filtering software can be used, which can for instance block images from a web page but still allow the rest of the web site to be viewed.

  • Users need to understand that they can generally be tracked looking at or doing whatever they are doing online – argument that proactive monitoring of undesirable behavior is a more effective way of stopping it than just blocking certain websites, instead the website can be left unblocked, but everyone who accesses it and what they view is logged and can potentially be punished (i.e. if they are viewing child porn).

  • Filtering effect breaks expectations of how the internet works, the concept of blocking access to some parts strikes at the heart of the online business model. Governments should not be disrupting the engineering that the internet is based on, and tracking down child pornographers is more effective than just trying to block them from accessing websites.

  • Filtering is seen differently depending what is filtered and how, blocking websites entirely for instance is seen as much worse than simply google choosing not to index it (so it can't be searched for, but can still be accessed by clicking on a link or typing in the url).

  • Filtering can give a false sense of security, parents feel there is no need to monitor their children's internet use as they expect the filter to catch all undesirable content.

  • Anyone with name suppression has the right to go to any ISP or website and ask them to remove the name if it has been disclosed – problem is where the hosting company cannot be identified or is in an overseas jurisdiction where court orders cannot be enforced. Also suppression orders may prevent victims from being identified, but this often does not prevent the identification of the offender, the area in which the victim lives, the school they go to etc, which can indirectly allow them to be identified despite the suppression order - especially in small communities (such as much of NZ).

  • Google's results are filtered themselves in that search results are personalised and depend on where the user lives, what they have previously searched for, what kind of websites they frequent etc, but this does not block access to any pages, it just means they will be lower down in the search rankings if they are something that google doesn't think is relevant or doesn't expect you will be looking for. So there is no denial of access, rather they just make access easier to information that they predict you will find useful. Also note that user preferences can be changed to alter how google weights the search rankings for you.

  • One unintended consequence – PhD students studying "spam" have found themselves unable to gather data due to automatic spam filters at various stages (i.e. both ISP and email provider) with no opt-out option and no listing of what has been blocked.

  • Google will index most legal content, but blocks content that is illegal in relevant jurisdictions, so child porn is blocked everywhere, but for other topics they will only be blocked if they are banned in the relevant country (i.e. all pornography is blocked in many middle eastern countries). So things that are not allowed in general, are also not allowed on the internet. However one argument against filtering is that it presumes everyone is likely to do things that are wrong or forbidden, unless the government is watching them the whole time so they can be punished if they transgress - a system that gives people the opportunity to "make mistakes" (i.e. access forbidden content) and then only punishes those who choose to do so, seems more consistent with the underlying values of a democratic society.

  • Idea behind name suppression is to make the suppressed name or information as hard as possible to find, judges acknowledge in practice that the internet may make name suppression outdated, but if a suppression order means that finding the suppressed name involves laboriously trawling through un-indexed blog posts rather than simply typing it into google, this is still broadly achieving its aim. On the other hand judge acknowledged that other methods may now have to be used to ensure fair trial, such as questioning jurors to see what they know about the case (if anything) before any evidence has been heard, so they can be excluded if they know things or hold beliefs about the facts in issue that would prejudice a fair trial.

  • Google anonymises search query logs after 6 months, and users can view and wipe their own search history if they don't want targeted ads or "shaped" search results. However there is a general policy to give higher rankings to newer information, hence why doing the same web search a few weeks later may have completely different results – in theory the old results should still be available but they may have moved so far down the search rankings that they are very difficult to find.

  • Is the "right to be forgotten" the new privacy? If an unfiltered web is also transparent and permanent, this makes the concept of privacy much more limited, and the issue of embarrassing information resurfacing many years after it was originally posted is far easier and more likely to occur when everything is indexed online. Particular issue with the increasingly younger ages that today's digital natives are signing up to social networking sites, forums etc - is it fair for adults to be penalised for things they said or did many years ago, when they were well below the legal age of adulthood? 










Comments