Something must be done – III

House of Commons postern, via the Commons site.The two earlier posts (here and here) to which this is the third related to harmful use of the internet, especially relating to children; while another series of posts (here, here and here) related to the regulation of video games. In the same vein (but coming to it late – apologies) is a report published last month by the UK’s House of Commons Select Committee on Culture Media and Sport, entitled Harmful content on the Internet and in video games. There is a balanced comment by Simon Walden in guardian blogs; see also BBC | OUT-Law | The Register | Times Online). Commenting on the Report, Light Blue Touchpaper says:

You will discern a certain amount of enthusiasm for blocking, and for a “something must be done” approach. However, in coming to their conclusions, they do not, in my view, seem to have listened too hard to the evidence, or sought out expertise elsewhere in the world …

Among the Report’s main conclusions and recommendations:

  • We agree that any approach to the protection of children from online dangers should be based on the probability of risk. We believe that incontrovertible evidence of harm is not necessarily required in order to justify a restriction of access to certain types of content in any medium. … It is sensible that parents set boundaries for their children’s online activities, but a totally risk-averse culture in parenting will not equip children to face dangers which they will inevitably encounter as they grow older. …
  • This is a good, balanced, position, and it drew largely positive comments. For example, the Society for Computers and Law said of the report that “its message is a sensible one”; and womensgrid concluded “Child safety should be of utmost priority … and any costs or technical limitations should be considered second to protecting children when it comes to the Internet”.

  • We strongly recommend that terms and conditions which guide consumers on the types of content which are acceptable on a site should be prominent. It should be made more difficult for users to avoid seeing and reading the conditions of use: as a consequence, it would become more difficult for users to claim ignorance of terms and conditions if they upload inappropriate content. …
  • This has not received as much commentary in the material linked-to here as comments about Google and YouTube, but in fact is much more important from the perspective of the protection of consumers online. Too often terms are ignored; they are in a separate window which are are never scrolled through; and the promient “I agree” or “I accept” button is automatically clicked. Anything that alerts consumers to what they are agreeing to or accepting when they click on that button is a Good Thing.

  • We believe that there is a need for agreed minimum standards across industry on take-down times in order to increase consumer confidence. …
  • There are many problems with the inconsistencies of various legal regimes (especially between defamation and copyright) when it comes to monitoring and taking down inappropriate, infringing or illegal material. The Committee recommended streamlining them (see Information Overlord, and Liberty* UKLiberty). Again, from the perspective of consumers, this is the most practical of the standardising suggestions.

  • We await the announcement by the Ministry of Justice on whether the law might be strengthened to help prevent the use of the Internet to encourage suicide.
  • On this issue, in response to much over-reaction, a lot of good sense has been written by Andres Guadamuz on his fine TechnoLlama blog here, here and here.

  • … We recommend that social networking sites should have a default setting restricting access and that users should be required to take a deliberate decision to make their personal information more widely available. …
  • This is an important recommendation; Daithí has much on social networking privacy issues relating to Facebook, and it has made splash in the mainstream (see, for example, Daniel Solove writing recently in Scientific American), and the International Working Group on Data Protection in Telecommunications (IWGDPT) issued some important recommendations (pdf) on the issue earlier this year. A helpful discussion is Susan B Barnes “A Privacy Paradox. Social Networking in the US” (2006) 11 (9) First Monday, arguing that we “need to be more proactive about educating each other and protecting our privacy on the Internet”.

  • We believe that leaving individual companies in the Internet services sector to regulate themselves in the protection of users from potential harm has resulted in a piecemeal approach which we find unsatisfactory. … Instead, we propose a tighter form of self-regulation, applied across the industry and led by the industry. We therefore call on the industry to establish a self-regulatory body which would agree minimum standards based upon the recommendations of the UK Council for Child Internet Safety, monitor their effectiveness, publish performance statistics and adjudicate on complaints. …
  • The UK Council for Child Internet Safety was proposed by the Byron Review (blogged here); it is due to begin its work next month; and many of the recommendations in the Report are directed specifically to it. This tighter form of self-regulation almost amounts to co-regulation (on which concepts, see ; hat tip Chris), and is the way that government is responding to the internet at the moment, implying that if the industry does not comply, resistance will be futile, and a statutory regulatory body will be set up instead.

  • We recognise the concerns that the hybrid system for games classification proposed by Dr Byron may not command confidence in the games industry and would not provide significantly greater clarity for consumers. We believe that, ideally, a single classification system should be adopted. While either of the systems operated by the BBFC [link] and by PEGI [link] would be workable in principle, we believe that the widespread recognition of the BBFC’s classification categories in the UK and their statutory backing offer significant advantages which the PEGI system lacks. We therefore agree that the BBFC should continue to rate games …
  • I think this is too insular. The Pan-European Game Information (PEGI) age rating system was developed by the Interactive Software Federation of Europe (ISFE) in 2003 to help parents make informed decisions on buying interactive games. As games – online as well as hard media – are increasingly available from a range of suppliers not only in the UK, but also from elsewhere in the EU as well as in the US, it makes sense for there to be a single European port of call for software companies. This recommendation, if implemented, will unnecessarily fragment the regulatory structure.

    All in all, the report has some important recommendations, even if it is a bit stodgy, and not entirely unmoored from the “something must be done perspective”. But the basic position must be that a little common sense goes a long way, online as in the real world, and we mustn’t over-react simply out of fear of the unknown.




    * Please see the first two comments below.