The Intermediary Service Providers provisions of the E‑Commerce Directive, its UK implementation, and relevant cases, from the point of view of UK ISPs

Introduction

The Directive on Electronic Commerce[1] was passed on 8th June 2000 and was required to be transposed by 17th January 2002. It established various legal principles for electronic commerce throughout the European Community, of which Articles 12 to 15 addressed the liability of intermediary service providers.

This essay considers these provisions from the viewpoint of a typical UK Internet Service Provider (throughout, unqualified references to the directive mean only these four articles). It examines the original directive, the transposition into UK law, and the practical effects of the provisions on the day-to-day operations of UK ISPs, and looks at what little case law exists in this area.

ISPs are looking for certainty in the law so that they can simply carry on with their business. However, as I show, they are instead forced to speculate how their activities are affected by this legislation.

The provisions of the directive

The relevant provisions of the directive are best described as "anti-liability" provisions: they specify three classes of situation where an intermediary service provider is exempt from all civil and criminal liability (but not court orders or other administrative actions relating to specific infringements). In summary[2]:

In addition, article 15 forbids Member States from imposing a general duty to seek out unlawful material or monitor the material transmitted and/or stored.

Lack of horizontal effect

The directive is supposed to be "horizontal"; that is, the specific subject of the content that would create liability is irrelevant. It matters not whether it is a libellous statement, a breach of copyright, or criminals conspiring. However, it is not quite so: Article 1 contains certain exemptions such as gambling[3]. One objective of the directive is that ISPs should not need to examine every communication pro-actively to search for unlawful material; apart from anything else, the sheer scale makes it impractical. If gambling content is excluded from the liability provisions, this completely destroys the spirit of the provisions if not the letter, since ISPs will still have to check everything in case it relates to gambling and creates a liability for them. However, the DTI was of the opinion that the "limitations on intermediary service providers' liability [apply] to information about these activities"[4] and, therefore, that the directive is indeed horizontal.

Problems with the UK transpositions

The UK transposition of the directive contains a range of problems. To start with, despite the deadline in article 22.1 of 17th January 2002, the corresponding UK regulations[5] did not come into force until 21st August[6], over 7 months later; during this time ISPs did not have the protection the directive should have given them[7]. This appears to be entirely due to government tardiness: the DTI consultation was not launched until August 2001 and ended 2nd November, while draft regulations were only published in March with a second consultation that ran until May[8]. Publication of the final Regulations was further delayed until 1st August.

Secondly, the actual regulations - but not the draft ones - limit their effect to existing legislation:

3(2) These Regulations shall not apply in relation to any Act passed on or after the date these Regulations are made or in relation to the exercise of a power to legislate after that date.[9]

This limitation, which has become known as "the prospective effect issue", does not correspond to anything in the directive and has the effect of making the UK transposition far from horizontal, expanding the issue with gambling content described above to cover a wide range of topics. A significant concern of ISPs is that, where they are aware that their networks are used in the commission of an offence or tort - an obvious example being the breach of copyright via "peer-to-peer" systems - they might be liable for aiding it. When the lack of prospective effect was raised with the DTI, their response[10] was that all future legislation would contain wording to ensure that the relevant effects of the directive applied; it would all be "e-commerce-ready".

However, when we examine the record things turn out to be very different: only four laws even attempt the task. Firstly, two statutory instruments[11] exempt themselves from regulation 3(2) by saying:

The Electronic Commerce (EC Directive) Regulations 2002 shall apply to this Order notwithstanding Regulation 3(2) of those Regulations.[12]

Next, s.34 of the Serious Crime Act 2007 forbids a "serious crime prevention order" from imposing liabilities or an obligation to monitor that would be excluded by the directive, doing so by a direct reference to it[13]. Finally, s.166A of the Criminal Justice and Public Order Act 1994[14] attempts to provide defences for the offence of "ticket touting" through very different wording:

(3) A service provider is not capable of being guilty of an offence under section 166 in respect of anything done in the course of providing so much of an information society service as consists in -
 (a) the transmission in a communication network of information falling within subsection (4), or
 (b) the storage of information provided by a recipient of the service,
except where subsection (5) applies.

(4) Information falls within this subsection if -
 (a) it is provided by a recipient of the service; and
 (b) it is the subject of automatic, intermediate and temporary storage which is solely for the purpose of making the onward transmission of the information to other recipients of the service at their request more efficient.

(5) This subsection applies at any time in relation to information if -
 (a) the service provider knew when that information was provided that it contained material contravening section 166; or
 (b) that information is stored at that time (whether as mentioned in subsection (3)(b) or (4)) in consequence of the service provider's failure expeditiously to remove the information, or to disable access to it, upon obtaining actual knowledge that the information contained material contravening section 166.

Unfortunately this wording is badly flawed. (3)(a) appears to relate to "mere conduit" but, when combined with the "and" in (4), it turns out that it only actually provides a defence for the transmission of material when that material is cached! Furthermore, applying (5) means that the Article 14 rules on knowledge apply rather than the Article 12 or 13 rules. It also fails to bring in regulations 20 to 22, which define the burden of proof in criminal matters and relevant tests for "actual knowledge".

The other "solution" has been to use statutory instruments to apply the defences separately. This has been done seven times: three of them[15] use the "nothwithstanding" wording to extend the regulations to four additional Acts and 12 statutory instruments. The other four, however, add specific defences to specific legislation and, furthermore, do so in three more different ways. Confining examples to the "mere conduit" defence, regulation 17 copies article 12 of the directive with only minor wording changes. However, the corresponding defence for the Financial Services and Markets Act 2000 (Financial Promotion) Order 2001[16] ignores this and instead refers directly to the directive[17]. Again the matters addressed in regulations 20 to 22 are omitted. It also excludes temporary storage as part of the "mere conduit" defence (since this is not in article 12.1) while bringing agents of the ISP under the umbrella of the "hosting" defence (they are excluded by article 14.2[18]) - arguably this last is ultra vires for such secondary legislation. Finally, it only applies the directive to certain issues (the "financial promotion restriction"), not the entire Act.

The defences for the Adoption Act 2002[19] make some textual changes that, hopefully, do not affect the meaning[20] and, again, only apply the defences to one specific offence[21]. The drafters assumed that this offence is the only one that can be committed by an intermediary, even though other examples were raised[22].

The two most recent cases apply the directive to the Public Order Act 1986[23] and the Terrorism Act 2006[24]. They both use identical wording which refers to "a relevant offence" - once again only certain offences are covered by these defences. They also add a new condition:

(3) Paragraph (1)(b) does not apply if the information is information to which regulation 6 applies.

(regulation 6 is the related "caching" defence). This means that when material might fall under both defences, an ISP can only rely on the latter, weaker, one, while the directive and the original transposition allow it to rely on either.

A further concern about prospective effect is how it interacts with legislation passed before the key date but amended afterwards. For example, s.160 of the Criminal Justice Act 1988 criminalises possession of indecent photographs of children under 16, but in 2004[25] the age was raised to 18. Does the hosting defence apply to indecent images of 17-year-olds or not? Some of the relevant statutory instruments[26] apply the defences to older legislation amended in 2003, which suggests that it is the date of the last change that matters.

Overall, the present approach makes it difficult for ISPs to determine when they are or aren't protected, means that they cannot use the directive to guide their actions, and places the UK in breach of its EU obligations. Prospective effect has been regularly brought up in discussions between industry and officials ever since the regulations were published. The latter often state that a solution is "in the works", but at the time of writing nothing has materialized.[27]

"Mere conduit" activities

A significant portion of the business of an ISP is covered by the "mere conduit" defence of article 12. For example, when someone looks at a web page hosted outside their ISP's network, the various communications involved are all covered by this defence. For these purposes, it is generally assumed that "does not select or modify the information contained in the transmission" applies to the actual content and not the exact data, so that purely technical changes to packets[28] are not relevant. This is not explicit in article 12 but is supported by recital 43[29], which excludes "manipulations of a technical nature". Doubtless a court would take this interpretation when applying regulation 17 or 18 in a case, since any other approach would mean they could never apply[30], contradicting the policy that legislation should be read so as to have an effect.

One significant phenomenon of the last few years is the unlawful sharing of copyright material through "peer-to-peer" networks. These systems transfer material directly from one user to another and nothing is stored (except transient storage of individual IP packets) on ISP systems. ISPs regularly receive complaints from rightsholders that their customers are taking part in such sharing. The evidence provided tends to be scanty at best and impossible for the ISP to verify. Therefore ISPs will rely on the "mere conduit" defence and not take any action in response to these complaints. This position is currently under attack in the Irish courts[31] and it has previously been suggested that ISPs should lose this protection[32].

More significantly, there has recently been an important decision in the Belgian courts which has caused the industry to reassess this area. In SABAM v Scarlet[33], the court found the defendant ISP guilty of copyright infringement by allowing peer-to-peer file sharing after it had been notified, and required the ISP to install monitoring and blocking systems that would prevent such sharing. In this decision, the court ignored article 12 because the case was related to a "cease-and-desist" injunction rather than damages; article 12.3 explicitly allows for these. More interestingly, it also decided that article 15, preventing imposition of a general duty to search for unlawful material, did not apply here. The novel theory[34] is that the purpose of article 15 is to prevent ISPs being liable for failing to search out unlawful activity and, since a cease-and-desist order does not imply fault on the part of the ISP, placing this requirement on the ISP does not amount to imposing a liability on them. The problem with this theory is that, once the monitoring system is in place, the ISP may well have constructive knowledge of infringements, harming their defence in subsequent cases. Furthermore, it is not supported by the plain words of article 15.

This case appears badly decided, partly because the defendant was a small company unable to afford to assemble a good defence. Scarlet is now part of Tiscali and commentators expect these issues to be argued more fully on appeal, giving a definitive interpretation.

The one area where ISPs typically do break the principles of the mere conduit defence is filtering systems, whether for child abuse images, spam, or viruses. These systems examine either the outgoing requests or incoming traffic and, if it matches certain criteria, block or reject it, thus breaching the conditions of article 12[35]. However, if an ISP were to block one message containing a virus, an untested question is how far the resulting liability stretches. Possibilities include liability for:

  1. any harm done by blocking that message;
  2. any message passing through that virus filter;
  3. any transmission made using the service or product that utilises that filter;
  4. any transmission made using any service the ISP provides (e.g. where the ISP has more than one type of account and only some types are filtered).

These filtering systems tend to be installed either out of self defence - sufficient unfiltered spam can make an ISP's service unusable - or through political pressure in the case of blocking of child abuse images[36]. To date the liability issue has not been tested and remains uncertain; BT's position[37], for example, is that if such a system was found to create a significant risk they would disable it. The topic was also raised in the Scarlet case, where the judgement stated that filtering and blocking was not "selection"[38].

Email

Outgoing email clearly fits in the article 12 regime: the ISP merely transmits the message from customer to destination. If the email is stored temporarily on a "smarthost"[39] this meets the "automatic, intermediate and transient storage" test of article 12.2.

However, incoming email is more complicated and commonly falls into one of three cases.

  1. Some customers have email sent directly to their own systems using the same processes as outgoing mail. This also falls under article 12.
  2. A customer's email is sent to a mailbox[40] on the ISP's systems and the customer downloads and deletes the emails from time to time. This is covered by article 12, except it could be argued that, since the email might be stored for days or weeks[41], the storage is not "transient". If so, the protection would move to article 14.
  3. The email is sent to a mailbox and the customer reads them from there, retaining or deleting them at her whim (this is the approach followed by "webmail" systems). This is clearly covered by article 14 rather than article 12. This means that, for example, if an ISP became aware that a customer had received a libellous email, they could be asked to remove it or else become liable for it. I am not aware of any UK ISP receiving such a request.

In both directions, email systems insert a "Received" header to allow the route taken by an email to be traced. As before[42], this is a technical change that does not invalidate the "mere conduit" status.

Usenet

Usenet was possibly at one time the biggest single service on the Internet. It has sometimes been described as a sort of bulletin board, but is in fact rather different. Usenet is based around thousands of servers[43] connected via a peer-to-peer network using NNTP[44]. Total statistics are hard to find, but as an example BT's servers are currently handling about 6 million articles, averaging around 280kb, daily[45].

A Usenet article is written by a user with newsreading software. The article is assigned to one or more "newsgroups" by listing them in the article header (there are currently around 40,000 such groups[46]) and then offered to the news server the user normally uses, which will allocate a unique identifier to the article[47]; this server might be one operated by their ISP, a commercial service such as Supernews[48], or the server operated by Google under the name "Google Groups"[49]. This initial server will then offer the article to other servers (its "newspeers") who will take it and then offer it to further servers in turn in what is known as the "flood fill algorithm" (the unique identifier allows each server to determine whether it already has a copy and therefore reject the offer). Individual users then download articles in the newsgroups they are interested in from their local news server. Not all servers store articles in every group[50], and servers "expire", or delete, articles to make room for new ones, but apart from this an identical copy[51] of every Usenet article is available on every Usenet server in the world.

The natural question for an ISP to ask is how to analyse the directive in respect of Usenet. There are two obvious approaches to take. The first is to argue that articles are stored on each server and are therefore "hosted" material covered by article 14. As discussed below, this is the course taken when the topic has, albeit marginally, come before the courts. The question with this approach is whether it fits the facts. Article 14 applies to a service "that consists of storage of information provided by a recipient of the service". If someone in Brazil writes an article and sends it to their local Usenet server, they will expect it to be copied to other servers, but have they actually requested it to be stored on the UK server? Particularly given that the definition of "information society service" includes "at the individual request of a recipient of services"[52].

The alternative analysis is to consider Usenet as "caching" under article 13[53]:

the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients of the service upon their request,

The purpose of Usenet is to make articles available to users around the world. The storage on servers is just part of that purpose, because at the time Usenet was created it was thought impractical to have every user fetch every article they want from the author's computer or her ISP's server. Therefore it meets the test of "for the sole purpose of making more efficient ..." and of being intermediate storage. Usenet servers also normally work unattended and certainly do not require administrators to select individual articles to be forwarded, thus meeting the "automatic" test. Finally, the fact that articles are expired to make way for new ones satisfies the "temporary" test. Thus it would appear that Usenet fits article 13.

What is the effect of this? Under article 14, when the host becomes aware of the unlawfulness of a stored item, they must expeditiously remove it or disable access to it, or else they become liable for it. Under article 13, however, this test is changed to:

obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement[54]

The last part of this would require a specific written notice, which leaves the first part. I argue that "removed from the network" happens only when the author of the article specifically issues a "cancel" article - a special form of article which informs each server that it reaches that the previous article should be removed[55]. Usenet servers can carry out this action automatically, though this is sometimes disabled because of the prevalence of forged cancellations. However, for an ISP to rely on the article 13 defence they would need to honour cancellations in order to meet:

(c) the provider complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry;[56]

There would be no difficulty in meeting requirements (a), (b), and (d)[57].

Of course, there is nothing in the directive that requires any given situation to be limited to one of the three relevant articles. Thus a Usenet server operator could rely on both articles 13 and 14 or, more precisely, whichever is more advantageous for a given set of facts.

Godfrey v Demon

While material on Usenet has been involved in a number of cases, there appear to only have been two that address the legal situation of a Usenet server operator. These are the pre-directive case of Godfrey v Demon[58] and the more recent Bunt v Tilley[59].

Godfrey notified Demon that a specific Usenet article was defamatory of him and requested that they remove it from the server. Demon failed to do so. Godfrey therefore sued for libel in respect of the period between receipt of the notice and the normal expiry some 10 days later. In an action to strike out their defence that they were not publishers, Morland J concluded that Demon were publishers within the meaning of the Defamation Act 1996 but, until put on notice, could rely on the defence of "innocent dissemination" within s.1 of that Act. He also wrote:

I do not accept Mr Barca's argument that the defendants were merely owners of an electronic device through which postings were transmitted. The defendants chose to store "soc.culture.thai" postings within their computers. Such postings could be accessed on that newsgroup. The defendants could obliterate and indeed did so about a fortnight after receipt.[60]

The clear implication here is that he saw Usenet server management as an active task rather than an automated process, whereas ISPs believe the direct opposite.

Demon subsequently apologised and paid damages while taking the view that the law had not kept up with technology[61]. Godfrey had two effects on ISP practice. Firstly, there was an outbreak of complaints about defamatory Usenet articles; in general, ISPs chose to remove the articles rather than attempt to determine whether there was a potential liability. Secondly, ISPs became concerned about "repeat offenders": customers who write a defamatory article that is complained about but then write another one. Does the knowledge of the first article mean that, in the case of the second one, the ISP cannot claim that "he did not know, and had no reason to believe, that what he did caused or contributed to the publication of a defamatory statement"[62]? The practical approach taken was to require the customer to sign a statement promising not to make defamatory statements again before they had their ability to write Usenet articles restored.

So how would Godfrey be decided today? If the court accepted that Usenet was actually a form of caching, then Demon would be protected by article 13 and Godfrey would have had to ask for an injunction to have the article removed from the original server[63]. If the court rejected this argument, Demon would have received "actual knowledge", negating article 14, and so the case would have continued along the same lines as it actually did.

Bunt v Tilley

This is the only UK case to directly address these provisions of the directive. Bunt sued three defendants over allegedly defamatory material they placed on the Internet but, more significantly to this essay, he also sued three ISPs over their role. Eady J decided that none of the three were publishers within the meaning of the Defamation Act, but then proceeded to discuss the application of the directive at length[64]. He first dismisses an intriguing theory from the claimant:

"Simple logic dictates that to be an INTERMEDIARY service provider one must be a service provider who is BOTH customer of an upstream service provider and supplier to a downstream service provider". Yet it is not a question of logic but of definition. No such restrictive definition appears in the regulations. Nor would it accord with the declared policy underlying them.

then analyses in 40-45 what kinds of body and process the directive aims at protecting.

He then moves to regulations 17 and 18 on "mere conduit" and caching; his analysis of the areas I discuss above does not conflict with mine. At 54 he discusses a proposition from the claimant:

An ISP providing a standard domestic consumer or SOHO Internet access package to a customer has no possible hope of successfully arguing that it is a mere 'conduit' and therefore immune from that Law, as all content originates from within their own network, instead of merely passing through it in 'Via' fashion from one network to another.

This appears to be predicated on the assumption that, in this situation, a customer's machine is part of the ISP's network, presumably because a single machine cannot be a network. The claim is dismissed "because it flies in the face of the fundamental policy underlying the regulations", but it is in any case wrong: it is quite possible for such connections to support several machines on a home network, and in Internet engineering terms the same protocols are used whether there is one machine or many.

At 58 Eady J turns to the relationship between the directive and regulations on one hand, and s.1 of the Defamation Act on the other. He cites Collins[65] as pointing out "that there may be circumstances in which the application of the s.l defence and of Regulation 17 would lead to inconsistent outcomes". The further analysis in the judgement does not address this inconsistency but, rather, starts with the s.1 defence (which succeeds). This is the wrong way round. Regulations 17 to 19 all say that "the service provider (if he otherwise would) shall not be liable" when the appropriate conditions apply. Therefore the correct analysis starts by determining if these regulations do in fact apply to the situation. For two of the three ISP defendants it seems clear, from the facts at 26 to 32, that regulation 17 applied. Therefore, even if s.1 imposed liability on them, regulation 17 would negate that liability.

At 68 the judgement moves to the last ISP defendant, whose involvement is that they run a Usenet server which carried the allegedly defamatory material. Unfortunately, no attempt was made to determine which type of defence covers Usenet:

It was made clear in the evidence served on behalf of BT that, although Regulations 17 and 18 had originally been relied upon, this was no longer being pursued. It is accepted that BT hosts Usenet newsgroups on its servers and that newsgroup postings are stored for a period of time, usually amounting to a few weeks, to enable BT's users to access them.[66]

However, the judgement does provide some useful pointers both in relation to Usenet and more generally. It discusses the issue of notice. Regulation 22 requires that, in interpreting the term "actual knowledge", the amount of detail in the notice is relevant. A court should take account of how well the notice gives "details of the unlawful nature of the activity or information in question", and the judgement states that:

As I have already observed, in order to be able to characterise something as "unlawful" a person would need to know something of the strength or weakness of available defences.[67]

More interestingly, the judgement also looks at inchoate offences. The claimant suggested that BT were responsible as "accessories" when a Usenet posting constituted unlawful harassment. This was dismissed because the facts didn't support harassment, but the judgement did not deny that an ISP could be an accessory; rather, it pointed out that "the provisions of Regulation 19 would prevent any claim for damages, whether in respect of harassment or any other wrongful act"[68]. This is important when we look back at the "prospective effect" issue - the various statutory instruments discussed above are selective in which offences they cover precisely because the officials who drafted them believed that no other offences could be committed over the Internet and ignored the question of inchoate offences[69].

Regrettably, although one ISP raised the point of "repeat offenders" discussed above, anticipating:

a further argument from the Claimant to the effect that the email of 1 May 2005 provided BT with "reason to believe" that further postings would be made by the same person using its services[70]

this was not addressed further because the notice failed to identify the person.

Conclusion

The intermediary liability provisions of the directive provide a significant protection for ISPs by protecting them against liability for the vast volumes of material passing through their systems every day which they have no practical way of checking. While the basic provisions are clear, interpreting them in terms of ISP activities is less easy. The UK transposition is seriously flawed, varying the protection according to the specific content. The lack of case law in this area does not help; what little there is does not always address the key (from the point of ISPs) issues, nor are cases always decided in the way that might be expected from the plain text.


Footnotes

[1] Directive 2000/31/EC, Official Journal L 178, 17/07/2000 p. 0001-0016

[2] See appendix 1 for the full text of articles 12 to 15.

[3] Article 1.5(d), last bullet: "gambling activities which involve wagering a stake with monetary value in games of chance, including lotteries and betting transactions."

[4] See the guidance referred to in note 8, section 3.2(d)(iii).

[5] The Electronic Commerce (EC Directive) Regulations 2002 S.I.2002 No. 2013, regulations 17 to 22. A copy of these regulations is included in appendix 2.

[6] Regulation 16, not relevant to this essay, came into force on 23rd October 2002.

[7] Most ISPs are not individuals and therefore Francovich and Bonifaci v Italy (C-6 & 9/90) does not assist. An individual who was an ISP could sue the UK government for any damage caused by late or wrong transposition, but this would only be possible after they had in turn been sued for something where they should have had protection.

[8] Neither consultation is available on the BERR website. Using the Wayback Machine (http://www.archive.org) it is possible to obtain the first consultation, at http://web.archive.org/web/20011019040224/http://www.dti.gov.uk/cii/ecommerce/europeanpolicy/ecommerce_directive.shtml, and the guidance on the second one at http://web.archive.org/web/20020811003016/http://www.dti.gov.uk/cii/ecommerce/europeanpolicy/ecommerce_directive.shtml. I am grateful to a colleague for providing me with a copy of the draft regulations.

[9] This wording is that following amendment by the Electronic Commerce (EC Directive) (Extension) Regulations 2004 (No. 1178), which added the second "in relation to". The notes to those regulations describe this as a defect in the drafting.

[10] Verbal statements at meetings attended by DTI officials and myself.

[11] The Price Marking Order 2004 (No.102) and the Uranium Enrichment Technology (Prohibition on Disclosure) Regulations 2004 (No.1818).

[12] This is the wording from the Price Marking Order. In the Uranium Technology Regulations, the words "this Order" are replaced by "these Regulations".

[13] "(5) A serious crime prevention order may not include terms which impose liabilities on service providers of intermediary services so far as the imposition of those liabilities would result in a contravention of Article 12, 13 or 14 of the E-Commerce Directive (various protections for service providers of intermediary services).
(6) A serious crime prevention order may not include terms which impose a general obligation on service providers of intermediary services covered by Articles 12, 13 and 14 of the E-Commerce Directive -
 (a) to monitor the information which they transmit or store when providing those services; or
 (b) actively to seek facts or circumstances indicating illegal activity when providing those services."

[14] S.166A is created by s.53(6) of the Violent Crime Reduction Act 2006.

[15] The Electronic Commerce (EC Directive) (Extension) Regulations 2003 (No. 115), the Electronic Commerce (EC Directive) (Extension) (No. 2) Regulations 2003 (No. 2500), and the Electronic Commerce (EC Directive) (Extension) Regulations 2004 (No. 1178).

[16] Regulation 18A of the Financial Services and Markets Act 2000 (Financial Promotion) Order 2001 (No.1335) as previously amended, added by Regulation 5(2) of the Financial Services and Markets Act 2000 (Financial Promotion) (Amendment) (Electronic Commerce Directive) Order 2002 (No. 2157).

[17] "18A. The financial promotion restriction does not apply to an electronic commerce communication in circumstances where -
 (a) the making of the communication constitutes the provision of an information society service of a kind falling within paragraph 1 of Article 12, 13 or 14 of the electronic commerce directive ( "mere conduit", "caching" and "hosting"); and
 (b) the conditions mentioned in the paragraph in question, to the extent that they are applicable at the time of, or prior to, the making of the communication, are or have been met at that time."

[18] "Paragraph 1 shall not apply when the recipient of the service is acting under the authority or the control of the provider."

[19] Regulation 9 of the Electronic Commerce Directive (Adoption and Children Act 2002) Regulations 2005 (No. 3222).

[20] E.g. "The acts of ... access referred to in paragraph (1) include ..." becomes "For the purposes of paragraph (1) above, the acts of ... access include ...".

[21] Specifically "shall not be guilty of an offence under section 124 of the Act (offence of breaching restriction under section 123)".

[22] For example, the possibility of aiding and abetting offences were raised at a meeting attended by Department for Education and Skills officials and myself.

[23] Regulation 5 of the Electronic Commerce Directive (Racial and Religious Hatred Act 2006) Regulations 2007 (No. 2497).

[24] Regulation 5 of the Electronic Commerce Directive (Terrorism Act 2006) Regulations 2007 (No. 1550).

[25] S.45 Sexual Offences Act 2003, brought into effect on 1st May 2004 by the Sexual Offences Act 2003 (Commencement) Order 2004 (No.874 (C.38)).

[26] The Electronic Commerce (EC Directive) (Extension) (No. 2) Regulations 2003 (No. 2500) and the Electronic Commerce Directive (Racial and Religious Hatred Act 2006) Regulations 2007 (No. 2497).

[27] At the time of writing the latest position was that BERR felt that the "way that is most compatible and effective within the prevailing legal environment [...] is to tailor make provisions where possible. We are though seeking to establish a form of prospective effect to ensure that the liability provisions apply to legislation where it may not be possible for some reason to tailor make provisions or were [sic] they may be overlooked through error" - letter from Baroness Vadera to the ISPA 2008-04-07.

[28] For example, an IP packet may be fragmented into smaller pieces if it is too large for a particular network segment with the final recipient reassembling them. Certain IP options require additional data to be added into the packet by the routers it passes through, and Network Address Translation (NAT) devices will change the IP addresses in both the packet headers and content to match those seen by the recipient. Also see note 30.

[29] "(43) A service provider can benefit from the exemptions for "mere conduit" and for "caching" when he is in no way involved with the information transmitted; this requires among other things that he does not modify the information that he transmits; this requirement does not cover manipulations of a technical nature which take place in the course of the transmission as they do not alter the integrity of the information contained in the transmission."

[30] All IP packets have a "time to live" field which is decremented by every router the packet passes through; when this reaches 0 the packet is discarded. This prevents packets circulating around the Internet for ever. Thus every ISP must alter - in the technical sense - every single communication passing through it.

[31] See, for example, http://www.ireland.com/newspaper/breaking/2008/0310/breaking61.htm.

[32] The Gowers Review of Intellectual Property, paras 5.94 to 5.96. Available at http://www.hm-treasury.gov.uk./media/6/E/pbr06_gowers_report_755.pdf.

[33] Formally SCRL Societe Belge des Auteurs, Compositeurs et Editeurs v SA Scarlet, [2007] E.C.D.R. 19.

[34] Judgement paras 27 and 28.

[35] See, for example, commentary by Olswang at http://www.olswang.com/newsarticle.asp?sid=101&aid=615.

[36] Hansard, 15th May 2006, column 715W, available online at http://www.publications.parliament.uk/pa/cm200506/cmhansrd/vo060515/text/60515w0013.htm.

[37] Statement by BT staff at a meeting with the Internet Services Providers Association.

[38] Judgement (see note 33) para 36: "That the mere fact that the technical filtering instrument would let works of SABAM's repertoire pass does not imply that the works were selected by Scarlet. That indeed the fact that certain content cannot be blocked does not imply that this content has been selected by the intermediary service provider or that it has targeted the information to deliver it to its subscribers; that the blocking measure is merely of a technical and automatic nature and has no active contribution to the filtering."

[39] An email "smarthost" accepts an email from a customer and deals with the technicalities of delivering it to the correct destination. For example, it will retry following a "temporarily unable to accept" response from the destination, try alternative servers for the recipient, and send out multiple copies of a message marked for delivery to several recipients.

[40] A file or set of files on the ISP's systems dedicated to holding email for that customer.

[41] As an example, Demon Internet deletes incoming messages automatically after 30 days or so.

[42] See text above relating to note 28.

[43] On 5th April 2008 the statistics at http://www.top1000.org/full.current.txt list 12,597 server identifiers that appeared in over 536 million Usenet articles passing through the 322 servers listed at http://www.top1000.org/participants.current.txt. This will certainly be an underestimate of the total number of Usenet servers.

[44] Network News Transfer Protocol, defined by IETF RFC 3977 (authored by myself).

[45] Statistics available at http://news-peer1.bt.net/. On 4th April 2008 the three main BT servers received 6,398,571 articles totalling 1,820,993,146,926 bytes. The graph at http://en.wikipedia.org/wiki/Usenet suggests the total global traffic is about twice this.

[46] For various reasons the exact number of groups is difficult or impossible to determine. On 5th April 2008 the Usenet server operated by THUS plc recognised 38,919 groups; while this server carries most "global" groups, it excludes some national or local groups relating to places outside the UK; see also note 50.

[47] The author's software might do this instead. The identifiers are normally made unique by including the date and time and the host name of the server within them.

[48] See http://www.supernews.com.

[49] See http://groups.google.com or http://groups.google.co.uk.

[50] Some newsgroups are only of national or local interest and therefore not offered to servers outside that area. Others servers may ignore newsgroups containing video or audio files because of the risk of copyright infringement, or newsgroups containing pornographic image files. A "full feed" is now too large for other than specialist commercial services.

[51] Identical, that is, apart from the header line showing the route the article took to reach the server this copy resides on. Errors in transmission can also cause copies to diverge.

[52] See article 1 of Directive 98/34/EC as modified by Directive 98/48/EC.

[53] To the best of my knowledge nobody has previously attempted this alternative analysis.

[54] Article 13.1(e).

[55] See IETF RFC 1036 for details of the Cancel article. There are also "Supersedes" (sic) and "Also-control" headers which have a similar effect; these are documented in "Son-of-1036", available at http://www.eyrie.org/~eagle/usefor/other/son-of-1036.

[56] Article 13.1(c).

[57] In relation to (a), the only modification done is a technical one similar to that discussed for email in the text relating to note 42. For (b) and (d) there are no standard mechanisms for imposing access conditions or providing data on the use of articles.

[58] 1999 WL 33285490, [1999] E.M.L.R. 542, and [2001] QB 201. As an employee of Demon at the time, I have also seen unpublished case material.

[59] [2006] EWHC 407 (QB) and 2006 WL 584578.

[60] [1999] E.M.L.R 542 at 550.

[61] E.g. see Demon's subsequent press release at http://www.demon.net/aboutus/pressroom/2000/pr006.html.

[62] Defamation Act 1996 s.1(1)(c).

[63] From memory, this server was in the USA.

[64] [2006] EWHC 407 (QB), paragraphs 38 to 76.

[65] Matthew Collins, The Law of Defamation and the Internet, 2nd edition, pub. 2005 Oxford University Press. ISBN 0199281823.

[66] Judgement para 68, emphasis added.

[67] Judgement para 72.

[68] Judgement para 76.

[69] See text above relating to note 22.

[70] Judgement para 67.


Previous publication

The issue of "prospective effect" has been one of the author's "hobby-horses" for many years. Appendix 3 gives the text of a paper I sent to the DTI and various other interested parties in 2004.


Online sources

Except where specifically dated, all online sources were successfully accessed on some date between 16th March and 6th April 2008.


Table of cases

Bunt v Tilley, [2006] EWHC 407 (QB) and 2006 WL 584578.

Godfrey v Demon Internet Ltd, 1999 WL 33285490, [1999] E.M.L.R. 542, and [2001] QB 201.

SCRL Societe Belge des Auteurs, Compositeurs et Editeurs v SA Scarlet, [2007] E.C.D.R. 19.


Further reading

L.Edwards, The Problem of Intermediate Service Provider Liability, in L.Edwards The New Legal Framework for E-Commerce in Europe, pub. 2005 by Hart Publishing. ISBN 1841134511.

M.Collins, The Law of Defamation and the Internet, 2nd edition, pub. 2005 Oxford University Press. ISBN 0199281823.

P.Milmo et.al., Gatley on Libel and Slander, 10th edition, pub. 2007 Sweet & Maxwell, ISBN 9780421946101.


Appendix 1 - the relevant provisions of the Directive

Section 4: Liability of intermediary service providers

Article 12

"Mere conduit"

1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network, Member States shall ensure that the service provider is not liable for the information transmitted, on condition that the provider:
(a) does not initiate the transmission;
(b) does not select the receiver of the transmission; and
(c) does not select or modify the information contained in the transmission.

2. The acts of transmission and of provision of access referred to in paragraph 1 include the automatic, intermediate and transient storage of the information transmitted in so far as this takes place for the sole purpose of carrying out the transmission in the communication network, and provided that the information is not stored for any period longer than is reasonably necessary for the transmission.

3. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.

Article 13

"Caching"

1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, Member States shall ensure that the service provider is not liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients of the service upon their request, on condition that:
(a) the provider does not modify the information;
(b) the provider complies with conditions on access to the information;
(c) the provider complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry;
(d) the provider does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and
(e) the provider acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement.

2. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.

Article 14

Hosting

1. Where an information society service is provided that consists of the storage of information provided by a recipient of the service, Member States shall ensure that the service provider is not liable for the information stored at the request of a recipient of the service, on condition that:
(a) the provider does not have actual knowledge of illegal activity or information and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or information is apparent; or
(b) the provider, upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information.

2. Paragraph 1 shall not apply when the recipient of the service is acting under the authority or the control of the provider.

3. This Article shall not affect the possibility for a court or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement, nor does it affect the possibility for Member States of establishing procedures governing the removal or disabling of access to information.

Article 15

No general obligation to monitor

1. Member States shall not impose a general obligation on providers, when providing the services covered by Articles 12, 13 and 14, to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.

2. Member States may establish obligations for information society service providers promptly to inform the competent public authorities of alleged illegal activities undertaken or information provided by recipients of their service or obligations to communicate to the competent authorities, at their request, information enabling the identification of recipients of their service with whom they have storage agreements.


Appendix 2 - the relevant provisions of the Regulations

Mere conduit

17. - (1) Where an information society service is provided which consists of the transmission in a communication network of information provided by a recipient of the service or the provision of access to a communication network, the service provider (if he otherwise would) shall not be liable for damages or for any other pecuniary remedy or for any criminal sanction as a result of that transmission where the service provider -
 (a) did not initiate the transmission;
 (b) did not select the receiver of the transmission; and
 (c) did not select or modify the information contained in the transmission.

(2) The acts of transmission and of provision of access referred to in paragraph (1) include the automatic, intermediate and transient storage of the information transmitted where:
 (a) this takes place for the sole purpose of carrying out the transmission in the communication network, and
 (b) the information is not stored for any period longer than is reasonably necessary for the transmission.

Caching

18. Where an information society service is provided which consists of the transmission in a communication network of information provided by a recipient of the service, the service provider (if he otherwise would) shall not be liable for damages or for any other pecuniary remedy or for any criminal sanction as a result of that transmission where -
 (a) the information is the subject of automatic, intermediate and temporary storage where that storage is for the sole purpose of making more efficient onward transmission of the information to other recipients of the service upon their request, and
 (b) the service provider -
  (i) does not modify the information;
  (ii) complies with conditions on access to the information;
  (iii) complies with any rules regarding the updating of the information, specified in a manner widely recognised and used by industry;
  (iv) does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and
  (v) acts expeditiously to remove or to disable access to the information he has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement.

Hosting

19. Where an information society service is provided which consists of the storage of information provided by a recipient of the service, the service provider (if he otherwise would) shall not be liable for damages or for any other pecuniary remedy or for any criminal sanction as a result of that storage where -
 (a) the service provider -
  (i) does not have actual knowledge of unlawful activity or information and, where a claim for damages is made, is not aware of facts or circumstances from which it would have been apparent to the service provider that the activity or information was unlawful; or
  (ii) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information, and
 (b) the recipient of the service was not acting under the authority or the control of the service provider.

Protection of rights

20. - (1) Nothing in regulations 17, 18 and 19 shall -
 (a) prevent a person agreeing different contractual terms; or
 (b) affect the rights of any party to apply to a court for relief to prevent or stop infringement of any rights.

(2) Any power of an administrative authority to prevent or stop infringement of any rights shall continue to apply notwithstanding regulations 17, 18 and 19.

Defence in Criminal Proceedings: burden of proof

21. - (1) This regulation applies where a service provider charged with an offence in criminal proceedings arising out of any transmission, provision of access or storage falling within regulation 17, 18 or 19 relies on a defence under any of regulations 17, 18 and 19.

(2) Where evidence is adduced which is sufficient to raise an issue with respect to that defence, the court or jury shall assume that the defence is satisfied unless the prosecution proves beyond reasonable doubt that it is not.

Notice for the purposes of actual knowledge

22. In determining whether a service provider has actual knowledge for the purposes of regulations 18(b)(v) and 19(a)(i), a court shall take into account all matters which appear to it in the particular circumstances to be relevant and, among other things, shall have regard to -
 (a) whether a service provider has received a notice through a means of contact made available in accordance with regulation 6(1)(c), and
 (b) the extent to which any notice includes -
  (i) the full name and address of the sender of the notice;
  (ii) details of the location of the information in question; and
  (iii) details of the unlawful nature of the activity or information in question.


Appendix 3 - my previous paper on "prospective effect"

This paper has been included unchanged from its 2004 version.

The Communications Bill and the Sexual Offences Bill

Problems with ISP liability

2003-05-13, updated 2004-01-14[1]

Summary

The Electronic Commerce Directive introduced three new legal protections for ISPs. When transposed into UK legislation, these protections were only applied to laws passed before July 2002.

The absence of equivalent protection in the Communications Act 2003 and the Sexual Offences Act 2003 puts the UK in breach of the Directive while, at the same time, leaving the ISP industry at serious legal risk.

The Directive

The Electronic Commerce Directive[2] introduced three new legal protections for Internet Service Providers. These are known as the "mere conduit", "caching", and "hosting" protections, and they state that ISPs are not liable - in either civil or criminal law - for the existence or contents of user communications and publications, subject only to certain reasonable principles. In more detail:

The Regulations

The Directive was brought into UK law as the Electronic Commerce (EC Directive) Regulations 2002[3]. Regulations 17 to 19 of this S.I. implement Articles 12 to 14 of the Directive effectively unchanged, while regulations 20 to 22 add extra provisions.

Were this the entirety of the relevant regulations there would be no problem. However, regulation 3(2) then states:

These Regulations shall not apply in relation to any Act passed on or after the date these Regulations are made or in exercise of a power to legislate after that date.

The reasoning behind this wording is not clear to the casual reader, because there is no corresponding wording in the Directive. However, the effect is that every piece of legislation from now on that could impact on the Internet needs to have wording to apply these same provisions, or else there is left a loophole which requires to be filled later. For example, both the Tobacco Advertising and Promotion Act 2002 and the Copyright (Visually Impaired Persons) Act 2002 failed to make such provisions and, as a result, a separate Statutory Instrument[4] had to be introduced just to override Regulation 3(2). A second such Statutory Instrument[5] similarly overrides Regulation 3(2) in respect of various new copyright regulations.

The Communications Bill

The specific issue with the Communications Act is section 127. This transposes similar words from the Telecommunications Act 1984:

(1) A person is guilty of an offence if he -
 (a) sends by means of a public electronic communications network a message or other matter that is grossly offensive or of an indecent, obscene or menacing character; or
 (b)causes any such message or matter to be so sent.

There are no obvious defences for ISPs in this area and there is no indication that the provisions of the Directive apply. This therefore leaves ISPs open to prosecution when their users transmit menacing messages through their systems or publish obscene material on their websites.

Indecent material

The word "indecent" in this section introduces a secondary issue. The Telecommunications Act - at least at the time it was written - applied basically to one-to-one communications. The only indecent communications that would have been prosecuted under it would be where the content had been addressed specifically at the complaining party, such as in an "indecent phone call".

However, communications over electronic networks are far more varied and, as is well known, it is far easier for someone to come across unexpected content. Equally well known is that a significant portion of Internet content is pornographic[6]. And herein lies the problem. It is clear from other areas of legislation that the majority of "adult" material on the Internet would be classified as "indecent"[7]. In effect, this wording makes it illegal for UK residents to look at adult web sites!

The Sexual Offences Act

The situation with this Act is rather more complicated. Section 45 amends the Protection of Children Act 1978 by modifying the definition of child pornography to include pictures of 16 and 17 year olds. There are certain exemptions, but in essence they apply to possession by the person appearing in photograph or on his or her instructions, and not to anyone else.

Once again the concern is that ISPs would be liable where such material was transmitted through or hosted on their systems. Indeed, we could be[8] in the somewhat ridiculous situation whereby an ISP is protected when inadvertently handling a picture of the rape of a 4 year old, but could be prosecuted over a picture of a person who, at the time, was 17 years and 51 weeks old and legal to photograph.

Solutions

One obvious solution is to add the specific defences of the Directive to each piece of legislation as it goes through Parliament. But, not only is this easy to forget - after all, it seems to have been forgotten in every case so far - it also risks having a slightly different effect to the Regulations (particularly if there are amendments in the future).

More practical solutions are either:

However, it is worth reconsidering whether this approach is scalable in the long run. As more legislation - including secondary legislation - is brought into force, the risk increases that something is overlooked. It would appear far preferable for Regulations 17 to 22 to be given prospective effect, perhaps by restricting the effect of Regulation 3(2).


[1] At the time this paper was first written, the Communications Act and Sexual Offences Act were still in the Bill process. This (2004) update changes appropriate references from Bills to Acts, corrects section numbers where they were altered in the Parliamentary process, adds a reference to a further relevant S.I., and replaces text proposing changes to the then Bills.

[2] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market ; O.J. L178, 17.7.2000, p.1.

[3] S.I. 2002 No. 2013, made on 30th July 2002 and brought into force on 21st August 2002 except for Regulation 16 (which is not relevant to this paper).

[4] The Electronic Commerce (EC Directive) (Extension) Regulations 2003, S.I.2003 No. 115.

[5] The Electronic Commerce (EC Directive) (Extension)(No.2) Regulations 2003, S.I.2003 No. 2500.

[6] Estimates vary wildly but, for example, a 1998 report by Datamation estimated it at almost 70% of the paid content market and 10% of the total economic activity on the Internet.

[7] The term "indecent" has no specific legal definition. But, for example, it is part of the legal definition of child pornography and (from past work with the Internet Watch Foundation) the author is aware that much illegal child pornography is no different from typical "top shelf magazine" pictures other than the age of the subjects. The conclusion that adult pornography is covered by the term "indecent" is inescapable.

[8] Since the Act amends older legislation, it is unclear whether the amended law is treated by Regulation 3(2) as being before the significant date or after it. The Extension No.2 Regulations address a similar situation, which would make it appear that the DTI believes the amended law is treated as being dated afterwards.


Back Back to the LL.M. index. To legal To my other legal topics index. CDWF Back to Clive's home page.