Submission

on the Electronic Commerce consultation

of the DTI

Introduction

This is a response to the DTI paper "Building Confidence in Electronic Commerce" URN 99/642, issued on 1999-03-05. It is submitted by Clive Douglas Woon Feather, who (in the spirit of the consultation) can be reached at the electronic mail address <clive@davros.org>.

Disclaimer

This submission contains only my own opinions. It is not necessarily the view of my employer, of any other body that I represent from time to time, or of any other person or body.

This submission may be published by the Government in full but not in part. I will be publishing it myself on the Internet at <http://www.davros.org/legal/ecommsub.html>. Private individuals may make copies and take extracts for fair use - including adverse criticism - provided that any extracts are made in a fair manner, identify the fact that they are extracts, and are not misleading. Unacknowledged extracts may also be used in other submissions on the DTI paper provided that I am notified after the event.

Author

I am currently the Director of Software Development for Demon Internet Ltd., a major Internet Service Provider in the UK and part of Scottish Telecom. Demon has been providing dial-up access to the Internet since 1992. I am also the Regulation Officer for the London Internet Exchange (on secondment); LINX comprises 67 major ISPs all of whom either do business in the UK or have a commercial interest in Internet users based in the UK. Finally, I am Chair of the Management Board of the Internet Watch Foundation.

In my various roles I have been involved in discussions with ACPO and others on policing and the Internet. In addition, my father was a police officer for 30 years, which has given me some insight into policing issues from "the sharp end".

I have had an informal interest in cryptography for over 25 years. I have broken some minor (and by today's standards simple) cyphers; in one case I broke the encryption on material seized by the police, but the resulting information was not of use in the ongoing investigation.

General

I generally welcome the Government's intention to legislate in the area of Electronic Commerce, and I understand the issues that they have needed to consider in drawing up this consultation paper.

However, as well as detailed comments there are a number of areas where the consultation as a whole must be criticized.

Firstly, a period of only 27 days (8 of them weekends and another lost in the logistics of obtaining the paper and delivering a reply) is far too short for such a major topic. It makes it almost impossible for any group of people, whether part of a company or otherwise, to calmly consider the proposals made and their responses. It is also far shorter than Cabinet Office guidelines, which suggest 8 weeks.

It has been suggested that the consultation period is unimportant because this is merely a revision to existing proposals. This is at best misguided, because there have been radical changes that need proper consultation. In any case, the point about time to consider a response applies to the actual consultation paper, and not to what it might be going to be.

Some people have claimed that the consultation needs to be hurried so that a Bill can be presented to Parliament in April. I believe it is far better to have a well-thought-out Bill next session rather than a badly flawed one this time. Furthermore, should such a Bill appear in April, it will be an indication that the responses to this consultation exercise have not properly been considered. The last consultation took almost a year just to produce an analysis of the responses without rethinking policy at all; can Government really work 20 times as fast this time ?

Finally, if there was such a deadline for a new Bill, it behoves the DTI to produce its consultation in good time to allow a proper response period followed by a sensible amount of time to consider the results. It is not the fault of those responding.

Secondly, the consultation paper presents a mix of commerce issues - properly the province of the DTI - and law-enforcement issues - properly the province of the Home Office. Back when the now-discredited idea of Key Escrow was in vogue, there might have been some vague justification for this. However, since this is no longer the case there is no good reason for not splitting the consultation - and any Bill - into two totally separate areas. Law Enforcement access to encrypted material is nothing to do with "Building Confidence in Electronic Commerce", and any attempt to pretend otherwise is wrong. Again, should this split require extra time, it would be better two have two well-designed Bills next year rather than one flawed one now.

Thirdly, the consultation paper shows clear evidence of bad editing. The most charitable view is that this is because it was produced in a hurry after a major policy change. If so, it should have been delayed until it could be done properly. We have waited 3 years for this paper; a few weeks more would not have hurt. A more cynical view is that the "flawed" areas would allow an easy reversal of the policy change. I hope that this is not the case.

Finally, though the paper goes to some effort to explain what public key cryptography is and what electronic signatures are, it singularly fails in many places (paragraph 89 is a weak exception) to distinguish between "communication secrecy" - cryptography used to secure communications between two parties - and "storage secrecy" - cryptography used to ensure the privacy of stored material. In some cases it takes arguments that are valid for one and then uses them as justification for the other.

For example, paragraph 36 talks about key escrow and key recovery. It states: "The loss of an encryption key, whether through negligence, or because an employee has left etc., could be very damaging". In the case of storage secrecy this is very true. Assuming that secure cryptography had been used, it would not be possible to recover the contents of any material stored using this key. Therefore arrangements should be made to allow the key to be recovered, for example by storing a copy with the company's solicitor in a sealed envelope. [I see no reason, however, why this requires any more of a government licence than arranging for her to store my house deeds does.]

However, there is absolutely no reason why this applies to communication secrecy. File copies of messages that I send are stored using my storage key, not my communications key. If I lose the latter, it means that I cannot read any message sent to me. In this case I simply generate a new key, publish it in the same way as the old one, and ask any the authors of any correspondence using the old key to resend it. This may cause me some inconvenience, but it is my own fault for forgetting the key. The designers of communications protocols have known for centuries that the correct approach is to expect an acknowledgement to a message and to resend it if one is not forthcoming. There is no need to reinvent a square wheel here.

I hope that this confusion between two very different matters is purely accidental and not an attempt to mislead.

Legal recognition of Electronic Instruments

I applaud the proposal to allow "writing" to include electronic messages and for "signed" to include electronic signatures. In this day and age it is important not to rule out modern means of communication. I also accept that there are special situations (such as the registration of births) where electronic mechanisms are not yet appropriate.

However, I feel very uncomfortable with the proposal to use secondary legislation to determine these matters. One reason is that it would leave matters in an inconsistent state for some time, possibly even for years. Another reason, and perhaps more significant, is a statement made by DTI officials that it would be open to government departments to individually decide whether they would submit such secondary legislation to accept electronic signatures and, if so, what restrictions to put on them (up to and including a requirement to use escrowed keys). This would make a complete mockery of the broad proposals to be "technology neutral".

I would therefore propose that the correct path is for primary legislation to make electronic signatures and writing acceptable wherever the law uses the terms "signed" and "written" (or their equivalents), except for a number of clearly outlined cases such as those mentioned. In addition, the Act should include powers to create secondary legislation - to exclude other legislation from this rule - for a fixed period of say three months, with the new rule not coming into effect until the end of that time.

Placing the "burden of proof" on the laws to be excluded will ensure that only essential differences are allowed to remain. If three months is not long enough to determine the cases where the law really needs to distinguish different kinds of technology, it is an indication that this whole area is being legislated in a hurry and without proper thought.

Paragraphs 20 and 21 suggest that signatures certified by licenced CAs would have a rebuttable presumption of "correctness", while other signatures would not. I disagree strongly with this suggestion. Firstly, paper signatures are not normally certified or notarized. If the term "certification authority" had been replaced by the more accurate "electronic notary" this would be clearer. There is a rebuttable presumption in law that all paper signatures are valid, and this should apply equally well to electronic signatures no matter whether or not they have been notarized by another party. The advantage of using a licenced CA should be merely that the procedures it uses would have been pre-vetted. In particular, a court should be able to take notice that a given signature scheme had been accepted by a previous court and not require it to be analysed unless there is good reason to believe that the signature is invalid.

Other possible legislative changes

Since this is the first attempt to address Internet issues in UK legislation, I welcome the suggestion that it is not limited to the narrow issues of electronic signatures and law enforcement matters.

Paragraph 26 discusses the use of electronic mail rather than paper in a number of business areas. While these changes would be useful, I consider them to be relatively minor - for example, the costs involved in producing a company's annual report are such that the delivery by post is relatively small. I feel that the issues I raise below are much more important than this, and I would be unhappy if they were ignored for lack of time while transmission of documents were included.

Unsolicited e-mail (SPAM)

Firstly, I wish it to be noted that "spamming" is not limited to commercial communications. Indeed, it is often the non-commercial spams that are most annoying. For this reason, and because "e-mail" is a bad term to use in respect of newsgroups, LINX has introduced the term "unsolicited bulk messaging" (UBM) which is more descriptive, because the two essential characteristics are that it is unsolicited and that it is sent in bulk.

Whilst an initial reaction would welcome some kind of legislation against UBM, this must be tempered with caution. The vast majority of UBM originates from outside the UK, and thus legislation would make little difference. Paragraph 31 suggests four possible actions that could be taken. Of these:

Liability of ISPs

In my opinion, the single biggest threat to electronic commerce in the UK is the liability of ISPs for material they carry but do not originate.

The most recent attempt by Parliament to legislate in this area is the Defamation Act 1996. This created a defence of "innocent dissemination" provided that the ISP could show:

(a) [it] was not the author, editor or publisher of the statement complained of,
(b) [it] took reasonable care in relation to its publication, and
(c) [it] did not know, and had no reason to believe, that what [it] did caused or contributed to the publication of a defamatory statement.

This matter has recently been tested in the courts in Godfrey v Demon, as part of a judgement by the Hon. Mr Justice Morland.

Much of the material handled by ISPs consists of Usenet articles (temporary items, with approximately 1 million new items appearing every day) and web pages (with a typical ISP hosting hundreds of thousands or millions of such pages). It is clear to all that this material meets test (a). However, the cited case concerned a situation that involved parts (b) and (c) of the test. The ISP was apparently notified in a rather informal manner that the article was alleged to be defamatory, and this notification was not acknowledged. Nonetheless the judge took the view that this was sufficient to negate these parts of the test from that moment on.

The major effect of this judgement is that ISPs are required to take every notification received and examine the relevant material. An ISP has no knowledge of the facts, no practical way to find them out in the timescales involved, and no way to determine whether a particular item is defamatory. Nothing in the judgement appears to exclude notification sent by email, or even written in red crayon on tissue paper. The result is that any person can make an unbounded number of vexatious claims and the ISP needs to waste time investigating each one or risk a lawsuit. The alternative is to always assume that the material is defamatory and remove it, thus quashing free speech and open debate. In the meantime, the originator of the material - if a customer - may be in a position to claim damages against the ISP for suppression of their work. This leaves the ISP with no safe position to be in.

In the field of electronic commerce it is perhaps more important to consider matters of copyright, trademark infringement, false advertising and so on. It is unclear what level of responsibility an ISP has in these matters, but it would not be unreasonable to assume that a future lawsuit would see the "innocent dissemination" test as a reasonable one. The way is now open for business rivals to make claims about each other's web sites that the ISP has no way to verify, yet is potentially liable for as soon as they are made. [For example, I am aware of a case where two companies in the same business were customers of the same ISP. Both used the same photographs on their web sites. Each claimed to be the holder of the copyright in the photographs and that the other was breaching it, but neither provided any useful evidence of this to the ISP. The ISP risked being sued on copyright grounds if it did nothing, and on damage to business grounds if it did anything.] Such a position is invidious to the ISP. It also makes businesses unhappy to move into electronic commerce, because they can never know if their web site will suddenly vanish after a spurious complaint.

I am not suggesting that ISPs should be absolved of all liability for the material that they retransmit. I consider the most significant issue to be that of when an ISP "knows" that it is carrying problematic material. As discussed above, the current situation is far too weak, because an apparent "crank" letter can be enough to leave the ISP liable. The test needs to be much stronger, and in particular it needs to remove the problem of judging the material from the ISP to some competent authority such as a court. I therefore propose that there should be a liability regime that meets the following requirements.

  1. This regime would only apply where test (a) of the Defamation Act would be passed, but would apply to all liability for material, not just defamation.
  2. There should be a mechanism whereby a complainant can obtain a formal order (a "removal order") against an ISP to have material removed from the latter's public-facing systems so that it is no longer being "published". This mechanism might require a court order, or it might simply require the complainant to register the order with a court office. Failure to obey a removal order, properly served, would be punishable.
  3. The ISP would not be liable for direct or consequential damages that result because it obeyed a removal order, nor for innocent dissemination before receipt of the order or during the time it reasonably takes to obey it.
  4. There should be a mechanism whereby the author of the material could appeal against the order. If this appeal is disputed, it would be for a court to decide the issues and whether the order stands in the meantime.
  5. The ISP would have a duty to reveal to the court any information it has concerning the identity of the author. If it has the ability to reliably identify the author, the ISP would also be required to pass him a copy of the removal order and to relay any other communications unless and until the author consents to their name and address being disclosed.
  6. The author of the material would have the right to seek damages if a removal order were not followed up with a lawsuit in a reasonable time - say 1 month. Mechanisms to allow either party to make amends similar to those in the Defamation Act might be worthwhile.
  7. Deliberately obtaining a removal order in a fraudulant manner or where there are no reasonable grounds to support it would be an offence. The ISP should also have a right to sue the complainant in this case.
  8. An ISP would have the right to seek damages if a removal order were taken out against it and not against other parties in the UK closer to the source of the material or, where identifiable, the author. In other words, the complainant must be making a genuine attempt to block the material at source and not just pick on an ISP with "deep pockets".
  9. If the complainant cannot identify the original author of the material, this would not be grounds to act against the ISP or any other party.

This mechanism need not be specific to electronic commerce. For example, it could equally well apply where a physical noticeboard is provided and not actively monitored, and could apply to newsagents and other bodies that carry material without normally being aware of its contents. But without such a mechanism electronic commerce will remain risky because of the ease with which spurious complaints can disrupt it.

Licensing regime

Those experienced with communications security are aware that only "end- to-end" security is reliable, because any other system requires placing trust in a third party. Despite the name, such "trusted third parties" actually introduce mistrust, because their only purpose is to allow a way for information to leak when neither party wishes this. I therefore see a licencing regime for such bodies to be simply encouraging people to falsely believe that their data would be more secure when in fact it would be less secure.

Certification Authorities or, to give them a more accurate name, Digital Notaries, are a somewhat different matter. As with a traditional notary, their purpose is to certify that a document produced by a person is a true copy of that document and that the notary has checked that the person is who they claim to be. The only significant difference is that an electronic signature, once certified, can be securely attached to any document without the notary needing to see the process taking place. I do not disagree with the idea of a licencing process to ensure that such digital notaries are competent and also carry appropriate liability, but I do not see it as essential and it should be possible for any person to provide such a service without needing a licence.

In the same way as end-to-end cryptography is the only safe form, keys can only be safe if it is known who holds them. Therefore I would strongly recommend that licenced CAs should be forbidden from generating the key pairs for their customers. Licenced CAs should be only be permitted to certify a public key where the customer makes an appropriate declaration that no other person has had access to their private key.

I disagree with the proposals of paragraph 39. Trading Standards law is sufficient to ensure that bodies do not misrepresent their licencing status. Any restrictions of the provision of mixed services would be either onerous or easy to get around.

The Licensing Authority

Despite specific questions asked during public meetings, I have been unable to receive a credible explanation of why Oftel should be the licensing authority. The Data Protection Registrar (or Data Commissioner) has had 15 years experience in dealing with computers, the security of data, and the enforcement of mechanisms to ensure that computer data is handled in a proper manner. By contrast, Oftel has little credibility with the service provider industry or with consumers. There seems no plausible reason to choose them.

I disagree with the proposals in paragraph 41 to delegate out licencing powers to industry, especially since it is industry that would be being licenced. To do so is to violate all the good practices of regulatory authority, such as independence and transparency.

Liability

Paragraph 46 asks whether there should be a duty of care imposed on holders of private signature keys. This would be a good idea, but any liability because of this duty should only apply to actions carried out after the key holder becomes aware of that loss and (in the case of actions by a third party) after they have had a reasonable opportunity to notify the CA.

Far more important is that, should any other person obtain access to a private key they should have a specific duty to inform the key owner as soon as practical. In the case of law enforcement officials this would start as soon as the immediate need for secrecy - if any - was over, and there should be a rebuttable presumption in law that law enforcement officials do not require access to signature keys.

Law Enforcement Interests

As stated above, the issues addressed in this part of the paper bear no connection with the rest of the electronic commerce debate. Since they do relate to topics such as interception of communications, and since there is a review of the Interception of Communications Act in the near future, they would be far better left until then. It seems a waste of Parliamentary time to legislate proposals only to need to modify them in the next Parliamentary session.

The examples in paragraph 50 (other than the fourth) describe problems caused by encryption of stored material. These problems are no different, in principle, to those that happen when the police find a locked safe or where the secret files are buried at the end of the garden. It is claimed that a safe can be broken into while cryptography can be secure, but equally well safes can be designed to destroy their contents if opened in the wrong way.

Such examples do not involve communications secrecy. Therefore they are not relevant to interception matters. The consultation document should have made this fact clear.

The fourth example in paragraph 50 talks about "cryptoviral extortion". I have asked representatives of the Home Office what this is supposed to mean - the term being unfamiliar to myself and various other experts - and what the threat is, but to date they have been unable to explain. [Note that, due to the ridiculously short consultation period, a reply arriving after this submission is sent to the DTI is quite possibly but totally useless.] In the absence of such an explanation I have to assume that it does not describe a credible threat.

Paragraph 58 talks about new legal powers to provide access to encrytion keys for communications secrecy. The only way that such keys can be obtained is from the communicating parties or because they have been escrowed. As I have shown above, there is no benefit - and much danger - in escrowing communications keys. Therefore one would only escrow such keys in order to allow the police and security services to intercept one's communications. I have difficulty believing that such keys would yield any genuine intelligence data (as opposed to deliberate false leads). If one party to the communication is willing to cooperate with the interceptor, then the key can be obtained from that party and no powers are required. Therefore I conclude that either no such powers are needed or the consultation paper is failing to provide an important part of the reasoning. Indeed, as made clear in paragraph 87, law enforcement agencies do not need such keys to be escrowed.

Paragraphs 59 to 63 discuss the seizure of encrypted material under various powers. While I accept the general principles discussed in them, the statement in paragraph 52 that the proposals "do not extend the intrusive surveillance powers of the law enforcement [...] agencies" is simply false. As I discuss below, the proposals given in the paper do widen these powers. If the Government is genuine in its stated aims, the proposals need to be modified along the lines I give below.

[Incidentally, the first part of that bullet point is blatently wrong as well: if the proposals were technology neutral then they would include a power to require a safe owner to hand over the key or combination, rather than just to open the safe, and they would encourage householders to leave a copy of their keys and their alarm code with some unknown "trusted third party".]

Legislative proposals

The first major flaw in the proposals is that they allow the person issuing "written notice" to decide whether the material shall be decrypted or the key revealed. Since the term "authorised officer" is not explained, I am left to assume that this refers to the ordinary police constable or detective constable working on the case. Such an officer is unlikely to appreciate the issues of personal privacy that apply to keys and will simply go for the most obvious option.

Talking to serving police officers who have dealt with encrypted material, they are completely uninterested in encryption keys. I have discussed a number of scenarios with them and they want two things:
- access to the actual stored data
- clear evidence that the decryption is correct.
The latter is, of course, the more important point. Luckily there are a number of technologies that allow this. Here are three examples.

  1. The encryption system can use two levels of key: a document key specific to each file and a master key. The master key is used to encrypt the document key, and the document key to encrypt the document. The encrypted document key is placed at the start of the encrypted document. [An analogy is PGP, where the public/private pair is used to encrypt a session key specific to the message, the rest of the message is encrypted with the session key, and the encrypted session key is included in the message.]

    When the police require access to the material, the owner can use his master key - while keeping it secret - to extract the various document keys. The police can then use the document keys to decrypt the files. The police know that the decryption is correct, while the owner has no need to reveal his master key.

  2. The encryption system could be asymmetric, involving a public and a private key. The police could re-encrypt a putative decryption; if it is genuine, the resulting encrypted document would be the same.

  3. The properties of the encryption system might make it possible to verify the encryption to a reasonable level of probability (say one in a million or one in a billion). For an example of how this would work, consider a system that uses 16 byte block encryptions. If two 16 byte blocks of cyphertext are the same, then so will the plaintext be. It would be possible to run an analysis on files to confirm this. If two blocks are identical in one file but not the other, there is immediate proof that the encryption is false. If a number of such tests are passed and the decrypted document makes sense, there is strong evidence that the decryption is genuine.

If the above techniques - or others - are used then there is no need for the police to have the true encryption key. In these circumstances, the only reason to have the key disclosed is so that other material - not declared to the owner - can be decrypted. Even if the police only intend to do this for material legitimately seized, the Government must be aware that "bad apples" do exist and there is no public confidence that the key will be kept secure or only used properly. Therefore keys should only be compelled when there is no other way to obtain the material.

It would also be possible to set up the decryption software and the files to be decrypted on a "clean" system and have the decryption supervised by an independent expert who would ensure that any temporary copy of the key was destroyed. Again, there would be no reason to hand over the key itself.

For all these reasons, when a case requires material to be decrypted, the initial power should be to decrypt the material and provide evidence that it is correct. It must not include a requirement that the key be handed over. Only if the owner of the material is unable to prove the latter should there be a requirement to hand over the key. If the legislation does not provide this sequence of steps, it can only be in order to widen the powers of law enforcement contrary to the claims in the paper.

As alluded to above, there is no indication as to who can issue a "written notice", leaving me to conclude that any police officer can do so. I feel that this power ought to be reserved to the courts. The seizure of material would normally require a court order and thus there is no reason not to go back to a court for an order to decrypt data. Even where the data has been seized under s18 PACE, the reasons that these powers were granted are not applicable to examination of the data. It has been suggested that there may not be time to obtain a court order in some cases. This is not a believable argument - good forensic practice will require any computer material to be copied before examination, which takes time - and nor is it good grounds for walking roughshod over basic human rights. Finally, it is very likely to be contrary to the Human Rights Act, which requires persons to be secure in their possessions; the exemptions for due legal process are normally taken to apply only to a court and not to mere police powers.

Even if the above arguments are not persuasive, I would state that the powers should be limited to an officer of the rank of Superintendent or above. An order to reveal a key, as opposed to an order to decrypt, should only be granted by a court and should allow the owner of the key an opportunity to be heard.

Safeguards

Mention of a Code of Practice in paragraph 73 would be rather more credible if the existing Code of Practice on Government consultations had been adhered to.

Unless a breach of the Code of Practice is a criminal offence, it offers no real safeguard against the legitimate concerns of the key owner. Once a key is revealed the damage is permanent. There needs to be more deterrent than a mere "breach of Code of Practice" to prevent unnecessary harm.

Oversight and complaints

Paragraph 75 suggests an oversight mechanism similar to that for IOCA. The latter has seen less than 10 complaints about interception warrants. This, of course, is because it is almost impossible to determine if one is the target of such a warrant.

Unless there is an automatic requirement to notify the owner of a key that it has been revealed at the earliest possible opportunity commensurate with good policing, this mechanism will be equally nugatory.

Key Escrow and Key Recovery

Paragraphs 80 to 82 continue to confuse communications security with storage secrecy. The suggestion in paragraph 81 that key escrow of communications keys has benefits to the user is little short of fraudulant. [It must be communications keys that are being referred to since the context of the statement is interception of communications.] Secure communications have been available to the general public for at least 75 years and, arguably, well over 2000. The Government will need to live with this fact.

Annex A - Licensing Criteria

I find myself uncomfortable with the use of secondary legislation in this area. As will be seen below, this is a somewhat technical topic and apparently minor details can have a disproportionate effect. It would be far better to take the time to at least determine the basic criteria under which licences should be granted and include them in the primary legislation. For example, there should be a general statement that no licencing requirement should weaken the security of data protected - directly or indirectly - by the licenced party.

General Licensing Criteria

These criteria appear to be based on the assumption that all licensed bodies would be UK limited companies. This seems to be unreasonable. As stated above, a "Certification Authority" is effectively a notary, and it should be open to Public Notaries and other solicitors to take such a role. Even if it were legitimate for these people to form a limited company, the only possibly effect would be to restrict their liability to the potential detriment of their customers.

There are proposed requirements that a provider must show it is financially viable and has a valid business plan. For many start-up businesses is might not be possible to obtain the former until it has its licence, and I find myself completely unable to see how Oftel is competent to judge business plans. Nor do I see why a provider needs to be a viable business. Provided that it meets the liability requirement below, it could be run as a hobby or as a loss-making unit of another business. To suggest otherwise is an irrational prejudice, particularly since many unconventional business models (such as shareware) have been shown to work on the Internet.

The most important criterion is liability. If the legislation places a certain liability on licensed bodies, then an applicant should be required to show that they can meet any credible claim on that liability, through insurance or otherwise. If this is met, then the financial viability of the company is irrelevant. If it is not, then even the best and prettiest business plan will not save it.

The phrase "the licensee would be expected to make provision for real- time communication with those responsible for running the organisation". This requirement is of possible benefit in the case of a body holding escrowed keys where there is a need to obtain them within minutes - something that most people would call unrealistic - but makes no sense as a general licensing criterion unless there is a plan to introduce mandatory escrow at some later date.

Finally, the sentence on key generation seems even less relevant for a general licencing requirement. Many (perhaps all) licensed bodies will not be generating keys at all. Once again, this appears related to mandatory escrow.

Licensing Criteria for Certification Authorities

Most of the criteria appear reasonable, but once again there is a joker in the pack. The requirement that "the certificate must not be used to validate a key being used to secure the confidentiality of information" makes a mockery of the whole process. When I communicate with another person using public key cryptography, I need to ensure that the key being advertised as hers is actually hers and not a third party's. This is normally done by:

If, however, she is not allowed to sign her own keys, this whole mechanism collapses. Such a requirement exists either in error (another sign of the unseemly haste in which this paper was written) or deliberately to prevent practical electronic commerce.

It is nobody's business but my own how my key pair is generated, as I will never reveal the private part to the CA. Therefore they have no need to know the process nor does the licensing authority have grounds to restrict it or require me to use specific equipment. Any such requirement would simply make licensed CAs unattractive to me.

Conditions on a TTP for the provision of a Confidentiality Service

As stated above, any access to a confidentiality key should be through a court order. Unless "a period specified in the licensing conditions" is measured in days (and this is a case where an apparent small detail actually makes a lot of difference), the requirements for disclosure of keys will directly contradict both good practice in computer security and the immediately following requirement for procedures to determine the authenticity of the request.

Of course, if such requests were digitally signed and there was a requirement that the relevant law enforcement private keys were escrowed with the service, it would concentrate minds wonderfully on both the security of and the necessity for such a system. Consumers might also be more willing to trust a body that the Government trusts.

These comments apply to the conditions on Key Recovery Agents as well.

Submitted by email 1999-03-31


Back Back to the legal topics index. CDWF Back to Clive's home page.