Public Key Encryption and Internet Security

Public key encryption

Public key encryption, sometimes called asymmetric encryption was invented in the late 1970s in response to the problem of sharing information between parties without the possibility of the keys used for encrypting the information being compromised.

In private key encryption, or symmetric encryption both the sender and the recipient of encrypted information have to be in possession of the private key; the sender needs it to encrypt information at source; and the recipient needs the key to decrypt, or unlock the information upon receipt.

The major problem with private key encryption is often not with the strength of the encryption keys themselves, but in the exchange of these private keys between the sender and recipient.

If the private key should become compromised, then anybody in possession of the private key and the algorithm used to decrypt the private key can read any and all communications between the sender and recipient without detection.

For example, in World War 2, Station X, the UK Government’s communications intercept centre established at Bletchley Park in 1939 concentrated on compromising private keys used to encrypt German messages. The keys were often sent just before the start of the message.

The schemes used by the German Navy – Naval Enigma – caused Station X some of the greatest problems. In addition to making the encryption key “stronger” by adding a fourth wheel to the usual 3-wheel Enigma encryption machine, the German Navy also used code books to remove the need for full encryption keys to be sent as part of a message. Code books were printed in feint water-soluble ink such that, in the event a ship was captured by allies, the radio operator only had to throw the code book into water for the book to become useless – and for the Naval encryption system to remain secure.

It was only when one of these code books and a Naval Enigma machine were captured by the British in 1941 that the allies were able to listen-in on German Naval communications and anticipate certain attacks.

Public key encryption solves the problems inherent in private key encryption by having two mathematically related keys; a public key and a complimentary private key. The public key is used to encrypt a message and, upon receipt, the receiver uses their private key to decrypt the message. It is extremely difficult, if not impossible, to determine the value of one key from the other.

Secure applications

Before the World Wide Web became consumer-oriented, Internet security was application specific – in other words if an organisation wanted to transfer information securely, or authenticate information over the Internet, it had to implement its own encryption scheme, use an existing encryption tool such as PGP (pretty good privacy), or use a library that implements a well-known encryption system such as the Data Encryption Standard, DES and RC4.

The development of Secure Sockets Layer, SSL – the de-facto standard for secure web communications – dates back to late 1993 when the National Center for Supercomputing, NCSA released its web browser, Mosaic and web server httpd. Mosaic and httpd were the first implementations of the HTTP/1.0 standard and incorporated support for fill-in forms and server-side scripting through the common gateway interface, CGI.

As more sophisticated applications were developed using Mosaic and httpd, various groups began to develop secure protocols for both information transferred from the server to the browser; and for information submitted through the browser to the server.

In later versions of Mosaic and httpd, hooks to the program PGP were introduced to support the Private Enhanced Mail (PEM) standard, and an experimental version of Common Client Interface (CCI) – the client-side equivalent of the Common Gateway Interface (CGI) – was also used in combination with PGP to ensure security of both client-server and server-client communications.

At the around the same time, an American company called Enterprise Integration Technologies, EIT developed S-HTTP – a superset of HTTP that allowed messages to be secured in a variety ways including encryption and digital signatures. In April 1994, the National Centre for Supercomputing and two companies, RSA – the owners of the RSA encryption system – and EIT announced their intention to develop a secure version of Mosaic to enable “buyers and sellers to meet spontaneously and transact business”.

Also in April 1994, Netscape began developing its web browser for the mass market. With a clear understanding of the growth of the Internet – at the time, roughly 25 million users – Netscape understood the need for secure Internet transactions to facilitate electronic commerce and began designing Secure Sockets Layer, SSL as an open, secure communications protocol.

SSL differs from other secure protocols in that it was developed as a transport-level protocol. This means that rather than providing security for a specific application such as a web browser, SSL was designed as a layer that sits between an application and the network protocol TCP/IP and secures all communications between the application client and application server – without the software developer having to consider issues such as, for example, how to negotiate the encryption system in use, and how to exchange keys.


In late 1994, the first implementation of SSL, version 2.0, was released. SSL 2.0 laid the foundation for a good, general purpose secure application transport protocol but its use in applications involving substantial risk, or funds transfer was limited due to a number of shortcomings.

In response to these shortcomings Microsoft, in association with Visa, released its own encryption layer, called Private Communications Technology, or PCT. PCT was meant to be an alternative to, and an enhanced version of SSL 2.0 and was submitted as a candidate standard to the Internet Engineering Task Form, IETF – the body responsible for developing Internet standards.

Netscape released SSL 3.0 in late 1995, incorporating features from both SSL 2.0 and PCT and since then, the IETF has assumed responsibility for SSL, renaming it Transport Layer Security, or TLS to avoid showing a preference to either company. SSL 3.0 is currently the industry standard for secure communications and provides support for all three major functions essential for secure electronic transactions: mutual authentication, data encryption and data integrity.

Mutual authentication

Mutual authentication is the term given to the process of establishing trust between the client and server through digital certificates. A digital certificate is a package of information issued by a trusted third party called a certification authority, CA.

The certification authority signs this package digitally using its own private key. Using the certification authority’s own public key, the client application – in most cases, a web-browser – can confirm the source of the certificate and, provided the source is reputable, you have confidence that the server you are sending your private details to is the intended recipient.

Similarly, many organisations such as the UK’s Inland Revenue are beginning to implement their own public key infrastructures (PKI) to enable client authentication. Client authentication enables an organisation you deal with to establish confidence, in addition to the usual username / password combination, that it can accept information from, or send information to you that might be confidential, or legally binding.

Data encryption

Data encryption is the process of obscuring information sent between the client and server during a secure electronic conversation. Data encryption ensures that anybody who manages to intercept, or listen-in on a secure conversation is unable to determine precisely what is being transmitted.

SSL provides strong data encryption between the client and server through private key, or symmetric encryption using of a pair of session keys: one for each direction of data communication – client-server, and server-client. The session keys, which typically only last for a single encrypted conversation are generated during what is called a key-exchange handshake which takes place between the client and server before any private information is transmitted.

For the client and server to ensure privacy, SSL must implement a secure handshake protocol that protects the private key during transit from the client to the server at the beginning of the session. This is where public key encryption comes in.

After the client and server have negotiated the public key encryption and compression schemes for use during the session, the client shares with the server a “pre-master secret”. This is a 48-byte value generated by the client using a secure random number generator, which is then public key encrypted using the server’s public key. The client then sends the encrypted pre-master secret to the server.

Upon receipt, the server uses its private key to obtain the 48-byte value generated by the client from the encrypted message. This 48-byte value is then processed using a “one-way function” where for any given input; the output is always the same but where the original input value cannot be derived from the output.

The new value generated by the one-way function, within the context of SSL, is called a “master secret” from which both the client and the server can generate four more encryption keys needed for secure data communication: the client-server encryption key; the server-client encryption key; and two respective client and server message authentication codes, called MAC secrets, used for checking that a secure message has not been tampered with during transmission.

The keys are derived from the master secret – rather than using the master secret itself – to ensure that no information that could be used to derive the master secret is ever transmitted over the network: if the master secret is ever compromised then unlocking all communications between the client and server is a simple process.

Data integrity

Data Integrity, in the context of an electronic conversation between a sender and a recipient, is the ability for the recipient of information to:

  • test the accuracy of information transmitted by the sender to ensure that changes have not been made to the message during transit, either intentionally or not;
  • establish confidence that the information was actually sent by the sender and not by a third party; and
  • determine whether information has been delayed in transit, or “replayed” from a earlier transaction by a third party.

SSL ensures data integrity using message authentication codes. Information transmitted over an SSL connection is broken up into fragments. For each fragment, a message authentication code, or MAC is generated by a one-way function in combination with the appropriate MAC secret for client-server, or server-client communication generated during the initial key-exchange handshake.

The MAC is then sent along with the encrypted data fragment to the receiver. The receiver decrypts the data fragment and generates its own MAC using the appropriate client-server/server-client MAC secret and then compares the MAC it has just calculated with the MAC generated by the sender. If the two MACs are the same, then the recipient can be confident that the message has not been corrupted or modified during transit and has been sent by the sender and not a third party.

The problem of identifying whether information has been delayed in transit, or replayed from an earlier transaction is resolved in two ways: first, the use of session-keys means that, for each discrete electronic conversation, different encryption keys are used for each so that messages from one conversation are rendered useless in a later conversation; and second, sequence numbers encoded in the MACs sent with each fragment ensure that if any part of a message is delayed, or replayed during the same conversation then the connection is terminated, alerting both the recipient and sender to possible interception.


Using public and private key encryption schemes in combination, SSL, or TLS as it is now called represents a solid foundation for the development of secure applications on the Internet and is a key enabling technology for e-business applications beyond simply acquiring username / password combinations and credit card details.