The basic security mechanics are confidentiality, integrity, and availability. In the case of the wireless world (as in many other worlds), authentication, authorization, and access control are also the basic security mechanics to be achieved. Many times, availability is viewed as a quality-of-service (QoS) feature rather than a security issue. Cryptography, in some sense, is the mechanism to achieve the security goals. The topics of interest to the wireless world are digital signature, encryption, and key management.
The following list provides the formal definitions of the basic security mechanics:
Definition: The capability to send (and receive) data without divulging any part to unauthorized entities during the transmission of data.
Mechanisms: Encryption?symmetric and asymmetric.
Definition: The capability to send (and receive) data such that unauthorized entities cannot change any part of the exchanged data without the sender/receiver detecting the change. If only integrity mechanics are in place, data can be changed, but the integrity will detect tampering.
Mechanisms: Digital signatures using one-way hash functions.
Definition: Here, availability is defined as the capability to receive and send data. For example, if a system is under a DoS attack, it will not be able to receive or send data.
Mechanisms: Availability mechanisms are mostly defense mechanisms that detect various forms of DoS attacks and guard against them.
Definition: Authentication establishes the identity of the sender or receiver of information. Any integrity check or confidential information is often meaningless if the identity of the sending or receiving party is not properly established.
Mechanisms: Multiple levels and protocols such as 802.1x, RADIUS, PAP/CHAP, MS-CHAP, and so on.
Definition: Authorization is tightly coupled with authentication in most network resource access requirements. Authorization establishes what you are allowed to do after you have identified yourself. (It is also called access control, capabilities, and permissions.) It can be argued that authorization does not always require a priori authentication. However, in this book, authentication and authorization are tightly coupled; authorization usually follows any authentication procedure.
Mechanisms: Multiple levels and protocols.
Definition: The capability to control access of entities to resources based on various properties: attributes, authentication, policies, and so on.
Mechanisms: At the access point (AP) based on authentication or knowledge of the WEP key.
Definition: The capability to transform data (or plain text) into meaningless bytes (cipher text) based on some algorithm. Decryption is the act of turning the meaningless bytes to meaningful data again.
Mechanisms: The main issue related to authentication and authorization in the wireless space is the robustness of the methods used in verifying an entity's identity.
The second issue is maintaining the confidentiality of the "wire" and connection and keeping it bulletproof.
In the wireless case, the wire is the air, so the problem of confidentiality becomes more difficult because anybody could potentially be a passive listener to the airwaves. The relevant point is that, in the WLAN space, encryption is needed if you are to trust the authentication. The wireless domain employs mechanisms such as WEP, CKIP, and TKIP.
Definition: A key is a digital code that can be used to encrypt, decrypt, and sign information. Some keys are kept private, and others are shared and must be distributed in a secure manner. Key management refers to the process of distributing keys for the processes previously mentioned (for example, changing keys, not signing information).
Mechanisms: The challenge in the wireless area is the key distribution?secure and scaleable in an automated fashion.
Next, look at each of the mechanics in a little more detail.
Confidentiality is achieved using data encryption. Encryption can be done using either the symmetric key paradigm or the asymmetric key paradigm.
Symmetric key encryption, often referred to as secret key encryption, uses a common key and the same cryptographic algorithm to scramble and unscramble a message.
Symmetric key encryption and decryption are mathematically inexpensive compared to asymmetric key; therefore, they have a major performance advantage. For any bulk encryption, the preferred method is symmetric encryption.
Figure 2-1 shows two users, Alice and Bob, who want to communicate securely with each other. Both Alice and Bob have to agree on the same cryptographic algorithm to use for encrypting and decrypting data. They also have to agree on a common key?the secret key?to use with their chosen encryption/decryption algorithm. There are negotiation protocols to arrive at mutually agreeable algorithms and keys.
The symmetric key algorithms fall into two categories:
Block ciphers? Operate on 64-bit message blocks. Even though most of the block ciphers operate on a 64-bit block, it is not an absolute requirement.
Stream ciphers? Operate on a stream of data, which basically means they operate on a byte at a time.
One point to note is that you don't get to choose which method to employ for a given algorithm. The algorithms employed by the WLAN methods use the block cipher method, so let's look closely at this method and the associated challenges.
The newer WLAN security algorithms use the Advanced Encryption Standard (AES), which uses the stream cipher method. But it is still informative to understand and learn from the vulnerabilities of the "classic" WLAN security methods.
In the block ciphers, it is necessary to break up larger messages into 64-bit blocks and somehow chain them together. Four common chaining mechanisms called modes exist, and each mode defines a method of combining the plain text (the message that is not encrypted), the secret key, and the cipher text (the encrypted text) to generate the stream of cipher text that is actually transmitted to the recipient. These four modes are as follows:
Electronic codebook (ECB)
Cipher block chaining (CBC)
Cipher feedback (CFB)
Output feedback (OFB)
The ECB chaining mechanism encodes each 64-bit block independently but uses the same key. The result is that the same plain text will always result in the same cipher text. This weakness can be exploited in multiple ways. For example, if a snooper knows the plain text and the corresponding cipher text, that person can at least understand some parts of a message. Another vulnerability is the opportunity for an eavesdropper to analyze and perform pattern matching. A much simpler vulnerability is that an eavesdropper can recognize a change of information (when the cipher text changes) and make inferences without knowing the contents. For example, consider someone snooping a certain employee's automatic payroll transactions to a bank. Assuming that the amount is the same for each paycheck, each ECB-encoded cipher text message would appear the same. However, if the cipher text changes, the snooper could conclude that the payroll recipient received a raise and perhaps was promoted.
Remember, in the wireless world, encryption has one and only one function: to prevent an eavesdropper from reading (or, for that matter, making any intelligent inferences of) the data passing.
Because an eavesdropper can capture the packets and analyze them later, the task of achieving confidentiality is much more difficult, so the goal is achieving confidentiality for a time limit; the strength used depends on how long the data should remain unreadable.
Another vulnerability is information leakage through pattern matching and recognizing a change in fixed data. An eavesdropper could capture the WLAN packets, look for changes, and make some inferences about the information. This makes WLAN confidentiality dynamic, so achieving confidentiality in the WLAN world becomes more difficult than in a static environment.
The remaining three algorithms?CBC, CFB, and OFB?have inherent properties that add an element of randomness to the encrypted messages. If you send the same plain text block through one of these three algorithms, you get back different cipher text blocks each time. Most secret key algorithms use one of these four modes to provide additional security for the transmitted data.
In CBC, the current block is XORed with the previous block. This still leaves the first block vunerable, and for that, the CBC uses an initialization vector (IV). An IV is an encrypted block of random data used as the first 64-bit block to begin the chaining process.
The CFB mode uses the cipher text of the preceding block rather than the plain text.
The OFB mode is similar to the CFB, but the XORed block is generated randomly and is therefore independent of the preceding plain text.
The wireless world uses a stream cipher with an IV to achieve the randomness. In the case of Wired Equivalent Privacy (WEP), the IV is sent as plain text with the encrypted data. As you will see later, one of the weaknesses in wireless security deals with the implementation of the encryption algorithm and how the IV is handled.
The following are some of the more common symmetric key algorithms used today:
Advanced Encryption Standard (AES)? AES was developed as part of the U.S. Department of Commerce's effort to develop the next-generation encryption standard. You can find more information at the National Institute of Standards and Technology website: http://www.nist.gov/public_affairs/releases/g00-176.htm. Currently, among all the encryption algorithms, AES is the strongest encryption algorithm, so all new implementations are moving toward AES. For example, the IEEE specification 802.11i requires AES encryption.
Data Encryption Standard (DES)? DES is the most widely used encryption scheme today. It operates on 64-bit message blocks. The algorithm uses a series of steps to transform 64 input bits into 64 output bits. In its standard form, the algorithm uses 64-bit keys, of which 56 bits are chosen randomly. The remaining 8 bits are parity bits (one for each 7-bit block of the 56-bit random value). DES is widely employed in many commercial applications today and can be used in all four modes: ECB, CBC, CFB, and OFB. Generally, however, DES operates in either CBC mode or CFB mode.
The standard DES key length is 64 bits. Due to export regulations, you will see that 40-bit DES is also a standard. In the 40-bit DES, all but 40 bits of the key are disclosed by the implementation of the communications mechanism. For example, you can implement 40-bit DES by prefacing each message with the same 24 bits of the DES key used to encrypt the data. 40-bit DES exists solely as an artifact of U.S. government export controls; there is no technical reason that you should not use standard DES at all times.
3DES (read "triple DES")? 3DES is an alternative to DES that preserves the existing investment in software but makes a brute-force attack more difficult. 3DES takes a 64-bit block of data and performs the encrypt, decrypt, and encrypt operations. 3DES can use one, two, or three different keys. The advantage of using one key is that, with the exception of the additional processing time required, 3DES with one key is the same as standard DES (for backward compatibility). 3DES is defined only in ECB mode, mainly for performance reasons: it compromises speed for the sake of a more secure algorithm. Both the DES and 3DES algorithms are in the public domain and are freely available.
Rivest Cipher 4 (RC4)? RC4 is a proprietary algorithm invented by Ron Rivest and marketed by RSA Data Security. It is used often with a 128-bit key, although its key size can vary. RC4 uses the block cipher mode. It is unpatented but is protected as a trade secret; however, it was leaked to the Internet in September 1994. Because the U.S. government allows it to be exported when using secret key lengths of 40 bits or less, some implementations use a very short key length. WEP uses the RC4 algorithm.
International Data Encryption Algorithm (IDEA)? IDEA was developed to replace DES. It also operates on 64-bit message blocks but uses a 128-bit key. As with DES, IDEA can operate in all four modes: ECB, CBC, CFB, and OFB. IDEA was designed to be efficient in both hardware and software implementations. It is a patented algorithm and requires a license for commercial use.
Symmetric key encryption is most often used for data confidentiality because most symmetric key algorithms have been designed to be implemented in hardware and have been optimized for encrypting large amounts of data at one time. Challenges with symmetric key encryption include the following:
Changing the secret keys frequently to avoid the risk of compromising the keys
Securely generating the secret keys
Securely distributing the secret keys
Asymmetric encryption is commonly used to facilitate distribution of symmetric keys. A commonly used mechanism to derive and exchange secret keys securely is the Diffie-Hellman algorithm. This algorithm is explained in the "Key Management" section later in this chapter.
Asymmetric encryption is often referred to as public key encryption. It can use either the same algorithm or different but complementary algorithms to scramble and unscramble data. Two different but related key values are required: a public key and a private key. With the keys, if plain text is encrypted using the public key, it can only be decrypted using the private key (and vice versa).
Some of the more common uses of public key algorithms are listed here:
Data confidentiality and sender authentication can be achieved using the public key algorithm. Figure 2-2 shows how data integrity and confidentiality are provided using public key encryption.
The following steps must take place if Alice and Bob are to have confidential data exchange:
Both Alice and Bob create their individual public/private key pairs.
Alice and Bob exchange their public keys.
Alice writes a message to Bob and uses his public key to encrypt her message. Then she sends the encrypted data to Bob over the Internet.
Bob uses his private key to decrypt the message.
Bob writes a reply, encrypts the reply with Alice's public key, and sends the encrypted reply over the Internet to Alice.
Alice uses her private key to decrypt the reply.
Data confidentiality is ensured when Alice sends the initial message because only Bob can decrypt the message with his private key. Data integrity is also preserved because, to modify the message, a malicious attacker would need Bob's private key again. Data integrity and confidentiality are also ensured for the reply because only Alice has access to her private key, and she is the only one who can modify or decrypt the reply with her private key.
However, this exchange is not very reassuring because it is easy for a third party to pretend to be Alice and send a message to Bob encrypted with Bob's public key. The public key is, after all, widely available. Verification that it was Alice who sent the initial message is important.
Figure 2-3 shows how public key cryptography resolves this problem and provides for sender authentication and nonrepudiation.
The following steps have to take place if Alice and Bob are to have an authenticated data exchange:
Both Alice and Bob create their public/private key pairs.
Alice and Bob exchange their public keys.
Alice writes a message for Bob, uses her private key to encrypt the message, and then sends the encrypted data over the Internet to Bob.
Bob uses Alice's public key to decrypt the message.
Bob writes a reply, encrypts the reply with his private key, and sends the encrypted reply over the Internet to Alice.
Alice uses Bob's public key to decrypt the reply.
Here the encryption is done using the private keys. An authenticated exchange is ensured because only Bob and Alice have access to their respective private keys. Bob and Alice should meet the requirement of nonrepudiation?that is, they cannot later deny sending the given message if their keys have not been compromised. This, of course, lends itself to a hot debate about how honest Bob and Alice are; they can deny sending messages by simply stating that their private keys have been compromised.
This scenario assumes a secure out-of-band exchange of the public keys between Alice and Bob. This would be practical for a one-time point-to-point exchange between two entities but is not scalable for a large number of entities.
If you want to use public key cryptography to perform an authenticated exchange and to ensure data integrity and confidentiality, double encryption needs to occur. Alice would first encrypt her confidential message to Bob with his public key and then encrypt again with her private key. Anyone would be able to decrypt the first message to get the embedded cipher text, but only Bob would be able to decrypt the cipher text with his private key.
A crucial aspect of asymmetric encryption is that the private key must be kept private. If the private key is compromised, an evil attacker can impersonate you and send and receive your messages.
The mechanisms used to generate these public/private key pairs are complex, but they result in the generation of two large random numbers, one of which becomes the public key and the other the private key. Because these numbers and their product must adhere to stringent mathematical criteria to preserve the uniqueness of each public/private key pair, generating these numbers is fairly processor intensive.
Key pairs are not guaranteed to be unique by any mathematical criteria. However, the math ensures that no weak keys are generated.
Because of their performance constraints, they require much more computation and are not as amenable to hardware (chip) offload; public key encryption algorithms are rarely used for data confidentiality. Instead, public key encryption algorithms are typically used in applications involving authentication using digital signatures and key management. Public keys are also used to encrypt session keys, which are symmetric keys, so that they can be exchanged or sent across a public network without being compromised.
Some of the more common public key algorithms are the Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) algorithm and the El Gamal algorithm.
No discussion is complete without mentioning the relative strengths, weaknesses, and performance aspects of encryption algorithms.
One issue occurs when choosing between block versus stream ciphers. The general approach is that because hardware sees a bit at a time, stream ciphers are more efficient for hardware-based encryption. For software-based encryption, block cipher in CBC mode or higher is suitable.
A related issue is comparing the symmetric and asymmetric key operations. Symmetric encryption is much faster than asymmetric key encryption, and as a result, for encrypting any reasonable amount of data, symmetric key is better. So, usually, a session symmetric key is generated that is exchanged securely. Key exchange protocols use the asymmetric key for authentication and the exchange of the session symmetric keys.
The next question is the key length and the strength of the algorithms. The key strength is basically the time, effort, and resources required to "break" a key. The strength is very much related to current technologies, especially the processing power, so it is a relative term. In fact, nonrepudiation mechanisms, such as the ETSI specifications, have mechanics to "re-encrypt" data (at a later time, when the "strength" of the original encryption is diminished by technological advances) that is to be kept for long-term arbitration.
The number of bits required in a key to ensure secure encryption in a given environment can be controversial. The longer the keyspace?the range of possible values of the key?the more difficult it is to break the key in a brute-force attack. In a brute-force attack, you apply all combinations of a key to the algorithm until you succeed in deciphering the message. Table 2-1 shows the number of keys that must be tried to exhaust all possibilities, given a specified key length.
Key Length (in Bits)
Number of Combinations
256= 7.205759403793 * 1016
264= 1.844674407371 * 1019
2112= 5.192296858535 * 1033
2128= 3.402823669209 * 1038
A natural inclination is to use the longest available key, which makes the key more difficult to break. However, the longer the key, the more computationally expensive the encryption and decryption process can be. The goal is to make breaking a key "cost" more than the worth of the information that the key is protecting.
If confidential messages are to be exchanged on an international level, you must understand the current government policies and regulations. Many countries have controversial import and export regulations for encryption products based on the length of the key. The U.S. export controls on cryptography have a lot of nuances, so beware.
Another important issue is the initialization vector. WEP uses the RC4 algorithm and a 24-bit IV, which is sent in clear text. The 24-bit IV gives around 16 million combinations, so theoretically, one has to capture millions of packets before seeing a key IV reuse. But, in fact, researchers have proven that key collision could occur at around 5000 packets or so. This aspect and other vulnerabilities are discussed in Chapter 6, "Wireless Vulnerabilities."
Integrity mechanisms are aimed at detecting any changes to a set of bytes. The next two sections look at the integrity mechanisms using hash functions and digital signatures. Digital signatures use the hash function mechanism and encrypt the resultant hash.
A hash function takes an input message of arbitrary length and outputs fixed-length code. The fixed-length output is called the hash, or the message digest, of the original input message.
If an algorithm is to be considered cryptographically suitable (that is, secure) for a hash function, it must exhibit the following properties:
It must be consistent; that is, the same input must always create the same output.
It must be random?or give the appearance of randomness?to prevent guessing of the original message.
It must be unique; that is, it should be nearly impossible to find two messages that produce the same message digest.
It must be one way; that is, if you are given the output, it must be extremely difficult, if not impossible, to ascertain the input message.
One-way hash functions typically are used to provide a fingerprint of a message or file. Much like a human fingerprint, a hash fingerprint is unique and thereby proves the integrity and authenticity of the message.
Let's take a look at how hash functions are used. Use Figure 2-4 to clarify this discussion. Alice and Bob are using a one-way hash function to verify that no one has tampered with the contents of the message during transit.
The following steps have to take place if Alice and Bob are to keep the integrity of their data:
Alice writes a message and uses the message as input to a one-way hash function.
The result of the hash function is appended as the fingerprint to the message that is sent to Bob.
Bob separates the message and the appended fingerprint and uses the message as input to the same one-way hash function that Alice used.
If the hashes match, Bob can be assured that the message was not tampered with.
The problem with this simplistic approach is that the fingerprint itself could be tampered with, and it is subject to man-in-the-middle (MitM) attacks.
An MitM attack refers to an entity listening to a believed-to-be-secure communication and impersonating either the sender or receiver. This entity intercepts the message from the sender, adds its own content, and finally substitutes the correct hash for the altered message. The receiver, who is unaware of the middle entity, verifies the hash (which, of course, would be correct) and comes to the conclusion that the altered message was sent by the sender. This deception works because the hash itself is not protected.
To effectively use hash functions as fingerprints, you can combine them with public key technology to provide digital signatures, which are discussed in the next section.
Common hash functions include the following:
Message Digest 4 (MD4) algorithm
Message Digest 5 (MD5) algorithm
Secure Hash Algorithm (SHA)
MD4 and MD5 were designed by Ron Rivest of MIT. SHA was developed by the National Institute of Standards and Technology (NIST). MD5 and SHA are the hash functions used most often in current security product implementations; both are based on MD4. MD5 processes its input in 512-bit blocks and produces a 128-bit message digest. SHA also processes its input in 512-bit blocks but produces a 160-bit message digest. SHA is more processor intensive and might run a little more slowly than MD5.
A digital signature is an encrypted message digest that is appended to a document. It can be used to confirm the identity of the sender and the integrity of the document. Digital signatures are based on a combination of public key encryption and one-way secure hash function algorithms. Figure 2-5 shows an example of how to create a digital signature.
The following steps must be followed for Bob to create a digital signature:
Bob creates a public/private key pair.
Bob gives his public key to Alice.
Bob writes a message for Alice and uses the document as input to a one-way hash function.
Bob encrypts the output of the hash algorithm?the message digest?with his private key, resulting in the digital signature.
The combination of the document and the digital signature is the message that Bob sends to Alice. Figure 2-6 shows the verification of the digital signature.
On the receiving side, these are the steps that Alice follows to verify that the message is indeed from Bob (that is, to verify the digital signature):
Alice separates the received message into the original document and the digital signature.
Alice uses Bob's public key to decrypt the digital signature, which results in the original message digest.
Alice takes the original document and uses it as input to the same hash function that Bob used, which results in a message digest.
Alice compares both of the message digests to see whether they match.
If Alice's calculation of the message digest matches Bob's decrypted message digest, the integrity of the document and the authentication of the sender are proven.
The initial public key exchange must be performed in a trusted manner to preserve security. This is critical and is the fundamental reason for digital certificates. A digital certificate is a message that is digitally signed with the private key of a trusted third party, stating that a specific public key belongs to someone or something with a specified name and set of attributes. If the initial public key exchange wasn't performed in a trusted manner, someone could easily impersonate a given entity. As an example, if A is a spy and A knows through a wiretap that you're going to send your public key to someone via e-mail, then A could block your message and substitute A's public key instead. That would buy A the capability to forge messages. In the real world, if you send your public key to someone, what are the odds that somebody else cares? Unless secrets or money depend on this, why would anyone care? And if you're worried about that, you can always call the person and (re-)establish trust. These are some of the questions and challenges for establishing trust.
Digital signatures do not provide confidentiality of message contents. However, it is frequently more imperative to produce proof of the originator of a message than to conceal the contents of a message. It is plausible that you might want authentication and integrity of messages without confidentiality, such as when routing updates are passed in a core network. The routing contents might not be confidential, but it is important to verify that the originator of the routing update is a trusted source. An additional example of the importance of authenticating the originator of a message is in online commerce and banking transactions, for which proof of origin is imperative before acting on any transactions.
Some of the more common public key digital signature algorithms are the Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) algorithm and the Digital Signature Standard (DSS) algorithm. DSS was proposed by NIST and is based on the El Gamal public key algorithm. Compared to RSA, DSS is faster for key generation and has about the same performance for generating signatures but is much slower for signature verification.
Key management is a difficult problem in secure communications, largely due to social rather than technical factors. In the wireless world, scalability and manageability are two important factors. Current wireless technologies use symmetric key encryption. For a small number of access points (APs) and clients, it is reasonable to create a key and manually enter it. However, in most wide-scale corporations, this mechanism is awkward and outdated.
A common method used to create secret session keys in a distributed manner is the Diffie-Hellman algorithm. The Diffie-Hellman algorithm provides a way for two parties to establish a shared secret key that only those two parties know, even though they are communicating over an insecure channel. This secret key is then used to encrypt data using their favorite secret key encryption algorithm. Figure 2-7 shows how the Diffie-Hellman algorithm works.
The following steps are used in the Diffie-Hellman algorithm:
Alice initiates the exchange and transmits two large numbers (p and q) to Bob.
Alice chooses a random large integer XA and computes the following equation:
YA = (qXA) mod p
Bob chooses a random large integer XB and computes this equation:
YB = (qXB) mod p
Alice sends YA to Bob. Bob sends YB to Alice.
Alice computes the following equation:
Z = (YB)XA mod p
Bob computes this equation:
Z = (YA)XB mod p
The resulting shared secret key is as follows:
Z = Z = q(XAXB) mod p
The security of Diffie-Hellman relies on two difficult mathematical problems:
Any eavesdropper has to compute a discrete logarithm to recover XA and XB. (That is, the eavesdropper has to figure out XA from seeing qXA or figure out XB from seeing qXB.)
Any eavesdropper has to factor large prime numbers?numbers on the order of 100 to 200 digits can be considered large. Both p and q should be large prime numbers and (p-1)/2 should be prime.
Key management and bootstrap of trust are broad domains that are still evolving.
For public key algorithms, creating the public/private key pairs is complex. The pairs adhere to stringent rules as defined by varying public key algorithms to ensure the uniqueness of each public/private key pair. Uniqueness is "statistically" guaranteed; that is, the odds of two identical keys being generated independently are astronomical. The complexity associated with generating public/private key pairs is the creation of sets of parameters that meet the needs of the algorithm (for example, prime numbers for RSA and many other algorithms). Just the method for generating or finding large prime numbers is computationally hard.
It is ideal for the end user (the person or thing being identified by the key) to generate the key pair himself. The private key should never leave the end user's possession. In corporate environments in which this might not be practical or in which key escrow is required, different rules apply. But all technical solutions should attempt self-generation as the first goal of a design architecture so that the private key is known only to the entity creating the key pair.
The problem is how you can distribute the public keys in a secure manner and how you can trust the entity that gives you the key. For a small number of wireless entities, it might be manageable to call each other or to meet face to face and give out your public key. A more scaleable approach is to use digital certificates to distribute public keys. Digital certificates require the use of a trusted third party: the certificate authority.
A digital certificate is a digitally signed message that typically is used to attest to the validity of a public key of an entity. Certificates require a common format and are largely based on the ITU-T X.509 standard today. (ITU-T stands for International Telecommunication Union-Telecommunication Standardization Sector, the standards body that has many of the common standards like the v.32, V.42, and V.90 series for data communication over telephone networks using modems, the X series for data communications, and the H series for audio-visual multimedia systems.) The general format of an X.509 V3 certificate includes the following elements:
Serial number of the certificate
Issuer algorithm information
Issuer of certificate
Valid to/from date
Subject's public key
Public key algorithm information of the subject of the certificate
Digital signature of the issuing authority
Digital certificates are a way to prove the validity of an entity's public key and might well be the future mechanism to provide single login capabilities in today's corporate networks. However, this technology is still in its infancy as far as deployment is concerned. Much of the format of certificates has been defined, but there is still the need to ensure that certificates are valid, are manageable, and have consistent semantic interpretation. By semantic, we mean answers to questions, such as what is and what is not to be trusted and what level of trust and security the certificate implies.
Some people disagree with the comment about digital certificates being in their infancy (among them one of the reviewers). SSL/TLS, the most widely used cryptographic standard, uses digital certificates. This is a mature technology, but enterprise deployments for system or user authentication using digital certificates are still not widely deployed.
As noted, the certificate authority (CA) is the trusted third party that vouches for the validity of the certificate. It is up to the CA to enroll certificates, distribute certificates, and remove (revoke) certificates when the information they contain becomes invalid. Figure 2-8 shows how Bob can obtain Alice's public key in a trusted manner using a CA.
Assume that Alice has a valid certificate stored in the CA and that Bob has securely obtained the CA's public key. The steps that Bob follows to obtain Alice's public key in a reliable manner are as follows:
Bob requests Alice's digital certificate from the CA.
The CA sends Alice's certificate, which is signed by the CA's private key.
Bob receives the certificate and verifies the CA's signature.
Because Alice's certificate contains her public key, Bob now has a "notarized" version of Alice's public key.
This scheme relies on the CA's public key being distributed to users in a secure way. Most likely, this occurs using an out-of-band mechanism. There is still much debate over who should maintain CAs on the Internet. Many organizations (including financial institutions, government agencies, and application vendors) have expressed interest in offering certificate services. In all cases, it's a decision based on trust. Some corporations might want to control their own certificate infrastructure, and others might choose to outsource the control to a trusted third party. There is also the issue of cost?third-party CAs' prices are too high for many enterprises.
Now that you have looked into some details of the confidentiality and integrity mechanics and at some methods to exchange keys, let's change gears and look at the next level: authentication and identity.