F-22A Raptor, FB-22, F-22E, F-22N and Variants Index Page [Click for more ...] People's Liberation Army Air Power Index Page  [Click for more ...]
Military Ethics, Culture, Education and Training Index Page [Click for more ...]
Russian / Soviet Weapon Systems Index Page [Click for more ...]






Last Updated: Mon Jan 27 11:18:09 UTC 2014








Cryptography

Originally published  July, 1998
by Carlo Kopp
© 1998, 2005 Carlo Kopp


The science of cryptology has experienced a boom in the last decade, largely due to the massive and unprecedented growth in the area of electronic commerce. Historically cryptographic techniques have been the domain of governments and the military, and this discipline has a long and very colourful history in these areas.

It if worth noting that many critical battles during WW2 were won to a large degree due to the superior cryptanalytical ability of the winning side. Knowing the other party's secrets always provides a key advantage. Until recent times, cryptographic techniques were of limited importance in the commercial world, since most transactions were performed on paper.

This is no longer true, and therefore cryptology has become a critical aspect of modern data transmission. This will become increasingly true with further growth in networking, the wider use of wireless networking, and the further proliferation of public domain operating systems and tools. Until recent times, it could be reasonably safely assumed that your machine-to-machine traffic was not easily accessible to third parties, be it running over a local LAN or a long haul direct link. Therefore it would be reasonably safe to carry unencrypted traffic within your network. The away from the use of expensive privately owned networks, to cheaper common carriers of IP traffic, has basically destroyed this fundamental underlying assumption.

The first reason is because you can never be sure that some machine along your virtual link is not sniffing your packets. The second reason is because defacto universal Internet connectivity means that you can never be entirely sure that a third party has not found a hole in your firewalling, and hasn't installed a sniffer somewhere inside your network. With the almost universal adoption of 10 and 100 Base T twisted pair networks, and the ongoing growth in wireless techniques, another issue will arise, and that is of third parties passively sniffing your intentional (wireless) or unintentional (twisted pair radiation) radio frequency transmissions.

Suitable receivers in both instances can recover substantial proportions of, if not all of your digital traffic without having either physical or network access into your private network. Needless to say, reassembling traffic to extract both useful data and passwords is not a technically insurmountable task, once you have logged the raw data. Clearly the security of transmitted data is today very important, and will be become more important as current trends are followed. This will be as much true of transaction related data and regular machine-to-machine traffic, as of maintaining privacy and authenticating oneself in a networked and digital documentation environment.

Privacy, Authentication and Cryptology

The privacy issue in the transmission and handling of digital data is trivially obvious. We want to be in the position where only those people whom we wish to do so, can the read the data or document we intend to send. This is the classical problem in cryptology, where we wish to deny a third party knowledge of a message being sent to a another. Our cryptographers seek to encode the message in such a manner, that a third party's cryptanalyst cannot easily, if at all, decode the message.

This is indeed the classical cryptology problem, one which has kept minds busy for at least two thousand years. The authentication issue has historically been less important, but it now becoming of major importance. Digital transaction processing over networks, i.e. sending money over networks requires both high levels of security for credit card numbers, but also a solid measure of authentication so the party accepting the transaction knows that the originator is genuine.

Moreover, this is also becoming a major issue with digitised works of art, such as pictures and sound. Purloining of unlicensed copies of any such material is now very common, with the W3, and the proliferation of readily available image processing tools such as Photoshop and GIMP. The embedding of hidden authentication or signature information is termed steganography.

Cryptographic techniques play a central role in providing both privacy and authentication of data, and therefore are a subject that every site administrator and programmer should have at least basic familiarity with, since they will be encountered with increasing frequency over time.

Basic Issues in Cryptography

In the simplest of terms, the central idea behind all cryptographic techniques is that of using an algorithm which can translate a message into an unreadable form with relative computational efficiency, yet without some secret information, requires significant if not non-finite computational effort to translate back. The secret information used to encode and decode the message is termed the key or modulus, and is something which is a frequent part of any discussion on cryptology. Cryptographic techniques mostly provide a degree or measure of security, typically proportional to the key size and the cleverness of the algorithm used. In practice this means that the security of most encoded messages is measured by the amount of time which an opponent must burn up on a supercomputer or large array of machines, to crack the message.

Therefore, if you are serious about security, there is no substitute for large key. Key management is another important issue in this context, since if a third party can get easy access to your key, then no matter how clever the cryptography you employ may be, it is quite useless. Therefore serious data security must encompass proper and secure schemes for key management.

Classical cryptography is often termed secret-key or symmetric cryptography. In a symmetric cryptographic scheme, both the sender and the receiver each hold an identical copy of the key, which has been provided or agreed upon via a secure channel, e.g. a person to person meeting, such that there is a high confidence that no third party can know the key. Secret key cryptography is potentially insecure since a key must be shared between two parties. In 1976 Whitfield Diffie and Martin Hellman produced an important new idea in the cryptography debate. This idea is that of public-key or assymetric cryptography. In public key cryptographic schemes, every user has two keys. One is termed the public key, and is published for all to read, and one is termed the private key, and is kept secret.

All messages which are sent across the network are encrypted using only the public key, and can only be decoded using the private key. The private key is never communicated or shared, and therefore unless its user's account has been compromised, it can be considered to be fairly secure. Public key cryptography is not so much a replacement for secret key cryptography, but rather a supplement, intended for use together. Public key techniques are of much importance in situations where it is difficult to manage secret keys.

Secret key techniques will remain in wide use in those situations where it is easy to manage secret keys in a centralised manner, and the centralised site is considered to be secure. In a distributed networking environment, where it is difficult to manage secret keys in a secure manner, public key techniques can be used to solve the key management problem. The biggest limitation of public key encryption techniques is that they are typically much slower to compute than secret key encryption techniques of comparable security.

Therefore, a popular scheme is the digital envelope in which the message proper is encoded using a fast secret key technique, and the much shorter key for the encrypted message is then encoded using a public key technique, with the public key of the intended recipient, both being embedded in the single transmission. The recipient will then use his or her private key to decrypt the secret key, and then use the secret key to decrypt the body of the message. The digital envelope provides most of the computational speed advantage of the secret key methods, with the key management security advantages of the much slower public key schemes.

Another important issue which arises in this general context is that of data integrity in transmission. While protocols such as TCP provide a nominally clean (i.e. error free) channel, there is no guarantee that the applications communicating over this channel will not introduce errors into the message, a problem which users of email and W3 browsers will be most familiar with. The classical approach to this problem is to employ either a checksum, cyclic redundancy check (CRC) or Forward Error Control (FEC) message, appended to the main body of the message, to provide the recipient with a reliable integrity check on the body of the message.

Transmission errors (or tampering) will be detected when the recipient regenerates the error control message and compares it with the received example. If they differ, the message has been trashed enroute. Importantly, such schemes are based upon the assumption of a non-malicious transmission environment, where it is possible for all parties to use the agreed upon error control message (eg CRC), since errors can only result from systemic problems in transmission.

Once we assume that the environment is potentially malicious, i.e. transmission errors may be intentionally inserted into a message, then the use of commonly agreed upon error control codes becomes a problem within itself, since a malicious player can trash a message and then replicate the error control data. Under such conditions, the use of cryptographic techniques can provide a robust defence of data integrity management. Compromising the data content becomes then much more difficult, without being detected.

Compression of data in messages must be performed before any encryption, since any effective encryption technique will convert the message into a format which has statistical properties very similar to random or pseudo-random data, and therefore will not compress usefully.

Cracking Codes and Countermeasures

Cryptanalysis
, more commonly known as "code cracking" is the science of defeating another party's encryption techniques. Historically, its best known practitioners were the UK's GCHQ, the US National Security Agency (NSA) and their Soviet counterparts, organisations well endowed with mainframes and later supercomputers, and staffed by very sharp mathematicians and linguists. In Australia, this role is performed by the Defence Signals Directorate (DSD), who provide the government's principal source of expertise in this area. An interesting side note here is that the first computers were developed and paid for specifically for the purpose of cracking other people's codes. If we wish to break another party's code, there are numerous well documented approaches which can be followed, and no doubt many more which are less well documented.

  • compromise the other party's algorithm and key by theft, bribery, eavesdropping or other such nefarious means. Many examples are documented, such as the theft of the Enigma, or eavesdropping of Soviet embassies during the Cold War. The drawback of such methods is the potential for getting caught, but also the potential for the other party to discover the compromised code/key and start feeding bogus messages to deceive.
  • crack the code by brute force computation. This approach relies on techniques which typically perform exhaustive searches of the key space, and is essentially a game where the player with the most money to spent on supercomputers gets there first. The disadvantage of this approach is that it is essentially a game of guessing, which can often take a very large amount of expensive time on a big machine to break a well constructed cipher.
  • crack the code by discovering some regularity or idiosyncrasy in its structure. This approach is based on finding weaknesses in another player's cipher, which reflect regularity in the structure of the encrypted message in a manner which can be detected by a cryptanalyst. T

he cryptology community frequently stages public challenges in cracking would be "unbreakable" codes, and a recent development in this context have been toolsets which allow a large array of diverse networked machines to work on cracking a code. With compute power becoming quite cheap and large numbers of machines being networked, this has in turn reflected in an increasing trend for key sizes in most encryption schemes to creep upward with time, to match the growth in computing performance available. Another interesting side effect of these issues is that governments have been generally very reluctant to allow citizens and corporations the use of highly secure encryption schemes, since particularly effective codes would deny governments the ability to surveil the population.

The problem in this context is that both sides have a good case to argue. Why should the government have the right to read my email as it pleases ? Can I trust civil servant X not to use this information to damage me ? On the other hand, why should private citizen or corporation Y have the right to conceal its messages from law enforcement and intelligence agencies, should these messages be used for purposes criminal or espionage related ?

This issue has been the subject of much controversy in the US, for very good reason. And governments have good reason to be concerned in this context, since some very good encryption tools are now becoming very widely available and while this clearly will benefit Joe Average in the sense of increasing his personal privacy, it does indeed raise serious concerns about their use for dishonest purposes.

This issue will continue to be one of major concern for both sides in the debate, and given the difficulty in enforcing the use of weak ciphers by the public and corporations, odds are that governments are likely to lose in the long run. Technology has in this instance struck yet again, and is in the process of destroying a monopoly which has long been held by governments.

How does one strengthen one's encryption techniques to make them more difficult to crack ? The theoretically most sound encryption method, genuinely regarded to provide perfect secrecy, is the One Time Pad, first proposed by Vernam in 1926. The one time pad scheme is based on the idea of XOR-ing a message with a secret, genuinely random key, of the same size as the message.

Providing the key is truly random in content, unknown to the eavesdropper/cracker, and used only once, it is basically considered to be impossible to crack. However, the drawback of this approach is that keys must be exchanged between users, and guaranteed to be secure. In practice this makes the one time pad a highly cumbersome technique to use, especially in a networked environment.

The next approach which can be applied is to use a known encryption algorithm with a very large key. If the key is large enough, and held securely, a code cracker will simply not be able to defeat the code in a useful timescale. Needless to say, the issue of legally permissible key sizes has been a central point in the debate between users of ciphers and government regulators.

A disadvantage of very large key sizes is that they mostly reflect in considerably longer times to encrypt and decrypt messages, by users. Another approach for enhancing the security of encryption is to employ multiple encryption, where a message is encrypted with a given key and algorithm, and the process is repeated several times over. This means that a code cracker must defeat several consecutive encryptions. An interesting example of this approach was practised by the US military during WW2, who employed obscure indigenous languages such as Navajo to first encode messages, and then encrypted them using conventional ciphers. On the basis of published material, this scheme was never broken.

Proper key management is another important means of making life difficult for an opponent bent upon beating your encryption. Proper key management means that private or secure keys must be:

  • kept secure from access by unauthorised parties
  • distributed in a secure manner such they are not compromised
  • frequently changed so that an opponent cannot use statistical techniques to discover patterns, and so that keys which are compromised invalidate security only across messages encoded with that key alone.
  • disused keys should be properly archived so that they are not inadvertently reused at a future time.

The compromising of a private/secret key can have dire consequences, in that not only can messages be read, but the originator of a message could be impersonated. This is a major issue with networked transactions, since a third party could in theory make perfectly legitimate purchases against your credit card, and you would have much difficulty in proving otherwise.

Summary

Clearly data security is an issue of increasing importance, and an area to which, in many respects, there are few trivial answers. The technology is now available to make decryption of messages sent by private citizens and corporate users quite difficult, even by parties such as governments, which have traditionally had the means to defeat commonly available schemes.

For the system administrator or IT manager, cryptographic know-how will become a major issue, since it will impinge upon an increasing amount of an organisation's computing and networking activity in coming years. Organisations will have to become much more literate in this area, than is currently the case, and procedures for proper key management and security will have to be adopted and enforced.

The general issue of keeping unwanted visitors off your systems will be compounded by the potential for such visitors to steal user's keys for the purpose of illegitimate transactions in a heavily networked environment of digital commerce. Increasingly, we can expect to see a trend toward encrypting all important traffic over internal and external network connections, to ensure that third parties cannot glean sensitive information, let alone keys and passwords.

Given the now universal availability of various legitimate and less legitimate sniffing tools, in the longer term encryption of traffic will become an essential part of networking. There can be no doubt that we live in interesting times. (Readers interested in more detail are directed to Welsh's "Codes and Cryptography", and the excellent RSA Labs FAQ document) A follow-on feature will look more closely at commonly used encryption schemes and tools.








People's Liberation Army Air Power Index Page [Click for more ...]
Military Ethics, Culture, Education and Training Index Page [Click for more ...]
Russian / Soviet Weapon Systems Index Page [Click for more ...]





Artwork, graphic design, layout and text © 2004 - 2014 Carlo Kopp; Text © 2004 - 2014 Peter Goon; All rights reserved. Recommended browsers. Contact webmaster. Site navigation hints. Current hot topics.

Site Update Status: $Revision: 1.753 $ Site History: Notices and Updates / NLA Pandora Archive