GlobalSign Blog

A Glossary of Cryptographic Algorithms

A Glossary of Cryptographic Algorithms

Cryptography at its very core is math. Pure, simple, undiluted math. Math created the algorithms that are the basis for all encryption. And encryption is the basis for privacy and security on the internet. So, we love math. Even if it is a tad complicated. With that being said, algorithms have to be built to work against computers. As computers get smarter, algorithms become weaker and we must therefore look at new solutions. This is how cryptography evolves to beat the bad guys.

So how is it done? First you need to build a cryptosystem that is both confidential and authentic. This cryptosystem is responsible for creating the key(s) that will be used to encrypt and then decrypt the data or message. A number of signing algorithms have been created over the years to create these keys, some of which have since been deprecated as computing power has increased. 

Before going through some of the main and most popular algorithms known in cryptography, it might be a good idea to recap on a couple of terms you will probably come across a lot during this article.

Brute Force Attack

A brute force attack or a dictionary attack as it’s also known is a trial and error method of obtaining the private key of an encrypted packet of data. The trial and error is done by a computer so the higher the computational power, the more “tries” it can have in a short space of time. As computing power and performance increases, the ability to find the private key increases, unless you increase the length of the key so that a higher number of possibilities exist.

Encryption Key Sizes

Key size or key length refers to the number of bits in a key used by a cryptographic algorithm. Only the correct key can decrypt a ciphertext (output) back into plaintext (input). As CPU power gets more advanced, the computational time required to brute force an encryption key gets less and less. As such, keys have had to become longer. For many years the limit was 40-bits, but today we are seeing up to 4096-bit key lengths in cryptography.

Block Sizes

Some algorithms use “block ciphers”, which encrypt and decrypt data in blocks (fixed length groups of bits). There is a relationship between block size and the amount of data that can be encrypted without duplicating blocks, the explanation of which is beyond the scope of this post, but the key takeaway is that the current recommendation is to use at least 128 bit blocks.   

Symmetric Key Algorithms

symmetric algorithms

A symmetric key algorithm (also known as a secret key algorithm), uses the concept of a key and lock to encrypt plaintext and decrypt ciphertext data. The same “key” is used to both encrypt and decrypt the file. They are sub-classified by stream ciphers and block ciphers. A stream cipher is where plaintext digits are combined with a pseudo-random cipher digit stream. Block ciphers take the number of bits and encrypt them as a single unit (known as rounds), padding the plaintext so that it’s a multiple of a block size.

The algorithm itself is not kept a secret and the sender and receiver of communication must both have copies of the secret key in a secure place. The use of the same key is also one of the drawbacks of symmetric key cryptography because if someone can get hold of the key, they can decrypt your data.

DES

The Data Encryption Standard or DES was, and probably still is, one of the more well-known algorithms of the modern cryptographic era. Today it is widely considered insecure. DES was developed in the 1970’s by IBM and was later submitted to the National Bureau of Standards (NBS) and National Security Agency (NSA). The involvement of the NSA in the design sparked controversial rumours of backdoors, creating widespread scrutiny. It wasn’t until 1976 that DES was approved as a cryptographic standard and published in FIPS.

In the 1990’s, computing 72 quadrillion possible keys for a 56 bit DES key seemed highly improbable. This would have been true for one computer, but in 1997 a group of computer scientists led by Rocke Verser used thousands of volunteer computers to crack DES within 24 hours, thereby making him and his team the winner of the $10,000 DES Challenge.

Since then, DES was fortified with new updates called double-DES and triple-DES, simply layering the cipher so that it would have to decrypt three times to each data block. Triple-DES is still used in some places, but AES (see below) has become the new standard since then.

In Use Today? – DES and double-DES are no longer in use, but triple-DES with three keys is still a recommended algorithm in NIST SP 800-57. It is much slower to use than the AES algorithm, but is still seen in the electronic payment industry. It is also used in Microsoft OneNote and Outlook 2007 to protect user content and system data and the browsers Firefox and Mozilla Thunderbird in CBC mode to encrypt website authentication login credentials when using a master password.

AES

The Advanced Encryption Standard or AES was established by the National Institute of Standards and Technology (NIST) in 2001. AES has been adopted by the US government with key lengths of 128, 192 and 256 bits.

AES has a fairly simple mathematical framework that some argue makes it susceptible to attack. The theoretical XSL attack was announced in 2002 and since then security researchers such as Bruce Schneier have found ways to exploit parts of the algorithm. However, it is important to note that even in Bruce Schneier’s article, he states that there is no reason to panic just yet since they only break 11 of the full 14 rounds of AES-256, taking 270 time. It also requires the cryptographer to have access to plaintexts encrypted with related multiple keys, which still gives AES a higher safety margin but just not as high as previously thought.

In Use Today? – Yes! AES is still in wide use today for its better processing power and ability to be used in a wide range of hardware like smart cards and high-performance computers.

MARS

After DES was found to be weak, NIST ran an open call process known as the Advanced Encryption Standard Process from 1997 to 2000 to find a new and improved block cipher. MARS was one of the finalists, making it far for its layered, compartmentalized approach aimed at resisting future advances in cryptography and CPU power.

MARS supports 128-bit blocks and variable key sizes on a number of core and mixed rounds providing strong resistance to cryptographic attack. Critics suggested that subkeys with long runs of ones and zeroes may have led to an easy and effective attack on MARS.

In Use Today? – No. 21 of 23 rounds of MARS were broken by Bruce Schneier and John Kelsey in 2004.

IDEA

The International Data Encryption Algorithm (IDEA), originally called the Improved Proposed Encryption Standard (IPES), was designed by James Massey of ETH Zurich under a research contract with the Hasler Foundation, now Ascom Tech AG ,and was first discussed in 1991. IDEA was a minor revision of the Proposed Encryption Standard (PES), intended as a replacement of the DES.

IDEA is now patent-free and thus completely free for all uses, but the name itself is still trademarked. It operated on a 64-bit block using 128-bit key and is still an optional algorithm in the OpenPGP standard. Bruce Schneier spoke highly of this algorithm in 1996 until such a time as the patents became difficult to put it into use and the speed of the algorithm couldn’t keep up with modern technology.

IDEA’s full 8.5 round algorithm was first broken in 2011 using a meet-in-the-middle attack and independently in 2012 using a narrow-bicliques attack. In May 2005, MediaCrypt announced a successor of IDEA called IDEA NXT

In Use Today? – Not widely used, but is now patent-free to use in applications of your choice.

RC 2

Rivet’s cipher, Ron’s code or, more commonly, RC algorithms were invented by Ron Rivest. While they share the same family name, the algorithms are quite different. For the purposes of this article, we will separate the names out.

Starting with RC2, which Ron Rivest created in 1987, is a 64-bit block cipher with variable key sizes and 18 rounds, arranged as a heavy unbalanced Feistel network (16 rounds on one type and two rounds on another).

In Use Today? – No. Recommended that this is not used. It is vulnerable to a related-key attack using 234 chosen plaintexts.

RC4

RC4 was designed by Ron Rivest in 1987 initially as a trade secret until it was posted in the Cypherpunks mailing list in 1994. Once it got out to the sci-crypt newsgroup, it was quickly broken by Bob Jenkins. The algorithm has never been officially released by RSA Security but it has been used in some encryption protocols and standards such as WEP in 1997, WPA in 2003, SSL in 1995 and TLS in 1999 until it was prohibited in all versions of TLS RFC 7465 in 2015.

In Use Today? – No. Recommended that this is not used.

RC5

Ron Rivest designed RC5 in 1994 to be variable on all fronts. Block sizes can vary from 32, 64 or 128 bits and key sizes from 0-2040 bits and rounds from 0-255. The original suggestion for parameters was 64-bit block, 128-bit key and 12 rounds.

In Use Today? – Distributed.net are working on brute-force attacks on RC5. They have cracked the 56-bit key in 250 days and the 64-bit key in 1,757 days. They are still working on the 72-bit key, arguably still making it safe to use.

RC6

RC6 was derived from RC5 by Ron Rivest and colleagues. It was designed to meet the requirements of the Advanced Encryption Standard competition and managed to become one of the five finalists. It has a block size of 128-bits and supported key sizes of 128, 192, 256-bits and up to 2040-bits. RC6, like RC5, uses data-dependent rotations, modular addition and XOR operations. The algorithm was not chosen because the RSA Security website suggested that the algorithm was not yet royalty free.

In Use Today? – Leaked files from the NSA suggest that it was used in implant devices in 2016. Other than this, it looks like RC6 might still hold two patents in the US: US 5724428 A and US 5835600 A but the patents are set to expire between 2015-2017.

Blowfish

Blowfish is a symmetric block cipher built by Bruce Schneier as a replacement to DES and IDEA. It takes variable key sizes from 32-bits to 448-bits, 64-bit block size and 16-rounds and was one of the first unpatented and license free block cipher (and still is). Serge Vaudenay, the French cryptographer found a way to use weak keys in a plaintext attack to recover 14 of the 16 rounds.

Blowfish has also been criticized for being slow in certain applications and vulnerable to Birthday Attacks in HTTPS.

In Use Today? – No. While, it’s now know to be vulnerable to Sweet32 attack, birthday attacks and plaintext attacks, some applications are still using it, for example to encrypt passwords. Bruce Schneier now recommends the use of Twofish.

Twofish

Twofish is Blowfish’s successor published in 1998. With a block size of 128-bits, key sizes up to 256-bits and 16 rounds, it became one of the five finalists of the Advanced Encryption Standard competition but was not selected for standardization. It was a step up from Blowfish in that it could be implemented on hardware and smartcards as well as large microprocessors.

Bruce Schneier and the team that created Twofish offered a $10,000 prize for anyone who could attack it during its first round of AES evaluation. In 1999, Sean Murphy and Fauzan Mirza won the prize for their paper, ‘An Observation on the Key Schedule of Twofish’.

In Use Today? – Yes. Still available to use patent-free.

Threefish

Threefish worked on 256-bit, 512-bit and 1024-bit blocks with the same key sizes as the block and up to 80 rounds. Threefish was created in 2008 as part of the Skein Hash Function, one of five finalists of the NIST’s SHA-3 hash function competition. Threefish was heralded for its speed; Threefish-512 can encrypt data at 6.1 block cycles per byte on a 64-bit machine.

In Use Today? – Yes. Still available to use patent-free. However, in October 2010, an attack was published that could break 53 of 72 rounds in Threefish-256 and 57 of 72 rounds in Threefish-512, so it could still be risky to use Threefish.

Serpent

Serpent was also entered into the Advanced Encryption Standard competition and was actually ranked second to Rijndael (now known as AES). Serpent was designed in 1998 by Ross Anderson, Eli Buham and Lars Knudsen. It has a block size of 128-bits, 192 or 256-bits with a block length of 128-bits and 32 rounds. Rijndael won over Serpent because judges deemed that it has more efficient software implementations.

In 2001, Eli Burham alongside Orr Dunkelman and Nathan Keller were able to break 10 of 32 rounds of Serpent-128 with 2118 known plaintexts and 289 of time. They could also break Serpent-192/256 with 2118 plaintext and 2187 time. Other papers have since come closer, breaking up to 12 rounds but still not close enough to consider the algorithm weak.

In Use Today? – Yes. Serpent is still in the public domain and while some attacks have managed to get through up to 12 rounds of the full 32, the time and energy needed for such an attack is still quite large.

Asymmetric Algorithms

asymmetric algorithms

Asymmetric cryptography is also known as public key cryptography and is based on the principle of having a pair of mathematically-related keys for encryption and decryption: a public key and a private key. The public key pair can be shared with anyone, while the private key must be kept secret. Anyone with the public key can encrypt a message but only the holder of a private key can decrypt it. Security depends on the secrecy of the private keys.

Diffie-Hellman

Diffie-Hellman is one of the first recorded examples of asymmetric cryptography, first conceptualized by Ralph Merkle and put into fruition by Whitfield Diffie and Martin Hellman. Traditionally, secure encrypted communication would require both parties to first exchange their keys by some secure physical channel. Diffie-Hellman eliminated the need for the secure exchange by creating an additional key, the public key.

At this moment in time, Deffie-Hellman is no longer the standard cryptographic algorithm because it has been found to be vulnerable to several attacks. A Logjam attack, for example, can allow man-in-the-middle attacks where the hacker can read and modify any data sent over the connection.

In Use Today? – Yes. For general PKI security and digital signing, NIST recommends RSA (see below) because Diffie-Hellman requires more CPU power and larger data exchange for Digital Signing and SSL in particular. But there are still some uses of Diffie-Hellman in the public sphere today for example, in Elliptic Curve Cryptography.

RSA

The Rivet-Shammir-Adleman algorithm, better known as RSA, is now the most widely used asymmetric cryptosystem on the web today. RSA is based on the factorization of prime numbers, because working backwards from two multiplied prime numbers is computationally difficult to do, more so as the prime numbers get larger. The challenge of breaking RSA is known as the ‘RSA problem’.

RSA is a slow algorithm and because of this, it is used to encrypt and decrypt the symmetric keys which in turn, encrypt and decrypt the communications. The symmetric keys perform the bulk of the work, while RSA creates a strong and secure channel.

In 1998, Daniel Bleichenbacher described how he exploited a vulnerability in the PKCS#1 file (used to carry the private key). His attack was able to retrieve the private key and use it to recover session keys and decrypt messages. As a result of his work, RSA Laboratories released new versions of PKCS#1 that are not vulnerable to the same attack. While some attacks to RSA have been attempted, the algorithm remains strong, arguably until quantum computers become mainstream.

In Use Today? – Yes. RSA is the most widely used asymmetric algorithm today.

Elliptic Curve

ECC stands for Elliptic Curve Cryptography, which is an approach to public key cryptography based on elliptic curves over finite fields. Cryptographic algorithms usually use a mathematical equation to decipher keys; ECC, while still using an equation, takes a different approach.

SSL/TLS Certificates most commonly use RSA keys and the recommended size of these keys keeps increasing (e.g. from 1024 bit to 2048 bit a few years ago) to maintain sufficient cryptographic strength. An alternative to RSA is ECC. Both key types share the same important property of being asymmetric algorithms (one key for encrypting and one key for decrypting). However, ECC can offer the same level of cryptographic strength at much smaller key sizes - offering improved security with reduced computational and storage requirements.

In Use Today?  - Yes. NIST has recommended 15 elliptic curves that can be used as standard. Some argue that it is weak because vulnerabilities have been found that allow an attacker to execute certain types of attack although there are ways to combat these. Other reasons for a lack in popularity are to do with the random key generator created by NIST, dubbed Dual Elliptic Curve Deterministic Random Bit Generator or DUAL_EC_DRBG for short. Some believed that the generator (developed by the NSA) wasn’t as random as you might think – it was later discontinued. 

It’s All Math

As quantum computing comes hurtling towards us, many wonder what cryptography will be like. Some argue that our traditional approach of increasing key size to combat increased computing power will hit its limit. Others think that might not necessarily be the case. 

If there’s anything to take away from this, it’s that algorithms all have a “margin of safety” as Bruce Schneier put it. We must recognise that with enough computing power and time, it is possible to break an algorithm, but if we continue to work together and stay on top of computational performance, we can find new algorithms to replace the old ones.

If you think we’ve missed an algorithm in this post, feel free to tell us and we would be happy to include it. Keep your eyes peeled for a follow up blog on cryptographic hash functions including SHA and MD. 

Share this Post

Recent Blogs