gradient background

Choosing Safe Key Sizes & Hashing Algorithms

A brief explanation

Algorithms, key size and digital certificates


On the surface digital certificates are not as complicated as they are occasionally perceived to be. A trusted public body, such as a Certificate Authority (CA) like GlobalSign, verifies a specified selection of evidence to produce an electronic identification for future presentation, indicating that the authentication of the individual or organization has taken place.

The digital certificate contains information about who the certificate was issued to, as well as the certifying authority that issued it. Additionally, some certifying authorities may themselves be certified by a hierarchy of one or more certifying authorities, and this information is also part of the certificate chain. When for example a digital certificate is used to sign documents and software, this information is stored with the signed item in a secure and verifiable format, so that it can be displayed to a user to establish a trust relationship.

As is usually the case, when examining a little deeper, mechanisms are not quite that simple. In the case of digital certificates there are a number of other factors. What are the qualifications of the third-party, their practices and which cryptographic algorithms did they use to produce the digital certificate?

From a CISOs perspective, using digital certificates such as SSL raises concerns that may impact on the organization’s operational environment. By using a certificate from a Certificate Authority, the individual/organization will need to fully trust the CA’s practices.

This is especially true when it comes to decisions relating to what cryptographic algorithms and key lengths are acceptable in this ever changing industry. Thankfully you do not need to be a cryptographer to make good decisions on this topic, but you will need to have a basic understanding of the history, advances promoted for future use, and carefully consider algorithms provided by a number of Certificate Authorities operating in the security market at present.


In recent history, the industry has relied on two algorithms within digital certificates. The first being an encryption algorithm called RSA, the second being a hash algorithm called SHA-1, both of which are now considered to be weak due to advances in cryptanalysis.

RSA’s strength and performance is based on the size of the key used with it, the larger the key the stronger and slower it is. The advances in cryptanalysis have driven the increase in the key size used with this algorithm which has increased the amount of computing power necessary to maintain the same effective strength. The problem with this is that every time we double the size of an RSA key, decryption operations with that key become 6-7 times slower.

As a result of this, since January 2011, trustworthy Certificate Authorities have aimed to comply with NIST (National Institute of Standards and Technology) recommendations, by ensuring all new RSA certificates have keys of 2048 bits in length or longer. GlobalSign was one of the first Certificate Authorities to implement 2048 bit key strength within its digital certificates, back in 1998 and other Certification Authorities have since followed suite based on these new requirements.

Unfortunately this ever increasing key size requirement cannot continue forever, especially if we intend to see SSL make up the majority of traffic on the Internet – the computational costs are simply too great.

We then moved on to SHA-1. SHA-1 hash algorithms take a variable amount of input and reduce it to a typically shorter and fixed length output, the goal of which being to provide a unique identifier for that input. The important thing to understand is that hash algorithms are always susceptible to collisions and the advances in the cryptanalysis have made it more likely to create such a collision. The problem here is that there is no parameter to tweak, the only way to address this issue is to change what algorithm one uses to produce the hash.


For the last decade or so there has been slow and steady movement to using two new algorithms to address these advances, SHA-2 and ECC. ECC has the potential for significant performance benefits over RSA without reducing security, and SHA-2 offers three versions, each with progressively longer lengths, which help it both address the current risks and provides some longevity.


The main goal in configuring SSL is to enable users to communicate over the Internet securely. Organizations and individuals need to be able to do this with the fewest hassles, lowest costs and in compliance with any associated standards.

Using Windows as an example, SHA-2 was added to XP in Windows XP Service Pack 2 and ECC in Windows Vista. With these facts you can clearly see the adoption clock for these new algorithms. With XP being used by about 30% of the Internet today, it is unlikely that ECC and SHA-2 will be adopted in full for around 5 years. This leaves us with RSA 2048 and SHA-1, which thankfully is broadly considered sufficient for the next decade.

Performance may also be regarded as a concern. A 2048-bit RSA certificate used in SSL will result in around a 10% CPU overhead, not huge but something that should be taken into consideration. Compliance is another important driving factor when making a decision, whether it is the Payment Card Industry / Data Security Standards (PCI), Federal Information Processing Standards (FIPS), or some other set of criteria you need to meet, this always needs to be taken into account.

Therefore CISOs across organizations worldwide can be rest assured that third-party providers using SHA2 algorithms and RSA 2048-bit key strength will be secure for the next ten or so years, but when making the key decision of choosing a provider it may also be worth taking into consideration when they adopted this level of security. Implementing this standard of security over 10 years before NIST’s recommendations, GlobalSign is always striving to stay one step ahead within the industry.