So when looking at SaaS solutions one of the things we consider is the strength of the SSL certificate, and when using a small provider who the Certificate Authority as commercial authorities will provide insurance for a breach which can go to paying some of the cleanup costs (assuming the breach isn’t from negligence).
So how to evaluate SSL certificates in terms of robustness (i.e. cryptographic strength) after all some people will talk. About 128 bit certificates and others such as Google mention 2048 which on the surface don’t seem comparable.
So the bit length is to do with the cryptographic algorithm used of which there are several such as AES, 3DES and so on. No I’m no expert on this so I won’t presume to explain the pros and cons of the different algorithms, there are other resources on the web for that (such as this document).
The point I have been working towards is that NIST (National Institute of Standards and Technology)(aside from being a good resource on security) have tables that recommends the size of the key used to help build the certificate (the document is here and tables 1 & 2 contain the key details, more here). The tables shown below takes into account the algorithm (therefore a comparator on key size) but also a recommended growth in key size.
So why grow a key size well one of the factors in driving key size is that as computing power increases the time and effort to brute force crack of a key shrinks. So every time the key size increases so does the effort to brute force the cracking of the key.
This leads to secondary consideration – that of the certificate life i.e. how long the certificate is valid for. This is in effect to potentially greatest period of exposure based on the fact that someone may brute force your certificate and then simply listen to the traffic so you never know of the compromise. Obviously you can revoke the certificate at any time.
Finally remember the need and level of security should be informed by assessing the data being transferred (in motion). Data security should also be considered for data at rest I.e being stored (data loss from a data store is likely to be far more damaging).