Three security researchers have discovered a variation to an old cryptographic attack that can be exploited to obtain the private encryption key necessary to decrypt sensitive HTTPS traffic under certain conditions.
Named ROBOT, which stands for Return Of Bleichenbacher's Oracle Threat, this new attack is a variation of the Bleichenbacher attack on the RSA algorithm discovered almost two decades ago.
Back in 1998, Daniel Bleichenbacher of Bell Laboratories discovered a bug in how TLS servers operate when server owners choose to encrypt server-client key exchanges with the RSA algorithm.
By default, before a client (browser) and a server start communicating via HTTPS, the client will choose a random session key that it will encrypt with the server's publicly-advertised key. This encrypted session key is sent to the server, which uses its private key to decrypt the message and save a copy of the session key that it will later use to identify each client.
Because RSA is not a secure algorithm, it also uses a padding system to add an extra layer of random bits on top of the encrypted session key.
Bleichenbacher discovered that if the session key was encrypted with the RSA algorithm and the padding system was PKCS #1 1.5, an attacker could simply send a random session key to the TLS server and ask if it was valid. The server would respond with a simple "yes" or "no."
This meant that by the means of a simple brute-force attack, an attacker could guess the session key and decrypt all HTTPS messages exchanged between the TLS (HTTPS) server and the client (browser).
Instead of replacing the insecure RSA algorithm, the designers of the TLS standard decided to add countermeasures to make the brute-force guessing process harder to carry out.
This was an incomplete and insufficient fix for the original Bleichenbacher attack, and since then, researchers have published new variations of the original Bleichenbacher attack in 2003, 2012, 2014, and 2015.
The most recent research on this topic was the DROWN attack, which affected a third of all HTTPS sites, published back in March 2016.
Today, a new Bleichenbacher variation named ROBOT came to light, which also relies on skirting the countermeasures put in place by TLS creators back in 1998 and later.
The problem, according to researchers is that the TLS standard is very complex and many server equipment vendors fail to properly implement Section 18.104.22.168 of the TLS standard (RFC 5246), which defines the original Bleichenbacher attack countermeasures.
The three-man research team who found and reported the ROBOT attack says that companies like Cisco, Citrix, F5, and Radware offer products that are vulnerable to ROBOT attacks in certain configurations.
That configuration is if the server owner decides to encrypt the TLS session key with the RSA algorithm and use the PKCS #1 1.5 padding system.
Until patches arrive for vulnerable products, the ROBOT research team and US-CERT recommend that owners of vulnerable devices disable TLS session key RSA encryption (also known as RSA encryption mode) on their device. This won't be an issue as most devices also support Elliptic Curve Diffie Hellman (ECDH) session key encryption as a better solution for RSA.
The ROBOT research team say that despite this being a variation for a 19-year-old attack, 27 of the Alexa Top 100 websites are vulnerable to the ROBOT attack. Vulnerable sites include Facebook and PayPal. The ROBOT attack scientific paper includes a case study how the research team decrypted Facebook traffic.
Other sites outside of the Alexa Top 100 may be vulnerable. Researchers have released a Python script that can help server admins scan for vulnerable hosts, and have also added a ROBOT vulnerability checker on the ROBOT attack homepage.
Just like it was the case with the DROWN flaw, this issue has huge repercussions, as an attacker could log HTTPS traffic and decrypt it at a later date.
It's also worth noticing that an attacker does not obtain the TLS server private key with a ROBOT attack, but individual session keys for each client connection. This means that an attacker does not have access to a universal decryption key, but only to a one-time key for one HTTPS session. To decrypt large quantities of HTTPS traffic, ROBOT attacks require a large computational effort.