How to learn about security from Apple SSL Bug mistakes

Secure Sockets Layer, or otherwise known as SSL – was introduced to the world back when Netscape was still dominating the majority of browsers market share, it was back in 1995 when the version 2.0 was available to the public, in hopes to protect the unprotected, and of course prevent hackers from having easy access to eCommerce websites.

On February 21st, Apple released a security patch for their iOS software, a new version of 7.0.6 was released to patch a SSL implementation bug in the encryption of the software. SSL is known was the modern way of protecting peoples privacy and data that gets shared across their network.

The Apple SSL Bug in this case would let a hacker, cracker or anyone else who’s hacking consciouss – to access your mobile phones data without your consent. The hacker could listen to whatever someone is doing on their iPhone, iPad and you would never find out about it. It’s worth noting that it only worked on shared and public networks.

Here is the original piece of code that security experts were able to find and audit, which made the whole security flaw possible.

static OSStatus
SSLVerifySignedServerKeyExchange(SSLContext *ctx, bool isRsa, SSLBuffer signedParams,
uint8_t *signature, UInt16 signatureLen)
OSStatus err;

if ((err = SSLHashSHA1.update(&hashCtx, &serverRandom)) != 0)
goto fail;
if ((err = SSLHashSHA1.update(&hashCtx, &signedParams)) != 0)
goto fail;
goto fail;

if ((err =, &hashOut)) != 0)
goto fail;

return err;

The function you’re looking at is called whenever an iPhone is connecting to an encrypted website with a SSL secured connection, in short: it’s supposed to verify that the encryption key coming to is legitimately accepted by the websites operator. It’s like computers are playing ping-pong.

Take a look at the lines 11 & 12, and see whether you notice something strange there. It’s two duplicate ‘goto fail;’ statements that enables the program to bypass an important authentication security check. Whenever the program is requested to authenticate a digital signature, the program never reaches that point and so it’s easy to bypass.

goto fail; – Audit Yourself Online

goto fail;

Website was built, that enables you to check your own browser / software against the bug, and it will clearly display whether you’re safe or vulnerable to the security exploit. The website is called ‘goto fail;‘ and also includes a link to a tutorial that explains the whole audit in a little bit more in-depth manner.

What Can We Learn?

It’s clear that this was nothing more than a mistake on the developers side, an honest typo mistake that could have been avoided, but it wasn’t. It probably was for the best anyway, without the flaw – we’d never get to see a website like ‘goto fail;’ – which most likely has helped more than just those who were victims of the Apple SSL Bug. It enabled to make a bridge for other software makers, and helped people who weren’t aware their devices could be hijacked so easily.

Security is complex, and finding little bugs like these does not come easily, one has to learn to spot the typo (in this case) and then evaluate on how it can be used to maliciously attack the site. It’s not an easy task, but we can learn that things like that should be acceptable, and it can only improve the focus and attention when the next software update is being released. I feel bad for whoever let this error slide in the production release, but that’s life.

It was an honest, public failure. It doesn’t make SSL any less appealing or secure, if anything – SSL will stay where it is, doing what it always has done, all you have to do is make sure that the configuration is tuned in properly. This bug caught many peoples attention, showing how dangerous public code can be, and how easy it is to manipulate software company as big as Apple itself. We need to reasses, and look at our options thoroughly, to make sure we’re not making the same mistakes twice.

Real tests, with real humans doing them, that might brings us a step closer to secure and peaceful code environments.

Image courtesy of HuffPost

Comments are closed.