The right course of action

Responsible disclosure of software vulnerabilities is the best outcome for developers

Chris Schwarz
software vulnerabilities

© Shutterstock / Joe Prachatree

Short-term secrecy often creates the best outcomes for developers, but they deserve to be informed once the risk is mitigated.

A couple of security vulnerabilities came to light towards the end of last year that got me thinking about the difficult problem of deciding who should be told about vulnerabilities and data breaches and when they should be told.

The first was Uber’s decision to keep secret a huge data breach in which the information of 57 million drivers and users was stolen. Uber’s response was to pay the criminals $100,000 to delete the data. Uber’s new CEO said that the company had “received assurances” that the data had been deleted.

Uber decided not to report the breach to those affected and it remained secret until recently. The CISO who handled the incident was fired along with his deputy. It’s quite likely that Uber broke any number of international regulations concerning the disclosure of data leaks, but it is not entirely surprising that the company kept the breach secret. Data leaks are embarrassing to the company and harmful to the careers of those involved.

The second incident was the disclosure of a mind-bogglingly serious vulnerability in Apple’s MacOS High Sierra. In brief, anyone with access to a Mac could gain root access by supplying the username “root” and a blank password in an authentication dialog. Even worse, if the Mac had screen sharing turned on, the vulnerability could be exploited remotely.

The developer who discovered the MacOS vulnerability immediately disclosed it publicly on Twitter.

So we have two cases: in the first, a company keeps a serious breach secret. In the second, a developer immediately releases a vulnerability to the public. In both cases, it’s the developers who stand to suffer.

The right course of action

Data breaches put developers at risk of identity theft, web account takeovers, spamming, and a host of other problems. Whatever “reassurances” Uber got from the criminals about destroying the data, it would be naive to trust them. The right course of action would have been to fix the vulnerability that led to the data theft and then disclose publicly.

In the MacOS case, an argument could be made for secrecy. Telling the world about such a huge vulnerability put millions of Apple users at risk. The responsible course of action would have been to inform Apple through the proper channels, let them fix the problem, and then go public.

Part of the problem is misaligned incentives. People who discover vulnerabilities frequently benefit from disclosing them. Companies who suffer data leaks or create vulnerabilities benefit from keeping them secret.

SEE ALSO: Spectre and Meltdown make anything with chip in it vulnerable, but Raspberry Pi is safe

The best option: Responsible disclosure

The best option is usually responsible disclosure: report the vulnerability to the vendor, give them time to fix the problem, and then go public. If the vendor refuses to fix the problem, the public is informed of the risk, but they are not put in unnecessary risk by early disclosure.

Google’s Project Zero is a good example of the process in action. When a vulnerability is discovered, the vendor gets 90 days to fix it. When it’s fixed, or if the vendor fails to fix it, the vulnerability is disclosed publicly so that users can make an informed decision.

The way that researchers and developers handled the recent Spectre and Meltdown vulnerabilities is instructive. Spectre and Meltdown are critical data disclosure vulnerabilities created by flaws in the way computer processors are designed. The vulnerabilities were discovered almost simultaneously by several research groups, all of whom informed chip manufacturers like Intel, which coordinated with operating system developers to get patches ready before the vulnerabilities became public knowledge.

As it turned out, the existence of Spectre and Meltdown leaked earlier than expected because eagle-eyed Linux watchers spotted the patches as they were added to the open source kernel. Nevertheless, it’s an instructive example of how software and hardware vulnerabilities and the serious data leaks they cause can be handled responsibly to keep users safe.

Ultimately, the best interests of the developers should guide security disclosure decisions.


Chris Schwarz

Chris Schwarz is the CEO of Cyber Wurx, a  premium colocation services provider with a world-class Data Center in Atlanta, Georgia that also specializes in Dedicated Server Hosting and VPS Hosting. Check out their hosting blog at

Inline Feedbacks
View all comments