British cloud security engineer Rob Dyke has spoken out in relation to how an instance of ethically reporting a data leak landed him in legal trouble. Dyke reported open code repositories in late February containing API keys, application code, usernames, passwords, and the URLs of third-party, embedded items relating to an open-source non-profit active in British healthcare. Originally thanked for his work, he now faces action from the company he attempted to assist.
Dyke recently Tweeted: “I told @AppertaUK about passwords/keys/finace db they published on github. And this is what I get #cyberup” This is in response to a warning of potential legal action.
Casey Ellis, CTO and Founder of Bugcrowd tells Digital Journal this incident highlights how legal action can not only outlaw, but also derail the efforts of those who help identify and resolve vulnerabilities that can potentially destroy an organization’s business.
According to Ellis, ethical hacking should be celebrated rather than classed as a criminal activity. Ellis explains: “As cybersecurity leaders, we have an obligation to support the ethical hacker community as they defend the safety of the Internet, and to avoid however possible the chilling effects of unwarranted legal action against security researchers operating in good faith.”
With the specific case, Ellis continues: “This incident highlights how legal action can not only outlaw, but also derail the efforts of those who help identify and resolve vulnerabilities that can potentially destroy an organization’s business.”
Ethical hacking represents an important part of the cybersecurity system, adds Ellis, noting: “Security researchers, pride themselves on high integrity, as exhibited here by British Cloud Security Engineer Rob Dyke. Their actions have helped thousands of organizations around the world discover and address vulnerabilities before adversaries could exploit them — preventing countless attacks that would undoubtedly prove detrimental to any organizations’ digital operations.”
Ellis adds: “This incident highlights the glaring need for this input across all types of organizations, and just how long vulnerabilities within an organization’s system can remain undetected without the help of external security researchers like Dyke, as the unsecure Apperta information that he discovered had been publicly leaked on GitHub for over two years.”
In terms of the future of the sector, Ellis outlines the seriousness of new legislation: “With 80 percent of security professionals fearing the UK’s Computer Misuse Act being used against them and their work, governments around the world must modernize their legislation for the greater good of its citizen’s digital safety”
So, what is to be done? According to Ellis: “Organizations in the public and private sectors, such as Apperta, must not only learn to trust and work collaboratively with external security researchers, but to instantiate their own policies to set clear expectations and establish legal safe harbor for those operating in good faith”
Furthermore, this philosophy extends to “even organizations with in-house security teams can benefit from the help of external security researchers — specifically their ability to provide continuous, 24/7 security testing and monitoring and to help the builders and defenders within an organization think just a little bit more like that breakers who’d potentially threaten it. Speed is the natural enemy of security and the best way to improve an organization’s security posture and beat malicious adversaries is by thinking like one.”