Here’s a data security conundrum. The news that anonymous DNA sample data has been used to personally identify the original donor sounds, at first, like an information security problem.
The reality is, it isn’t. A team of geneticists has shown there is a systematic weakness in the way that this data is handled. It turns out that statistical analysis combined with good old-fashioned searching the Internet for identity clues may be all it takes to render the strict controls associated with donor data completely powerless.
The law of unintended consequences always applies with new technology. If we give information away freely, we shouldn’t be surprised when someone finds a way to use it. Imagine what a geneticist could do with this research that uses Facebook ‘likes’ to predict race, religion and sexual orientation?
This does raise the question of how to design security systems to protect data from threats (or developments in technology) that we don’t know about yet. This intractable problem is likely to remain with us for the foreseeable future, but one approach is to offer up your implementation for attack, and pay a bounty for positive results.
Which sounds very much like the Google-sponsored “Pwnium 3” contest where cash prizes of up to $150k are available for demonstrable exploits of Google’s Chrome OS. Google did manage to get some last-minute patching done just before the competition started, and (consequently?) there were no clear winners, with Chrome fending off all attacks.
Let’s wrap up this weeks somewhat sober assessment with a data-leak-of-the-week quote from this story about widely reported data breaches at various credit reference bureaus:
“The data leak this week is being called a juvenile prank and not necessarily the work of any sophisticated hacker”
We’re not entirely sure what the difference is, from the victim’s perspective, but it’s an interesting defence.
Get the most from GDPR penetration testing