Even though Thursday marks the ninth annual Data Privacy Day, a moment to recognize the need for better worldwide protection of personal information, there's been no letup in devastating breaches.
In the US, more than 109 million people had their personally identifiable information (PII) exposed in just six of the many healthcare breaches in the past year. Even worse, breaches of government agencies and companies victimized tens of millions of people.
In the wake of these breaches, far too many people received messages that went something like this: "We take your privacy very seriously … so we are putting additional privacy protections in place."
But why were those protections not in place already? Haven't they heard of the underground market in stolen PII that has been thriving for over a decade? Don’t they realize how upsetting it is for someone to find out their personal information is now in the hands of strangers – strangers who are also criminals?
The reality is that too few of the systems that gather, store, and communicate personal data were not designed with privacy in mind. Developers might have given some thought to security, but that is not the same thing as privacy. Systems that handle personal information need to be built according to the principles of Privacy by Design, or PbD.
These principles have been well-known and widely advocated for more than a decade now. PdD was originally discussed in the first report on "Privacy Enhancing Technologies" by a joint team of the Information and Privacy Commissioner of Ontario, Canada, and the Dutch Data Protection Authority in 1995.
Today, anyone who has been keeping an eye on European privacy regulation knows that policymakers there have adopted PbD standards as part of new data regulations. While US legislators are far behind their European counterparts when it comes to enshrining privacy protections in law, it would serve American businesses well to begin implementing Privacy by Design principles.
Wouldn't it have been great if that breached bank had two-factor authentication on its customer data server? How about if that hacked government agency had used better malware protection?
What was electronic toymaker VTech, which suffered a massive data breach last year, thinking when it designed its systems? Even though its products generated vast amounts of personally identifiable data, it probably never considered Privacy by Design.
Are the commercial pressures really so great that it is worth risking a big hit to the company's valuation just to save on some coding, or reduce "friction" in the user experience? VTech stock was so badly hit by news of its weak privacy protection, trading of its shares on the Hong Kong exchange was temporarily halted.
According the PbD standards, companies should consult with data protection officer "prior to the design, procurement, development, and setting-up of systems for the automated processing of personal data, in order to ensure the principles of privacy by design and privacy by default." All this before going live; not after a security breach has exposed an organization's failure to fully operationalize its lack of respect for user privacy.
Bolting on privacy protections after a breach, often likened to bolting the barn door after the horses have left the stable, is costly, clunky, and may not win back the trust of consumers.
Stephen Cobb is a global security researcher for Internet security maker ESET. Follow Stephen on Twitter @zcobb.