By this year, we were supposed to have flying cars, self-tying shoes, and clinics that could make you look decades younger in a jiffy – that is, according to the predictions made in the 1980s hit film series "Back to the Future."
Instead, in 2015, we have a different set of high-tech devices, such as cars equipped with WiFi, smartwatches tracking our every move, baby monitors that give live video feeds of bedrooms. And all of them are potential avenues for hackers to break in, or steal people's personal data.
In five more years, the number of connected devices in the world is expected to more than double to 50 billion. That leaves policymakers today wrestling with some difficult questions: How can they help put in place privacy and security safeguards to protect consumers?
On Wednesday, which happens to be "Back to the Future Day," Passcode hosted an event on the role of policymakers in securing the Internet of Things, featuring key speakers such as Federal Trade Commissioner Julie Brill and Senator Brian Schatz (D) of Hawaii. The full video of the event is available here. Here are some things we learned:
1. There's a danger in members of Congress being too reactionary to reports of hacking.
Generally speaking, members of Congress tend to "be a little too declaratory... especially in places they don't know what they're talking about," said Sen. Schatz. "Most members of Congress and the general public are only starting to understand what the Internet of Things is at all... we still have an educational process to go through."
So when members of Congress find a juicy news article about security researchers remotely hacking an Internet-connected Chrysler vehicle, or realize their own smartphone was hacked, Schatz said, there's a temptation for them to say: "There ought to be a law." But if lawmakers react by seeking to regulate a specific sector or product – say, automobiles specifically – and ignore the bigger policy questions, Schatz said, "you will not establish a meaningful policy framework."
2. Policymakers must consider creating economic incentives as more companies start selling connected devices.
As the Internet of Things expands, more companies will start selling connected devices – and they may not be as sophisticated when it comes to security as well-established firms that have been thinking about it for a long time. "That's of deep concern to us," Ms. Brill said, especially because, she added, 90 percent of data from connected devices is personal – and 70 percent of that is flowing over unencrypted networks.
Yet creating incentives for all firms to bake in cybersecurity to their products could be difficult. Devices such as a "connected pen" or "piece of paper," Brill said, are not necessarily going to be easy to patch. And even if they don't store sensitive information themselves, she said, "they could be a threat vector to other devices."
3. Privacy policies will look totally different 20 years from now.
Right now, privacy policies are "longer than Hamlet," Brill said. But in 20 years, they could be on consumers' screens – or car dashboards, or even appear as a hologram on car windshields. Consumers might have more sophisticated options when it comes to sharing data and protecting their privacy, she predicted, including, potentially, a choice about whether to share their driving data and provide it to their insurance companies.
4. But people's definitions of privacy may also change.
So could their opinions about what information they want to keep to themselves, Schatz noted, as people might even be less afraid of giving away personal data online than they are now – and that's OK. "I don't think it's a tragedy my son will have a lower threshold for privacy," he said. "The First and Fourth Amendment rights are going to be evolving" in the Digital Age.
5. The US government is already working to improve Internet of Things security through business education initiatives and outreach to researchers – which have been well-received.
Programs such as the FTC's "Start with Security," aimed at promoting best security practices for startups and developers, are a good idea, said Tom Cross, chief technology officer of Drawbridge Networks. "Going out and educating developers about how to write better code is absolutely in the public interest, and maybe a worthwhile expenditure of the public's money."
The Department of Homeland Security's recent outreach to researchers investigating the security of critical infrastructure is also a good idea, he added. If Cross called up a company, employees might say, "'This guy is scary, he's got long hair,' and they'll call the police," Mr. Cross said. "But if DHS calls up, maybe they'll be a little more open-minded."
6. But some companies believe there should be limits to hackers seeking to test vulnerabilities in the connected devices space.
Even though General Electric's chief privacy officer, Peter Lefkowitz, acknowledges the "tremendous value" of vulnerability disclosures from white hat hackers, they should "not interfere with a company's ability to own and control its technology." What's more, he said, there's no one-size-fits-all model for the level of risk companies will each tolerate as they open their doors to security researchers.