Passcode went on the road to South By Southwest Interactive, the annual gathering of digerati in Austin, Texas.
You won't be surprised to find panels and events relating to security and privacy were big draws for the more than 30,000 expected attendees – including four events hosted by Passcode.
We went deep on big data and discrimination, exploring the potential pitfalls that come with a reliance on algorithms to make decisions when it comes to everything from marketing to consumer credit scores. We discussed ways to balance digital security and privacy with some of the nation's top cybersecurity researchers, privacy advocates, and practitioners. We talked about the international differences in the privacy debate with experts from the US, Ireland, and Germany.
Here are some key things we learned during our trip:
Passcode hosted a SXSW panel featuring Nicole Wong, former deputy US chief technology officer; Sascha Meinrath, director of the X-Lab; and Mallory Duncan, senior vice president and general counsel at the National Retail Federation.
People are more concerned about inequity than privacy.
People are worried their data will be used to treat them unfairly, Ms. Wong said.
"They might say, 'Actually, I'm afraid something about that algorithm is going to foreclose opportunities for me. That I might not even see credit opportunities offered or academic options,'" Wong said. "That's not so much about privacy as it is about fairness and nondiscrimination."
The policy discussion about data collection in the context of civil rights is just beginning.
The regulations dealing with data collection for the last 30 years, Wong said, center on protecting people's privacy, including getting individual consent and minimizing the information collected.
"Those models don't actually solve the fairness problem," Wong said. "Even if I'm transparent about the data collected, it's not necessarily fair if I make a decision about that data."
We, as a society, are the algorithms – bias and all.
Harvard University researcher Latanya Sweeney conducted a study that found Google searches of traditionally black names yielded ads for background checks or arrest records even when none existed.
As disturbing as this sounds, Mr. Duncan said those were clearly algorithms that were "taught" to serve those ads by people's clicks. The public clicked on those ads just often enough, in other words, to train the algorithm to make more money for the webpage by displaying certain ads.
"We as a public, we're actually teaching the algorithm how to provide discriminatory reinforcement. It's not something that could be easily fixed in the initial design," said Duncan.
Data collection can affect the lives of even the youngest humans
"I feel data collection can be incredibly empowering, liberating, and helpful," Mr. Meinrath said, "but it can also be disadvantaging, invasive, disemplowering in ways we have never yet seen in modern society."
Meinrath's young daughter is already inundated for requests for personal information online. "I'm teaching her to lie. What's your favorite color? Plaid. What's your favorite animal? Giant squid. Who's your best friend? Beelzebub," he said. "How do I prevent a four-year-old from having a profile that's going to follow her for the rest of her life? She cannot possibly understand the import of the data she's providing."
Data collectors say it might be too late to change.
Duncan supports targeted advertising, he said, since the goal of retailers has always been to get a customer to come into a store and see things she wants in hopes she will buy other merchandise. In essence, mass data collection tools are just a more sophisticated way of becoming a better salesman.
"It might be too late to adopt the kind of controls to affect the outcomes my fellow panelists talk about," Duncan said. "The systems may have already become too sophisticated in some respects."
Holding brands accountable for good data practices could be a better way to keep algorithms in check than regulation.
The collection of massive troves of data actually makes the world smaller, Duncan said.
"We are going back into a world in which we're a small town," Duncan said.
"In a small town, the dressmaker knew the bust size of Mrs. Jones. And Mrs. Jones trusted the dressmaker not to pass that information on to anybody else, because if word got out she was gossiping, her business disappeared."
Until we come up with more sophisticated ways to control this issue, he continued, brand equity "is probably the best we've got."
Privacy is a policy and technology challenge
Passcode hosted a dinner where we explored the new frontiers of security and privacy along with the Center for Identity at the University of Texas. The security firm Rapid7 sponsored the dinner.
Dan Kaufman, head of the Defense Advanced Research Projects Agency's Information Innovation Office, and Nuala O'Connor, chief of the Center for Democracy and Technology think tank, both saw a need for technology to help people retain their privacy online.
Mr. Kaufman described the recent launch of a DARPA program to research and develop the latest tools to protect people's privacy online, one of what the agency describes as one of the most "vexing" problems facing the connected world.
The so-called Brandeis Project – named after the former Supreme Court Justice Louis Brandeis who helped develop the concept of a right to privacy – aims give users a very simple way to express how they would like their data to be shared. For instance, as Kaufman said, people might be OK with providing their information to the Centers for Disease Control and Prevention to be contacted in the event of an Ebola outbreak but not want that same data in the hands of the Internal Revenue Service.
Can the Pentagon's futuristic research arm build math models, Kaufman asked, ensuring government data is only used in an appropriate way?
Ms. O'Connor agreed that technology has to become part of the answer to the dilemmas facing the country as devices proliferate and more data is collected, but warned against mission creep within the government overtaking the goal of keeping information private.
To guarantee personal privacy, O'Connor argued, we to move away from seeing data as solely a property right, and toward viewing data as an intrinsic human right.
Security practitioners wrestle with tough policy questions
Some of the country's leading cybersecurity pros and researchers joined a breakfast Passcode hosted in partnership with the Center for Identity at the University of Texas, sponsored by the security firm AlertLogic.
Adam Tyler, chief innovation officer of the security firm CSID, said criminal hackers are more protected than security pros seeking to protect data because of restrictions on hacking back.
"In the Target breach, we know where that data was exfiltrated – but if we hacked into that server and extracted it ourselves, we'd be breaking the law," Mr. Tyler said.
Daniel Weitzner, head of the new MIT Cybersecurity Policy Initiative, says cyberspace today is not the Wild West, a commonly used metaphor to describe the proliferation of hackers and crimes that somehow escape the law.
Instead, Mr. Weitzner said, cyberspace is more like 17th century piracy – and the way the British handled that problem could provide useful lessons for online security today.
Capitalizing on people's fear, uncertainty, and doubt, Denim Group principal John Dickson said, is not a sustainable business model for cybersecurity companies.
"[A strategy of] 'We'll scare the money out of their pockets doesn't work,'" he said.
Convenience will trump privacy, until privacy is convenient
"People, even Americans, claim they really care about privacy – until it impinges on their convenience," said Michele Neylon, chief executive officer of Blacknight Solutions at Rebels Without a Clause, a SXSW event cohosted by the German software firm Open-Xchange and Passcode. "It's important for a lot of people for [privacy] to be simple and convenient."
To make privacy more convenient, people may need to have psychological incentives, said Matt McGlone, associate professor of Communication Studies and an affiliate of the Center for Identity at The University of Texas at Austin.
Often, companies ask users to take extra steps to secure their privacy online – rather than assuming their data will be kept private and asking if people would like to forgo that right.
"People don't like to give up things, Mr. McGlone said. "They are willing to forgo discounts, but not willing to give up things. If the assumption is that privacy is yours, people will pay more to get privacy back."
What we learned from other SXSW panels
Your technology is exposing you, even inside your house
"We are mapping your world," Carol Politi says. She's chief executive officer of TRX Systems, a company that aims to create maps of the inside of every building through sensors on users' mobile devices.
The technology was originally developed to locate firefighters and soldiers and is intended to make people safer. With public safety issues or national defense, privacy was less a clearcut concern. Now that average users' mobile phones are also equipped with advanced sensors, the technology opens up a whole host of privacy questions.
Just think if you call 911 and the emergency responder can instantly locate what floor you're on and where. That seems like a smart use for indoor mapping. But people might not be comfortable with all the ways their data is being collected, Ms. Politi said in her panel at SXSW Interactive. "When it comes to firefighters, you might say, 'I'm good with this,' but when you think about mapping when it's me walking around my house, maybe I feel differently about it," Politi says.
After all, she says, humans spend 90 percent of their lives indoors. Simply carrying a cellphone around the house could provide aggregate data over days, weeks, or months to show how long the building floors extend, where its staircases are, where you usually congregate within your home or what hours you're usually at work.
"What's OK? Creating a map of the public space I walk around? My own house or apartment? What about a friend's house or apartment when I'm visiting? A business I visit?" Politi says. Users are able to opt-out from providing location data and other tracking built into their cell phones, but the expectation is that they will understand how to do that or what the consequences are if they don't. "I'm the one doing the mapping, but I still struggle with what are the expectations for users and venue owners."
Tech advances will give you control over your data
Data is increasingly cheaper to store and access, said Facebook's chief privacy officer, Erin Egan.
Within a few years, Ms. Egan predicts, people will have more control of their information online and a better understanding understanding of where it's being collected and disseminated – and thus will be able to manage their identities in a "more real way" than they do now. This could also be a global trend, even though privacy interests and expectations in the US, Europe and other countries are not necessarily the same.
"Technology does not respect borders or jurisdictions," said Keith Enright, Google's senior privacy counsel. "If we're going to fast forward five years, one of the things I do think we will see is some increasing bridge building, connectivity, so we can start allowing users to be in control of their information. So as different privacy interests are all over the world... we will be able to empower data around them." The privacy officers spoke at the International Association of Privacy Professionals' panel, “What Keeps the Internet’s Leading CPOs Up At Night.”
The future of cybercrime is automated
Marc Goodman, author of the new book "Future Crimes," packed the room at SXSW for his talk on how criminals use technology in increasingly sophisticated ways. "Crime is becoming automated. It’s software that commits crime, you don’t need to be a master hacker anymore," said Mr. Goodman, a former Los Angeles police officer who has become an expert and consultant on the advancements in cybercrime.
From Mexican drugs gangs building private cellular networks to Russian hackers stealing millions from ATMs, Goodman presented an array of vexing problems for today's law enforcement agencies. But he said there's hope, too. "We are vulnerable to attack because we don't have trustworthy computing," said Goodman. "I'm not calling for perfect security, I'm calling for better security."
He said that more secure software, and security products that average people can easily use can help thwart criminal hackers. "We need to design security so that people can understand it," he said. What's more he said, there need to be more incentives for researchers, technologists, coders, and inventors to come up with better security systems. "We need a Manhattan Project for cybesecurity."
The definition of privacy is freedom
Let's take a step back. With so much data collected about us online from companies and the government, how does the definition of privacy change in the Digital Age? "I think privacy is fundamentally about power and autonomy," says Bruce Schneier, renowned cryptographer and author of recent book "Data and Goliath."
"It isn't 'if you have nothing to hide, you have nothing to fear.' That's a gross mischaracterization," he said. "[Privacy] is about how we present ourselves to the world. When we have privacy, we decide what we disclose and to whom." Pervasive surveillance, he adds, leads to conformity. "If we are being watched all the time, if we are being recorded all the time, we are less likely to be different and more likely to conform." Mr. Schneier is also a Passcode columnist.