The Failures of Protecting Consumer Privacy: Our Takeaways from FTC's PrivacyCon

Next: Matchlight Expands into the European Market
Previous: Data Ethics and Best Practices: Highlights from Strata...
Writer Emma Z.
March 08, 2018

Emma serves as the Director of Analysis at Terbium Labs, working on evaluating and contextualizing threats to customer data. She spends a lot of time reading forum drama on the dark web, writing regular expressions, and drinking LaCroix on the train between DC and Baltimore.

One could be forgiven for expecting the Federal Trade Commission’s third PrivacyCon to be disheartening. Data breaches continue to increase in size and scope; hardly a month goes by without a story about inadequately secured databases, surveilling “smart” toys, and session-replay scripts that steal passwords. As a kickoff to National Consumer Protection Week, this year’s PrivacyCon focused on the economic consequences of privacy and the economic incentives—or lack thereof—to protect users’ privacy.

The system of free-flowing data as a commodity supports a wealth of products and services that expose consumer data. As Princeton University’s Gunes Acar explained in a presentation about session-replay scripts, systems can expose information in ways a company would never expect. Session-replay scripts allow websites to capture the exact browsing experience a user goes through while visiting their website, recording everything from the movement of a cursor across the page to the exact amount of time a user hovered over a button before clicking. As Acar revealed, because websites will sometimes temporarily store information in formats that are accessible to the session-replay script, a consumer researcher hoping to learn about a user’s website experience may also receive plaintext usernames and passwords, private health information, driver’s license numbers, or other sensitive information.

Without thinking actively about what information a wide net may capture, companies put themselves on shaky legal ground by collecting protected data without realizing it. Presenters emphasized that, in some cases, the company or service provider may not even realize that their actions have compromised consumers’ security until it is brought to their attention by a third party. Facebook, for example, was unaware bad actors could use Facebook’s targeted advertising service to de-anonymize a list of email addresses by inferring their phone numbers en masse until a team from Northeastern showed them how.

While consumers sorely need tools to navigate the wilds of digital privacy, they have shown a reluctance to spend money on them, especially if they do not believe it will make a difference. Toulouse School of Economics’ Ying Lei Toh discovered that when consumers have both the will and means to punish a company for mishandling their data, they can force firms to be more cautious. When consumers lack leverage or the data has already been exposed, they will continue patronizing the firm even after a breach occurs, as there is no perceived purpose in pursuing data protection. Given that many consumers’ lifetime information may have already been exposed in one of several major data breaches, how can consumers motivate companies to invest in better securing their data?

In many of the conference presentations, researchers determined that companies shared user information by default. Consumers are expected to either not participate in the technological ecosystem or to take proactive—and often unreasonably onerous—steps to discover the way companies expose and exploit their data. There are researchers and activists working to improve digital privacy hygiene, such as presenter Pardis Emami-Naeini, who discussed designing a digital privacy assistant app—shoring up consumers’ ability to respond to data overreach.

Consumers can theoretically opt out of part or all products that compromise their digital privacy, though that’s often easier said than done. When companies themselves are unaware of the risks of their products, how can users be expected to know what steps to take to ensure their privacy? One case explored the privacy implications of internet-enabled smart toys. The primary researcher, Emily McReynolds, concluded that the company did not address privacy considerations of the children and even encouraged parents to share recordings of children interacting with their toys on social media. Although their parents technically gave their permission, the company actively enabled and even encouraged this potential violation of the children’s privacy. Internet-enabled interactive toys have had serious public security issues. While parents were able to consent to the potential privacy issues of which they were informed, is it reasonable for them to anticipate their—and their children’s—information would be leaked?

A company that assumes that their technology is protecting consumer privacy by default is set up to fail, but it is users who bear the brunt of the damage caused by those failures.

RELATED ARTICLES
events September 06, 2018
Sharks and Shpiony: A Conversation with Andrei Soldatov and Irina Borogan, Authors of The Red Web

As part of our Black Hat programming this year, we had the distinct honor of hosting journalists Irina Borogan and Andrei Soldatov for an evening of discussion on security, surveillance, and the state of...

events August 30, 2018
Risk, Cyber Crime and Strategic Security: Highlights from Black Hat 2018

Members of the Terbium Labs team once again made the summer trek to Las Vegas for Black Hat USA in search of the latest developments in information security.