January 12, 2017–Research presented at the Federal Trade Commission’s second annual PrivacyCon event today looked at consumers’ expectations for privacy, their willingness to accept privacy risks in exchange for content, apps, and IoT (Internet of things) connections, and possible solutions. In opening remarks at the conference, FTC Chairwoman Edith Ramirez recalled some of the consumer technology she saw earlier this month on the show floor at CES: including autonomous vehicle, TVs “as thin as cellphones,” drones that can carry organs for transplant, drones that are outfitted for virtual reality, and smart trash can that can scan bar codes of discarded items to create a shopping list of items that need to be replaced.
All this technology, however means that consumers’ data can be accessed by many players, including device makers, app makers, content publishers, software vendors for IoT devices, and advertising companies. The situation is “exacerbated when non-consumer-facing entities increasingly handle consumer data,” Chairwoman Ramirez added. And not only is consumer privacy at risk, she said; some of these devices, including autonomous vehicles, pose threats to health and safety from the failure of security.
The research presented at PrivacyCon can inform the FTC’s work in three areas, the chairwoman said. It can identify potential areas for investigation and enforcement; it can provide data for FTC policy work and help identify areas where additional research is needed; and it can help the FTC “identify and develop solutions to the privacy and security challenges we are seeing in the marketplace,” she said.
During a wrap-up session at the end of the conference, FTC Consumer Protection Bureau Director Jessica Rich said, “We heard consumers are downloading ad blockers in record numbers,” but at the same time “we all know consumers don’t hesitate to use web-sites and apps that collect a lot of information, despite their stated concerns about privacy.”
Responding to the question of whether consumers care about privacy, Howard Beales, a former FTC Consumer Protection Bureau director now at George Washington University, said, “Some do, some don’t. Markets are really good at giving people what they want. It’s why we have an illegal drug problem.”
Mr. Beales said that “it isn’t reasonable to expect consumers to make instance by instance decisions” about privacy and whether to grant apps permission to access private information. As one of the researchers said earlier in the day, he noted, if a consumer had to give permission each time an app wanted access to information, “you’d be approving 231 requests per hour.”
Andrew Stivers, deputy director of the FTC’s Bureau of Economics, said, “I would echo much of what Howard said in terms of some people care, some people don’t. The best way to determine how much people care is to look at their decisions.” However, Mr. Stivers said, “your choice about privacy is connected to all your other decisions about privacy” and consumers may not see “the value of investing in privacy in one dimension when there are 16 other dimensions [whose privacy effects] I don’t know about or don’t understand.”
As a consumer, “I can’t assess whether companies are providing good privacy, because so much happens behind the scenes. It’s not even an experiential good [that consumers can assess based on their experience]. So Consumer Reports is stepping in and saying this is an expert good,” Deirdre Mulligan of the University of California, Berkeley said.
She suggested that security has “public good” aspects, in that people “can’t solve their own security problems” because “people are being attacked because of other people’s poor choices.”
During a session of presentations on mobile privacy, Sebastian Zimmeck of the Carnie Mellon University’s School of Computer Science, explained that the work of he and his co-authors on automated analysis of privacy requirements for mobile apps is found that about half of the more than 17,000 apps they looked at did not have a privacy policy, and that 71% of apps that process personally identifiable information (PII) did not have a privacy policy.
Zubair Shafiq of the University of Iowa, who, with his co-authors, working on detecting and circumventing anti ad-blockers “in the wild,” said that they found a “feeling by consumers that they have lost control of their information. They do care, but they have given up.”
Still, “ad blockers have become very popular,” Mr. Shafiq said, with 600 million around the world and more than 18% of U.S. Internet users using them.
In response, websites and the ad industry are developing anti-ad-blocking tools that detect ad blockers and force users to turn them off or to white-list the site if they want to access it.
“The goal of our project is two-fold,” he said: measuring the prevalence of anti-ad-blocker tools and creating a “stealthy” ad blocker that could not be detected by the anti ad-blocking tools. – Lynn Stanton, lynn.stanton@wolterskluwer.com
Courtesy TRDaily