This September marks two years since the Federal Trade Commission ordered TRENDnet, a California-based maker of surveillance cameras and networking devices, to refrain from misrepresenting the security of its devices after feeds from hundreds of consumers’ cameras became public on the Internet.
According to the FTC, the company failed to use reasonable security to design and test software for its SecurView cameras. The omission allowed hackers to obtain feeds for roughly 700 cameras that showed babies asleep in their cribs, children playing, and adults coming and going.
The case, which TRENDnet settled by agreeing to strengthen digital security in its products and to implement a program that reduces risks to privacy, represented the first enforcement action by the FTC involving a consumer device that sends and receives data over the Internet, also known as the Internet of Things (IoT).
From mattresses that measure whether we toss and turn at night, to refrigerators that tell the grocer when it’s time to restock, to fitness trackers that encircle our wrists, the IoT represents a networking of everyday devices to improve—in theory, at least—how we live and work. The IoT includes meters that allow electric utilities to measure usage, monitors that give doctors access to our health data 24/7, and carpets and walls that detect when someone has fallen.
Though estimates vary, there are roughly 4.9 billion connected devices in the world, up 30% from 2014, according to Gartner, which projects 25 billion such devices by 2020. Data from mobile devices alone reached 2.5 exabytes per month (that’s one billion gigabytes) last year, up 69 percent from a year earlier, and is expected to exceed 24.3 exabytes per month by 2019, according to Cisco.
Or, as a character on the HBO series “Silicon Valley” exclaims: “Ninety-two percent of the world’s data has been created in the last two years alone!”
Devices can be difficult to secure. Seventy percent of the most common ones that constitute the IoT contain serious vulnerabilities, a study last year by Hewlett-Packard found. But what matters as much if not more is safeguarding the flood of data itself and ensuring that consumers know the terms of the exchange. Dominique Guinard, co-founder and and chief technical officer of Evrythng, a maker of platforms that tie devices together, observed recently in AdvertisingAge:
“In the data-driven world of IoT, the data that gets shared is more personal and intimate than in the current digital economy. For example, consumers have the ability to trade protected data such as health and medical information through their bathroom scale, perhaps for a better health insurance premium. But what happens if a consumer is supposed to lose weight, and ends up gaining it instead? What control can consumers exert over access to their data, and what are the consequences?”
Guinard envisions contracts between consumers and manufacturers that adjust over time and address what happens when data becomes unfavorable to the consumer. The FTC has discussed similar approaches. In a report published last January, the agency presented results of a workshop at which participants examined security for the IoT as measured by Fair Information Practices, a code established in 1973 by the U.S. Department of Health, Education and Welfare and later adopted by the Organization for Economic Cooperation and Development that has provided a framework for thinking about privacy since.
At the workshop the FTC and participants focused on the application of four practices as they pertain to the IoT: security, data minimization, notice, and choice. Participants stressed the benefit of so-called security by design, which holds that companies build security into devices at the outset rather than as an afterthought. Minimization refers to companies imposing reasonable limits on collection and retention of data. Less is more, you might say.
Notice refers to how a company describes its privacy practices, including what information the company collects from consumers. Choice addresses the ability of consumers to specify how such information may be used, disclosed and shared.
The meaningfulness of both notice and choice turn in part on consumers’ expectations. Among scenarios posited by the FTC:
“Suppose a consumer buys a smart oven from ABC Vending, which is connected to an ABC Vending app that allows the consumer to remotely turn the oven on to the setting, ‘Bake at 400 degrees for one hour.’ If ABC Vending decides to use the consumer’s oven-usage information to improve the sensitivity of its temperature sensor or to recommend another of its products to the consumer, it need not offer the consumer a choice for these uses, which are consistent with its relationship with the consumer. On the other hand, if the oven manufacturer shares a consumer’s personal data with, for example, a data broker or an ad network, such sharing would be inconsistent with the context of the consumer’s relationship with the manufacturer, and the company should give the consumer a choice.”
Technology may help. The Future of Privacy Forum, a Washington-based think tank that advocates for responsible data practices, suggested in comments to the FTC that companies tag data with permissible uses so that software can identity and flag unauthorized uses. Microsoft envisioned a manufacturer that offers more than one device using a consumer’s preference for one to determine a default preference for others.
As the proposals suggest, notice and choice can be a challenge to achieve when our appliances collect data while we go about our lives. But as the FTC observed, “giving consumers information and choices about their data… continues to be the most viable [approach] for the IoT in the foreseeable future.”