Our vacuums could sell our data to the highest bidder, but we don’t seem to care – CBC News

iRobot, the makers of the popular automated vacuum, the Roomba, wants to sell the data it collects about people’s houses to tech companies that create “smart” home tools: corporations like Amazon, Apple, or Google. As Reuters described, the data that iRobot might sell elsewhere is “of the spatial variety” – that is, information such as the distance between walls or furniture, the sort of details that might make a device that heats a room more efficient, or might allow someone to market missing items to future customers. Is that weird?

Privacy advocates think so. Jim Killock, the executive director of the Open Rights Group, a U.K. online rights nonprofit, told the Guardian that iRobot’s decision to sell data is a “particularly creepy example of how our privacy can be undermined by companies that want to profit from the information that smart devices can generate about our homes and lives.” Companies ought to treat data about people’s homes as if it were personal data, he said, “and ensure that explicit consent is sought to gather and share this information.”

But even if it were treated as personal data, would anyone be more careful with it?

Reading the terms and conditions

The Roomba’s terms and conditions already carry a clause that states that owners allow data collected to be shared with “other parties in connection with any company transaction, such as a merger, sale of all or a portion of company assets or shares,” as well as in a few other instances. Do company assets include the data the vacuum cleaner collects? Probably. Is that enough of a hint to tell Roomba owners what might happen to the mapping data their vacuum cleaner will collect? Most consumers would likely say no.

Since the initial Reuters story surfaced, iRobot’s CEO, Colin Angle walked his language back. In a statement, Angle said “iRobot will never sell your data,” and emphasized tha customers have control over their data. Which is true – but only to whatever point the terms and conditions allow.

Still, the Roomba data case is a good reminder that the rules that govern many aspects of our lives, both online and – increasingly – off, are not, as Killock put it, “explicit.” The “creepy” implications are just that: implied. We are left largely unaware of what those implications might actually mean in practice – if we even bother to read user agreements in the first place (which we don’t). That needs to change.

How? The forward-facing language of…

Read the full article from the Source…

Leave a Reply

Your email address will not be published. Required fields are marked *