January 7, 2019 by
Since the UK’s vote on Brexit and the 2016 presidential election, the world has become increasingly agitated about the power and potential for abuse implicit in the vast global utilities Google, Facebook and Amazon. We are still now only beginning to see a dim outline of the ways in which access to the behaviors and perceptions of billions of consumers worldwide can be used for shadowy purposes by targeting messages – both factual and fictional – to inflame and to terrorize. In response, public institutions such as the EU and the US Congress have been taking increasingly harsh tones with these quasimonopolies, including, in the case of the EU, a series of ever increasing financial penalties.
What this furor about secretive social media campaigns has highlighted anew is the fundamentally bizarre nature of the trade-off consumers make for access to these universal platforms. Consumers buy the convenience of instantaneous access to information, ecommerce and social connection in exchange for their historical and real-time personal data and networks which the search engine and social platforms monetize and turn into vast revenues by selling access to the insights acquired to advertisers. Concerns about the equity of this “barter” system date to the dawn of the internet and social media age but have once again started to move center stage. This time, however, we believe, that the critics will not restrict their policy recommendations to the titans of our online ecosystem but expand their purview to include every company that uses data about its customers for commercial purposes. The discussion about how well corporations keep the data they hold about customers private and the level of transparency they practice about how they use that data is an ongoing one. It is critical, therefore, that we also explore the ramifications of a shift in consumer expectations of how personal data is used and whether they believe are compensated adequately for the use of that data.
Efforts to devise an effective system by which users of social media and search engines could “own” and be compensated for the use of that data have a long heritage. The idea was given wider currency by virtual reality pioneer Jaron Lanier in his 2013 book, Who Owns the Future?. It was revived more recently, as Eduardo Porter described in The New York Times, in a new book, Radical Markets by Eric Posner from the University of Chicago Law School and E. Glen Weyl from Microsoft. Posner and Weyl have developed an intriguing construct they call Data as Labor (DaL). They argue that, rather than treating our personal data as a by-product of our barter relationships with the Googles and Facebooks of the world, our data are, in fact, a form of labor and all of us are its suppliers, for which we should be compensated.
Initiatives based on this core idea are already beginning to move from the theoretical to the practical. In May 2018, activists in Amsterdam launched what they call a Data Labor Union with the goal of electing leaders to negotiate directly with Google and Facebook on the subject of compensation for shared data and as a channel to air grievances about its use. Not surprisingly, pioneers in the field of blockchain technology almost immediately grasped how this technology could be deployed in giving individuals control over their own data in ways that would make it easier to monetize. One current example is DataFund which runs on the Ethereum and Swarm blockchain platforms. The concept is that individuals use the DataFund protocol to create their own data funds. These data funds not only protect and curate the individual’s data but also operate on the principle that organizations that use data can “lease” this information in exchange for financial value.
Each version of this concept comes with a slightly different operating model. Datum, which offers a calculator to demonstrate how much value each data user derives from your individual data, also consists of a blockchain-based network. Datum’s data tokens are traded on ten blockchain exchanges. The number of data owners is still small (10,000 users), but Datum has large ambitions. In a predictive flourish characteristic of blockchain visionaries, Datum founder Roger Haenni describes the company’s ambition as follows: “Datum is a decentralized global scale database with blockchain characteristics that allows anyone to securely store structured data. Datum returns data ownership to individuals and lets them monetize their data on their own terms. This lays the foundation for a new marketplace powered by a new asset class. Datum turns structured data into a tradeable commodity and is building the $120 Billion Data Economy”.
Concerns about the fair use of consumer data are not restricted to internet visionaries and libertarian academics. In her piece for the Financial Times – “Should Google and Amazon pay us for our data?” Gillian Tett cites a recent poll by the Pew Research Center that suggest 91 per cent of Americans feel they have lost control of their personal data and 61 per cent want the government to introduce more controls (Tett, 2018). There is significant disagreement about what the nature of such greater controls might bring. Some have argued that Facebook and Google should be regulated as utilities, such as power, transportation and communications. Others are convinced that regulating social media as a utility would actually make things worse. Crawford believes that this would simply muddy the waters and let existing communications utilities avoid genuine oversight.
In light of the enormous financial value that social media and search companies generate from individual consumer data and the increase in concerns about privacy and manipulation, it seems more likely than not that they will be regulated more severely in some form or other. Whether that involves these companies becoming the subject of anti-trust actions as AT&T, IBM and Microsoft have variously been or being forced to pay consumers for the use of their data remains to be seen but we believe that any of these courses of action will have implications not just for social media and search but across a wide range of other corporate uses of data. Any company, in fact, that uses data about individual consumers, whether to improve its service, predict market needs or cross-sell, should look closely at its current policies to ensure that it does not become a target as appealing as Google and Facebook are today.
The EUs General Data Protection Requirements (GDPR) has already significantly advanced consumer privacy protections by requiring that companies disclose what they are doing with an EU citizen’s personal data and obtaining explicit consent for a range of uses. However, we believe that as consumers become more familiar with all the ways their data are used, the stronger the call is likely to be for even more rigorous regulation. The most intriguing potential for distrust or abuse, for example, is likely to come from the Internet of Things (IoT).
This relatively new data source has received a great deal of attention in the past three year much of it rather breathless. McKinsey’s much shared prediction states that the IoT sector will be contributing $11.1tn in economic impact by 2025. This will be based, presumably on the 20.4 billion connected devices that Gartner claims will be plugged in by 2020. Much of this value will be derived from the ways in which these data sources will enable predictive maintenance of power stations and chemical refineries and warn of potential threats from corroded pipelines or nuclear power stations. Much attention has also been focused on the cyber security risks posed by our increasingly connected and online infrastructure systems. However, there has been little serious study of the impact of the IoT at the individual household level where much of the potential reputation risk lies. These risks are directly proportional to the extraordinary power that connected domestic devices will give manufacturers. If that power is abused, it is likely to stall and even block the value that both parties – manufacturers and consumers – can derive from IoT.
Today, many consumers who think of themselves as trendsetters feel positive about the use of data in their smart homes. According to a study by Columbia Business School, 75 per cent of people are more willing to share personal data with brands they trust and 80 per cent will share non-required data for loyalty points or other benefits). A report by IAB states, furthermore that 65 per cent of people with IoT devices are willing to receive advertising on them and 55 per cent will pay attention if they could get a coupon in return. In light of this enthusiasm, it is not surprising, as the manufacturing services company Jabil reports, that 99 per cent of manufacturers of smart devices plan to collect consumer data and use it. If handled appropriately and sensitively, the use of IoT data in the home has the potential to turn every manufacturer whether of TV sets, vacuum cleaners or refrigerators into a service provider with all of the customer intimacy that this implies.
The reason for the heightened sensitivity should be clear. It is one thing to handle Facebook postings and Instagram stories with care which is obviously important. Data about what goes on in people’s bedrooms, kitchens, bathrooms and playrooms rise to an altogether higher level of intimacy, and companies need to clearly think through what limits they need to set on their use of this data. There are three areas of particular concern.
Data privacy and security Too many of the early smart devices had poorly designed encryption and security or very little. Horror stories about baby monitors and connected toys have made consumers highly sensitive to device security. In fact, according to a report by PwC, 75 per cent of US internet users would be willing to pay extra for heightened levels of security. As connected devices become ubiquitous, we believe standards in cyber security and data privacy in IoT will converge, but companies will need to exercise real vigilance to make sure that they are industry leaders or at least in the middle of the pack.
Transparency As the range of possible communications and behaviors driven by IoT data increases, the more important it will become to offer highly personalized options. The small print that now governs the contractual relationship between manufacturer and customer as it pertains to data privacy and usage is becoming an entirely inadequate model. Manufacturers of smart devices need to reimagine how this aspect of the relationship is handled. It may even require an in-home visit not from an installer but a well trained data privacy specialist who can take the home owner through all of the data usage options in real time face to face. At the very least, companies who are using bots to help educate users and sell them additional services should consider privacy bots that explain all of the options and risks.
Creepiness The kind of information that smart home devices will increasingly provide to manufacturers raises new dangers for how that information is handled. The kind of entertainment consumed, personal hygiene practices, consumption of food and drink, personal conversations are all forms of data that these devices will capture. Clearly, the use of these kinds of data will need to be strictly controlled and explicitly addressed in company privacy policies. Particular care will need to be taken to ensure that any automated marketing messages and other algorithm driven responses to the data are carefully gated by human “listeners” to ensure that they are contextually appropriate. To offer one gruesome example: the oven that suggests baking a type of cake it knows a particular home owner enjoys also needs to know that this Strudel recipe was a favorite of a child who was killed in a car accident the day before the message went out. Ironically, the more private data the machine has, the greater the need for careful human monitoring.
Despite the many risks involved in the transition to potential individual ownership of personal data and IoT devices, we believe that these technology transitions have the potential to create great competitive advantages for companies that handle them correctly. The darker alternative is that this bright future is compromised by poorly thought-through and poorly executed privacy management and data utilization. That truly would be killing the goose that laid the golden eggs.