Each innovation challenges the norms, codes, and values of the society in which it is embedded. The industrial revolution unleashed new forces of productivity but at the cost of inhumane working conditions, leading to the creation of unions, labor laws, and the foundations of the political party structures of modern democracies. Fossil fuels powered a special century of growth before pushing governments, companies, and civil society to phase them out to protect our health, ecology, and climate.
When innovations lead to disaster, it says much about the societal context. The Chernobyl nuclear disaster embodied the flaws of Soviet planning. The US opioid crisis, which turned an essential pain-relieving medication into a fatally addictive drug that killed millions, reflects many of the fractures and problems of modern America, from the lobbying power of the pharmaceutical industry and a fragmented health system to post-industrial economic decline.
The digital technology revolution is facing its own societal reckoning, as its benefits are eclipsed by the harmful practices and business models it has unleashed. Critics talk of a capitalist surveillance network that shapes consumer behavior and channels our choices to profit a tiny group of tech giants. Insurance companies harness data to exclude certain customers unfairly. Data is resold without consent. Flawed algorithms are dishing out criminal sentencing and predicting student grades.
Governments and international organizations have already responded to digital technology’s threats. Personal data protection has been enshrined as a legal right, evidenced by the European Union’s General Data Protection Regulation (GDPR), a gold standard being replicated and adapted in jurisdictions elsewhere. Ethical frameworks governing the use of artificial intelligence have been drawn up in the likes of Canada and France–and by companies themselves.
But a chorus of critical voices, from industry whistle-blowers to economists, historians, and anthropologists, are calling for far-reaching reforms to ensure digital and data innovation adds true value to society. “Do no evil” may preclude a company from outright unethical actions, but what of an advertising-based business model that channels huge profits to companies with trivial societal benefits?
A net positive?
“I was very optimistic about the benefits of the internet and digital technology in the early days, and frankly, it has been hugely disappointing,” says Paul Romer, former chief economist of the World Bank and winner of the Nobel Prize for economics. “I don’t think anything has come out of the internet revolution that was actually a net positive for society, except maybe Wikipedia.”
Diane Coyle, economist and co-director of the Bennett Institute for Public Policy at the University of Cambridge, is also unconvinced that the digital revolution has delivered much tangible value. “If an economy is changing to make things better for almost everybody, I’m going to worry less about the distribution of benefits across people. But I don’t think we’re even at that starting point yet. The digital economy is making things better for a small number of people. You’ve got the obvious free services, but beyond that, it’s quite hard to know what the net benefit is.”
Coyle and Romer are two in a growing community of influential voices. They include pioneers of the original digital revolution who are disenchanted with its current state. Tim Berners-Lee, who invented the web itself, has formed a new platform, Inrupt, to create a web of “shared benefit.” Virtual reality innovator Jaron Lanier thinks the early dream of free information was a mirage and that making everything free, in exchange for advertising, would lead to a manipulative society.
Ahead of the Data Paradigm Forum 2020, an event convened by the Omidyar Foundation to define a policy agenda to widen the benefits of the digital revolution, we examine the dynamics of the data economy today–what went wrong, and how to fix it.
Oil, sunlight, capital, or labor? Defining data
To understand what data is, and how to govern it, metaphors and allegories can be helpful. The most popularly used analogy is that of oil. This captures data’s power to transform our economic and geopolitical order–and the importance of extraction and processing in giving it value. When Zoom’s market capitalization surpassed ExxonMobil’s, it was yet another indication of the tech sector “replacing” the old economic order.
But unlike oil, data’s value is not created by the companies that extract and process it–many actors, from governments to citizens, contribute. Some companies sit on and monetize data as though they found it in the earth, and that data could be given more value through sharing and collaboration.
“Take the example of a car-share company,” says Coyle. “They have all this data about what journeys people make, what they are willing to pay for, the peaks and troughs. Shouldn’t some of that be shared with public transit authorities, rather than being the personal property of a company that is, after all, using all kinds of public investments?” Data, on this analysis, looks like a utility or infrastructure, such as telephone or water networks than a natural resource.
“Data as sunlight,” coined by Alphabet’s chief financial officer Ruth Porat, is another metaphor that captures its ubiquity and renewability. What it misses is that data, unlike sunlight, can be bottled and sold. When consumer genetic-testing company 23andMe sold customer data to pharmaceutical company GSK, it was criticized by some for falling short of the informed consent protocols that should be followed for such a transaction.
Tim O’Reilly, founder and CEO of O’Reilly Media, who gave us the “open source” and “Web 2.0” monikers, has no objection to consumers giving their data to digital platforms for a convenience like Google Maps, but wants to limit the kinds of rents that are being extracted on top. “I give Google my location and I expect them to give me services for that. I don’t expect them to give that data to another company. Similarly, there is no reason why a bank or phone company should resell our data. It’s not their business model, it’s just a rent.”
Putting a value on data
All data metaphors capture something of its character, but none can envelop it entirely. Data has distinctive traits and dynamics which must be factored into any robust interpretive framework. This includes its reusability; unlike fossil fuels, data can be reused endlessly and, in domains like artificial intelligence and machine learning models, its value grows greater with scale. Data creates externalities, from the positive kind–an individual’s car journey that helps a Google or Waze direct drivers away from congestion–to harmful, such as the use of an individual’s consensual data to infer insights about other consumers or users who share common demographic or other attributes, but who did not share their data.
The challenge is to define data in its own terms with new tools and ideas that factor in data’s true nature. Critics believe the first step is a more accurate picture of where data gets its value – and the difference between value and mere rent.
Mariana Mazzucato, professor in the economics of innovation and public value at University College London, thinks that profits in the digital technology era have become confused with value. She draws parallels to critiques of GDP as a misleading indicator when looked at through different lenses.
Feminist economists have argued that it fails to capture value-creating activities like domestic work and care, for instance, while environmentalists point out the absurdity of a metric which would show pollution increasing GDP because of the increased output and the money spent on cleaning it up. Mazzucato explains, “On top of those arguments, you have a problem that profits get confused with rents. We see this in the financial sector, where a hedge fund or transactions like a credit-default swap leads to a fee or a profit. We assume that equals value without asking a question: what are the market participants actually doing? Is anything productive happening here?”
Valuing data means understanding who participated in its creation. Governments, for instance, have been essential investors in both the technology and the data systems of today, from GPS to the internet itself. Yet they still do not see themselves as value creators. “Once you see that government funds technological progress, it becomes not just a regulator, administrator, or a market fixer, but an investor and a value creator,” she says.
Data’s value is also a product of the input and participation of digital users with cloudy consent protocols, from granting permissions to platforms to access their data as part of a consumption experience, to labeling and digitization work conducted during processes like reCAPTCHA. Indeed, Google faced a class action lawsuit arguing that its ReCAPTCHA program was, in effect, a vast unpaid labor enterprise through which the company was transcribing vast troves of books, newspapers, and street imagery[5].
Viewing data through the lens of the participants involved in its value creation can, in turn, inform sensible versus misguided remedies. The dominant policy discussion in Brussels to Washington is on anti-trust and breaking up tech giants, but this misses the point that it is data practices, rather than size, which are at issue. “What’s important is not to make accusations against big tech,” says Mazzucato. “It is about getting into the nitty-gritty of how algorithms are being used. Are they creating a great value for society, or increasing monopoly profits and ad revenues? Breaking up big tech is a simplistic notion, as opposed to breaking up their rent-seeking activities and letting their value-creating activities prosper.”
Romer believes a progressively increasing tax on revenue from targeted digital advertising could shift advertising-driven platforms to ad free subscription models, removing the imperative to track consumers and creating a more transparent transaction. “At the moment, consumers have essentially no choice at all about many critical dimensions of their digital experience,” says Romer. “I cannot use online banking without being subject to a massive amount of tracking. I can’t buy an airline ticket without the airline sending some of my information to Facebook.” In effect, there is “no possibility of accessing digital services that doesn’t compel me to go along with this surveillance and advertising economy.”
Romer envisions a Netflix-like subscription service in which people pay a modest fee for utilities like search. “People get mad at me for this idea because they like free services. I’m sorry, but that’s not how economics works. If you think you are getting something for free, you don’t understand what’s going on.”
The careful fix
Although there is growing agreement that “something must be done” about fairness and the digital ecosystem, it requires creative and technical expertise in equal measure–and experimentation to get the solution right. For instance, Google’s founders considered fees and paid subscriptions in the early days, but users were anchored to the idea of the internet as a free service. If events of the last few years are anything to go by, from social media platforms and election meddling to disinformation, it is the old economist’s muse: there’s no such thing as a free lunch.
Paying people for their data, or their contribution to date, does have some real-world examples, like Amazon Mechanical Turk, an outsourcing marketplace for basic data tasks. But it can also misfire. When Microsoft tried to introduce micro-payments for data, armies of bots sprung up to manipulate the process. Some economists are uncomfortable about the entire idea of putting property rights about data, even when this is for the benefit of the user. For one thing, the value of individual data is tiny. One online calculator, produced by the Financial Times, valued Coyle’s at a mere GBP6. “I tried to make it as high as I could,” she recalls. “Claiming ownership misses the point; that the value of data is relational and collective.”
In looking at ways to build a fairer data economy, we may also need to look at much bigger questions than what tech companies are doing. “Big tech is being scapegoated for fundamental flaws in our rogue capitalist system,” says O’Reilly. He believes we need to look at corporate strategies as rational within the rules of society, describing relentless profit-seeking as the “master algorithm” of our current world. “What does it mean when Facebook has to provide false content because it’s more engaging, that oil companies deny climate change? or when pharma companies lobby the FDA to say oxycontin is not addictive? It’s because the master algorithm of our economy is that, to quote Milton Friedman, the social responsibility of business is to increase profits. We’ve baked that into the system. We are calling out the tech industry but really, it is showing us our system.”
This article was written by Insights, the custom content arm of MIT Technology Review. It was not produced by MIT Technology Review’s editorial team.
Table of Contents Introduction to Study Skills The Importance of Time Management Developing Active Reading…
Technological advancements in manufacturing keep improving efficiency and production speed. Learn how RIM is changing…
Discover the exciting world of Bitcoin rewards and learn how to earn crypto while you…
Key Takeaways Medicare and Medicaid fraud drains financial resources and harms patient care. Recognizing signs…
As a business owner, knowing how to use holiday cards to your advantage is important.…
When it comes to grooming, your eyebrows play a significant role in framing your face.…