The inevitability of tokenized data
We’re reaching the endgame of an inevitable showdown between big tech and regulators with a ley battleground around consumer data. In many ways, the fact that things have gotten here reflects that the market has not yet developed an alternative to the data paradigm of Google and Facebook as sourcers and sellers and Amazon as host that today dominates.
The tokenization and decentralization of data offers such an alternative. While the first generation of “utility” tokens were backed by nothing more than dreams, a new generation of tokens, connected explicitly to the value of data, will arise.
The conversation around data has reached a new inflection point.
Presidential candidate, Sen. Elizabeth Warren has called for the breakup of technology giants including Amazon and Facebook. In many ways, the move feels like an inevitable culmination of the last few years in which public sentiment around the technology industry has shifted from overwhelmingly positive to increasingly skeptical.
One part of that growing skepticism has to do with the fact that when populist ideology rises, all institutions of power are subject to greater scrutiny. But when you hone in on specifics, it is clear that the issue underlying the loss of faith in technology companies is data: what is collected, how it is used, and who profits from it.
Facebook’s Cambridge Analytica scandal, in which a significant amount of user data was used to help Russian political actors sew discord and help Trump get elected in 2016, and Facebook CEO Mark Zuckerberg’s subsequent testimony in front of Congress were a watershed moment in this loss of faith around data.
Those who dismissed consumer outrage around the event by pointing out that barely anyone actually left the platform because of the event failed to recognize that the real impact was always more likely to be something like this – providing political cover for a call to break up the company.
Of course, not every 2020 Democratic candidate for the Presidency agrees with Warren’s call. In a response to Warren, Andrew Yang — the upstart candidate who has made waves with his focus on Universal Basic Income and after appearances on Joe Rogen’s popular podcast – wrote: “Agree there are fundamental issues with big tech. But we need to expand our toolset. For example, we should share in the profits from the use of our data. Better than simply regulating. Need a new legal regime that doesn’t rely on consumer prices for anti-trust.”
While one could suggest that Yang is biased, since he comes from the world of technology, he has been more vocal and articulate about the coming threat of displacement from automation than any candidate. His notion of a different arrangement of the economics of data between the people who produce it and the platforms who use (and sell advertising against) it are worth considering.
In fact, one could make an argument that not only is this sort of heavy-handed regulatory approach to data inevitable, but represents a fundamental market failure in the way the economics of data are organized.
Data, it has been said, is the new oil. It is, in this analogy, the fuel by which the attention economy functions. Without data, there is no advertising; without advertising, there are none of the free services which have come to dominate our social lives.
Of course, the market for data has another aspect as well, which is where it lives. Investor (and former Facebook head of growth) Chamath Palihapitiya pointed out that 16% of the money he puts into companies goes directly into Amazon’s coffers for data hosting.
This fact shows that, while regulators – and even more, Presidential candidates looking to score points with a populist base – might think that all of technology is aligned around preserving today’s status quo – there are in fact big financial motivations for something different.
Enter ‘decentralization.
In his seminal essay “Why Decentralization Matters,” A16Z investor Chris Dixon explained how incentives diverge in networks. At the beginning of networks, the network owners and participants have the same incentive – to grow the number of nodes in the network. Inevitably, however, a threshold is reached where it pure growth in new participants isn’t achievable, and the network owner has to turn instead to extracting more from the existing participants.
Decentralization, in Dixon’s estimation, offers an alternative. In short, tokenization would allow all users to participate in the financial benefit and upside of the network, effectively eliminating the distinction between network owners and network users. When there is no distinct ownership class, there is no one who has the need (or power) to extract.
The essay was a brilliant articulation of an idealized state (reflected in its 50,000+ claps on Medium). In the ICO boom, however, things didn’t exactly work out the way Dixon had imagined.
The problem, on a fundamental level, was about what the token actually was. In almost every case, the “utility tokens” were simply payment tokens – an alternative money just for that service. Their value relied on speculation that they could achieve a certain monetary premium that allowed them to transcend utility for just that network – or enable that network to grow so large that that value could be sustained over time.
It’s not hard to understand why things were designed this way. For network builders, this sort of payment token allowed a totally non-dilutive form of capitalization that was global and instantaneous. For retail buyers, they offered a chance to participate in risk capital in a way they had been denied by accreditation laws.
At the end of the day, however, the simple truth was that these tokens weren’t backed by anything other than dreams.
When the market for these dream coins finally crashed, many decided to throw out the token baby with the ICO bathwater.
What if it prompted a question instead: what if the tokens in decentralized networks weren’t backed by nothing but dreams, but we’re instead backed by data? What if instead of dream coins, we had data coins?
Data is indeed the oil of the new economy. In the context of any given digital application, data is where the value resides: for the companies that are paid to host it; for the platforms that are able to sell advertising against it; and for the users who effectively trade their data for reduced priced services.
Data is, in other words, an asset. Like other assets, it can be tokenized and decentralized into a public blockchain. It’s not hard to imagine a future of every meaningful piece of data in the world will be represented by a private key. Tying tokens to data explicitly creates a world of new options to reconfigure how apps are built.
First, data tokenization could create an opportunity where nodes in a decentralized hosting network – i.e. a decentralized alternative to AWS – could effectively speculate on the future value of the data in the applications they provided hosting services for, creating financial incentive beyond simple service provision. When third parties like Google want to crawl, query, and access the data, they’ll pay the token representing the data (a datacoin) back to the miners securing and storing it as well as to the developers who acquire, structure, and label the data so that it’s valuable to third parties — especially machine learning and AI-driven organizations.
Second, app builders could not only harness the benefits of more fluid capitalization through tokens, but easily experiment with new ways to arrange value flows, such as cutting users in on the value of their own data and allowing them to benefit.
Third, users could start to have a tangible (and trackable) sense of the value of their data, and exert market pressure on platforms to be included in the upside, as well as exert more control over where and how their data was used.
Tokenized data, in other words, could create a market mechanism to redistribute the balance of power in technology networks without resorting to ham fisted (even if well meaning) regulation like GDPR, or even worse, the sort of break-up proposed by Warren.
Even after the implosion of the ICO phenomenon, there are many like Fred Wilson who believe that a shift to user control of data, facilitated by blockchains, is not just possible but inevitable.
Historically, technology has evolved from closed to open, back to closed, and then back to being open. We’re now in a closed phase where centralized apps and services own and control a vast majority of the access to data. Decentralized, p2p databases — public blockchains — will open up and tokenize data in a disruptive way that will change the flow of how value is captured and created on the internet.
Put simply, tokenized and open data can limit the control data monopolies have on future innovation while ushering in a new era of computing.
It’s how information can finally be set free.
from TechCrunch https://ift.tt/2Ho4IWi
No comments: