Selling Your Data without Selling Your Soul: Privacy, Property, and the Platform Economy

With Sinclair Davidson

Executive summary: Humans have always sought to defend a zone of privacy around themselves—to protect their personal information, their intimate actions and relationships, and their thoughts and ideas from the scrutiny of others. However, it is now common to hear that thanks to digital technologies, we now have little expectation of privacy over our personal information.

Meanwhile, the economic value of personal information is rapidly growing as data becomes a key input to economic activity. A major driver of this change is the rise of a new form of business organization that has come to dominate the economy—platforms that can accumulate and store data and information are likely to make that data and information more valuable.

Given the growing economic importance of data, digital privacy has come to the fore as a major public policy issue. Yet, there is considerable confusion in public debates over the meaning of privacy and why it has become a public policy concern. A poor foundational understanding of privacy is likely to result in poor policy outcomes, including excessive regulatory costs, misallocated resources, and a failure to achieve intended goals.

This paper explores how to build a right to privacy that gives individuals more control over their personal data, and with it a choice about how much of their privacy to protect. It makes the case that privacy is an economic right that has largely not emerged in modern economies.

Regulatory attempts to improve individual control over personal information, such as the European Union’s General Data Protection Regulation (GDPR), have unintended consequences and are unlikely to achieve their goals. The GDPR is a quasi-global attempt to institute privacy protections over personal data through regulation. As an attempt to introduce a form of ownership over personal data, it is unwieldy and complex and unlikely to achieve its goals. The GDPR supplants the ongoing social negotiation around the appropriate ownership of personal data and presents a hurdle to future innovation.

In contrast to top-down approaches like the GDPR, the common law provides a framework for the discovery and evolution of rules around privacy. Under a common law approach, problems such as privacy are solved on a case-by-case basis, drawing on and building up a stock of precedent that has more fidelity to real-world dilemmas than do planned regulatory frameworks.

New technologies such as distributed ledger technology—blockchain—and advances in zero-knowledge proofs likewise provide an opportunity for entrepreneurs to improve privacy without top-down regulation and law.

Privacy is key to individual liberty. Individuals require control over their own private information in order to live autonomous and flourishing lives. While free individuals expose information about themselves in the course of social and economic activity, public policy should strive to ensure they do so only with their own implied or explicit consent.

The ideal public policy setting is one in which individuals have property rights over personal information and can control and monetize their own data. The common law, thanks to its case-by-case, evolutionary nature, is more likely to provide a sustainable and adaptive framework by which we can approach data privacy questions.

Published by the Competitive Enterprise Institute

Submission on the final report of the Australian Competition and Consumer Commission’s Digital Platforms Inquiry

With Darcy Allen, Dirk Auer, Justin (Gus) Hurwitz, Aaron Lane, Geoffrey A. Manne, Julian Morris and Jason Potts

The emergence of “Big Tech” has caused some observers to claim that the world is entering a new gilded age. In the realm of competition policy, these fears have led to a flurry of reports in which it is asserted that the underenforcement of competition laws has enabled Big Tech firms to crush their rivals and cement their dominance of online markets. They then go on to call for the creation of novel presumptions that would move enforcement of competition policy further away from the effects-based analysis that has largely defined it since the mid-1970s.

Australia has been at the forefront of this competition policy rethink. In July of 2019, the Australian Competition and Consumer Commission (ACCC) concluded an almost two-year-long investigation into the effect of digital platforms on competition in media and advertising markets.

The ACCC Digital Platforms Inquiry Final Report spans a wide range of issues, from competition between platforms to their effect on traditional news outlets and consumers’ privacy. It ultimately puts forward a series of recommendations that would tilt the scale of enforcement in favor of the whims of regulators without regard to the adverse effects of such regulatory action, which may be worse than the diseases they are intended to cure.

Available in PDF here.

Some Economic Consequences of the GDPR

With Darcy Allen, Alastair Berg, Brendan Markey-Towler, and Jason Potts. Published in Economics Bulletin, Volume 39, Issue 2, pages 785-797. Originally a Medium post.

Abstract: The EU General Data Protection Regulation (GDPR) is a wide ranging personal data protection regime of greater magnitude than any similar regulation previously in the EU, or elsewhere. In this paper, we outline how the GDPR impacts the value of data held by data collectors before proposing some potential unintended consequences. Given the distortions of the GDPR on data value, we propose that new complex financial products—essentially new data insurance markets—will emerge, potentially leading to further systematic risks. Finally we examine how market-driven solutions to the data property rights problems the GDPR seeks to solve—particularly using blockchain technology as economic infrastructure for data rights—might be less distortionary.

Available here.

Submission to the Australian Competition and Consumer Commission’s Digital Platforms Inquiry

With Gus Hurwitz.

Executive summary: The analysis in the Australian Competition and Consumer Commission’s Preliminary Report for the Digital Platforms Inquiry is inadequate in several ways, most notably:

  • It mischaracterises the relationship between changes in the economics of media advertising and the rise of digital platforms such as Facebook and Google.
  • Its analysis of the dynamics of media diversity is misguided.
  • Its competition analysis assumes its results and makes unsupportable claims about the division of advertising markets.
  • It is recklessly unconcerned with the freedom of speech consequences of its recommendations.
  • It fails to recognise, and proposes to supplant, the ongoing social negotiation over data privacy.
  • It provides a poor analytic base on which to make policy recommendations, as it applies a static, rather than dynamic, approach to its analysis.

There is a real danger that if the policy recommendations outlined in the preliminary report were to be adopted, Australian consumers would be severely harmed.

Available here.

The Classical Liberal Case for Privacy in a World of Surveillance and Technological Change

Palgrave Macmillan, 2018

How should a free society protect privacy? Dramatic changes in national security law and surveillance, as well as technological changes from social media to smart cities mean that our ideas about privacy and its protection are being challenged like never before. In this interdisciplinary book, Chris Berg explores what classical liberal approaches to privacy can bring to current debates about surveillance, encryption and new financial technologies. Ultimately, he argues that the principles of classical liberalism – the rule of law, individual rights, property and entrepreneurial evolution – can help extend as well as critique contemporary philosophical theories of privacy.

Available from Palgrave Macmillan.

Some economic consequences of the GDPR

With Darcy Allen, Alastair Berg and Jason Potts.

At the end of May 2018, the most far reaching data protection and privacy regime ever seen will come into effect. Although the General Data Protection Regulation (GDPR) is a European law, it will have a global impact. There are likely to be some unintended consequences of the GDPR.

As we outline in a recent working paper, the implementation of the GDPR opens the potential for new data markets in tradable (possibly securitised) financial instruments. The protection of people’s data is better protected through self-governance solutions, including the application of blockchain technology.

The GDPR is in effect a global regulation. It applies to any company which has a European customer, no matter where that company is based. Even offering the use of a European currency on your website, or having information in a European language may be considered offering goods and services to an EU data subject for the purposes of the GDPR.

The remit of the regulation is as broad as its territorial scope. The rights of data subjects include that of data access, rectification, the right to withdraw consent, erasure and portability. Organisations using personal data in the course of business must abide by strict technical and organisational requirements. These restrictions include gaining explicit consent and justifying the collection of each individual piece of personal data. Organisations must also employ a Data Protection Officer (DPO) to monitor compliance with the 261-page document.

Organisations collect data from customers for a range of reasons, both commercial and regulatory — organisations need to know who they are dealing with. Banks will not lend money to someone they don’t know; they need to have a level of assurance over their customer’s willingness and ability to repay. Similarly, many organisations are forced to collect increasingly large amounts of personal data about their customers. Anti-money laundering and counter-terrorism financing legislation (AML/CTF) requires many institutions to monitor their customers activity on an ongoing basis. In addition, many organisations derive significant value from personal data. Consumers and organisations exchange data for services, much off which is voluntary and to their mutual benefit.

One of the most discussed aspects of the GDPR is the right to erasure — often referred to as the right to be forgotten. This allows data subjects to use the government to compel companies who hold their personal data to delete it.

We propose that the right to erasure creates uncertainty over the value of data held by organisations. This creates an option on that data.

The right to erasure creates uncertainty over the value of the data to the data collector. At any point in time, the data subject may withdraw consent. During a transaction, or perhaps in return for some free service, a data subject may consent to have their personal data sold to a third party such as an advertiser or market researcher. Up until an (unknown) point in time — when the data subject may or may not withdraw consent to their data being used — that personal data holds positive value. This is in effect a put option on that data — the option to sell that data to a third party.

The value of such an option is derived from the value of the underlying asset — the data — which in turn depends on the continued consent by the data subject.

Rational economic actors will respond in predictable ways to manage such risk. Data-Backed Securities (DBS) might allow organisations to convert unpredictable future revenue streams into one single payment. Collateralised Data Obligations (CDO) might allow data collectors to package personal data into tranches of varying risk of consent withdrawal. A secondary data derivative market is thus created — one that we have very little idea of how it will operate, and what any secondary effects may be.

Such responses to regulatory intervention are not new. The Global Financial Crisis (GFC) was at least in part caused by complex and rarely understood financial instruments like Mortgage-Backed Securities (MBS) and Collateralised Debt Obligations (CBS). These were developed in response to poorly designed capital requirements.

Similarly, global AML/CTF requirements faced by financial institutions have caused many firms to simply stop offering their products to certain individuals and even whole regions of the world. The unbanked and underbanked are all the poorer as a result.

What these two examples have in common is that they both have good intentions. Adequate capital requirements and preventing money from being cleaned by money launderers are good things, but good intentions are not enough. Secondary consequences should always be considered and discussed.

Self-governance alternatives, including the application of blockchain technology, should be considered. These alternatives use technology to allow individuals greater control over the personal data they share with the world.

Innovators developing self-sovereign identity solutions are attempting to provide a market based way for individuals to gain greater control over — and derive value from — their personal data. These solutions allow users to share just enough data for a transaction to go ahead. A bartender doesn’t need to know your name or address when you want a drink, they just need to know you are of legal age.

Past instances of regulatory intervention should make us cautious that even well-meaning regulation will achieve its stated objectives with no negative effects. Self-sovereign identity, and the use of blockchain technology is a promising solution to the challenges of data privacy.

Government must leave encryption alone, or it will endanger blockchain

With Sinclair Davidson and Jason Potts

If we could give Malcolm Turnbull one piece of economic advice right now – one piece of advice about how to protect the economy against a challenging and uncertain future – it would be this: don’t mess with encryption.

Earlier this month the government announced that it was going to “impose an obligation” on device manufacturers and service providers to provide law enforcement authorities access to encrypted information on the presentation of a warrant.

At the moment it’s unclear what exactly this means. Attorney-General George Brandis and Malcolm Turnbull have repeatedly denied they want a legislated “backdoor” into encrypted devices, but the loose way they’ve used that language suggests some sort of backdoor requirement is still a real possibility.

Hopefully we’ll discover more when the legislation is introduced in the August sitting weeks. Turnbull did say at the press conference “I’m not suggesting this is not without some difficulty”. The government may not have made any final decisions yet.

But before any legislation is introduced, the government needs to understand what the stakes are in as they strive against encryption.

Anything the government does to undermine the reliability of encryption could have deleterious consequences for what we believe will be the engine of economic growth in decades to come: the blockchain protocol.

The blockchain is the distributed and decentralised ledger that powers the Bitcoin cryptocurrency. Blockchain constitutes a suite of five technologies: cryptography, a database that can be added to but not altered, peer-to-peer networking, an application of game theory, and an algorithm for ensuring a consensus about what information is held on the ledger.

Taken separately, these are long established technologies and techniques – even mundane ones. But taken together, they constitute an entirely new tool for creating political, economic, and social relationships.

The possibilities far exceed digital currencies. Already banks and other financial institutions are trying to integrate blockchains into their business structures: blockchains drastically reduce the costs of tracking, recording, and verifying transactions. Almost any business or government organisation that is done with a database now can be done more efficiently, more reliably, and cheaper with a blockchain – property registers, intellectual property, security and logistics, healthcare records, you name it.

But these much publicised blockchain applications are just a small taste of the technology’s possibility. “Smart” self-executing contracts and massively distributed organisational structures enabled by the blockchain will allow the creation of new forms of business structures and new ways to work together in every sector and every industry.

In fact, we think that the blockchain is so significant that it should be treated as its own category of human organisation. There are firms, there are markets, there are governments, and now there are blockchains.

But the blockchain revolution is not inevitable.

If there is one key technology in the blockchain, it is cryptography. There are lots of Silicon Valley entrepreneurs playing around with lots of different adaptations of the blockchain protocol, but this one is a constant: the blockchain’s nested levels of encryption are built to ensure that once something is placed on the blockchain it is permanent, immutable, and only accessible to those who own it.

Blockchains only work because their users have absolute confidence that the system is secure.

Any legal restrictions, constraints or hurdles placed on encryption will be a barrier to the introduction of this remarkable new economic technology. In fact, any suggestion of future regulatory challenges to encryption will pull the handbrake on blockchain in Australia. In the wake of the banking, mining and carbon taxes, Australia already has a serious regime uncertainty problem.

Melbourne in particular is starting to see the growth of a small but prospective financial technology industry of which blockchain is a central part. The Australian Financial Review reported earlier this week about the opening of a new fintech hub Stone & Chalk in the establishment heart of Collins St. What’s happening in Melbourne is exactly the sort of innovation-led economic growth that the Coalition government was talking about in the 2016 election.

But the government won’t be able to cash in on those innovation dividends if they threaten encryption: the simple and essential technology at the heart of the blockchain.

Medicare details available on dark web is just tip of data breach iceberg

Modern governments use a lot of data. A lot. Our social services are organised by massive databases. Health, welfare, education and the pension all require reams of information about identity, social needs, eligibility, and entitlement.

Our infrastructure is managed by massive databases holding information about traffic flows, public transport usage, communications networks, and population flows.

Our security is maintained by complex information systems managing defence assets, intelligence data, and capabilities and deployment information.

We should be thinking about these enormous data holdings when we read the news that thieves have been selling Medicare numbers linked to identities on the “dark web” – a mostly untraceable anonymous corner of the internet.

That last detail is what has made this such a scandal for the government, as Human Services Minister Alan Tudge and the Australian Federal Police have scrambled to identity the systems’ weaknesses.

But the fact that the Medicare numbers are being sold is the only thing that makes this an unusual data security breach. Australian government databases are constantly being accessed by people who are not authorised to do so.

Here’s just a taste. Last year the Queensland Crime and Corruption Commission revealed it had laid 81 criminal charges and 11 disciplinary recommendations in the space of 12 months for unauthorised access to confidential information by police. One of those was a police officer who had been trawling through crime databases looking for information about people he had met on a dating service. He was convicted of 50 charges of unauthorised access.

A Queensland police officer was disciplined in May this year for using the police database to share the address of a woman with her husband who was subject to a restraining order.

The Victorian government’s police database was wrongly accessed 214 times between 2008 and 2013, by “hundreds” of officers.

Earlier this year 12 staff were fired from the Australian Taxation Office for accessing tax data on celebrities and people they knew.

We could go on. These of course are the instances we know about because they have been detected and reported on. There are undoubtedly others.

Governments manage a lot of data because we ask them to do it a lot, and to do what they do well.

They run thousands of complex systems. Many of these systems have been jerry-rigged and adapted from earlier systems, a series of politicised, over-budget and under-delivering IT projects stacked on top of each other over decades.

But these repeated episodes of unauthorised access show that these complex systems are in dire need of reform.

It is clear that the “permission” structures on these government databases are deeply broken.

In the debate over mandatory data retention one of the big questions was whether law enforcement and regulatory agencies should have to obtain a warrant before accessing stored data. In the end the government decided no warrant was necessary – because warrants could only slow down investigations.

This is exactly the sort of loose permission structure that leads to abuse. Just two weeks after data retention officially came into effect this April, the Australian Federal Police admitted one of its members had illegally accessed the metadata of a journalist.

This breach was entirely predictable. Data retention opponents repeatedly predicted it.

Last week’s Medicare breach has been made possible because thousands and thousands of people – bureaucrats, health professionals, and so on – can access the Medicare database. Most police officers, bureaucrats, and health professionals are trustworthy. But it only takes a few bad actors to wreck a system built on trust.

Rather than leaving data access up to the discretion of thousands of people, we need stricter codified rules on data access. Government databases need to be restructured to prevent, not simply penalise, government employees from going on fishing expeditions through our data.

The point isn’t to provide a legal or technological fix to the problem of unauthorised access. Rather, we should completely reimagine who owns the information that the government keeps on all of us. We ought to own and control our information, not the state.

New cryptographic technologies increasingly being applied to blockchain and cryptocurrency applications allow for even greater personal control over information. If applied, they would only allow government agents to know exactly what they need to know.

And it would move us from a system of surveillance and big data, to one of personal disclosure and privacy.

In the past, economic reform was targeted at big sectors like banking, telecommunications, and trade.

As Australian governments evolve inevitably into complex information brokers, the next wave of reform will have to focus on data management.

If You’re Worried About Privacy, You Should Worry About The 2016 Census

If you blinked, you missed it. On December 18 last year, the Australia Bureau of Statistics announced that at the 2016 census in August it would, for the first time, retain all the names and addresses it has collected “to enable a richer and dynamic statistical picture of Australia”.

Keeping names and addresses, we were quietly told, would enable government planners to do more rigorous studies of social trends.

It is only now that the significance of the ABS’s change is spilling out into the press.

For the past 45 years, it has been the ABS’s practice to destroy that identifying information as soon as all other information on the census forms is transcribed – first onto magnetic tape, and now into vast digital data banks that allow statisticians to slice and dice at their whim.

In the 2001 census, the government first offered Australians a choice as to whether they would like their name-identified information kept. This year that opt-in system will be a compulsory system. Your name will be kept whether you like it or not.

The risks to privacy are blindingly obvious. The safest way to protect data is to not collect it at all. The second safest way is destroy that data after collection. There is no such thing as 100 per cent safely secured information. We know this from bitter experience. The last decade has seen a constant stream of unauthorised releases of apparently secure private information: the 2015 Ashley Madison hack being just the most embarrassing of these.

After all, privacy risks don’t only come from hackers and other rogues. Government departments have a poor record of protecting information from their own staff. The Department of Human Services admitted there were 63 episodes of unauthorised access to private files by its staff between July 2012 and March 2013. The South Australian Police Force accuses up to 100 of its own members of unauthorised access to police files every single year. ABS staff are no more or less virtuous than any other public employee.

The ABS argues that identification information will be stored safely and separately from the rest of the census data, creating a firewall that protects against individual identification. A spokesperson told Radio National last week that the ABS “never has and never will release information that is personally identifiable”.

There are a lot of unanswered questions here. But no matter what firewalls the ABS places around access and matching, it is a truism that any data that can be used usefully can also be used illegitimately.

And of course, what are considered legitimate and illegitimate uses of data can change over time. Rules written in 2016 could be changed in 2026. The data collected now might be used in a very different way down the track.

Identification retention could have practical consequences as well. A population that is rightly worried about the security of their information is less likely to answer the census either accurately or at all. Indeed, this has historically been the ABS’s big concern with keeping identification. They told a parliamentary committee in 1998 that the reduction in data quality from a reluctance to answer questions truthfully was not worth the trade-off.

A lower quality census would lead to lower quality government statistics across the board. A lot of things hang off the census. Census data guides electoral redistributions, Commonwealth grants, education funding and so on. Risking the integrity of all that in the hope that future data might be marginally more interesting to genealogical researchers and government planners seems like a terrible deal.

Although they profess to have changed their mind on the risk of lower quality data, we can speculate these concerns might be why the ABS announced the new policy in the dead holiday season. The less publicity given to the change, the less likely Australians are going to hear enough about the new census rules to be worried about their privacy.

While the Coalition’s support for traditional rights and freedoms has taken a battering over the past few years, overriding the ABS decision would go some way to reclaiming its liberal heritage.

After all, it was a Liberal Treasurer, Billy Snedden, who first mandated the destruction of names and addresses in census forms in 1971 in response to privacy concerns. And Cabinet records show the Fraser government – at the behest of treasurer John Howard – unhesitatingly and immediately rejecting a 1979 proposal by the law reform commission to retain census names and addresses.

The digitisation of absolutely everything has made privacy one of the central problems of the 21st century. If anything, Australians are more aware of the dangers of identity theft and information insecurity than they have been at any time in history.

As the ABS change shows, the debate over warrantless mandatory data retention was just the tip of the iceberg.

It is true that modern governments are data hungry. Planners and regulators want more and more information about the populations they govern.

But to the extent we have an interest in protecting ourselves against government excesses, we have an interest in denying governments carte blanche to collect information. We are not just data points in a planner’s spreadsheet. They work for us.

Communications Minister Malcolm Turnbull’s Metadata Move Will Aid Regulators, Not Security

The Abbott government has rightly focused on red tape reduction and deregulation.

But Communications Minister Malcolm Turnbull could well preside over one of the largest increases in the regulatory burden since the telecommunications market was liberalised two decades ago.

At the very moment when Turnbull seems to have cleaned up the mess that was the national broadband network, his mandatory data retention policy puts the entire competitive dynamic of the Australian telecommunications sector at stake.

Terrorism is a very real problem. The existence of the Islamic State in Iraq and Syria has heightened the terror threat. If there are serious gaps in our anti-terror law framework, they should be filled. The government has spent the past six months doing so.

However, the data retention bill the government has put forward – which requires telecommunications providers to store masses of data on their customers for no other purpose than if a law enforcement agency or regulator wants to have a look at it in the future – is not a targeted anti-terror law.

If data retention is just for terrorism, the government could legislate to ensure it was just for terrorism. But from what we know, both the Australian Competition and Consumer Commission and the Australian Securities and Investment Commission are likely to get access to the new data.

Indeed, over the half a decade that data retention has been debated, its most fervent advocates have been economic regulators, not counter-terror agencies.

One draft data set (even as Parliament is set to vote on the bill, we still don’t know what the final data set to be retained will be) included a requirement to store records of “download volumes” for two years. What anti-terror benefit would that add? Download volumes would useful in copyright infringement cases.

The threat data retention poses to privacy has been widely discussed. But data retention is, first and foremost, a new economic regulation. So let’s treat it as sceptically as we would any increase in the regulatory burden on business.

Prime Minister Tony Abbott has said that the cost of data retention would be around $300 to $400 million, or just 1 per cent of the total revenue of the telecommunications industry.

This is a very significant amount of money. Telcos are already some of the most highly regulated firms in the country.

Turnbull has suggested government will contribute substantially to the cost of implementing data retention. But whether we pay for data retention through internet bills or just general taxation, we’ll still pay for it.

This new burden could dramatically reshape the telecommunications sector. All else being equal, large firms, with their well-established regulatory teams, are able to comply with new regulation much easier than small firms, which lack the economies of scale to absorb costs.

The unfortunate result of burdensome regulation is push smaller firms out of the market, reducing competition as they disappear. Less competition will, in the long run, result in higher prices.

In the case of data retention, it isn’t just size however that matters. Some telcos have more complex networks and technologies and legacy systems – think of Telstra – for whom imposing these new requirements might be disproportionately expensive.

Turnbull and Attorney-General George Brandis claim that mandatory data retention will require telcos to store no more data than some firms do already – just store it for a bit longer.

It’s not clear which firms they’re referring to. The entire industry has been up in arms about data retention. The proposed policy is not just a minor extension of existing practice.

Nevertheless, there’s a reason some telcos store data more than others. The smallest internet service providers survive by keeping their data storage and infrastructure costs as low as possible, hoping to pull customers away from the big firms with lower prices or better service.

For the law enforcement and regulatory agencies that have spent the past six years lobbying for data retention, regulatory compliance costs are an abstract second-order issue.

But for internet users and taxpayers, who will be charged higher prices by a declining number of internet service providers, the economic effect of mandatory data retention is a big deal.