The institutional economics of quantum computing

With Jason Potts, first published on Medium

What happens when quantum computing is added to the digital economy?

The economics of quantum computing starts from a simple observation: in a world where search is cheaper, more search will be consumed. Quantum computing offers potentially dramatic increases in the ability to search through data. Searching through an unstructured list of 1,000,000 entries, a ‘classical’ computer would take 1,000,000 steps. For a mature quantum computer, the search would only require about 1,000 steps. 

Bova et al. (2021) describe this capability generally as a potential advantage at solving combinatorics problems. The goal in combinatorics problems is often to search through all possible arrangements of a set of items to find a specific arrangement that meets certain criteria. While the cost of error correction or quantum architecture might erode the advantage quantum computers have in search, this is more likely to be an engineering hurdle to be overcome than a permanent constraint.

Economics focuses on exchange. To our knowledge no analysis of the economic impact of quantum computing has been focused on the effect that quantum computing has on the practice and process of exchange. Where there have been estimates of the economic benefits of quantum computing, those analyses have focused on the possibility that this technology might increase production through scientific discovery or by making production processes more efficient (for example by solving optimisation problems). So what impact will more search have on exchange?

In economics, search is a transaction cost (Stigler 1961, Roth 1982) that raises the cost of mutually beneficial exchange. Buyers have to search for potential sellers and vice versa. Unsurprisingly, much economic organisation is structured around reducing search costs. Indeed, it is the reduction of search costs that structures the digital platform economy. Multi-sided markets like eBay match buyers with sellers at global scale, allowing for trades to occur that would not be possible otherwise due to the high cost of search.

Quantum computing offers a massive reduction in this form of transactions cost. And all else being equal, we can expect that a massive reduction in search costs would have a correspondingly large effect on the structure of economic activity. For example, search costs are one reason that firms (and sub-firm economic agents like individuals) prefer to own resources rather than access them over the market. When you have your own asset, it is quicker to utilise that asset than seeking a market counterpart who will rent it to you. 

Lowering search costs favours outsourcing rather than ownership (‘buy’ in the market, rather than ‘make’ inhouse). Lower search costs have a globalising effect — it allows economic actors to do more search — that is, explore a wider space for potential exchange. This has the effect of increasing the size of the market, which (as Adam Smith tells us), increases specialisation and the gains from trade. In this way, quantum computing powers economic growth.

Typically specialisation and globalisation increases the winner-take-all effect — outsized gains to economic actors at the top of their professions. However, a countervailing mechanism is that cheaper search also widens the opportunities to undercut superstar actors. This suggests an important implication of greater search on global inequality — it is easier to identify resources outside a local area. That should reduce rents and result in more producers (ie workers) receiving the marginal product of their labour as determined by global prices, rather than local prices. In this way, quantum computing drives economic efficiency.

Quantum and the digital stack

Of course other transactions costs (the cost of making the exchange, the cost of contract enforcement etc), can reduce the opportunities for faster search to disrupt existing patterns of economic activity. Here we argue that quantum is particularly effective in an environment of digital (or digitised) trade and production — in the domain of the information economy. 

The process of digitisation is the process of creating more economic objects and through the use of distributed ledgers and digital twins, forming more and more precise property rights regimes. In Berg et al (2018), we explored one of the implications of this explosion in objects with precisely defined property rights. We argued that the increasingly precise and security digital property rights over objects would allow artificially intelligent agents to trade assets on behalf of their users, facilitating barter-like exchanges and allowing a greater variety of assets to be used as ‘money’. Key to achieving this goal is deep search across a vast matrix of assets, where the optimal path between two assets has to be calculated according to the pre-defined preferences not only of the agents making the exchange, but of each of the holders of the assets that form the paths. 

This illuminates one of the ways in which quantum interacts with the web3 tech stack. While some quantum computation scientists have identified the opportunity for quantum to be used in AI training, we see the opportunity for quantum to be used by AI agents to search for exchange with other AI agents; an exchange theoretic rather than production-centric understanding of quantum’s contribution to the economy. The massive technological change we are experiencing is both cumulative and non-sequential — rapid developments in other parts of the tech stack further drive demand for the quantum compute. This is the digital quantum flywheel effect.

Compute as a commodity

Compute is a commodity and follows the rules of commodity economics. Just as buyers of coal or electricity are ultimately buying the energy embodied in those goods, buyers of compute are ultimately buying a reduction in the time it takes to perform a computational task (Davies 2004). There are computational tasks where classical computers are superior (either cheaper or faster), where quantum computers are superior (or could be superior), and those where both quantum and classical computers can satisfy demand. Users of compute should be indifferent as to the origin of the compute they consume, but they have specific computational needs that they wish to perform subject to budget and time constraints. And they should be indifferent to the mixture of classical and quantum computing that best suits their needs and constraints.

This indifference between classical and quantum has significant consequences for how quantum computing is distributed between firms in the economy — and, indeed, between geopolitical powers. At this stage in the development of quantum computing, the major open question is how relatively large the space of computational tasks that are best suited for classical computing are versus that for quantum computing.

For computational tasks where classical computers dominate, compute is already massively decentralised — not just with multiple large cloud services (AWS, Google etc) but in the devices on our desks and in our pockets. There is no barrier to competition in classical compute, nor any risk of one geopolitical actor dominating. Where bottlenecks in classical compute emerge are in the production networks for semiconductor chips — a known problem with a known menu of policy stances and responses. Similarly, no such risk emerges around computational tasks where classical or quantum systems are equally suited.

The salient question is whether there will arise a natural monopoly in quantum compute? This could arise as a result of bottlenecks (say of scarce minerals, or caused by market structure as in the semiconductor chip industry), or as an outcome of competition in quantum computing development. As an example, one argument might be that as quantum compute power scales exponentially with the number of qubits then a geopolitical or economic actor that establishes a lead in qubit deployment could maintain that lead indefinitely due to compounding effects. This is a quantum takeoff analogous to the hypothesised ‘AI takeoff’ (see Bostrom 2014).

Several factors mitigate against this. The diversity of architectures for quantum computing being built suggests that the future is likely to be highly competitive; not merely between individual quantum compute systems but between classes of architectures (eg. superconducting, ion trap, photonics). While quantum compute research and development is very high cost, it is proceeding widely and with significant geographical dispersion. There are at least eight distinct major systems or architectures for quantum computing, seven of which have successfully performed basic computational tasks such as the control of qubits (see the survey by Bremmer et al 2024). 

Nor is there any obvious concern that first-mover advantage implies competitive lock-in. Quantum compute is quite unlike AI safety scenarios, where ‘superintelligence’ or ‘foom’ is hypothesised to lead to a single monopolistic AI as a result of the superintelligence using its capabilities to 1) develop itself exponentially and 2) act to prevent competitors emerging. Quantum computing is and will be, for the long foreseeable future, a highly specialised toolset for particular tasks, not a general program that could pursue world domination either autonomously or under the direction of a bad actor.

One significant caveat to this analysis is that the capabilities of quantum compute might have downstream consequences for the economy, and this could . The exponential capabilities at factoring provided by quantum compute could undermine much of the cryptography that protects global commerce, and underlines the need for the development and deployment of post-quantum cryptography. We have argued elsewhere that the signals for the emergence of quantum supremacy in code breaking will emerge in the market prices of cryptocurrency (Rohde et al 2021). There is a significant risk mitigation task ahead of us to adopt post-quantum cryptography. It is a particularly difficult task because while the danger is concrete, the timeline for a quantum breakthrough is highly uncertain. Nonetheless, the task of migrating between cryptographic standards is akin to many other cybersecurity mitigations that have been performed in the digital economy, and while challenging should not be seen as existential.

Instead, the institutional economic view of quantum computing emphasises the possibilities of this new technology to radically grow the space for market exchange — particularly when we understand the possibility of quantum computing as co-developing alongside distributed ledgers, smart contracts (that is, decentralised digital assets) and artificial intelligence. Quantum computing lowers the cost and increases the performance of economic exchange across an exponentially growing ecosystem of digital property rights. It will be an important source of future economic value from better economic institutions.

References

Berg, Chris, Sinclair Davidson, and Jason Potts. ‘Beyond Money: Cryptocurrencies, Machine-Mediated Transactions and High-Frequency Hyperbarter’, 2018, 8.

Bremner, Michael, Simon Devitt, and Dr Eser Zerenturk. “Quantum Algorithms and Applications.” Office of the NSW Chief Scientist & Engineer, March 2024.

Bostrom, Nick. Superintelligence: Paths, Dangers, Strategies. Reprint edition. OUP Oxford, 2014.

Bova, Francesco, Avi Goldfarb, and Roger G. Melko. ‘Commercial Applications of Quantum Computing’. EPJ Quantum Technology 8, no. 1 (December 2021): 1–13. https://doi.org/10.1140/epjqt/s40507-021-00091-1.

Davies, Antony. “Computational Intermediation and the Evolution of Computation as a Commodity.” Applied Economics, June 20, 2004. https://www.tandfonline.com/doi/abs/10.1080/0003684042000247334.

Rohde, Peter P, Vijay Mohan, Sinclair Davidson, Chris Berg, Darcy W. E. Allen, Gavin Brennen, and Jason Potts. “Quantum Crypto-Economics: Blockchain Prediction Markets for the Evolution of Quantum Technology,” 2021, 12.

Roth, Alvin E. ‘The Economics of Matching: Stability and Incentives’. Mathematics of Operations Research 7, no. 4 (1982): 617–28.

Stigler, George J. ‘The Economics of Information’. Journal of Political Economy 69, no. 3 (1961): 213–25.

Building a grammar of blockchain governance

With Darcy Allen, Sinclair Davidson, Trent MacDonald and Jason Potts. Originally a Medium post.

Blockchains are institutional technologies made of rules (e.g. consensus mechanisms, issuance schedules). Different rule combinations are entrepreneurially created to achieve some objectives (e.g. security, composability). But the design of blockchains, like all institutions, must occur under ongoing uncertainty. Perhaps a protocol bug is discovered, a dapp is hacked, treasury is stolen, or transaction volumes surge because of digital collectible cats. What then? Blockchain communities evolve and adapt. They must change their rules (e.g. protocol security upgrades, rolling back the chain) and make other collective decisions (e.g. changing parameters such as interest rates, voting for validators, or allocating treasury funds).

Blockchain governance mechanisms exist to aid decentralised evolution. Governance mechanisms include online forums, informal polls, formal improvement processes, and on-chain voting mechanisms. Each of these individual mechanisms — let alone their interactions — are poorly understood. They are often described through sometimes-useful but imperfect analogies to other institutional systems with deeper histories (e.g. representative democracy). This is not a robust way to design the decentralised digital economy. It is necessary to develop a shared language, and understanding, of blockchain governance. That is, a grammar of rules that can describe the entire possible scope of blockchain governance rules, and their relationships, in an analytically consistent way.

A starting point for the development of this shared language and understanding is a methodology and rule classification system developed by 2009 economics Nobel Laureate Elinor Ostrom to study other complex, nested institutional systems. We propose an empirical project that seeks conceptual clarity in blockchain governance rules and how they interact. We call this project Ostrom-Complete Governance.

The common approach to blockchain governance design has been highly experimental — relying very much on trial and error. This is a feature, not a bug. Blockchains are not only ecosystems that require governance, but the technology itself can open new ways to make group decisions. While being in need of governance, blockchain technology can also disrupt governance. Through lower costs of institutional entrepreneurship, blockchains enable rapid testing of new types of governance — such as quadratic voting, commitment voting and conviction voting — that were previously too costly to implement at scale. We aren’t just trying to govern fast-paced decentralised technology ecosystems, we are using that same technology for its own governance.

This experimental design challenge has been compounded by an ethos and commitment to decentralisation. That decentralisation suggests the need for a wide range of stakeholders with different decision rights and inputs into collective choices. The lifecycle of a blockchain exacerbates this problem: through bootstrapping a blockchain ecosystem can see a rapidly shifting stakeholder group with different incentives and desires. Different blockchain governance mechanisms are variously effective in different stages of blockchain development. Blockchains, and their governance, begin relatively centralised (with small teams of developers), but projects commonly attempt to credibly commit to rule changes towards a system of decentralised governance.

Many of these governance experiments and efforts have been developed through analogy or reference to existing organisational forms. We have sought to explain and design this curious new technology by looking at institutional forms we know well, such as representative democracy or corporate governance. Scholars have looked to existing familiar literature such as corporate governance, information technology governance, information governance, and of course political constitutional governance. But blockchains are not easily categorised as nation states, commons, clubs, or firms. They are a new institutional species that has features of each of these well-known institutional forms.

An analogising approach might be effective to design the very first experiments in blockchain governance. But as the industry matures, a new and more effective and robust approach is necessary. We now have vast empirical data of blockchain governance. We have hundreds, if not thousands, of blockchain governance mechanisms, and some evidence of their outcomes and effects. These are the empirical foundations for a deeper understanding of blockchain governance — one that embraces the institutional diversity of blockchain ecosystems, and dissects its parts using a rigorous and consistent methodology.

Embracing blockchain institutional diversity

Our understanding of blockchain governance should not flatten or obscure away from its complexity. Blockchains are polycentric systems, with many overlapping and nested centres of decision making. Even with equally-weighted one-token-one-vote blockchain systems, those systems are nested within other processes, such as a github proposal process and the subsequent execution of upgrades. It is a mistake to flatten these nested layers, or to assume some layers are static.

Economics Nobel LaureateElinorOstrom and her colleagues studied thousands of complex polycentric systems of community governance. Their focus was on understanding how groups come together to collectively manage shared resources (e.g. fisheries and irrigation systems) through systems of rules. This research program has since studied a wide range of commons including cultureknowledge and innovation. This research has been somewhat popular for blockchain entrepreneurs, in particular through using the succinct design principles (e.g. ‘clearly defined boundaries’ and ‘graduated sanctions’) of robust commons to inform blockchain design. Commons’ design principles can help us to analyse blockchain governance — including whether blockchains are “Ostrom-Compliant” or at least to find some points of reference to begin our search for better designs.

But beginning with the commons design principles has some limitations. It means we are once again beginning blockchain governance design by analogy (that blockchains are commons), rather than understanding blockchains as a novel institutional form. In some key respects blockchains resemble commons — perhaps we can understand, for instance, the security of the network as a common pool resource — but they also have features of states, firms, and clubs. We should therefore not expect that the design principles developed for common pool resources and common property regimes are directly transferable to blockchain governance.

Beginning with Ostrom’s design principles begins with the output of that research program, rather than applying the underlying methodology that led to that output. The principles were discovered as a meta-analysis of the study of thousands of different institutional rule systems. A deep blockchain-specific understanding must emerge from empirical analysis of existing systems.

We propose that while Ostrom’s design principles may not be applicable, a less-appreciated underlying methodology developed in her research is. In her empirical journey, Ostrom and colleagues at the Bloomington School developed a detailed methodological approach and rule classification system. While that system was developed to dissect the institutional complexity of the commons, it can also be used to study and achieve conceptual clarity in blockchain governance.

The Institutional Analysis and Development (IAD) framework and the corresponding rule classification system, is an effective method for deep observation and classification of blockchain governance. Utilising this approach we can understand blockchains as a series of different nested and related ‘action arenas’ (e.g. consensus process, a protocol upgrade, a DAO vote) where different actors engage, coordinate and compete under sets of rules. Each of these different action arenas have different participants (e.g. token holders), different positions (e.g. delegated node), and different incentives (e.g. to be slashed), which are constrained and enabled by rules.

Once we have identified the action arenas of a blockchain we can start to dissect the rules of that action arena. Ostrom’s 2005 book, Understanding Institutional Diversity, provides a detailed classification of rules classification that we can use for blockchain governance, including:

  • position rules on what different positions participants can hold in a given governance choice (e.g. governance token holder, core developer, founder, investor)
  • boundary rules on how participants can or cannot take part in governance (e.g. staked tokens required to vote, transaction fees, delegated rights)
  • choice rules on the different options available to different positions (e.g. proposing an upgrade, voting yes or no, delegating or selling votes)
  • aggregation rules on how inputs to governance are aggregated into a collective choice (e.g. one-token-one-vote, quadratic voting, weighting for different classes of nodes).

These rules matter because they change the way that participants interact (e.g. how or whether they vote) and therefore change the patterns that emerge from repeated governance processes (e.g. low voter turnout, voting deadlocks, wild token fluctuations). There have been somestudies that have utilised the broad IAD framework and commons research insights to blockchain governance, but there has been no deep empirical analysis of the rule systems of blockchains using the underlying classification system.

The opportunity

Today the key constraint in advancing blockchain governance is the lack of a standard language of rules with which to describe and map governance. Today in blockchain whitepapers these necessary rules are described in a vast array of different formats, with different underlying meanings. That hinders our capacity to compare and analyse blockchain governance systems, but can be remedied through applying and adopting the same foundational grammar. Developing a blockchain governance grammar is fundamentally an empirical exercise of observing and classifying blockchain ecosystems as they are, rather than imposing external design rules onto them. This approach doesn’t rely on analogy to other institutions, and is robust to new blockchain ecosystem-specific language and new experimental governance structures.

Rather than broadly describing classes of blockchain governance (e.g., proof-of-work versus proof-of-stake versus delegated-proof-of-stake) our approach begins with a common set of rules. All consensus processes have sets of boundary rules (who can propose a block? how is the block-proposer selected?), choice rules (what decisions do block-proposers make, such as the ordering of transactions?), incentives (what is the cost of proposing a bad block? what is the reward for proposing a block), and so on. For voting structures, we can also examine boundary rules (who can vote?), position rules (how can a voter get a governance token?) choice rules (can voters delegate? who can they delegate to?) and aggregation rules (are vote weights symmetrical? is there a quorum?).

We can begin to map and compare different blockchain governance systems utilising this common language. All blockchain governance has this underlying language, even if today that grammar isn’t explicitly discussed. The output of this exercise is not simply a series of detailed case studies of blockchain governance, it is detailed case studies in a consistent grammar. That grammar — an Ostrom-Complete Grammar — enables us to define and describe any possible blockchain governance structure. This can ultimately be leveraged to build new complete governance toolkits, as the basis for simulations, and to design and describe blockchain governance innovations.

What we think we know about defi

This essay follows an RMIT Blockchain Innovation Hub workshop on defi. Contributions by Darcy WE Allen, Chris Berg, Sinclair Davidson, Oleksii Konashevych, Aaron M Lane, Vijay Mohan, Elizabeth Morton, Kelsie Nabben, Marta Poblet, Jason Potts, and Ellie Rennie. Originally a Medium post.

The financial sector exists solely to smooth economic activity and trade. It is the network of organisations, markets, rules, and services that move capital around the global economy so it can be deployed to the most profitable use.

It has evolved as modern capitalism has evolved, spreading with the development of property rights and open markets. It has grown as firms and trade networks became globalised, and supercharged as the global economy became digitised.

Decentralised finance (defi) is trying to do all that. But just since 2019, and entirely on the internet.

Any business faces the question of “how do I get customers to pay for my product?” Similarly consumers ask the question, “Where and how can I pay for the goods and services I want to buy?” For the decentralised digital economy, defi answers this question. Defi provides the “inside” money necessary to facilitate transactions.

But what in traditional, centralised finance looks like banks, stock exchanges, insurance companies, regulations, payments systems, money printers, identity services, contracts, compliance, and dispute resolution systems — in defi it’s all compressed into code.

From a business perspective trade needs to occur in a trusted and safe environment. For the decentralised digital economy, that environment is blockchains and the dapps built on top.

And as we can see, defi doesn’t just finance individual trades or firms — it finances the trading environment, in the same way that taxes finance regulators and inflation finances central banks. If blockchain is economic infrastructure, defi is the funding system that develops, maintains and secures it.

These are heavy, important words for something that looks like a game. The cryptocurrency and blockchain space has always looked a little game-y, not least with its memes and “in-jokes”. The rise of defi has also had its own cartoonified vibe and it has been somewhat surreal to see millions of dollars of value pass through tokens called ‘YAMs’ and ‘SUSHI’.

Games are serious things though. A culture of gaming provides a point around which all participants can coordinate activity and experimentation — what we’re seeing in defi is the creation of a massive multiplayer online innovation system. The “rules” of this game are minimal, there are no umpires, and very little recourse, where the goal is the creation and maintenance of decentralised financial products, and willing players can choose (if and) to what extent they participate.

Because there is real value at stake, the cost of a loss is high. Much defi is tested in production and the losses from scams, unethical behaviour, or poor and inadequately audited coding are frequent.

On the other side, participation in the game of defi is remarkably open. There are few barriers to entry except a small amount of capital that players are willing to place at risk. Once fiat has been converted into cryptocurrency, the limit on participation in decentralised finance isn’t regulatory or institutional — it is around knowledge. (Knowledge is a non-trivial barrier, excluding people who could be described as naive investors. This is important for regulatory purposes.)

This is starkly different from the centralised financial system, where non-professional participants have to typically go through layers of gatekeepers to experiment with financial products.

The basic economics of defi

The purpose of defi is to ensure the supply of an ‘inside money’ — that is, stablecoins — within decentralised digital platforms and to provide tools to manage finance risks.

In the first instance defi is about consumer finance. It answers basic usability questions in the blockchain space: How do users of the platform pay native fees? Which digital money is deployed as a medium of exchange or unit of account on the platform?

In the second instance defi concerns itself with the operation of consensus mechanisms — particularly proof of stake mechanisms and their variants. The problem here is how to capture financial trust in a staking coin and then how to use that trust to generate “trust” on a blockchain. Blockchains need mechanisms to value and reward these tokens. Given the (potential) volatile nature of these tokens, risk management instruments must exist in order to efficiently allocate the underlying risk of the trading platform.

As we see it, the million yam question is whether the use of these risk management tools undermine trust in the platform itself. It is here that governance is important.

Which governance functions should attach to staking tokens and when should those functions be deployed? Should they be automated or should voting mechanisms be used? If so, which voting mechanisms and what level of consensus is appropriate for decision making.

Finally defi addresses the existence of stablecoin and staking tokens from an investor perspective. Again there are some significant questions here that the defi space has barely touched. How do these instruments and assets fit into existing investment strategies? How will the tax function respond? How much of existing portfolio theory and asset pricing applies to these instruments and assets?

Of course, we already have a complex and highly evolved centralised financial system that can provide much of the services that are being built from the ground up in defi. So why bother with defi?

The most obvious reason is that the blockchain space has a philosophical interest in decentralisation as a value in and of itself. But decentralisation addresses real world problems.

First, centralised systems can have human-centric cybersecurity vulnerabilities. The Canadian exchange QuadrigaCX lost everything when the only person with access to the cryptographic keys to the exchange died (lawyers representing account holders have requested that the body be exhumed to prove his death). Decentralised algorithmic systems have their own vulnerabilities (need we mention yams again?) but they are of a different character and unlike human nature they can be improved.

Second, centralised systems are exposed to regulation — for better or worse. For example, one of the arguments for UniSwap is that it is more decentralised than EtherDelta. EtherDelta was vulnerable to both hackers (its order book website was hacked) and regulators (its designer was sued by SEC).

Third, digital business models need digital instruments that can both complement and substitute for existing products. Chain validation instruments and the associated risk management tools presently do NOT have real world equivalent products.

Fourth and finally, the ability to digitise, fractionalise, and monetise currently illiquid real-world assets will require a suite of instruments and digital institutions. Defi is the beginning of that process.

In this sense, the defi movement is building a set of financial products and services that look superficially familiar to the traditional financial system using a vastly different institutional framework — that is, with decentralisation as a priority and without the layers of regulation and legislation that shape centralised traditional finance.

Imagine trying to replicate the functional lifeforms of a carbon-based biochemical system in a silicon based biochemical system. No matter how hard you tried — they’d look very different.

Defi has to build in some institutions that mimic or replicate the economic function provided by central banks, government-provided identity tech, and contract enforcement through police, lawyers and judges. It is the financial sector + the institutions that the traditional finance sector relies on. So, initially, it’s going to look more expensive, relative to “finance”. But the social cost of the traditional finance sector is much larger — a full institutional accounting for finance would have to include those courts and regulations and policymakers and central banks that it relies on.

Thus defi and centralised finance look very different in practice. Consider exchanges. Traditional financial markets can either operate as organised exchanges (such as the New York Stock Exchange) or as over-the-counter (OTC peer-to-peer) markets. The characteristics of those types of market are set out below.

Image for post

Defi exchanges represent an attempt to combine the characteristics of both organised exchanges and over-the-counter markets. In the very instance, of course, they are decentralised markets governed by private rules and not (necessarily) public regulation. They aim to be peer-to-peer markets (including peer-to-algorithm markets in the case of AMM).

But at the same time they aim to be anonymous (in this context meaning that privacy is maintained), transparent, highly liquid, and with less counterparty risk than a traditional OTC market.

Where is defi going?

Traditional finance has been developing for thousands of years. Along with secure private property rights and the rule of law, it is one of the basic technologies of capitalism. But of those three, traditional finance has the worst reputation. It has come to be associated with city bros and the “Wolf of Wall Street”, and the Global Financial Crisis. Luigi Zingales has influentially argued that the traditional finance system has outgrown the value it adds to society, in part because of the opportunities of political rent seeking.

This makes defi particularly interesting.  Defi is for machines. Not people. It represents the automation of financial services.

A century ago agriculture dominated the labour force. The heavy labour needs of farming are one of the reasons we were poor back then. As we added machines to agriculture — as we let machines do the farming — we reduced the need to use valuable human resources. Defi offers the same thing for finance. Automation reduces labour inputs.

Automation of course has been increasingly common in financial systems since at least the 1990s. But it could only go so far. A lot of the reason that finance (and many sectors, including government and management) resisted technological change and capital investment, was at the bottom, there had to be a human layer of trust. Now that we can automate trust through blockchains, we can move automation more deeply into the financial system.

Of course, this is in the future. Right now defi is building airplanes in 1902 and tractors in 1920. They’re hilariously bad and horses are still better. But that’s how innovation works. We’re observing the creation of the base tools for entrepreneurs to create value. Value-adding automated financial products and services comes next.

What we’ve learned from working with Agoric

With Sinclair Davidson and Jason Potts. Originally a Medium post.

Since 2017 we (along with our colleague Joe Clark) have been working with Agoric, an innovative and exciting smart contract team, who are about to launch a token economy model we helped design.

At the RMIT Blockchain Innovation Hub we’ve long been thinking about how blockchain can drive markets deeper into firms, resolving the electronic markets hypothesis and giving us new opportunities for outsourcing corporate vertical integration.

What we’ve discovered from working with the Agoric team is the possibilities of driving markets down into machines. Mark Miller’s groundbreaking work with Eric Drexler explored how property rights and market exchange can be used within computational systems. Agoric starts economics where we start economics — with the institutional framework that secures property rights.

This has been one of the most intellectually stimulating collaborations of each of our careers, and has shaped much of how we think about the economics of frontier technologies.

We first met the Agoric team through Bill Tulloh at the Crypto Economics Security Conference at Blockchain @ Berkeley in 2017, just as we were forming the RMIT Blockchain Innovation Hub. CESC was the first serious attempt we were aware of to bring the blockchain industry and social science together — such as our disciplines of economics and political economy.

In the presentation to CESC, we applied some of Oliver Williamson’s thinking to understand the economic properties of tokens and cryptocurrencies.

Bill — who had thought along similar lines — came over to chat during a break. We met again at the 2018 Consensus Conference in New York. Bill introduced us to Mark Miller. What started out as a quick chat to say hello over breakfast turned into a long discussion about Friedrich Hayek, Don Lavoie, and market processes in computer science. Through Bill and Mark we then met Kate Sills and Dean Tribble.

It is true that economic thinking is everywhere in the blockchain and cryptocurrency community. There’s a lot of lay reasoning about Austrian economics, monetary policy, central banks, and inflation. These ideas have brought a lot of people into the cryptocurrency space. Some of the thinking that brought them here is good economics (we’re very passionate about how Austrian economics can inform the blockchain industry ourselves — see here and our colleague Darcy Allen here) but unfortunately a lot of it is not-so-good economics. Many developers have self-taught economics, many have intuited economics from first principles, and we have observed a combination of brilliant insight, economic fallacy, and knowledge gaps.

Developers, however, tend to be very good at game theory; if only because unlike our colleagues in academia, the blockchain community is testing the assumptions of game theory and applying it in the real world for business models with real value at stake. Reality can be bracing. Only invest what you can afford to completely lose. This is still a highly experimental industry.

But economics has much, much more to contribute to our understanding of the blockchain economy than just Hayekian monetary theory and textbook game theory. Our friends at Agoric know this — they already had an economist in their team. They know and understand that it isn’t enough to have good code — to succeed, you need to have economically coherent code.

To that end, we have developed a new field of economics: institutional cryptoeconomics. In this field, we apply the transaction cost economics of Ronald Coase and Oliver Williamson to explore blockchain as an economic institution competing with and complementing the schema of firms, markets, states, clubs and the commons.

The economic foundation of our institutional cryptoeconomics is broad and solid. In addition to economics Nobel laureates like Hayek, Ronald Coase, and Oliver Williamson, we have also incorporated the work of other laureates such as Herbert Simon, Douglass North, Elinor Ostrom, and Jean Tirole into our blockchain research. Then we’’ve drawn on should-have-been-laureates such as Joseph Schumpeter, William Baumol, Armen Alchian, and Harold Demsetz are included. Economists such as Andrei Shleifer and Israel Kirzner could still win a Nobel.

Merton Miller — himself an economics laureate — once argued that there was nothing more practical than good theory. Our experience working with Agoric has convinced us of the value of very good theory. We have had plenty of help — actual practitioners trying to solve immediate real-world problems are hard task masters. Ideas cannot remain half-baked — they must be fully explained and articulated. Working with Agoric has been an intellectually intense, extended interactive academic seminar where ideas are taken from vague hunch to ‘how can this be implemented’ and back again. From whiteboard to business model.

As academics we have learned which ideas, models and tools are of immediate use and value in the blockchain world. There have been some surprises here. Whoever would have thought that edgeworth boxes would have a practical real world application? Or indifference curves? But here we are. When building an entire economic ecosystem — the Agoric economy — we have had to draw upon the full breadth of our economic training. We suspect that having an economics team on board will become an industry standard in the years to come.

We have benefited as educators too. Of course, explaining complex ideas to highly intelligent laypeople is a large part of our day job. The stakes, however, are much higher. The Agoric team aren’t seeking information to pass a class test. They are seeking information to pass a market test — that the market will grade. As another favorite economist of ours Ludwig von Mises explained, consumers are hard task masters.

Our own students particularly have benefited from our Agoric experience. We now have a deeper understanding of industry needs and thought in the blockchain space. We know which ideas interest them and which don’t. The Agoric team questioned us closely on some topics. Our students will know how to answer those questions.

It also turns out that financial engineering is far more important than we thought it would be when we first started working on blockchain economics. The work with Agoric has coincided with the defi boom — a richly anarchic and innovative movement within the blockchain space. As a consequence, the blockchain for business degree programs that we have launched at RMIT have huge dollops of finance in them.

We share with Agoric a vision of the future where technology leads to an improvement in human flourishing and an enhancement of our capacity to lead full lives.

In a new book published by the American Institute for Economic Research we’ve argued that blockchain and other frontier technologies offer us the tools to actively take back liberties we may have lost.

With Agoric, it is incredibly exciting to be able to actually build the economy of the future that we’ve been studying.

How to understand the credentialing industry

With Jason Potts. Originally a Medium post.

Let’s take a birds’ eye view of the Australian economy. What do we produce? In order: iron ore, coal, and credentials.

Tertiary education is Australia’s third-largest export industry. And Australia is the third-largest education exporter in the world, behind the US and UK.

The world’s skilled labour markets are dependent upon proof of identity, experience and skills, including education qualifications, trade certification and occupational licensing. The smooth operation of these markets relies on the technical infrastructure that supports those credentials: a continually updated, reliable, trusted and efficient public registry of qualifications and skills.

We call this intersection of the education sector and access points into global labour markets the credentialing industry.

We’re university academics, but the credentialing industry encompasses much more than universities. In fact, it’s about more than just education. A surprisingly large fraction of the economy supplies and deliver credential services.

  • High school education and equivalencies (completion certification)
  • University credentials (course credits, graduate certificates, diplomas, bachelors degrees, higher degrees)
  • Vocational education credentials and trade certifications (hairdressing, electrician, builder, etc.)
  • Industry and professional association based qualifications (e.g. accounting [CPA, ACA], law [the Bar], finance [FINSIA], etc.)
  • Proficiency qualifications (languages, driving, etc.)
  • Occupational licenses (surgeons, pilots, dentists, teaching, etc.)

Credentials prove skills and qualities, and trusted claims of skills and capabilities are an input into contracts and jobs. They are an institutional token that carries trusted information that facilitates transactions in almost every labour market, many service markets, and all professional markets in an economy.

As the economy becomes more complex, the workforce will need to be more highly skilled and globally oriented. This means that credentials will be a more important output (from the credentialing industry) and input (into labour markets).

Credentials are a key institution in a modern economy. The more complex and developed the economy, the more it depends on efficient and effective credential infrastructure and production.

These certifications benefit consumers, facilitating trust in professional and trade services, and employers, facilitating trusted information about skills and capabilities. The more complex the economy, the more important and valuable are credentials.

But what exactly is a credential? Credentials are a type of institutional technology that is produced by the education sector, by professional and trade associations, and by government, often jointly.

So from a public policy perspective, it can be hard to tell where the rules that govern the credential come from — are they from government regulation, or the private imposition of standards by a professional association. Richard Wagner calls these sorts of intermingling public/private rules entangled political economy.

From a technology perspective, a credential is a bundle of:

  • Identity (who does it attach to)
  • Registry (what is the content of claims made)
  • Assessment and evaluation (how have they been verified, and by whom)
  • Storage, maintenance and recall (an effective transactional database)

A credential has institutional and technical properties:

  • Trust and transparency
  • Security and auditability
  • Transactional value in use

A credential is essentially an entry in a ledger. But centralized credential technology has limitations in all of the above dimensions. Blockchain technology presents an opportunity to revolutionise the credential sector by offering a more effective, scalable and secure platform for the production and use of credentials.

For Australia, innovation in blockchain credentials, will benefit a major export industry, increasing administrative efficiency and facilitating adoption of digital technology in tertiary education, as well as improve the functioning of labour markets in Australia and around the world, increasing the quality of job matching and lowering the cost of employment. We expect blockchain adoption in the credentialing industry is expected to drive economic growth, exports, and jobs.

Multi-sided market collapse in the newspaper industry

Originally a Medium post. Published in the Spectator Australia as on 9 July 2020 as ‘The death – and rebirth – of the newspaper?

Everybody, whatever side of politics they are on, generally agrees that the media is one of the reasons that politics is so polarised right now.

Agreeing on why the media has driven this is a little harder. Yes, the newspaper and print industry has been disrupted, thanks to the internet. And yes, it seems like newspapers are more desperate for readers.

But underlying these surface level observations is the fact that newspapers are undergoing a fundamental structural shift between two organisational types — from platforms to factories.

Let’s call what’s happened to the newspaper industry multi-sided market collapse. Understanding the industry this way clarifies how today’s media environment is so different from that of the twentieth century — and offers a warning for other platform industries that face disruption in the future.

(I’m going to focus here on the newspaper industry, because the dynamics are most obvious there. But we can use this framework to understand how media economics effects media content in everything from talk radio to cable television.)

The basics

The twentieth century newspaper was a particular type of economic organisation: a platform that serviced a multi-sided market.

The idea of a multi-sided market platform was first developed in detail by Jean-Charles Rochet and Jean Tirole in 2003. It’s intuitive: we want to make trades with each other, and a platform helps match us together.

Platform economics is interesting because market participants want to use the platform that everybody else is using. We want to buy the video game console that has the most games — and developers want to design for the console that has the most users. We want to use the ridesharing app that has the most drivers — and drivers want to drive for the app with the most riders.

This desire to go where the crowd already is leads to some curious pricing structures. Platforms typically feature complex cross-subsidies. One side of the multi-sided market might be given access to the platform for free, or given heavy discounts, while the other faces high charges.

For the traditional newspaper industry, the market participants are advertisers and readers. Readers want content, and advertisers want eyeballs. Revenue from advertising paid for the production of news content, which attracted readers, which attracted more advertisers, and so on.

Image for post
The newspaper as platform

The cross-subsidies were straightforward. Advertisers were charged relatively large fees for access (very large in the case of fullpage advertising, and relatively large in the case of classifieds). Readers were charged small fees (through either subscription or individual sales), or even no fees (such as the free newspaper model or free distribution locations like stadiums and railway stations).

The need to get as many readers as possible onto the platform didn’t just shape pricing — it shaped decisions about what content would be published.

Newspapers sought to cater for as wide an audience as possible. On the op-ed page newspapers would strive for a rough balance. They’d match one opinion piece from the ideological right with one opinion piece from the ideological left. Let’s call this liberal balancing theory — all voices get heard.

In the news pages they’d adopt a perspective that wouldn’t excessively upset any particular side of politics. Let’s call this median reader theory. The combination of these two approaches has given us the twentieth century model of journalistic objectivity, view-from-nowhere journalism, the idea of newspaper-as-public-square etc.

The collapse

The arrival of the internet disrupted the underlying newspaper business model.

Newspapers first sought to continue the existing model in an online world by offering their content for free supported by banner ads or cross subsidised by print sales.

However, much advertising — particularly but not only classified advertising — migrated to dedicated digital platforms. To be more specific, the advertising migrated to digital platforms that didn’t use journalism as a way to attract eyeballs.

Within the space of a decade, the cross-subsidies that sustained the newspaper business model evaporated. But the demand for journalism has not. Newspapers have responded to this reduction in revenue from advertising by increasing the cost to readers. Newspaper websites now charge for access. Newspaper subscription prices went up.

Journalism is now predominantly paid for by fees from the readers that demand that journalism, rather than indirectly through advertising. This shift represents a change from a platform servicing a multi-sided market to a something that looks more like a production process servicing a single sided market. Less an advertising platform, and more a journalism factory.

In other words, what we’ve seen in the newspaper industry is multi-sided market collapse(I would prefer to call it deplatforming — but that word has already been taken.)

The adjustment

Now let’s think through what this means for newspaper content and journalism.

Higher subscription fees imply a smaller readership. This is less of a problem than it appears — newspapers no longer have the same need to deliver huge readership numbers to advertisers. Instead, newspapers need to convince readers to pay more for what a product they used to get cheaply or even free.

The strategy newspapers have pounced upon is specialisationNewspapers now seek readers who have more emotionally invested in that particular newspaper brand. They’re the ones more likely to pay the higher subscription fees.

Ideology is a specialisation. Partisanship is a specialisation.

In other words, multi-sided market collapse explains the dominance of ideologically driven media outlets in the digital age.

It helps explain controversies like that which greeted the Tom Cotton opinion piece published in the New York Times in June 2020. Why should ideologically-motivated readers pay higher prices for content intended to appeal to their ideological opponents?

And if newspapers are no longer trying to appeal to the median reader, why should they continue producing bland ‘view-from-nowhere’ content? The news pages have become more passionate, more opinionated, more self-aware. Newspapers now focus on what their most dedicated readers actually want — not just what the median reader in the population will accept.

The future

Converting a business from a platform to a factory is hard. If, presented with this argument before the internet existed, you tried to make predictions about what would happen to the newspaper industry should its platform model collapse, you’d likely predict:

1. Lots of newspapers fail to make the transition and massive business failure.

2. Lots of new media organisations be established that are structured around the new factory model.

Which is of course exactly what we have seen.

There are lots of implications of the idea of multi-sided market collapse. Here are a few. For instance, it demonstrates clearly that lot of our current debate about platform ‘monopolies’ like Facebook and Google is deeply confused about platform economics.

The multi-sided market collapse model shows that there has been no ‘expropriation’ of advertising from newspapers to digital platforms. Rather, as platform businesses, newspapers have been outcompeted. “Readers” (in this case, social media users and webpage searchers) and advertisers want the platforms they use to be as big as possible. Advertisers were attracted to newspapers because they were big platforms. Now advertising has migrated to different (digital) platforms. Nothing nefarious has occurred.

What does this mean for future technological disruption? If the analysis here is correct, it’s not obvious that new platform technologies like blockchain pose a threat to the new business model of journalism. They’re just not platforms anymore.

If we’re looking for blockchain use cases in journalism we should be thinking of them more along the lines of the factory/production process/supply chain model (focusing on provenance, track and trace) rather than the matching service performed by platforms.

Platforms are one of the dominant organisational structures of the digital economy. They rely on their ability to cross-subsidise one side of a market with another. And society invested heavily in newspapers as platforms — not just investments in terms of capital, but in cultural and political significance.

But when you work for a platform company it is easy to be confused about what your company’s competitive advantage actually is. In truth that advantage was not journalism, but matching. Newspapers were outcompeted by competitors that were better at matching.

The partisanship and fervour we’re seeing in media content right now is just the most visible symptom of an entire industry trying to restructure itself in real-time.

The digital consequences of the pandemic

With Darcy WE Allen, Sinclair Davidson, Aaron M Lane, and Jason Potts. Originally a Medium post.

The global policy response to the COVID-19 pandemic has been extraordinary. We’ve seen a massive increase in government spending and social welfare programs, heavy handed policing, and some less remarked on crisis deregulation.

But the long run effect of the pandemic will be even more substantial. COVID-19 is driving far deeper, and profound, changes in the economy.

Some of these changes we can start to see already, but their full implications are still murky and distant. Nonetheless, as we argue in our book Unfreeze: How to Create a High Growth Economy After the Pandemic, the economy will not simply snap back into place. The post-COVID-19 economy will not look like the pre-COVID-19 economy.

Here we offer seven changes that have big consequences for policymakers, entrepreneurs, and employees.

1 — Digital acceleration

COVID-19 has massively accelerated the adoption of digital technology to facilitate work from home. But also shop from home, school from home, telehealth, and so on.

This digital shift is often remarked on but not well understood. Technology adoption normally follows a particular diffusion trajectory. Digital technologies that have significant scale effects must overcome behavioural and institutional resistance, and they can get stuck at take-off. This means that the productivity benefits from widespread technology adoption, especially infrastructural and production technology, can be very slow to realise.

COVID-19 arrived at a critical time in the history of technology — when a supercluster of digital technologies were forming, poised to disrupt the underlying infrastructure of the economy. This suite of digital platforms and technologies had been developing for the past several decades. But they had run into innovation constraints caused by coordination adoption problems and regulatory barriers.

In March 2020, many of these constraints suddenly vanished. The spread of online education and telemedicine, which had been until then a multi-decade process, occurred in a matter of weeks.

This was a massive, global, multisector, virtually-instantaneous coordinated adoption of digital technology. That’s utterly incredible — and perhaps unique in the history of technology adoption.

A major problem with platform technologies is to drive coordinated adoption. The pandemic did in a few weeks what decades of government effort had failed to do. Long-run that is very good. But short-run it is highly disruptive.

2 — A need for massive entrepreneurial adjustment

In Unfreeze we argue that there is an urgent need for entrepreneurs to adapt to the post-COVID-19 world. Economies are made of connections, information, contracts, webs of value, relationships. When we try to restart the economy, much of this connective tissue will be gone.

The rapid technological acceleration driven by the crisis creates its own unique needs for adaptation. We’re already seeing the formation of new consumer preferences, new types of jobs, new types of business models with new cost and demand structures, new patterns of supply, and new regulatory and legal uncertainties.

But this implies that a significant amount of human capital and physical capital (built for industrial era technologies and business models) has rapidly devalued.

The first priority for entrepreneurs in the post-COVID-19 economy will be understanding how particular markets and jobs and administrative functions have changed. For example, many restaurants have moved to take-away only. Will consumers expect those new services to continue? Much of the white-collar economy has moved to work from home. Will employees demand that continues?

Entrepreneurial skills are essential during periods of rapid change. Entrepreneurship is not something that can be supplied by governments. But it can be inhibited. Policymakers have to make sure they are facilitating — not impeding — entrepreneurial adaptation to the accelerated digital adoption triggered by COVID-19.

3 — Decentralised production and innovation

One consequence of this sudden digital uptake is increased decentralisation. With the rapid adoption of work from home — not just the technologies but the social practices — we’ve seen a shift in the locus of much economic activity from offices into homes.

This shift has several implications. One, it facilitates greater co-production of value. More household resources, including especially local information, are being mixed into production.

Two, this also shifts the sites of innovation, facilitating greater household innovation and user innovation. More innovation occurs in the commons rather than in markets and organisations. This in turn increases the need for trusted decentralised networks and, in turn, increases the demand for and use of distributed innovation technology and institutions.

Three, distributed production will require more distributed dispute resolution mechanisms. Traditional courts have been slow to adapt to the digital environment and parties will be looking to more agile forms of alternative dispute resolution.

Four, because more production and innovation is occurring in households and in the commons, this means that it is harder to measure value creation and improvements in these non-market contexts. The non-market part of the economy will increase in apparent scale. So our industrial era measurements of economic activity (like GDP) will need to catch up with these new digital era realities of value creation.

This new institutional economic order will require a new economics to make sense of these new patterns of consumption and production, and new digital forms of capital and value creation.

4 — Powered-up economic evolution

The pandemic is a selection filter. As the precursor and mechanism of many of these changes, the economic consequence of the economic policy response to the viral pandemic is a powerful evolutionary selection mechanism passing over the global economy and through each sector.

This brutal selection mechanism is causing job losses, contract terminations or renegotiations, demand reductions, business closures and bankruptcy, fire sales, credit shrinkage, asset repricing, factor substitution, and other distinct forms of economic destruction that will play out over the coming months and years.

This hard evolutionary selection mechanism is also a filter. It will kill off some things disproportionately and let other things pass through. Most obviously, digitally enabled businesses and sectors will do better, because they are more well-adapted to the new environment. Bigger firms with better capitalisation (or better political connections) will do better, and smaller firms will be selected against.

In labour markets some positions are more vulnerable than others, particularly part-time workers or contractors. While many workers and firms are on temporary support through public sector subsidy of wages or quasi-partial nationalisations, a proportion of those positions or organisations being kept alive will die as soon as support is removed. There are many zombies already.

Similarly, there will be a lot of bad debt on company books (and thereby in banks) that will be realised in market revaluations over coming periods. These collapses will release resources for subsequent entrepreneurial reconstitution and reinvention.

But we should also expect consolidation of existing markets and resources among surviving players. This may actually result in higher growth and profits among large adaptive companies — particularly technology driven companies. So a period of global economic destruction is not inconsistent with a booming share market.

5 — The twilight of conventional macroeconomic policy

At the same time, COVID-19 looks to fundamentally break the standard monetary and fiscal policy levers that have been used to manage business cycles over the twentieth century.

From a public finance perspective, the magnitude of the committed policy actions is already unprecedented. The levels of public debt that are planned in order to deal with this crisis — the policies to subsidise wages, provide rent and income relief, bail out companies, etc in order to avoid market catastrophe — are the largest that has ever been experienced. Moreover, these actions are being taken during a massive collapse in tax receipts. The implications for public finance are catastrophic, with a huge increase in public debt, a vastly worse central bank balance sheet, and looming inflation.

The result is a policy challenge that far exceeds capabilities of traditional monetary and fiscal levers. We will require institutional policy reforms to deal with the crisis. But institutional policy designed to free-up the supply side of the economy, to lower the costs and constraints on businesses, is politically much harder to achieve.

Indeed, the limits of these policy levers reveals the extent to which government administration (e.g. of money, of asset and property registries, of identity, of regulation and governance) is still the foundation of a modern economy. The pandemic has brought into sharp relief the limits and constraints of this centralised public infrastructure and the technocratic foundations of the macroeconomic policy mechanisms built upon them.

The real alternative to conventional policy levers isn’t different policies (like quantitative easing, negative interest rates, or universal basic income) but better institutional technologies. We’ve been looking in the past few years at distributed digital technology (that is, blockchain) that offers a new administrative and governance base layer of the economy (see herehereherehere and here to start).

A digital infrastructure base layer of industry utilities and digital platforms would provide a far more agile foundation for targeted economic policy and entrepreneurial adaptation.

6 — A new global trading order

One of the most powerful institutional forces over the past several centuries, and which has underpinned global economic prosperity in the industrial era, was the development of global trading infrastructure for commodities and capital. It was built around the Westphalian system of nation-state record-keeping and intra-nation state treaty-based institutional governance (i.e. trade zones). But it has come to a virtual halt in the crisis.

In the short and medium term the global trading order will rebuild around a different order, namely provable health identity and data to facilitate the safe movement and interaction of people. Where that can safely happen, so can economic activity. Health zones can become the basis for trade zones. Australia and New Zealand are already talking about a “health bubble”. It would be easy to include other highly successful health economies — Taiwan, Japan, Germany, potentially Hong Kong and Singapore, some Pacific Island nations.

Green zones (or cordon sanitaire) have long been used in pandemics and have once again been proposed as a way to exit lockdown. As the health zone grows, so can the trade zone. Economic zones can then free ride on the decentralised identity and data infrastructure created to build a health zone. The result will be the redrawing of physical and network boundaries, even eliminating artificial economic borders, to create integrated trade zones.

7 — A new political order

The costs of COVID-19 do not fall evenly across the population. The health risks fall heavily on some groups (the elderly and those with co-morbidities), and the costs of economic lockdown fall on different groups and will be felt differently. The differential impact by sector, jobs, education, human capital investments or physical or financial capital write-downs shape how the costs are distributed across society.

The virus imposes huge private costs that will be in part socialised through political bargaining. The outcome of these politically mediated bargains and transfers that will shape politics for years to come.

But the pandemic also shifts some of the anchor points of political economy. The sudden growth of the welfare state, of unemployment insurance and wage-support, of healthcare provision and childcare, even of social housing are unlikely to be easily rolled back. So there will be a higher demand for social welfare safety nets.

But to pay for this, along with the urgent need to address the huge deterioration of public balance sheets, economic policy will need an aggressive pro-market agenda to unleash economic growth. Politically, this is a pivot to the centre with very ‘dry’ economic policy and ‘wet’ social policy — what was called ‘third way’ in the 1990s.

The counterpoint to that centre-pivot is that many of the high-cost political projects of both the right and the left will be abandoned. Reduced economic growth means we can afford fewer of the luxuries of advanced capitalism.

This is a vision of a new kind of social-digital capitalism to be built after the reset — from the government-led physical infrastructure of the industrial era, to a digital era built on private, open and communally developed technology platforms.

Finally

The economic consequences of the COVID-19 pandemic are mostly currently being discussed as a macro policy response to dealing with the economic destruction that the public health strategy necessitates. This is talk of the V-shaped, U-shaped, L-shaped or W-shaped recoveries. In Unfreeze we wrote of the need for a square root shaped recovery — after the reopening, we’ll need a long period of high economic growth to return to the prosperity of 2019.

But here we’ve gone further. COVID-19 is driving structural evolutionary change in the economy. The accelerated adoption of digital economic infrastructure during the crisis will leave a lasting mark on the political and economic system of the future.

Cryoeconomics: how to unfreeze the economy

With Darcy Allen, Sinclair Davidson, Aaron Lane and Jason Potts. Originally a Medium post.

The Australian government, like many governments around the world, wants to freeze the economy while it tackles the coronavirus pandemic. This is what the Commonwealth’s JobKeeper payments and bailout packages are supposed to do: hold workers in place and keep employment relationships together until mandatory social distancing ends.

Easier said than done. We are in completely uncharted territory. We’ve never tried to freeze an economy before, let alone tried to thaw it out a few weeks or months later. That’s why our new project, cryoeconomics, looks at the economics of unfreezing an economy.

To understand why this will be so hard, think of an economy as a remarkably complex pattern of relationships. Those relationships are not only between employees and employers, but also between borrowers and lenders, between shareholders and companies, between landlords and tenants, between producers tied together on supply chains, and between brands and tastemakers and their fans.

The patterns that make up our economy weren’t designed from above. They evolved from the distributed decisions of consumers and producers, and are shaped by the complex interaction between the supply of goods and services and their demand.

The problem is that the patterns the government plans to freeze are not the patterns we will need when they finally let us thaw.

When the government decides to pull the economy out of hibernation, the world will look very different. As a simple example, it’s quite possible that many Australians, forced to stay home rather than eat out, discover they love to cook. This will influence the demand for restaurants at the end of the crisis. On the other hand, our pent-up desire for active social lives might get us out into the hospitality sector with some enthusiasm. There will be drastic changes because of global supply chain disruptions and government policies. These changes will be exacerbated by the fact that not all countries will be unfrozen at the same time.

The upshot is that the economy which the government is trying to hibernate is an economy designed for the needs and preferences of a society that has not suffered through a destructive pandemic.

Unfreezing the economy is going to be extremely disruptive. New patterns will have to be discovered. As soon as the JobKeeper payments end, many of the jobs that they have frozen in place will disappear. And despite the government’s efforts, many economic relationships will have been destroyed.

Yet there will also be new economic opportunities — new demands from consumers, and new expectations. Digital services and home delivery will no doubt be more popular than they were before.

These disruptions will be unpredictable — particularly if, as we expect, the return to work is gradual and staggered (perhaps according to health and age considerations or access to testing).

As we unfreeze, the problem facing the economy won’t primarily be how to stimulate an amorphous ‘demand’ (as many economists argue government should respond to a normal economic recession) but how to rapidly discover new economic patterns.

It is here that over-regulation is a major problem. So much of the laws and regulations imposed by the government assume the existence of particular economic patterns — particular ways of doing things. Those regulations can inhibit our ability to adjust to new circumstances.

In the global response to the crisis there has already been a lot of covert deregulations. The most obvious are around medical devices and testing. A number of regulatory agencies have stood down some rules temporarily to allow companies to respond to the crisis more flexibly. The Australian Prudential Regulatory Authority is now willing to let banks hold less capital. The Australian Securities and Investment Commission has dropped some of its most intrusive corporate surveillance programs.

The deregulatory responses we’ve seen so far relate to how we can freeze the economy. A flexible regulatory environment is even more critical as we unfreeze. Anything that prevents businesses from adapting and rehiring staff according to the needs of the new economic pattern will keep us poorer, longer.

Today the government is focused on fighting the public health crisis. But having now turned a health crisis into an economic crisis, it must quickly put in place an adaptive regulatory environment to enable people and businesses to discover what a post-freeze economy looks like.

The rational crypto-expectations revolution

With Sinclair Davidson and Jason Potts. Originally a Medium post.

Will governments adopt their own cryptocurrencies? No.

Will cryptocurrencies affect government currencies? Yes.

In fact, cryptocurrencies will make fiat currency better for its users — for citizens, for businesses, for markets. Here’s why.

Why do we have fiat currency?

Governments provide fiat currencies to finance discretionary spending (through inflation), control the macroeconomy through monetary policy, and avoid the exchange rate risk they would have to bear if everybody paid taxes in different currencies.

As George Selgin, Larry White and others have shown, many historical societies had systems of private money — free banking — where the institution of money was provided by the market.

But for the most part, private monies have been displaced by fiat currencies, and live on as a historical curiosity.

We can explain this with an ‘institutional possibility frontier’; a framework developed first by Harvard economist Andrei Shleifer and his various co-authors. Shleifer and colleagues array social institutions according to how they trade-off the risks of disorder (that is, private fraud and theft) against the risk of dictatorship (that is, government expropriation, oppression, etc.) along the frontier.

As the graph shows, for money these risks are counterfeiting (disorder) and unexpected inflation (dictatorship). The free banking era taught us that private currencies are vulnerable to counterfeiting, but due to competitive market pressure, minimise the risk of inflation.

By contrast, fiat currencies are less susceptible to counterfeiting. Governments are a trusted third party that aggressively prosecutes currency fraud. The tradeoff though is that governments get the power of inflating the currency.

The fact that fiat currencies seem to be widely preferred in the world isn’t only because of fiat currency laws. It’s that citizens seem to be relatively happy with this tradeoff. They would prefer to take the risk of inflation over the risk of counterfeiting.

One reason why this might be the case is because they can both diversify and hedge against the likelihood of inflation by holding assets such as gold, or foreign currency.

The dictatorship costs of fiat currency are apparently not as high as ‘hard money’ theorists imagine.

Introducing cryptocurrencies

Cryptocurrencies significantly change this dynamic.

Cryptocurrencies are a form of private money that substantially, if not entirely, eliminate the risk of counterfeiting. Blockchains underpin cryptocurrency tokens as a secure, decentralised digital asset.

They’re not just an asset to diversify away from inflationary fiat currency, or a hedge to protect against unwanted dictatorship. Cryptocurrencies are a (near — and increasing) substitute for fiat currency.

This means that the disorder costs of private money drop dramatically.

In fact, the counterfeiting risk for mature cryptocurrencies like Bitcoin is currently less than fiat currency. Fiat currency can still be counterfeited. A stable and secure blockchain eliminates the risk of counterfeiting entirely.

So why have fiat at all?

Here we see the rational crypto-expectations revolution. Our question is what does a monetary and payments system look like when we have cryptocurrencies competing against fiat currencies?

And our argument is that it fiat currencies will survive — even thrive! — but the threat of cryptocurrency adoption will make central bankers much, much more responsible and vigilant against inflation.

Recall that governments like fiat currency not only because of the power it gives them over the economy but because they prefer taxes to be remitted in a single denomination.

This is a transactions cost story of fiat currency — it makes interactions between citizens and the government easier if it is done with a trusted government money.

In the rational expectations model of economic behaviour, we map our expectations about the future state of the world from a rational assessment of past and current trends.

Cryptocurrencies will reduce government power over the economy through competitive pressure. To counter this, central bankers and politicians will rail against cryptocurrency. They will love the technology, but hate the cryptocurrency.

Those business models and practices that rely on modest inflation will find themselves struggling. The competitive threat that cryptocurrency imposes on government and rent-seekers will benefit everyone else.

It turns out that Bitcoin maximalists are wrong. Bitcoin won’t take over the world. But we need Bitcoin maximalists to keep on maximalising. The stability of the global macroeconomy may come to rely on the credible threat of a counterfeit-proof private money being rapidly and near-costlessly substituting for fiat money under conditions of high inflation.

A hardness tether

Most discussion about the role of cryptocurrency in the monetary ecology has focused on how cryptocurrencies will interact with fiat. The Holy Grail is to create a cryptocurrency that is pegged to fiat — a so-called stable-coin (such as Tether or MakerDAO).

But our argument is that the evolution of the global monetary system will actually run the other way: the existence of hard (near zero inflation, near zero counterfeit) cryptocurrency will tether any viable fiat currency to its hardness. No viable fiat currency will be able to depart from the cryptocurrency hardness tether without experiencing degradation.

This in effect tethers fiscal policy — and the ability of politicians to engage in deficit spending in the expectation of monetising that debt through an inflation tax — to the hardness of cryptocurrency.

The existence of a viable cryptocurrency exit tethers monetary and fiscal policy to its algorithmic discipline. This may be the most profound macroeconomic effect of cryptocurrency, and it will be almost entirely invisible.

Cryptocurrency is to discretionary public spending what tax havens are to national corporate tax rates.

Blockchain is (now) a competitive industry

With Sinclair Davidson, Jason Potts and Ellie Rennie. Originally a Medium post.

With the anniversary of the Bitcoin whitepaper looming on October 31, it is remarkable how far and fast this industry has come since it was anonymously launched on a crypto bulletin board just ten years ago. Ethereum, which gave us smart contracts and ICOs, was only started in 2015. The Consensus conference, only in its fourth year, packed over 8500 attendees into the New York midtown Hilton with representatives from most major corporations and industries being present.

Blockchain is quickly becoming mainstream. The industry is entering the phase of industrial competition — and this is happening on a global scale.

Consensus is the centerpiece of Blockchain Week in New York City, and the main global industry conference for cryptocurrency and blockchain technology. It is also increasingly a platform for major industry announcements. Two clusters of announcements in particular are propitious markers of where we’re up to in the development of the industry.

In politics, David Burt, Premier and Finance Minister of Bermuda, announced his country’s Parliament had tabled the Digital Asset Business Act, staking an ambition and claim to be the world’s leading crypto-regulator. On Tuesday, Eva Kaili, Chair of European Parliament Science and Technology Options Assessment, announced the Blockchain Resolution had passed the European Parliament.

In enterprise, Fred Smith, CEO of FedEx called blockchain the next big disruption in supply chains and logistics with the potential to completely revolutionise the global trade system. Circle, a Goldman Sachs backed crypto finance company, announced it will be issuing a fiat stablecoin, which is to say a crypto-version of the $USD. And buried in the announcement by Kaleido — a blockchain business cloud — of a partnership with UnionBank i2i (a Philippines Bank specializing in rural banking), was a joint partnership with Amazon Web Services.

These announcements indicate that we have entered a new industry phase, moving well beyond the first entrepreneurial phase of highly speculative market-making start-ups operating entirely in a disruptive mode, and are now at the onset of a second phase of industrial dynamics, that of industrial competition. While still incredibly young, because of the speed and scale at which it has developed, the blockchain industry has now entered the phase of market competition.

The Bermuda announcement is a competitive response to the innovative regulatory frameworks built by jurisdictions such as Singapore, Zug (CryptoValley), Estonia, Gibraltar, Isle of Man, and other crypto-havens. The Bermuda announcement clearly signals that we’re now in the phase of global regulatory competition, and that crypto-regulation and legislation in countries such as the US and Australia will be held by the competitive pressure of exit-options from departing too far from the competitive equilibrium.

The announcement by Kaleido is in itself less significant than that of the AWS partnership, which signals the new shape of competition in cloud computing. Technology companies such as Microsoft, Oracle and IBM are competitively positioning themselves to provide foundational infrastructural services and standards in this new space, and the Fred Smith’s pronouncement signals that the logistics industry is about to be competitively disrupted again.

The difference between the first and second phase of industrial dynamics is that in the first phase entrepreneurs are inventing new technology, disrupting existing markets, and seeking to create new business models. It’s a process of de-coordination of an existing economic order. But this is not generally well described as a competitive market process, usually because markets themselves are still forming, and uncertainty is very high. Cooperation in networks and innovation commons is the predominant institutional form.

Competition emerges when uncertainty begins to clear as the outlines of how the technology works and what it will be used for, which markets are affected and how, and which firms will be involved, and a speculative game turns into a strategic game because it becomes clear who the players are and what they are doing. Investment is not just for R&D, for discovery of new technology; but is strategic investment to compete for market share, and ideally for market dominance.

This is where we are up to now: the phase of global market competition.And further evidence of this is that the main concern of industry participants is global regulatory uncertainty, which is to say the rules of the competitive game.

Now to be clear, crypto and blockchain is still an experimental technology. But we’re now past the early innovation phase — the start-up phase — and have investment is now a C-suite concern, and a parliamentary agenda item.

What does competition mean for Web 3.0?

So blockchain is being absorbed into the economy and global political system. But what does this mean for the future of the internet?

The other big question arising from the Consensus 2018 announcements was the extent to which the involvement of incumbent internet platforms, such as Microsoft and AWS, will affect the distributed nature of the emergent blockchain ecosystem.

Joseph Lubin, co-founder of Ethereum, argued that the technological foundations for a distributed future have been built and that the essential task now is to achieve scalability. Data storage is an important aspect of scalability that will be essential to the success of decentralised applications (dapps), and more radical solutions (such as the InterPlanetary File System, IPFS) are apparently not ready for widespread adoption.

The involvement of AWS in Kaleido enables enterprise participation in the Ethereum blockchain whilst ensuring that the data (including oracles) are housed securely. While numerous self-sovereign identity dapps are available (as displayed through Civic’s identity-checking beer vending machine at the conference), common standards are necessary for those providing verified information.

Microsoft’s partnership with Blockstack and Brigham Young University is a development towards these standards that is potentially significant for this new approach to online privacy.

Neither development necessarily threatens Web 3.0, but this is now being driven by a competitive logic of market forces.