Wulf A. Kaal**
Abstract
Decentralization cannot evolve in a vacuum. As decentralized technology becomes increasingly available, its application, adaptation, and evolution depend on society’s acceptance and use of the technology. While improved decentralized incentive designs can accelerate adoption, design alone will not suffice in promoting the decentralization of business and society. Several factors call into question society’s engagement with decentralized technologies. Such factors can be seen as decentralization neutralizers. They are plentiful and undermine society’s engagement with emerging decentralized technologies and the solutions it may offer for society. The article discusses these decentralization neutralizers and points out possible solutions.
Key Words: Decentralization, Neutralizers, Digital Assets, Decentralized Finance, Blockchain, Start-up, Decentralized Commerce, Emerging Technology, Token Models, Incentive Design, Tokens, Distributed Ledger Technology, Decentralized Infrastructure
JEL Categories: K20, K23, K32, L43, L5, O31, O32
I. Introduction
Decentralization cannot evolve in a vacuum. As decentralized technology becomes increasingly available, its application, adaptation, and evolution depend on society’s acceptance and use of the technology. While improved decentralized incentive designs can accelerate adoption, design alone will not suffice in promoting the decentralization of business and society.
Several factors call into question society’s engagement with decentralized technologies. Such factors can be seen as decentralization neutralizers. They are plentiful and undermine society’s engagement with emerging decentralized technologies and the solutions it may offer for society.
Analog habits remain strong. People who were raised in the analog age have been socialized into engaging in certain basic work related and social habits that are very hard to change. For example, the analog media such as vinyl or magnetized tape have a cult-like following despite their clear disadvantages including imperfections that cause cracking and popping noise. Their followers consider such technology imperfections a more humane and natural human-technology interaction. Other examples include peoples’ reliance on printing to get “serious work” done. While the content is the same, people somehow believe that their interaction with old technology formats ensures superior outputs. Psychologically, one may speculate, such reactionary perspectives may stem from a feeling of control over content that seems otherwise uncontrollable with modern technology. Believe in the superiority of outdated technology is prevalent in society, even among intellectual elites.
Humans are analog beings with natural limitations. Humans’ ability to store information is different from artificial intelligence. AI is based on the precise and incorruptible memory of computers. Artificial intelligence so far has been digital but may morph into something entirely different. Even if AI, neuroscience, and collective intelligence converge eventually, if remains unclear if advanced AI will be neuromorphic. Yet, emerging decentralized technology offers unprecedented programmable incentives in peer technologies that allows humans to become ever more connected and better collaborators. The coordination of human brain capacities, that is collective intelligence, while inherently risky, may be a way for humans to become more than analog beings.
Many leading legacy businesses are actively involved in attempts to neutralize disruptive decentralized innovation. The smartphone sector provides a prominent example. The smartphone sector has demonstrated that 24/7 hour transactions are possible worldwide, bringing knowledge and information to the edges, in any language or currency, across borders, regardless of communication technologies. The smartphone technology thus created new markets with smaller players that are competing with leading legacy businesses. Take for example what Skype did to the telecommunication market. The competition to win the resulting technology battle is intensifying at an alarming pace. In their attempts to neutralize their decentralized competitor businesses, legacy businesses are acquiring little-known companies with highly competitive decentralized technologies. Microsoft’s acquisition of Skype provides a prominent example. Legacy businesses’ attempts to control decentralized competition is still in its infancy. The effect of curtailing decentralized solutions by integrating them into centralized power structures may actually be the long-term incremental proliferation of decentralized solutions.
Some examples help illustrate the impact of technology on industries and its accelerating pace. The travel industry, which moved from established brick and mortar business to fully online service providers, is just one example that illustrates the impact of technology on industries and its accelerating pace. Even formerly disruptive businesses, such as Amazon, as its platform ages after over a decade without much change other than added categories, are now forced to try to preserve their place in the respective industries. Amazon has started to offer delivery as a service after getting a lesson from the more aggressive Chinese providers, demonstrating that Amazon had been out of touch with today’s technology offerings.
Governmental decentralization trends (discussed above) are counteracted by a political shift around the world. The lack of legitimacy and antiestablishment surge in politics in combination with growing public concerns about inequality, stagnating wages, trade sanctions, immigration, the debt crisis, and China’s rising power have fueled a recent political shift. Political parties in Europe have started the gradual process of moving away from markets back to the state. Political parties across Europe are emphasizing increased welfare spending, renationalization of formerly privatized public service providers, bigger pensions, and higher taxes for the corporations, among other measures aiming to reverse decades of pro-market policy. Political parties employ such measures in an attempt to stop the exodus of voters to populist and antiestablishment parties. Some analysts see this as a continuing countertrend to the political decentralization in the 1970s under Thatcher and Reagan, arguing that the zeitgeist of globalization and liberalization is over.
Decentralized systems and tribal structures evolved, morphed, and adapted as centralized systems were adopted by society. Decentralized systems evolved out of a naturally decentralized order of nature. Centralized solutions started to emerge in society to manage the seemingly disorganized and unpredictable chaos of nature. Without attempting to give a full account of anthropology and the move from decentralized human organization to centralized human organization, humans living in organized groups could maintain their life spans better and live better lives through coordinated action vis-à-vis nature than humans who did not centrally organize their groups. Efficiencies of centralized organization facilitated higher likelihoods of survival. Over time, such centrally organized groups evolved into tribal structures. In an effort to protected its members and utilize scarce resources most efficiently, most tribes utilized different forms of centralized and often patriarchy-based and enforced order and associated centralized hierarchy. With the introduction of property rights and associated economic order, tribes expanded into urbanized structures to, again, enhance efficiency of production of scarce resources and centralized distribution thereof. Urbanized structures over time evolved into centrally controlled countries with centrally organized governments and centralized societal mechanics. With some notable exceptions,[1] it was only through the emergence of decentralized technology, in the 1990s with the internet and in 2008 with blockchain technology, that developed centralized and mostly western first world countries started to experiment with decentralized solutions for society.
In the early 2010s, groups of technologists and believers in technology-enabled decentralized networks, orders, and solutions began to consider technology alternatives for societal order. Given the perceived downsides of centrally coordinated and government controlled hierarchies in society, including the negative effects on innovation, equity, and sustainability, the merging decentralized systems became increasingly attractive and their support groups proliferated around the world. The increasing mainstream acceptance of Bitcoin and the prospect of quick riches through investments in initial coin offerings broadened the scope of the decentralized community from technologists and idealists to include digital asset investors and institutional investors.
In the early 2020s, the characteristics of the existing pool of proponents of decentralized technologies have inhibiting effects on the evolution of decentralization. The initial crypto community emerged in the 2010s, in the aftermath of the publication of Satoshi Nakamoto’s Bitcoin whitepaper, with a strong libertarian focus. The idea of reforming society with decentralized technology remains strong in the crypto community. As a result, cooperation with existing centralized structures is often suboptimal. This is perhaps best illustrated by the debate between proponents of central bank issued digital currencies that are centrally controlled and the proponents of more decentralized protocols and more decentralized issuers of cryptocurrencies in the early 2020s. Rather than seeing the central bank issued digital currencies as a pathway to mainstream acceptance of more decentralized cryptocurrencies, the decentralized issuers of digital assets and their community perceived their centralized counterparts as a threat.
Between 20010 and 2020, the prospect of quick profits attracted a clientel to the community that was more interested in investment opportunities than decentralized solutions. With it came fraud in the digital asset space which was rampant and created a chasm between the decentralized assets space and mainstream investors. Mainstream investors often perceived digital assets as too risky and likely to be involved in forms of fraud. The lacking regulatory certainty and lacking regulatory recognition of cryptocurrencies exacerbated this trends. The single focus on Bitcoin as store of value and investment in large parts of the evolving decentralized community extends the negative effects for the foreseeable future.
Even those parts of the decentralized technology community that value decentralized technology solutions for society over investments and profits derived from cryptocurrencies are siloed and do not sufficiently engage with each other. Decentralized technology systems require interoperability for survivability. Yet, most platforms attempt to make it alone in an effort to become the next Bitcoin. As a result, decentralized community silos proliferate. The decentralized technology infrastructure solutions that are required cannot be build by one protocol alone. Cooperation is inevitably needed.
II. Degrees of Decentralization
Since the inception of decentralized networking technologies in the 2010s, researchers have debated the possibility of full and sustainable decentralization in computing systems.[2] Still in the early 2020s, decentralized technologies facilitated suboptimal degrees of decentralization out of necessity. For example, Bitcoin’s Proof-of-Work (PoW) consensus algorithm required multiple nodes to verify and facilitate transactions. In return for making their computing capacity available, Bitcoin miners were rewarded a portion of the transaction fees as well as the opportunity to claim the reward for the creation of a Bitcoin block. In the early 2020s, the Bitcoin block reward was about 12.5 bitcoins, issued circa 144 times per day. The economic nature of these rewards incentivizes degrees of centralization in collective action of nodes that share in the mining rewards as a group, so called mining pools. Similarly, experimentation with other forms of consensus algorithms that promise to overcome the downsides of PoW, such as the high energy consumption of mining, including but not limited to Proof-of-Stake (PoS) and Delegated-Proof-of-Stake (DPoS), have routinely made compromises with regards to the degree of decentralization to attain other benefits, such as scaling and/or security.
The debate on the possible degree of decentralization has insufficiently taken into account the path dependencies that are created by existing technology decentralization. Early stage decentralized technologies display clear limitations. For example, PoW, PoS, and DPoS, among others, create a community believe system in certain foundational technology features that in effect inhibit higher degrees of decentralization in technology. As a result of the suboptimal degrees of technology decentralization in early stage technology, community understanding of degrees of decentralization became distorted and perpetuated suboptimal technology decentralization while inhibiting higher degrees of decentralization in the technology infrastructure.
Even if full and sustainable decentralization may not be possible in computing systems, ever enhancing degrees of decentralization bring with them the promise of freeing human and machine networks and network governance from the age-old foundational problems of tyranny and corruption. Even smaller degrees of decentralization in network technology can have highly disruptive effects on the then existing systems in society. The most applicable and disruptive use cases of decentralized technologies evolve as the decentralized infrastructure of the technology evolves. As progress is being made on scalability of decentralized networks, use cases that necessitate higher degrees of throughput become increasingly available and disrupt existing industries that serve the same use cases with incumbent technology. Similarly, as progress is being made on the security of decentralized networks, if combined with higher degrees of scalability of those networks, other and new use cases are becoming available and start to disrupt industries with incumbent technologies.
The decentralization of network technology is limited because it operates in an environment of centralized technology. The internet, while in its foundation decentralized, has a centralized order structure. Decentralized technologies use the centralized evolution of the Internet and its technological infrastructure to create decentralized system solutions. Such decentralized systems exist and are confined by centralized technology of the internet and the centralized societal structures it facilitates.
III. S-Curve of Centralization
The S-Curve of centralization provides society with tangential temporary benefits. Centralization is effective for society and business as long as it provides added benefits. In the initial phase of expansion of centralization, the benefits of centralizing power, capital, and control, rise faster than the cost of centralization. Because the benefits of centralized power, such as cost savings and efficiency, economies of scale and scope, rise faster, they offset incremental downsides until the efficiency gains of centralized systems expire. In the following decline phase, the cost of centralization, e.g. corruption, rising prices, diseconomies of scale, diminishing returns, cronyism, and other overreaches increase. Because of the increase in the cost of centralization, the benefits of centralization can no longer offset and outweigh the cost. As a result, centralized systems function less optimally and less efficiently.
The efficiency gains of centralization can perpetuate centralized tendencies in society. Centralized efficiencies are very powerful and create benefits for society and business. As new technologies and new forms of human interaction materialize, centralized design principles can again be applied and thrive until they collapse in the next iteration. In the process, centralized design principles are becoming deeply engrained in tech in societal processes, business, and associated technology. In the process, centralized design efficiencies become engrained in the infrastructure of society and create path dependencies that make decentralized solutions incompatible.
The S-Curve of centralization undermines decentralized solutions. Anthropologically, society evolved from somewhat decentralized tribal structures into more centralized structures that facilitated efficiency gains in urbanization and throughout the industrial revolution. In the process, centralized infrastructure elements, such as a legal system, financial services, among many others, became deeply engrained in society. Despite the emergence of superior decentralized technological solutions that have been emerging at the beginning of the 21st century, the centralized elements in business and society can be perpetuated via the efficiency gains in the S-curve of centralization. The advent of decentralized technology makes it not only technologically possible to create decentralized solutions for society but is also anthropologically necessary as centralized societal solutions have become outdated and often unworkable.[3] Yet, legacy systems can be optimized with decentralized technology solutions that ultimately only perpetuate centralized system structures. For example, the malfunctioning proxy process for shareholders voting, where shareholder proxies may accidentally approve a merger proposal,[4] can be optimized with blockchain technology. In the process, the outdated and malfunctioning centralized legacy systems are being perpetuated rather than being replaced with better performing decentralized systems.
IV. Path Dependencies
Historical preference or use explains the continued use of a practice or product. Path dependency occurs if less efficient and beneficial means are used despite the availability of more efficient alternatives because it may be operationally easier and, in the short-term, more cost-effective to maintain along a preestablished path.Institutions are notoriously path-dependent and change less than rationally expectable and constrain advancement in the process. Several factors help explain the existence of path dependencies. First and foremost, humans are boundedly rational, subject to transaction cost, and opportunism.[5] Human policymakers make decisions based on their past experiences and their assumptions based on such experiences. Cost implications may lead to short-termism and reluctance to commit to long-term, sustainable solutions for the common good. Similarly, initial technology or design, concept, method, or innovation is often adopted as a standard for a given industry, group, or part of society, creating path dependence in its aftermath. Examples include the industrial revolution, use of fossil fuels as primary energy sources, analog systems, among many others.
The design of decentralized systems is subject to path dependencies that originate from the centralized past. Humans’ past experience in centralized structures infiltrate and undermine attempts to create more decentralized system. Any designer of any type of system is subject to subconscious processes the designer cannot change, channel, or influence. One aspect of the subconscious is its exposure to centralized thinking and philosophy in culture and during the human education and socialization. Design of decentralized systems attempts is subject to the totality of centralized human experience. As such, decentralized system designs are not only subject to centralization to enhance its features or deal with decentralized system shortcomings, such as scaling and throughput, among others, but also may incorporate unintentional methodologically centralized design features. For example, centralization problems were prevalent in blockchains at the beginning of the 2020s, whether the system designers used Bitcoin’s Proof of Work consensus algorithm or variations of Proof of Stake consensus algorithms, such as BitShare’s Delegated Proof of Stake[6] or Ethereum’s Casper.[7] In each of these cases, mining pools or block production cartels arose because lotteries, or voting delegations, or economy of scale gave outsized rewards to powerful groups.
Unintentional methodologically centralized design features in complex decentralized systems are inevitable. Complex systems are systems that are subject to interactions between the system and its respective environments or the systems’ own parts. Because of the interactions and the associated morphing of the system, complexity abounds and increases often exponentially because of feedback loops, spontaneous order or the lack thereof, and adaptation. Complex adaptive systems constantly evolve through self-regulation of self-organizing group behaviors. By contrast, while humans give undue credence to complex concepts,[8] human brains naturally simplify complex systems for the sake of coherence. Therefore, it seems only natural that the designers of decentralized systems will unintentionally use centralized design features that increase coherence. Decentralized collaboration may be able to overcome these natural human limitations.
Biomimicry for decentralized design optimization is subject to limitations. Attempts to emulate nature’s decentralized design[9] to enhance the resilience of human-made systems are typically disadvantaged ab initio as nature’s design is the result of millions of years of design experimentation. Because of its history of experimentation, nature’s decentralized design is meticulous until proven otherwise. By contrast, human science is flawed until proven otherwise.[10] To emulate nature’s decentralized experimentation design features, human designed decentralized systems would need to build experimentation with workable designs as a core feature into the system. At the beginning of the 2020s, very few decentralized systems or design proposals incorporate such experimentation features.[11] Even if human designed decentralized systems incorporate experimentation features at the design core, experimentation takes time. Truly decentralized human design systems may take decades of experimentation to iron out flaws and centralized impurities.
V. Constitutional Democracies
Constitutional democracies are the dominant form of organizing society.[12] In constitutional democracies, the government is limited by a constitution. For example, in the United States, the constitution is the supreme law of the United States of America, originally comprising seven articles, delineating the national framework of the government. The constitution delineates the checks and balances for the distribution of power in the government. The constitution controls the elected representatives in the government and holds them accountable for their decisions and actions while in public office. For example, the constitution protects the rights of minorities. An elected representative in a constitutional democracy cannot make decisions against a minority group even if it benefits the ethnic majority that helped elect the representative. The authority of the majority is limited by legal and institutional means so that the rights of individuals and minorities are respected.
Overreliance on the design of constitutional democracies limits experimentation with decentralized solutions for the organization of society. Because of the overwhelming success of constitutional democracies around the world and their ability to bring peace and equity to humanity, the model is widely seen as the only form of organization that has a lasting role for humanity. Yet, the reliance on a single legal document, the constitution, in combination with widespread voter apathy has created limitations. The constitution and the centralized structures established by the constitution can perhaps be adequately adjusted to unprecedented societal needs. However, the exclusive reliance on constitutional principles for the organization of society ignores and undermines the opportunities created by decentralized forms of organization.
VI. Democratic Legitimacy of Code
“Developers will be at the center of solving the world’s most pressing challenges. However, the real power comes when every developer can create together, collaborate, share code and build on each other’s work. In all walks of life, we see the power of communities, and this is true for software development and developers.” Satya Nadella (CEO, Microsoft).[13]
Evolving decentralized systems are increasingly instantiated via computer code. Code provides the fundamental architecture that sets the terms of life in a digital age. The deep architecture of the digital world cannot be fully understood unless one considers the nature of code and how it operates. Code determines how to protect personal information or how humans express themselves. Code determines whether access to information is open or whether specific information or space is zoned and siloed. Code affects who sees what, or who or what is monitored. It determines how machines communicate in the Internet of Things.[14]
“[The code] will present the greatest threat to both liberal and libertarian ideals, as well as their greatest promise. We can build, or architect, or code cyberspace to protect values that we believe are fundamental. Or we can build, or architect, or code cyberspace to allow those values to disappear. There is no middle ground. There is no choice that does not include some kind of building. Code is never found; it is only ever made, and only ever made by us.” Lessig (2006).[15]
Code is increasingly functioning as law. Through code, private actors increasingly influence global (e)-commerce. Private actors’ coded legal arrangements create transnational rules and regulations. For example, in the advent of the digital age, code has progressively established itself as the predominant way to regulate the behavior of Internet users and e-commerce. While code cannot yet emulate the ambiguity and flexibility of legal rules and natural language that can be interpreted by a machine, blockchain technology and smart contracting enable code to assume an even stronger role in coordinating and regulating people’s interactions over the Internet.
Given the proliferation of code as law, the democratic legitimacy of coded legal arrangements are increasingly called into question. Through code, private actors have created governance tasks that were traditionally the responsibility of democratically elected sovereign states and state actors. Yet, the private coded legal arrangements may violate democratically determined social justice objectives and imperfectly reflect the makeup and demographic realities of modern societies. Arguably, the lacking democratic legitimacy of code is not as much of a concern because coded legal solutions are more restrictive than traditional law. Code forces users to comply with its restrictions ex-ante. Users who engage with a given coded platform or smart contract are constrained by the coded environment from the beginning of the relationship. Users cannot breach the technical coded rules, even before they act. By contrast, traditional legal rules can only be enforced ex-post, that is, after the predetermined rules have been violated. While it is possible that global democracy can be enhanced through private actors and their coordination of legal arrangements via code, coded private governance may also circumvent democratically legitimized governments. As such, coded private regulation may compete with national and international law. Thus, the democratic legitimacy of coded private governance is an ongoing concern.
Decentralized systems can provide coded solutions that help overcome the challenges presented by the lacking democratic legitimacy of code. Such decentralized coded solutions provide an incremental democratization process. For example, it is possible to create a decentralized legal precedent system through weighted directed acyclic graphs.[16] With decentralized precedent comes democratic legitimacy as the constituents who voted for the code template, e.g. decentralized legal solutions etc., provide democratic legitimacy in the respective voting pool. As the vote-legitimized templates are increasingly used by others who reference the earlier vote for the template, democratic legitimacy increases incrementally through use as validation of the earlier voting pool result. While these decentralized democratic solutions are not the equivalent of centralized parliamentary democratic legitimacy, they can be seen the early renderings of a decentralized democratic society.
VII. Territorial Integrity of Nation States
Conceptions of geographical borders for nation states undermine a decentralized organization of society. Nation states and their geographical borders were created by cultural homogeneity of a given tribal structure and their need to distinguish and protect themselves from other tribes. In decentralized forms of organization tribal homogeneity is reorganized. That reorganization may be incompatible with existing tribal structures and their associated instantiations of government and geographical constraints.
Experimentation with decentralized organizational forms of society is curtailed by the principle of territorial integrity of nation states. The territorial integrity of nation states is a foundational principle of international law.[17] It suggests that forced imposition of a border change is an act that justifies war. Experimentation with decentralized forms of societal organization are limited because of the geographical constraints of the nation state and its protection of its territorial integrity. Protection of its geographical borders under the principle of territorial integrity of nation states necessitates a continuous presence of modern national defense and military. The presence and perceived threat of nation states’ military facilitates the ongoing division of natural resources and production of products and services in a given nation state and the exclusion of others in the distribution of such resources. Society is less likely to experiment with decentralized forms of society if nation states assert authority over natural resources.
Nation states undermine distributed value creation. As societies find alternative means of production and distributed value creation, nation states’ protection of national resources creates obstacles in the expansion of distributed value creation. Distributed value creation depends on the use of networked intelligence and self-organization. As society moves from the industrial age corporation with traditional hierarchies tasked with the allocation of traditional physical and financial assets to increasing self-organization and the creation and distribution of decentralized critical resources via networked intelligence, the value creation increasingly transcends the geographical constraints of nation states. Nation states attempting to assert their sovereignty and oversight over distributed value creation is likely to curtail decentralized solutions for distributed value creation. Naturally, governments will try to tax increasing distributed value creation if they can assert national jurisdiction. Exercising this assumed authority inhibits distributed value creation.
Decentralized value creation can effectively coexist with the nation state. Practically, it is still difficult to imagine how society could coordinate its critical resources and physical infrastructure without a nation state and its functionalities and applications for society. Simple coordination of law and order, physical infrastructure, such as streets and sanitation, among others, necessitate physically present centralized governance for the foreseeable future. In fact, these practical constrains may permanently undermine or significantly delay the comprehensive decentralized coordination of societies’ resources and distributed value creation. Accordingly, decentralized value creation is more of a feature not a bug in the system that needs protection from the nation state. The coexistence of decentralized value creation and the nation states may ultimately be a question of degree and coordination. In other words, the nation state could make decentralized value creation a feature of its coordination function for society.
VIII. Dogmatism
Increasing dogmatism of the 21st century undermines the acceptance of decentralization. Society is faced with unprecedented challenges as the complexity of the world increases exponentially in the 21st century. For example, social welfare systems collapse and call for alternative solutions that are not afflicted by political dogma. Unfortunately, humans naturally crave certainty in an environment of exponentially growing complexity and associated uncertainty. As a result, humans often favor dogmatic approaches that create certainty. Dogma or human belief in an incontrovertible absolute truth often results in lacking updates of perspectives on an area of a given reality. Political dogma is formed by culture, socialization, education, and many other factors. Dogmatic approaches to new problems are unlikely to present the much-needed solutions. Groupthink and other forms of dogma undermine society’s collective intelligence and decentralized decision making.
Tools that facilitate output from humanity’s collective intelligence and collective decentralized decision making are the most likely candidates to help humanity overcome the core challenges of the 21st century. Without a broad and constantly changing set of tools, humanity will attempt to solve new problems with existing design metrics, which can have catastrophic consequences.[18] Adequate tools require constant updating and adjusting of human assumptions, core believes, and their underlying evidence. Constant examination and validation of new data to experiment and update the underlying model and understanding of the world is essential. As new information becomes available, in an effort to take into account any validated new data, previous conclusions about the world and its state of affairs have to be continuously updated. Without constant questioning and corresponding updating, existing tools will inevitably begin to warp and misshape emerging data to force-fit its existing and previously functional models.
Decentralized decision-making tools help transcend traditional economic notions of capitalism and socialism. Conceptually, decentralization uses elements of profit generation and redistribution in a way that in effect combines capitalistic and socialistic ideas. Decentralization allows for the organization of society in economic structures that generate profits while at the same time redistributing resources.[19] Decentralized decision-making tools facilitate self-organization and with it networked intelligence that can grow organically and evolutionarily. As society moves from the industrial age production in centralized entities with traditional hierarchies tasked with the allocation of traditional physical and financial assets to increasing self-organization and the creation and distribution of critical resources via networked intelligence, distributed value creation becomes possible.
Widespread belief in centralized dogmatic approaches that can be traced in principle back to notions of socialism and capitalism remove the possibility of transcending those core notions. Dogmatic notions can often be traced back to conflicting views on socialism and capitalism. Human society and economic systems have revolved around political doctrines for organizing society and allocating resources for the last two centuries. First and foremost, among those are the dichotomies of capitalism and socialism which are seen by their proponents and their progeny as mutually exclusive for the most part. Society is organized around those notions and has formed power structures and political hierarchies to serve and maintain these ideas.
IX. Regulation
In the early 2020s, government-controlled regulation of the evolving digital asset space was perhaps the leading decentralization neutralizer. The Securities and Exchange Commission, among other regulatory agencies in the United States, concerned themselves with attempts to fit decentralized technology solutions and their digital assets into the existing regulatory infrastructure. Carve-outs and safe harbors were discussed but not seriously considered. The emphasis of regulatory initiatives in the 2020s was on securities tokens and legal ways to trade such tokens in the then existing securities law infrastructure. Government control of the industry was an indispensable aspect of the legal initiatives. Yet, decentralized technology solutions, at their core, negate external control, censorship, and oversight. Accordingly, the then-proposed approaches largely undermined the decentralized technology evolution. Without the ability to experiment in a legally protected environment, decentralized products and technology could not evolve. Consumer protection rightly trumped decentralized product experimentation in the early 2020s. Regulatory arbitrage provided some space to grow for the then emerging decentralized products.
In the early 2020s, decentralized legal infrastructure solutions were almost entirely missing. The incompatibility of decentralized technologies with the then-existing regulatory frameworks would have suggested that decentralized legal infrastructure solutions would have flourished. Yet, other important decentralized infrastructure products, such as a functional public blockchain were still missing which undermined decentralized infrastructure solutions. While some startups had been experimenting with ERC-20 tokens and forms of decentralized arbitration, such solutions lacked sufficient scale, decentralization, anonymity, and autonomy.
The regulatory infrastructure solutions for decentralized products and technology of the early 2020s followed largely the characteristics of the product issuers. Tokens issued by government entities (government coins) were compliant with and followed the legal environment established by the issuing government. Similarly, tokens issued by corporations in a given jurisdiction (corporate coins), were compliant with the legal guidance available at the time of their issuance in the given jurisdiction. More decentralized products and technology solutions typically did not fit into such regulatory solutions in a given jurisdiction. As a result, the more decentralized products that were more censorship resistant, autonomous, and could not be controlled by regulatory agencies, were largely left in a regulatory vacuum that limited their expansion, reach, and evolution.
The regulation of government coins versus corporate coins versus people coins bifurcated the regulatory infrastructure for decentralized technologies. Government coins and corporate coins were able to develop and evolve with regulatory oversight. Governments, such as the People Republic of China, were able to promote the use of the technology through the tokens they sponsored and force users and merchants in their jurisdiction to embrace the technology. The government sponsored use of the technology, through government coins and corporate coins, also enabled a flourishing ecosystem that evolved around the government-sponsored decentralized technology solutions. In the case of corporate coins, just as in other industries like the oil and gas industry, corporate influences supported and enabled regulatory approaches and solutions they could comply with, support, and control. Such corporate-driven legal solutions operated at the expense of decentralized products and technology with higher levels of decentralized solutions that were not compliant with the centralized legal infrastructure for the cryptocurrency and blockchain industry. People coins and the innovations they created were largely subject to regulatory uncertainty and evolved much slower or not at all. As a result, evolving government coins and corporate coins created their own path dependencies and engrained product deficiencies with suboptimal levels of decentralization.
X. Centralized Algorithmic Automation
Centralized algorithmic automation poses perhaps the greatest threat to decentralization. Centralized algorithmic automation describes the process of artificially intelligent systems taking over core functions in human society. The increasing availability of data and predictive analytics is likely to cause humans to consult centrally organized information in decision making. In the 2020s, algorithms that analyze big data from sensors, IoT devices, big data, and humans started to provide unprecedented guidance to humanity. For example, sensors in medical devices, wearables, and even the basic mobile phone provided more and more data on human health and personal preferences, enabling humans to alter their habits, diet, and daily routines to improve health and live a more productive life.[20] Big data analytics using a variety of data sources including in social media preferences were able to reliably predict an individual’s preferences to a degree that significantly outcompeted the individuals’ peers and even spouses.[21]
Individual choice, free will, and diversity of human experiences and actions are increasingly limited through the availability of big data and algorithmic data analytics. For example, the more DNA analytics are available, the less people with make decisions that contradict the analytics. Human survivability and long-term health will determine their choices based on the available DNA analytics. This was powerfully illustrated by Angelina Jolie’s decision in 2013 to undergo a double mastectomy after her DNA analytics confirmed that she was carrying a mutation of the BRCAi gene which is associated with an almost ninety percent probability of developing breast cancer.[22] Similarly, knowing oneself well enough to make important life choices that enhance happiness, such as selecting a life partner or picking up a particular profession, used to involve an individual’s experimentation with different preferences, meditation, philosophy, or even psychoanalysis, personality tests, assessments centers, among others. By contrast, big data analytics can help to quantify the self and facilitate self-knowledge through numbers as the self is arguably the totality of mathematical patterns that are otherwise too complex for humans to understand.[23] Humans will increasingly see themselves as the amalgamation of biochemical systems. Individuality, as defined in the early 2020s, will morph into an individual’s understanding of the statistical significance of biochemical signals in a given setting or environment, not a humans’ inherent personality. Because the outcomes generated by predictive algorithmic analytics are so superior and avoid the catastrophic consequences of bounded human rationality, self-delusion, and self-deception, humanity is bound to embrace predictive analytics of the self wholeheartedly over time. Rational human choice facilitated by predictive human analytics categorizes human experience and choice into a limited range.
Centralized algorithmic automation also has an equity aspect that affects society. Centralized algorithmic automation cements the existing socio-economic status of humans and class in the societal order. As predictive algorithmic applications proliferate in society, the elites will have enhanced access and control over their data and predictive analytics. This will enable the elites to make better choices than less well-off citizens. As humans become increasingly the totality of their respective data and data streams, the systems that organize human generated data improve outcomes for their human users. For example, a health data monitoring system used by a human may screen bio markers in real time and provide recommendations to its users. The system may observe high blood pressure and elevated cortisol and dopamine levels of a user in a meeting and may triangulate that data with the same data from previous meetings and predict accuracy of decision making in the meeting. If the new biofeedback suggests a higher likelihood of suboptimal decision making, the user can decide to terminate the meeting and collect more information to make a better decision later. Human users who have access to such systems will make better decisions than users who have no access. The better a system is in analyzing user preferences and interacting with other systems and making choices for users, the better positioned the user is in society.
Avatars are another good example to illustrate the possible inequities. Users who upload all their personal data, including personal preferences, health records etc. to an algorithmic organizers that optimizes outcomes for its users can create an avatar of themselves. It is conceivable that the algorithmic organizers of such avatar users will, over time, connect with other users’ algorithmic organizers, even without the knowledge of their users, to examine optimization of outcomes for their respective users. The better the avatar of a given user is and the better the algorithmic organizer of that avatar is, the better the outcomes for the respective user in real life. For example, employers may review the avatar of a user rather than a resume. Assuming that existing wealth remains a predictor of outcomes, the elites are more likely to have the best organized avatars with the best outcomes.
Increasing inequality among humans can also result from centralized algorithmic decision making. As certain data driven applications are becoming increasingly prevalent and used by humans, such applications can become the equivalent of a sovereign as any changes the application makes affects all citizens using the application. For example, a heavily used application such as Google may start to make disparate recommendations to similarly situated users, thus benefiting one user group with better predictions and outcomes over another user group.
Feedback effects between human knowledge and human use of data and algorithmic utilization of that human input and optimization for further input call into question the role of humans at large. For example, an algorithm that watches human inputs and the biometrics of human interaction with content can create feedback effects for human consumption of content, creating ever more data for analysis and extrapolation. At the same time, the algorithm can triangulate with big data and real-time data from other users interacting with the same content. As it is learning and applying the human interaction and collective examination of the content, the algorithm can extrapolate the collective human psyche, awareness, and collective consciousness over time and in real-time. More concretely, future algorithms can observe Kindle readers when and how they interact with a certain text passage. The algorithm can observe how and what happens to humans’ hormonal levels, pupil movements, etc. and provide feedback to the reader and the collective on its uses and thoughts regarding the text passage. In other words, the algorithm can analyze, triangulate, and predict users’ collective interaction with knowledge and associated decision making. In essence, the technology can not only predict an individual’s interaction with ideas, knowledge, wisdom, science etc. but also analyzes, internalizes, and predicts the collective human experience and all of humanity’s prevailing future that pertains to such content.
Given these superior analytical and predictive skills, data and algorithmic oracles may in fact turn into a sovereign. Democratic processes as they exist in the early 2020s may no longer be necessary as the algorithmic solutions may become universally accepted as superior for the greater good of humanity. For example, centralized algorithmic automation can remove human choice in democratic institutions. Democratic elections may not be needed in an environment of centralized algorithmic automation. Because big data and algorithmic data analytics enable centralized algorithmically automated systems (such as Google, Facebook, and Amazon) to know individuals’ political preferences better than the individual’s themselves, exercising political will by way of human voting is redundant. Of course, that vision conflicts with the existing constitutional legal arrangements which may delay such outcomes. Yet, the big data and algorithmic data analytics-driven election choice removes humans’ bounded rationality[24] may therefore more accurately, and arguably fairer, represent the totality of a given constituents’ political will.
The above-mentioned effects of centralized algorithmic automation have the potential to undermine the evolution of decentralized systems. While decentralization offers defense systems against the effects of centralized algorithmic automation,[25] its evolution and societal acceptance may be threatened by the effects of centralized algorithmic automation, big data, and dataism. Centralized algorithmic automation enables a form of abdication of personal responsibility for individuals and society at large. Furthermore, society may get used to the conveniences, certainty, and guidance offered by centralized algorithmic automation. As a result, path dependencies may set in that undermine more innovative, secure, and equitable solutions offered by decentralized systems. An algorithmic future where most of humans’ decisions are predicted and or performed by algorithms creates a threat for human individuality. The lack of diversity of human experiences and lacking unpredictability create a threat to diversity of nodes in decentralized systems. While centralized algorithmic automation undermines equity, individualism, individual choice, free will, and diversity of human experiences, decentralized systems thrive when those conditions are protected and proliferate. The freer and more equal humans are and the more access they have to information and data the better for decentralized systems. Societal outcomes can be optimized through the efficient decentralized coordination of human inputs.
The possible effects of centralized algorithmic automation and control can perhaps best be illustrated by cases of centralized algorithmic control over uniquely human experiences. For example, in the not too distant future it may be possible to upload one’s brain to permanently exist in the cloud of computing systems.[26] In 2020, an American startup called Nectome claimed that through a process called vitrivixation, it is possible to preserve the external and internal structure of human brains.[27] If Nectome should find ways to upload their instantiation of a human brain to the cloud and revive it, it would theoretically be possible to maintain uniquely human characteristics in perpetuity. If the control over a human brain is left to a centrally controlled server that is owned by a technology company, the individuals who uploaded their brain may forever be subject to the centralized entity’s discretion, control, and guidance. If the centralized technology companies’ business models of the early 2020s are a guide, a human existence in cyberspace would likely be subject to a subscription service and associated fees, requiring humans to pay for server storage and maintenance. In effect, the identity of the person would be encapsulated by the centralized server parameters. It seems almost intuitive that most individuals who wish to upload their identity, personality, and future thoughts in a digital format would prefer a committee of peers, e.g. similarly situated individuals who are connected in a network, to make decisions on their future than any centralized power structure. Decentralized DAO etc. networks provide that very possibility, if not with regards to hardware requirements, at least with regards to collective decision making and governance of such networks.
XI. Lack of Decentralized Governance Systems
Humanity has been debating governance designs since the inception of human organization. Institutional design and governance are a primary subject of study in law, ethics, economics, political science, computer science, philosophy, sociology, social psychology, cybernetics, control theory, among others. Entire schools of thought, such as new institutional economics evolved because of the insufficiency of governance solutions. Common core denominators of existing problems in governance design include the corruptive effects of existing governance designs with fungible assets as well as the identity of actors in governance designs. These factors, among many others, contributed to the resulting inability to govern institutions effectively.
Lacking decentralized governance solutions affect the application of decentralized systems on multiple levels. First and foremost among those are decentralized networks, the market for digital assets, and decentralized autonomous organizations (DAOs), among others. Decentralized networks depend on dynamic governance. As blockchains protocols evolve in a given market, they require updates. The practice of hardforking, that was still prevalent in the early 2020s, created significant economic loss for such blockchains. Similarly, the evolution of the digital asset market necessitates certainty for market participants. Without standards and governance, certainty and the associated market confidence cannot develop. At the beginning of the 2020s, basic standards for the governance of digital assets were still missing. As a result, the digital asset market stagnated and DeFi (decentralized finance) remained in its infancy. Efforts to provide more legal certainty through regulatory carve-outs[28] were mostly tentative and lacked sufficient regulatory support or voting majorities in the regulatory agencies. DAO governance is a particularly useful example that illustrates the effects of lacking governance on the evolution of decentralized systems. At the beginning of the 2020s, the then existing DAO governance design structures failed to take into account the historical precedent on governance. Most DAOs utilized centralized forms of master nodes to institute blockchain protocol and DAO upgrades.
Blockchain governance in the early 2020s necessitated chain forks with all their negative effects. The calibration of forks for protocol upgrades was common practice in the early 2020s.[29] While sometimes forks were merely used to test a process or upgrade,[30] forking was most often used to implement new characteristics for digital asset or to create a fundamental protocol change.[31] The bifurcation of nodes in a given decentralized network can lead to significant economic loss, errors, confusion, and bugs. For example, the bifurcation of network nodes can result in the reemergence of the double spend problem that the previous network had overcome. Users running the pre-fork code consider the post-fork code invalid, they cannot detect the spending on the post-fork code. Correspondingly, cryptocurrencies spent in a post-fork block could be spent again on a pre-fork block. Similarly, fork-related changes in protocol parameters such as the block size or the difficulty of the cryptographic puzzle can result in certain blocks being accepted by the post-fork protocol but rejected by the pre-fork versions of the protocol which may result in the loss of funds. The economic loss associated with such parallel existence can be quite significant. Finally, the fork that created Bitcoin Cash[32] illustrates the risk of contention and the associated social and political turmoil post fork that necessitates blockchain reorganization. Post Bitcoin Cash fork, the Bitcoin community could not agree on the chain that provided the most survivable protocol. As a result of contention, two blockchains, e.g. Bitcoin and Bitcoin Cash compete in perpetuity with the resulting social and economic loss for each chain.
DAO governance lacked proper incentive designs at the beginning of the 2020s. Human nature and any effective machine derivates of human engagements in institutional form require a duality of incentives in order to overcome attempts of rational and opportunistic internal and external constituents to game the governance design of a given DAO. The duality of incentives consists of a) incentives for actors to improve their own utility, while at the same time 2) actors’ actions benefit the entirety of the institution and its constituents for the long run. DAO designs at the beginning of the 2020s did not effectively master this duality. Moreover, then existing DAO designs did not effectively use non-fungible assets to overcome corruptive elements. When fungible assets are used as the dominant incentive design in the governance of DAOs with identifiable actors, rational and opportunistic internal constituents and external participants will typically attempt to corrupt the governance design of the DAO for their own gain. Similarly, the identity of actors in a DAO governance design creates typically corruptive elements. Merit identifiers other than individual identity remove the most corruptive influences. At the beginning of the 2020s, no then-existing DAO design had effectively designed and applied an anonymous merit identifier with non-fungible decentralized assets.
On-chain voting mechanisms are still largely in their infancy in the early 2020s. Most on-chain voting mechanisms and governance designs mostly resemble a plutocracy. Suboptimal voting outcomes in existing decentralized protocols were associated with the then popular one-token-one-vote voting mechanisms. A one-token-one-vote design allocates more power to token holders who have a significant share of the total supply of a given token. Majority token holders have more power than the rest of the token holders combined. These structures reintroduce many of the downsides and suboptimal incentive allocations of one-share-one vote designs in legacy systems of the early 21st century.
XII. Decentralized Infrastructure Requirements
Decentralized infrastructure needs run deep. Emerging decentralized technology is subject to multiple layers of underdeveloped decentralized infrastructure. Even a fully developed and instantiated infrastructure product in one area may collapse without the supporting structures build by decentralized infrastructure products in other, yet inevitably related, areas. The interdependence of infrastructure development in complex decentralized systems and structures is perhaps best illustrated by the need for a truly decentralized public blockchain that is scalable and fully secure. Without it, experimentation with workable infrastructure products requires stand-alone, siloed approaches that are doomed ab-initio as they can rarely interoperate with other siloed infrastructure products. Interrelated infrastructure products are the key for decentralized technology’s foundational and transformative impact. Such interrelated decentralized infrastructure requirements are comparable to the infrastructure requirements that enabled electricity to take hold in business and society. In light of such infrastructure requirements, emerging decentralized technology is a foundational technology whose transformational impact takes decades rather than years to have an impact on society.
Usability is perhaps the most important decentralized technology infrastructure requirement in the early 2020s. User cannot be expected to have specialized technology knowledge. Rather, users should be able to use decentralized technologies intuitively. Alas, in the early 2020s, decentralized technologies required users to know about their own security, self-custody of digital assets, and overall system operations. Without graphic user interfaces that are intuitive, decentralized systems may not be able to reach mainstream consumer adoption. If users have to discern and manage public and private keys to wallets, among other concerns, it will be very difficult to educate the public sufficiently to seamlessly adopt decentralized protocols in their daily life.
In the early 2020s core infrastructure products that had to be developed included: (1) truly decentralized consensus combined with higher levels of transaction throughput. Currently, the data mining required to provide scarcity also slows creation so that high speed transactions cannot be achieved for the foreseeable future, (2) evolutionary governance designs for blockchains that overcome the need for hardforking, (3) decentralized underwriting protocols that enable democratized access to insurance, and (4) verification protocols for smart contracting, among other decentralized infrastructure needs, and (5) stable cryptocurrencies that maintain their value and can be used by consumers in daily transaction without the need to make an investment decision for every transaction. In other words, if consumers are required to consider the investment implications of using digital asset for a consumer transaction, they are less likely to opt into digital assets for consumer transactions.
Finally, a core requirement for the emerging decentralized infrastructure of the early 2020s that is often overlooked by technologists pertains to the need for a decentralized human backstop in decentralized algorithmic systems. Without a decentralized human backstop to code, the immutability of the blockchain and its cryptographic security systems may not be able to create truly transactional guarantees and trust between principals and agents in the integrity of their contractual relationship.
Bibliography
Buterin, Vitalik & Weyl, Glen (May 21, 2018). Liberation Through Radical Decentralization. Medium: Cryptocurrency, https://medium.com/@VitalikButerin/liberation-through-radical-decentralization-22fc4bedc2ac;
BitcoinCash, www.bitcoincash.com (accessed June 1, 2020)
Bitshares (n.d.). Delegated Proof of Stake (DPOS), https://how.bitshares.works/en/master/technology/dpos.html.
Bohl, Micahel A., Martirosyan, Nikolay L., Killeen, Zachary W., Belykh, Evgenii, Zabramski, Joseph M., Spetzler, Robert F., Preul, Mark C. (March 2019). The History of Therapeutic Hypothermia and Its Use in Neurosurgery. Journal of Neurosurgery, 130, 1006–1020; ; ;
Bonneau, Joseph, Miller, Andrew, Clark, Jeremy, Narayanan, Arvind, Kroll, Joshua A., & Felten, Edward W. (2015). SoK: Resea rch Perspectives and Challenges for Bitcoin and Cryptocurrencies, in 2015 IEEE Symposium on Security and Privacy, https://ieeexplore.ieee.org/document/7163021
Buterin, Vitalik (Aug 27, 2017). Incentives in Casper the Friendly Finality Gadget. Etherum Foundation, https://github.com/ethereum/research/blob/master/papers/casper- economics/casper_economics_basic.pdf (accessed Dec. 7, 2017).
Calcaterra, Craig (May 24, 2018). On-Chain Governance of Decenralized Autonomous Organizations: Blockchain Organization Using Semada [Unpublished article], https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3188374 (accessed June 1, 2020).
Calcaterra, Craig & Kaal, Wulf A. (Jan. 18, 2018). Secure Proof of Stake Protocol [Unpublished Working Paper №18–10], https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3125827 (accessed June 1, 2020).
Dormehl, Luke (2015). The Forumla: How Algorithms Solve All Our Problems…and Create More. WH Allen: London.
Etherum (May 28, 2020). Develooper Resources, https://ethereum.org/developers/ (accessed June 1, 2020).
Fried, Limor “Ladyada” (April 19, 2018) All the (Internet of) Things, Codecademy, https://www.codecademy.com/articles/all-the-internet-of-things (accessed June 1, 2020).
FS Blog (n.d.). Complexity Bias: Why We Prefer Complicated to Simple. FS.blog, https://fs.blog/2018/01/complexity-bias/.
In re Appraisal of Dell Inc., CV 9322-VCL, 2015 WL 4313206 (Del. Ch. July 13, 2015), as revised (July 30, 2015).
Jolie, Angelina (May 14, 2013). My Medical Choice. New York Times, A.25.
Kwon, Yujin, Liu, Jian, Kim, Minjeong, Song, Dawn, & Kim, Yongdae (2019). Impossibility of Full Decentralization in Permissionless Blockchains. Association for Computer Machinery, https://webee.technion.ac.il/people/ittay/aft19/aft19-final31.pdf.
Lessig, Lawrence (2006). Code. Basic Books: New York.
Mizrahi, Eli M., Patel, Vasishta M., Crawfor, E. Stanley, Coselli, Joseph S., Hess, Kenneth R. (1989). Hypothermic-induced Electrocerebral Silence, Prolonged Circulatory Arrest, and Cerebral Protection During Cardiovascular Surgery. Electroencephalography and Clinical Neurophysiology, 72:1, 81–85
Monero, https://www.getmonero.org/ (accessed Aug. 19, 2019)
Munger, Charlie (1994). A Lesson on Elementary Worldly Wisdom [Speech to USC Business School], transcript available at https://fs.blog/great-talks/a-lesson-on-worldly-wisdom/ (accessed June 1, 2020).
Nadella, Satya (June. 4, 2018). Microsoft + GitHub = Empowering Developers. Official Microsoft Blog, https://blogs.microsoft.com/blog/2018/06/04/microsoft-github-empowering-developers/ (accessed June 1, 2020).
Partz, Helen (Feb. 6, 2020). SEC’s Cryptomom Proposes Safe Harbor Framework for Token Projects. Cointelegraph, https://cointelegraph.com/news/secs-cryptomom-proposes-safe-harbor-framework-for-token-projects?fbclid=IwAR3lGU0aXyoXj0mgetDg2s1RrfFtmoGWyBhXFNqZjaD_uwPzYSp8nXpWvvM.
Percy, Andrew, Widman, Shannon, Rizzo, Joahn A., Tranquilli, Maryann, Elefteriades, John A. (2009). Deep Hypothermic Circulatory Arrest in Patients with High Cognitive Needs: Full Preservation of Cognitive Abilities. The Annals of Thoraciic Surgery, 87:1, 117–123
Pinkstone, Joe (Feb. 19, 2018). Could You Live Forever? Humans Will Achieve Immortality Using AI and Genetic Engineering by 2050, Expert Claims. DailyMail.com, https://www.dailymail.co.uk/sciencetech/article-5408425/Human-beings-achieve-immortality-2050.html?fbclid=IwAR2NEKgeZMOlyAYoB3_NxQG4HftNTPsAtwFvCA9PglhHp7WjAbn759bqkF4 (accessed June 1, 2020).
Quantified Self, https://quantifiedself.com/ (accessed June 1, 2020).
Raichle, Marcus, E. (1983). The Pathophsiology of Brain Ischemia. Annals of Neurology, 13:1, 2–10
Taleb, Nassim Nichols (2012). Antifragile: Things That Gain from Disorder. Random House Trade Paperbacks: New York, 349.
TechTarget: IoT Agenda (n.d.) Internet of Things (IoT), https://internetofthingsagenda.techtarget.com/definition/Internet-of-Things-IoT (accessed Aug. 28, 2018);
Wikipedia, List of Countries by System of Government, https://en.m.wikipedia.org/wiki/List_of_countries_by_system_of_government (accessed June 1, 2020).
Wu, Xianren, Drabek, Tomas, Tisherman, Samuael A. (2007). Emergency Preservation and Resuscitation with Profound Hypothermia, Oxygen, and Glucose Allows Reliable Neurological Recovery after 3 h of Cardiac Arrest from Rapid Exsanguination in Dogs. Journal of Cerebral Blood Flow & Metabolism, 28:2, 302–311.
Youyou, Wu, Kosinki, Michal, & Stillwell, David (2015). Computer-Based Personality Judgements Are More Accurate Than Those Made by Humans, PNAS 112:4 (2015), 1036–40.
** Professor of Law, University of St. Thomas School of Law (Minneapolis, USA). The author is gratefull for ongoing discussions about decentralized technology solutions with Craig Calcaterra. The author is also grateful for outstanding research assistance from Hayley Howe and research librarian assistance from Nicole Kinn. The author did not receive any form of financial incentive for authoring this article.
[1] The Apache tribe for example, that maintained a decentralized tribal order through most of its history.
[2] Buterin, Vitalik & Weyl, Glen (May 21, 2018). Liberation Through Radical Decentralization. Medium: Cryptocurrency, https://medium.com/@VitalikButerin/liberation-through-radical-decentralization-22fc4bedc2ac; Bonneau, Joseph, Miller, Andrew, Clark, Jeremy, Narayanan, Arvind, Kroll, Joshua A., & Felten, Edward W. (2015). SoK: Research Perspectives and Challenges for Bitcoin and Cryptocurrencies, in 2015 IEEE Symposium on Security and Privacy, https://ieeexplore.ieee.org/document/7163021; Kwon, Yujin, Liu, Jian, Kim, Minjeong, Song, Dawn, & Kim, Yongdae (2019). Impossibility of Full Decentralization in Permissionless Blockchains. Association for Computer Machinery, https://webee.technion.ac.il/people/ittay/aft19/aft19-final31.pdf.
[3] See further [___].
[4] In re Appraisal of Dell Inc., CV 9322-VCL, 2015 WL 4313206 (Del. Ch. July 13, 2015), as revised (July 30, 2015).
[5] See [__]
[6] Bitshares (n.d.). Delegated Proof of Stake (DPOS), https://how.bitshares.works/en/master/technology/dpos.html.
[7] Buterin, Vitalik (Aug 27, 2017). Incentives in Casper the Friendly Finality Gadget. Etherum Foundation, https://github.com/ethereum/research/blob/master/papers/casper- economics/casper_economics_basic.pdf (accessed Dec. 7, 2017).
[8] FS Blog (n.d.). Complexity Bias: Why We Prefer Complicated to Simple. FS.blog, https://fs.blog/2018/01/complexity-bias/.
[9] See [____].
[10] Taleb, Nassim Nichols (2012). Antifragile: Things That Gain from Disorder. Random House Trade Paperbacks: New York, 349.
[11] A noteworthy exception: Calcaterra, Craig & Kaal, Wulf A. (Jan. 18, 2018). Secure Proof of Stake Protocol [Unpublished Working Paper №18–10], https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3125827 (accessed June 1, 2020).
[12] Wikipedia, List of Countries by System of Government, https://en.m.wikipedia.org/wiki/List_of_countries_by_system_of_government (accessed June 1, 2020).
[13] Nadella, Satya (June. 4, 2018). Microsoft + GitHub = Empowering Developers. Official Microsoft Blog, https://blogs.microsoft.com/blog/2018/06/04/microsoft-github-empowering-developers/ (accessed June 1, 2020).
[14] TechTarget: IoT Agenda (n.d.) Internet of Things (IoT), https://internetofthingsagenda.techtarget.com/definition/Internet-of-Things-IoT (accessed Aug. 28, 2018); Fried, Limor “Ladyada” (April 19, 2018) All the (Internet of) Things, Codecademy, https://www.codecademy.com/articles/all-the-internet-of-things (accessed June 1, 2020).
[15] Lessig, Lawrence (2006). Code. Basic Books: New York.
[16] Calcaterra, Craig (May 24, 2018). On-Chain Governance of Decenralized Autonomous Organizations: Blockchain Organization Using Semada [Unpublished article], https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3188374 (accessed June 1, 2020).
[17] It is enshrined in Article 2(4) of the UN Charter and has been recognized as customary international law.
[18] “To the man with only a hammer, every problem looks like a nail.” Munger, Charlie (1994). A Lesson on Elementary Worldly Wisdom [Speech to USC Business School], transcript available at https://fs.blog/great-talks/a-lesson-on-worldly-wisdom/ (accessed June 1, 2020).
[19] See further [___].
[20] Dormehl, Luke (2015). The Forumla: How Algorithms Solve All Our Problems…and Create More. WH Allen: London.
[21] Youyou, Wu, Kosinki, Michal, & Stillwell, David (2015). Computer-Based Personality Judgements Are More Accurate Than Those Made by Humans, PNAS 112:4 (2015), 1036–40.
[22] Jolie, Angelina (May 14, 2013). My Medical Choice. New York Times, A.25.
[23] Quantified Self, https://quantifiedself.com/ (accessed June 1, 2020).
[24] Kahneman’s cold water experiments suggest that humans listen to a narrating self in political decision making, follow a peak-end rule, forget the vast majority of political events during a given legislation period, focus exclusively on a few extreme outlier events, and give largely disproportionate weight in political decision making to the most recent events. See [___] 4, footnote 11.
[25] See further [___], Part [___].
[26] Pinkstone, Joe (Feb. 19, 2018). Could You Live Forever? Humans Will Achieve Immortality Using AI and Genetic Engineering by 2050, Expert Claims. DailyMail.com, https://www.dailymail.co.uk/sciencetech/article-5408425/Human-beings-achieve-immortality-2050.html?fbclid=IwAR2NEKgeZMOlyAYoB3_NxQG4HftNTPsAtwFvCA9PglhHp7WjAbn759bqkF4 (accessed June 1, 2020).
[27] Bohl, Micahel A., Martirosyan, Nikolay L., Killeen, Zachary W., Belykh, Evgenii, Zabramski, Joseph M., Spetzler, Robert F., Preul, Mark C. (March 2019). The History of Therapeutic Hypothermia and Its Use in Neurosurgery. Journal of Neurosurgery, 130, 1006–1020; Mizrahi, Eli M., Patel, Vasishta M., Crawfor, E. Stanley, Coselli, Joseph S., Hess, Kenneth R. (1989). Hypothermic-induced Electrocerebral Silence, Prolonged Circulatory Arrest, and Cerebral Protection During Cardiovascular Surgery. Electroencephalography and Clinical Neurophysiology, 72:1, 81–85; Percy, Andrew, Widman, Shannon, Rizzo, Joahn A., Tranquilli, Maryann, Elefteriades, John A. (2009). Deep Hypothermic Circulatory Arrest in Patients with High Cognitive Needs: Full Preservation of Cognitive Abilities. The Annals of Thoraciic Surgery, 87:1, 117–123; Raichle, Marcus, E. (1983). The Pathophsiology of Brain Ischemia. Annals of Neurology, 13:1, 2–10; Wu, Xianren, Drabek, Tomas, Tisherman, Samuael A. (2007). Emergency Preservation and Resuscitation with Profound Hypothermia, Oxygen, and Glucose Allows Reliable Neurological Recovery after 3 h of Cardiac Arrest from Rapid Exsanguination in Dogs. Journal of Cerebral Blood Flow & Metabolism, 28:2, 302–311.
[28] Partz, Helen (Feb. 6, 2020). SEC’s Cryptomom Proposes Safe Harbor Framework for Token Projects. Cointelegraph, https://cointelegraph.com/news/secs-cryptomom-proposes-safe-harbor-framework-for-token-projects?fbclid=IwAR3lGU0aXyoXj0mgetDg2s1RrfFtmoGWyBhXFNqZjaD_uwPzYSp8nXpWvvM.
[29] Etherum (May 28, 2020). Develooper Resources, https://ethereum.org/developers/ (accessed Feb 11, 2018).
[30] (“An example of a hard fork that implemented a new feature is Monero. Monero underwent a planned hard fork in January 2017, to introduced Ring Confidential Transactions (RCT). RCT improved security as well as privacy for Monero users”). Monero, https://www.getmonero.org/ (accessed Aug. 19, 2019).
[31] In the early 2020s, forks could be distinguished into hard and softforks. Generally speaking, forks result in previously uniform nodes that, after the fork, run a different protocol with different data than the newer version. A soft fork is a backward compatible protocol upgrade. The soft fork creates a new more restrictive protocol rule, e.g. block size limit was 1 MB and is changed by the soft fork to 500 KB. Because it is more restrictive and not more expansive it is backward compatible with the previous protocol rule. By contrast, the hard fork creates a more expansive ruleset and is therefore not backward compatible. For instance, the increase of the block size limit from 1MB per block to 2 MB per block would not be backward compatible because the previous protocol limited the block size to 1 MB.
[32] BitcoinCash, www.bitcoincash.com (accessed June 1, 2020)