How the Computational Capacity of Economies Explains Income

Economies are computers, and the capacity of an economy to generate income is a side effect of that economy's computational capacity. This helps explain international differences in income, as it implies that economies that embody vast computational capacities, such as the economies of South Korea, Taiwan or Silicon Valley, should be richer than those that struggle to compute, like the economies of many Latin American and Sub-Saharan countries.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

Most people think that information and computation are new things when in fact they are as old as the big bang. In the beginning, there was the bit, as my MIT colleague Seth Lloyd likes to say. Only recently, however, we have learned to see the bits embodied in atoms, cells, society and the economy.

But what is information? Colloquially, people think of information as the messages we use to communicate the state of a system. But information, which is not the same as meaning, includes also the order embodied in these systems, not just the messages we use to describe them. Think of the order you destroy when you crash a car. A car crash does not destroy atoms -- it destroys the way in which these atoms are arranged. That change in order is technically a change in information.

Computation, on the other hand, is the use of energy to process information. It is the fundamental mechanism by which nature rearranges bits to produce order. Computation is everywhere but in an economic context, we can think of it as a more modern and more accurate interpretation of the ideas of labor advanced originally by Adam Smith and Karl Marx.

Smith and Marx did not know about information or computation, so they described economies using the language of energy that dominated the nineteenth century zeitgeist. The mechanical protagonists of the industrial revolution were machines that transformed heat into motion: engines for pumps, trains and cranes. These machines awed the nineteenth century masses with their power -- masses that failed to see that what these machines were doing was increasing their ability to process information.

Processing information is the essence of all economic activities. It is not the privilege of the coder or the writer but what we do when we bake a cake, make a sandwich or manufacture a car. We compute when we take out the trash, do laundry or pair socks. All of these acts involve using energy to produce order -- whether we are grouping undesirable objects in a trashcan or using a laundry machine to remove dirt from our shirts. All jobs are acts of computation, and the economy is a collective computer that involves all of us.

In Why Information Grows, I dedicate sixty thousand words to describe the nuances of our economic computers. This involves the nuances of the types of information that we produce and the social networks we create to produce them. The most obvious punchline of the book is simple: economies are computers and the capacity of an economy to generate income is a side effect of that economy's computational capacity. This helps explain international differences in income, as it implies that economies that embody vast computational capacities, such as the economies of South Korea, Taiwan or Silicon Valley, should be richer than those that struggle to compute, like the economies of many Latin American and Sub-Saharan countries. More interestingly, this tells us that economies with a computational capacity that is larger than what we would expect given their income, such as the economies of China or India, grow faster than those with an income that is too high for their capacity to compute, like that of Greece.

Of course, you might be asking: how can we measure the computational capacity of economies? The answer here is non-obvious but also simple: we can measure the computational capacity of economies by looking at the types of products that these make. Powerful computers can run "sophisticated programs" that few other economies can also run, like those required to produce aircraft engines or new pharmaceuticals. Simple economies can only run simple programs that are more ubiquitous -- and hence, often integrate into the global economy by either exporting the energy we use to run our collective computers or the atoms we use to embody our ideas.

But by understanding economies as computers we get much more than a predictive theory of future economic growth. We also get a framework that helps us incorporate institutional factors and historical processes easily in our descriptions of economic systems. After all, economies are computers embodied in social networks, implying that the mechanisms that limit our ability to create social networks will affect our economies' computational capacities.

One example of this is the economic importance of trust. Building on the work of sociologists like Mark Granovetter and political scientists like Francis Fukuyama and Robert Putnam, I argue that societies with different levels of trust create networks of different sizes, which in turn gravitate towards different industrial sectors. Low trust societies are societies where links are expensive and hence, create relatively small networks that gravitate towards simpler industries, such as agriculture, mining and retail. High trust societies can form social and professional links more easily and hence, are better at creating the networks needed to embody the computational capacities required to perform complex productive activities, such as the manufacture of machinery or the discovery of pharmaceuticals.

Of course, trust is a social institution that evolves through a slow path dependent process that is affected by historical factors. But the lesson we are looking to extract here is that these historical factors -- whatever they are -- affect the computational capacity of economies because they affect the structure and the size of the networks that embody an economy's computation. Ultimately, these differences in institutions, networks and computational capacities are expressed in the mix of products that different economies are able to make.

But there are other ways we get by thinking of economies in the language of information and computation that are less technical but more poetic and equally accurate. When we realize that all products are made of information, we learn that our world is not only tangible but also made of fiction -- literally, not metaphorically. The majority of the products we use, from the shoes we wear to the homes we live in, are objects that started as fiction, as they were imagined before they were built. The physical order, or information, that we accumulate in our economy is not quite the same than the order produced by physical processes and biological systems, since it is order that originates as mental computations that we then re-embody in objects. As a species, we do not only use information to communicate messages, but to create objects that endow us with fantastic capacities. We are the only species to do that.

My favorite example of augmentation is the story of Hugh Herr. Hugh is also a professor at The MIT Media Lab but unlike me, his research direction was defined at an early age when he lost both of his legs in Mount Washington. But Hugh did not lose his ability to walk, as he now walks on robotic legs he created with his team at MIT. What is poetic about Hugh's accomplishment, however, is that he is not just walking on robotic limbs but on solidified pieces of his own imagination.

So on the surface, Why Information Grows is a book about economic development interpreted through the lens of information and computation. Below the surface, however, it is not about economies but about the processes that enable and limit the growth of information in our planet.

To explain this deeper punch line, we will deviate from economies for a second and will think of biological cells. Cells are extremely successful biological computers. Yet, they are also extremely limited in their computational capacities when we compare them to humans. Why? Because cells, just like all finite systems, embody a finite computational capacity that is much smaller than the ones humans have. Yet despite their limitations, cells did not become the final stop in our planet's ability to generate information and computation. How did cells transcend their inherent limitation? By creating social structures that allow them to distribute that computation. Cells achieved multicellularity, and by discovering multicellularity, they were able to transcend their limited computational capacities.

And humans are like cells, in that we embody a finite computational capacity. I call this finite capacity one personbyte. This finite capacity implies that humans can only generate complex computational processes, like those required to manufacture aircrafts or to build telecommunication networks, by distributing computational capacity in networks of humans. Ultimately, these networks of humans are the blessing that helps us generate new and complex forms of information. But also, because embodying computation in a network of humans is hard, these networks are what limits the growth of computation and information in economies.

So below the surface, Why Information Grows is about the universal processes that both enable and limit the growth of information in our planet. These processes transcend the traditional distinctions between the natural and social sciences, as they focus on the universal need to re-embody computation that is needed to transcend the limitations that we all have -- from cells to humans and from humans to economies.

Popular in the Community

Close

What's Hot