It was 1964. RAND researcher Paul Baran began thinking about the optimal structure of the Internet. He envisioned a network of unmanned nodes that would act as switches, routing information from one node to another to their final destinations. Baran suggested there were three possible architectures for such a network —centralized, decentralized, and distributed.
While he felt the first two —centralized and decentralized— were vulnerable to attack, the third distributed or mesh-like structure would be more resilient. His task of designing a “survivable” communications system led him to suggest the Internet's design should have a distributed architecture.
Distributed networks are more resilient
But because the analog communication system of the time could not handle messages broken down into “small packets of uniform size capable of traveling independently of one another along the network,” says Albert-László Barabási in Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life, his proposal was rejected.
AT&T's Jack Osterman quashed Baran's vision by saying, “First, it can't possibly work, and if it did, damn if we are going to allow the creation of a competition to ourselves.” AT&T was the communication monopoly at the time.
Years later, the Advanced Research Projects Agency (ARPA), President Eisenhower's answer to the Soviet's Sputnik's launch, came up with the same vision independently of Baran's proposal. “By that time,” says Albert-László Barabási, “the Internet was well along its course of development.”
The Internet was designed as a network of routers that communicate with each other through protocols. Once it was out there it took a life of its own, and it looks more like as an ecosystem —as in Baran's original vision— than a centrally-driven structure.
The World Wide Web followed a similar development that the Internet's. Both basic infrastructures are still in place from their origins, even as they were both unprepared for the myriad tools and services designed on top of them. Development happened before the design was fully in place. Hence why many of the application do not run as smoothly as they could have, says Barabási.
On February 7, 2000 a massive denial-of-service (DDOS) attack crippled Yahoo.com and the leading online commerce sites. It was the work of a Canadian teenager. How did he do it? He managed to take down the computers of universities and small companies that were vulnerable to attack and instructed them bombard Yahoo with messages.
As recent headlines demonstrate, many organizations are still quite vulnerable to attacks. “Parasitic computing exploits [the Internet's] setup by forcing computers to perform computation at the command of a master host by merely engaging the computer in communication,” says Barabási.
But the nature of the Web is also fragmented. There are up to nineteen degrees of separation between the billions documents online. For technical reasons, the links of the Web are directed, that is they allow us to travel only in one direction —we cannot make round-trips.
How information travels
Since computers are connected systems, they do not form one single homogeneous network but information travels along nodes to a core adding up to an overall number of four continents. The IN and OUT continents are as large and the core. While the IN space is harder to navigate, we can reach the OUT easily from the core. The OUT is populated by corporate sites —easy to reach, hard to get out of.
Technology applications have expanded well beyond the Web, yet many of the tools we use today still connect us to it as they link us to each other. The planet itself is becoming a vast computer made of billions of interconnected processors and sensors.
Which begs the question many have been asking —as it continues to emulate real organisms, will this structure we think of as a computer become self-aware?
It's an interesting question given that most webs, from social to protein interaction networks are non directed, yet the Web's nature like the food web have directed links. Centrally-planned models work really well in predictable environments.
What makes an environment not predictable? When the gap between what we know and what we need to learn to get to where we want to be is large, it introduces complexity in the system. Take for example life in its most elemental components —molecules.
From sequencing of the human genome we know that it is a string of molecules that makes up our DNA. Molecular biology reduces living things to their smallest parts. But from early experience in both medicine and biology, we know that “the behavior of living systems can seldom be reduced to their molecular components,” says Barabási.
Instead, our bets need to focus on how different genes work together. How do messages travel within cells? What are the effects of potential reactions to the rest? To unravel the complexity of living organism, biology needs to understand networks.
Value of network structure
In Linked Albert-László Barabási says the structure of corporations is that of a tree where the CEO is the roots and the rest of the organization spreads from there to increasing levels of specialization and decreasing levels of responsibility.
Although the tree is a good metaphor for growing an idea into a sustainable system, the tree model as structure works best for a world of mass production where information is (mostly) directed, and each tree is about the growth and full development of one idea. When we shift our business model to growing many ideas and relying on information flowing both ways for growth, it creates many challenges.
First among them the illusion that the very structure that has taken the organization to where it is will continue to deliver the same results. The evolution of competitive pressure and market maturity have both created a dynamic environment. Which in turn delivers shocks to the rigid infrastructure that tries to serve it.
Successful organizations operate as close to networks as feasible based on entity type and size. In some cases, they let what other companies would consider “informal” networks emerge. Then based on their success, they rally around them.
But companies have by and large not restructured to support this more natural view of business. Instead, many have eliminated entire layers in the current structure —eliminating middle management, for example. Those organizations look like the tree to your right.
The trimming move has shifted decisions to employees who previously played secondary roles. While these employees may have the ability to raise to the occasion, they have not have had enough proximity to greater responsibilities and may still have little to no access to information flows due of organizational habits. This often stunts proper growth at the individual and company-level.
Companies have also been filling roles that are more specialized at a time when the ability to connect information relies on broader knowledge and experience.
Good connections help at every level of an environment. But only when we see and use them. Says Barabási:
“The diversity of networks in business and the economy is mind-boggling. There are policy networks, ownership networks, collaboration networks, organizational networks, network marketing-you name it. It would be impossible to integrate these diverse interactions into a single all-encompassing web. Yet no matter what organizational level we look at, the same robust and universal laws that govern nature's webs seem to greet us. The challenge is for economic and network research alike to put these laws into practice.”
But we must uncover their value and consider it in its right proportions to the whole:
“Networks do not offer a miracle drug, a strategy that makes you invincible in any business environment. The truly important role networks play is in helping existing organizations adapt to rapidly changing market conditions. The very concept of network implies a multidimensional approach.”
Because better than adapting, which is not always the best or most economical answer, is preparing for what comes next by continuously shaping the organization to remain resilient in its strength.