Helbing Systemic Risks in Society and Economics

13
Systemic Risks in Society and Economics Dirk Helbing 1,2,3* 1 ETH Zurich, UNO D11, Universit¨ atstr. 41, 8092 Zurich, Switzerland http://www.soms.ethz.ch, http://www.ccss.ethz.ch 2 Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, NM 87501, USA and 3 Collegium Budapest—Institute for Advanced Study, Szenth´ aroms´ ag u. 2, 1014 Budapest, Hungary (Dated: November 18, 2009) This contribution presents a summary of sources and drivers of systemic risks in socio-economic systems and related governance issues. The analysis is based on the theory of complex systems and illustrated by numerous examples, including financial market instability. Typical misunderstandings regarding the behavior and functioning of socio-economic systems will be addressed, and some current threats for the stability of social and economic systems are pointed out. I. INTRODUCTION When studying systemic risks, i.e. risks that can trig- ger unexpected large-scale changes of a system or imply uncontrollable large-scale threats to it, scientific research has often focused on natural disasters such as earth- quakes, tsunamis, hurricanes, vulcano outbreaks, or on failures of engineered systems such as blackouts of the electric power grids or nuclear accidents (as in Cher- nobyl). However, many major disasters hitting human societies relate to social problems [1–4]: This includes famines and other shortages of resources, wars, climate change, and epidemics, some of which are related to pop- ulation density and population growth. Financial insta- bilities and economic crises are further examples of sys- temic risks. Let us illustrate these risks by some numbers: World War I caused more than 15,000,000 victims, and World War II even 60,000,000 fatalities. The latter generated costs of 1,000 billion 1944 US$ and destroyed 1710 cities, 70,000 villages, 31,850 industrial establishments, 40,000 miles of railroad, 40,000 hospitals, and 84,000 schools. Moreover, the world has seen many lossful wars ever since. The current financial and economic crises triggered an estimated loss of 4-20 trillion US$. Climate change is expected to cause natural disasters, conflicts for water, food, land, migration, social and polit- ical instability. The related reduction of the world gross domestic product is expected to amount to 0.6 trillion US$ per year or more. Turning our attention to epi- demics, the Spanish flu has caused 20-40 million deaths, and SARS has triggered losses of 100 billion US$. Considering these examples, one could in fact say “The major risks are social”, but they are still poorly under- stood. In fact, we know much more about the origin of the universe and about elementary particles than about the working of our socio-economic system. This situation must be urgently changed (see Sec. V). It is obvious that mankind must be better prepared for the crises to come. A variety of factors is currently * Electronic address: [email protected] driving the world out of equilibrium: Population growth, climate change, globalization, changes in the composi- tion of populations, and the exploitation of natural re- sources are just some examples. As president of New York’s Columbia University, Lee C. Bollinger formulated the problem as follows: ”The forces affecting societies around the world ... are powerful and novel. The spread of global market systems ... are ... reshaping our world ..., raising profound questions. These questions call for the kinds of analyses and understandings that academic institutions are uniquely capable of providing. Too many policy failures are fundamentally failures of knowledge.” [5] We certainly need to increase our capacity to gain a better understanding of socio-economic systems, condi- tions triggering instabilities, alternative system designs, ways to avoid or mitigate crises, and side effects of pol- icy measures. This contribution will shortly summarize the current knowledge of how systemic risks emerge in society, and give a variety of relevant examples. II. SOCIO-ECONOMIC SYSTEMS AS COMPLEX SYSTEMS An important aspect of social and economic systems is that they are complex systems (see Fig. 1) [6–38]. Other examples of complex systems are turbulent fluids, traf- fic flows, large supply chains, or ecological systems. The commonality of complex systems is that they are charac- terized by a large number of interacting (mutually cou- pled) system elements (such as individuals, companies, countries, cars, etc.) [7, 39–49]. These interactions are usually non-linear (see Sec. II A). Typically, this implies a rich system behavior [7]. In particular, such systems tend to behave dynamic rather than static, and proba- bilistic rather than deterministic. As a consequence, com- plex systems can show surprising or even paradoxical be- haviors. The slower-is-faster effect [50, 51], according to which delays can sometimes speed up the efficiency of transport systems, may serve as an example. Moreover, complex systems are often hardly pre- dictable and uncontrollable. While we are part of many complex systems (such as traffic flows, groups or crowds,

Transcript of Helbing Systemic Risks in Society and Economics

  • Systemic Risks in Society and Economics

    Dirk Helbing1,2,31 ETH Zurich, UNO D11, Universitatstr. 41, 8092 Zurich, Switzerland

    http://www.soms.ethz.ch, http://www.ccss.ethz.ch2 Santa Fe Institute, 1399 Hyde Park Road, Santa Fe, NM 87501, USA and

    3 Collegium BudapestInstitute for Advanced Study, Szentharomsag u. 2, 1014 Budapest, Hungary(Dated: November 18, 2009)

    This contribution presents a summary of sources and drivers of systemic risks in socio-economicsystems and related governance issues. The analysis is based on the theory of complex systems andillustrated by numerous examples, including financial market instability. Typical misunderstandingsregarding the behavior and functioning of socio-economic systems will be addressed, and somecurrent threats for the stability of social and economic systems are pointed out.

    I. INTRODUCTION

    When studying systemic risks, i.e. risks that can trig-ger unexpected large-scale changes of a system or implyuncontrollable large-scale threats to it, scientific researchhas often focused on natural disasters such as earth-quakes, tsunamis, hurricanes, vulcano outbreaks, or onfailures of engineered systems such as blackouts of theelectric power grids or nuclear accidents (as in Cher-nobyl). However, many major disasters hitting humansocieties relate to social problems [14]: This includesfamines and other shortages of resources, wars, climatechange, and epidemics, some of which are related to pop-ulation density and population growth. Financial insta-bilities and economic crises are further examples of sys-temic risks.Let us illustrate these risks by some numbers: World

    War I caused more than 15,000,000 victims, and WorldWar II even 60,000,000 fatalities. The latter generatedcosts of 1,000 billion 1944 US$ and destroyed 1710 cities,70,000 villages, 31,850 industrial establishments, 40,000miles of railroad, 40,000 hospitals, and 84,000 schools.Moreover, the world has seen many lossful wars eversince. The current financial and economic crises triggeredan estimated loss of 4-20 trillion US$.Climate change is expected to cause natural disasters,

    conflicts for water, food, land, migration, social and polit-ical instability. The related reduction of the world grossdomestic product is expected to amount to 0.6 trillionUS$ per year or more. Turning our attention to epi-demics, the Spanish flu has caused 20-40 million deaths,and SARS has triggered losses of 100 billion US$.Considering these examples, one could in fact say The

    major risks are social, but they are still poorly under-stood. In fact, we know much more about the origin ofthe universe and about elementary particles than aboutthe working of our socio-economic system. This situationmust be urgently changed (see Sec. V).It is obvious that mankind must be better prepared

    for the crises to come. A variety of factors is currently

    Electronic address: [email protected]

    driving the world out of equilibrium: Population growth,climate change, globalization, changes in the composi-tion of populations, and the exploitation of natural re-sources are just some examples. As president of NewYorks Columbia University, Lee C. Bollinger formulatedthe problem as follows: The forces affecting societiesaround the world ... are powerful and novel. The spreadof global market systems ... are ... reshaping our world..., raising profound questions. These questions call forthe kinds of analyses and understandings that academicinstitutions are uniquely capable of providing. Too manypolicy failures are fundamentally failures of knowledge.[5]We certainly need to increase our capacity to gain a

    better understanding of socio-economic systems, condi-tions triggering instabilities, alternative system designs,ways to avoid or mitigate crises, and side effects of pol-icy measures. This contribution will shortly summarizethe current knowledge of how systemic risks emerge insociety, and give a variety of relevant examples.

    II. SOCIO-ECONOMIC SYSTEMS ASCOMPLEX SYSTEMS

    An important aspect of social and economic systems isthat they are complex systems (see Fig. 1) [638]. Otherexamples of complex systems are turbulent fluids, traf-fic flows, large supply chains, or ecological systems. Thecommonality of complex systems is that they are charac-terized by a large number of interacting (mutually cou-pled) system elements (such as individuals, companies,countries, cars, etc.) [7, 3949]. These interactions areusually non-linear (see Sec. II A). Typically, this impliesa rich system behavior [7]. In particular, such systemstend to behave dynamic rather than static, and proba-bilistic rather than deterministic. As a consequence, com-plex systems can show surprising or even paradoxical be-haviors. The slower-is-faster effect [50, 51], according towhich delays can sometimes speed up the efficiency oftransport systems, may serve as an example.Moreover, complex systems are often hardly pre-

    dictable and uncontrollable. While we are part of manycomplex systems (such as traffic flows, groups or crowds,

  • 2financial markets, and other socio-economic systems), ourperception of them is mostly oversimplified [52, 53] or bi-ased [5456]. In fact, they challenge our established waysof thinking and are currently a nightmare for decision-makers [52]. The following subsections will explain thesepoints in more detail.

    470

    480

    490Location (km)

    67

    89

    10

    Time (h)

    1000

    Velocity (km/h)

    Figure 1: Top: A car is a complicated system (top), but itsparts behave in a deterministic and predictable way. There-fore, a car is relatively easy to control. Bottom: Freeway traf-fic (bottom), in contrast, constitutes a complex system, as itinvolves the interaction of many independent driver-vehicleunits with a largely autonomous behavior. Their interactionscan lead to the self-organization of different kinds of trafficjams, the occurence of which is hard to predict (after [57]).Besides the structural complexity represented by (a) and thedynamic complexity represented by (b), there is a third kindof complexity, namely (c) algorithmic complexity. It measureshow computer resources needed to simulate or optimize a sys-tem scale with system size. Note that this manuscript mainlyfocuses on dynamic complexity.

    A. Non-Linear Interactions and Power Laws

    Systems with a complex system dynamics are mostlycharacterized by non-linear interactions among the ele-ments or entities constituting the system (be it particles,objects, or individuals). Non-linear interactions are typi-

    cal for systems in which elements mutually adapt to eachother. That is, the elements are influenced by their envi-ronment, but at the same time, they also have an impacton their environment.Non-linearity means that causes and effects are not

    proportional to each other. A typical case is a systemthat is hardly responsive to control attempts, or whichshows sudden regime shifts when a tipping point iscrossed [5863] (see Fig. 2). Examples for this are suddenchanges in public opinion (e.g. from smoking-tolerance tosmoking bans, from pro- to anti-war mood, from a strictbanking secret to transparency, or from buying pickuptrucks to buying environment-friendly cars).

    Figure 2: Schematic illustration of one of the typical behav-iors of complex systems: In regimes 1 and 2, a cause (suchas a control attempt) has essentially no effect on the sys-tem, while at the tipping point, an abrupt (and often un-expected) transition to a different system behavior occurs. Arecent example is the sudden large-scale erosion of the Swissbanking secret, after UBS had handed over about 300 namesof clients to an US authority.

    B. Power Laws and Heavy-Tail Distributions

    It is important to note that strong interactions amongthe system elements often change the statistical distri-butions characterizing their behavior. Rather than nor-mal distributions, one typically finds (truncated) powerlaws or, more generally, so-called heavy-tail distribu-tions [48, 49, 58] (see Fig. 3 and Sec. IID). These implythat extreme events occur much more frequently thanexpected. For example, the crash of the stock market onBlack Monday was a 35 sigma event (where sigma standsfor the standard deviation of the Dow Jones Index ona logarithmic scale). Other examples are the size distri-butions of floods, storms, earth quakes, or wars [14].Obviously, the occurence of the respective heavy-tail dis-tributions is highly important for the insurance businessand for the risk assessment of financial derivatives.

  • 3Figure 3: When system components interact strongly, the nor-mally distributed behavior of separated system elements oftenbecomes (approximately) power-law distributed. As a conse-quence, fluctuations of any size can occur in the system, andextreme events are much more frequent than expected. Notethat power laws are typical for a system at a critical point,also known as tipping point.

    C. Network Interactions and Systemic Risksthrough Failure Cascades

    A typical case of non-linear interactions are networkinteractions, which are ubiquitous in socio-economic sys-tems [6479]. These imply feedback loops and vitious cir-cles or induce (often undesired) side effects [32]. (For ex-ample, the introduction of cigarette taxes has promotedsmuggling and other criminal activities.) Moreover, net-work interactions are often the reason for a cascading offailure events. Examples for this are epidemic spreading,the failure of the interbank market during a financial cri-sis, the spreading of traffic congestion, or the blackout ofan electrical power system (see Fig. 4).

    Figure 4: Example of a blackout of the electrical power gridin Europe (from: EU project IRRIIS. E. Liuf (2007) Criti-cal Infrastructure protection, R&D view). To allow for thetransfer of a ship, one power line had to be temporarily dis-connected in Northern Germany. This triggered an overload-related cascading effect [80], during which many power lineswent out of operation. As a consequence, there were black-outs all over Europe (see black areas). The pattern illustrateshow counter-intuitive and hardly predictable the behavior ofcomplex systems with network interactions can be.

    Failure cascades (which are also called chain reactions,avalanche or domino effects) are the most common mech-anism by which local risks can become systemic [8184](see Fig. 5). Systemic failures are usually triggered byone of the following reasons:

    Figure 5: Top: Schematic illustration of a networked systemwhich is hit by an over-critical perturbation (e.g. a natu-ral disaster). The problem of feedback cycles is highlighted.They can have autocatalytic (escalation) effects and actlike vitious circles. Bottom: Illustration of cascading effects insocio-economic systems, which may be triggered by the dis-ruption (over-critical perturbation) of an anthropogenic sys-tem. A more detailed picture can be given for specific disas-ters. Note that the largest financial damage of most disastersis caused by such cascading effects, i.e. the systemic impactof an over-critical perturbation (after [85]).

    1. The parameters determining system stability aredriven towards a so-called critical point or tip-ping point, beyond which the system behavior be-comes unstable (see Sec. IIA). For example, thedestabilization of the former German DemocraticRepublic (GDR) triggered off spontaneous demon-strations in Leipzig, Germany, in 1989, which even-tually caused the re-unification of Germany. Thispeaceful revolution shows that systemic instabil-ity does not necessarily imply systemic malfunc-tions. It can also induce a transition to a bet-ter and more robust system state after a transienttransformation period. Further examples of spon-

  • 4taneous transitions by systemic destabilization arediscussed in Secs. IID, III, and IVA.

    2. The system is metastable (i.e. robust to small per-turbations, which quickly disappear over time), butthere occurs an over-critical perturbation (such asa natural disaster), which harms the system func-tionality so much that this has damaging effects onother parts of the system [84] (see Fig. 6).

    3. The system is metastable, but there is a coinci-dence of several perturbations in the network nodesor links such that their interaction happens to beover-critical and triggers off additional failures inother parts of the system [83]. In fact, disasterscaused by human error [86, 87] are often based ona combination of several errors. In networked sys-tems, the occurence of this case is just a matter oftime.

    Figure 6: The most efficient disaster response strategy de-pends on many factors such as the network type (after [84]).Here, we have studied six different disaster resonse strategiesfor regular grids, scale-free networks, and Erdos-Renyi ran-dom networks. The best strategy is a function of the resourcesR available for disaster response management and the timedelay tD before practical measures are taken. Obviously, thereis no single strategy, which always performs well. This makesdisaster response challenging, calling for scientific support.

    D. Self-Organized or Self-Induced Criticality

    A system may get into a critical state not only by ex-ternal influences that are affecting system stability. It isknown that some endogeneous processes can automat-ically drive the system towards a critical state, whereavalanche or cascading effects of arbitrary size appear (re-flecting the characteristic heavy-tail statistics at criticalpoints, see Sec. II B). In such cases, the occurence of ex-treme events is expected, and we speak of self-inducedor self-organized criticality (SOC) [88, 89].It is likely that bankruptcy cascades can be understood

    in this way. The underlying mechanism is that a companyor bank tries to make a better offer to customers or clientsthan the competing companies or banks do. This forcesthe competitors to make better offers as well. Eventually,the profit margins in a free market become so small that

    variations in the consumption rate can drive some com-panies or banks out of business, which creates economicproblems for other companies or banks. Considering theinterconnections between different companies or banks,this mechanism can cause bankruptcy cascades. Eventu-ally, the number of competitors will be smaller, and as aconsequence, they can take higher prices. Therefore, theirprofits go up, which encourages new competitors to enterthe market. In this way, competition increases again andautomatically drives the system back to low profits andbankruptcies.

    Another example concerns safety standards [86, 87].These are usually specified in such a way that normalperturbations would not cause serious harm or even sys-temic failures. As a consequence, most man-made sys-tems are constructed in a way that makes them robust tosmall and moderate perturbations (in other words: meta-stable). However, the requirement of cost efficiency ex-certs pressure on decision-makers to restrict safety stan-dards to what really appears to be needed, and not more.Consequently, if a large-scale failure has not occurred in along time, decision-makers often conclude that the exist-ing safety standards are higher than necessary and thatthere is some potential to reduce costs by decreasing themsomewhat. Eventually, the standards are lowered so muchthat an over-critical perturbation occurs sooner or later,which causes a systemic failure. As a consequence, thesafety-standards will be increased again, and the processwill start from the beginning.

    As a third example, let us discuss man-made systemswith capacity limits such as traffic or logistic systems.These systems are often driven towards maximum effi-ciency, i.e. full usage of their capacity. However, whenreaching this point of maximum efficiency, they also reacha tipping point, at which the system becomes dynam-ically unstable [90]. This is known, for example, fromfreeway and railway traffic. As a consequence, the sys-tem suffers an unexpected capacity drop due to optimiza-tion efforts, shortly after the maximum performance wasreached.

    Similarly to freeway traffic, engineers also try to avoidthe occurence of congestion in urban traffic, which can bereached by re-routing strategies. A closer analysis showsthat this optimization leads again to a sudden breakdownof the flow, once the maximum throughput is reached[91]. One may, therefore, conclude that optimizing for thefull usage of available system capacity implies the dan-ger of an abrupt breakdown of the system performancewith potentially very harmful consequences. To avoid thisproblem, one must know the capacity of the system andavoid to reach it. This can be done by requiring to respectsufficient safety margins.

  • 5E. Limits of Predictability, Randomness,Turbulence and Chaos

    The large number of non-linearly coupled system com-ponents can lead to a complex dynamics (see Fig. 7).Well-known examples for this are the phenomena of tur-bulence [92] and chaos [42, 93], which make the dynamicsof the system unpredictable after a certain time period.A typical example are weather forecasts.The large sensitivity to small perturbations is some-

    times called the butterfly effect, suggesting that (ina chaotically behaving system) the flight of a butterflycould significantly change the system behavior after asufficiently long time. A further obstacle for predictingthe behavior of many complex systems is a probabilis-tic or stochastic dynamics [94, 95], i.e. the importance ofrandomness.In socio-economic systems, there is furthermore a ten-

    dency of self-fulfilling or self-destroying prophecy effects[96] (and it is hard to say which effect will finally dom-inate, see the current response of the population to theswine flu campaign). Stock markets show both effects:On the one hand, the self-fulfilling prophecy effect leadsto herding behavior, which creates bubbles [97]. On theother hand, the competition for the highest possible re-turns eventually destroys any predictable gains (other-wise everybody could become rich without having towork, thereby creating a financial perpetuum mobile).Altogether, this competition creates a (more or less) ef-ficient and unpredictable stock market. A generalizationof this principle is known as Goodharts law.

    Figure 7: Illustration of various cases of non-linear dynamicsthat can occur in complex systems (from [98], p. 504). De-terministic chaos and turbulence constitute further and evenmore complicated cases of non-linear system dynamics.

    F. The Illusion of Control

    Besides the difficulties to predict the future behaviorof complex systems, there are other effects which makethem difficult to control:

    Figure 8: When a complex system is changed (e.g. by externalcontrol attempts), its system parameters, stability, and dy-namics may be affected. This figure illustrates the occurenceof a so-called cusp catastrophe. It implies discontinuoustransitions (regime shifts) in the system dynamics.

    1. On the one hand, big changes may have small orno effects (see Fig. 2) and, when considering net-work interactions (see Sec. II C), even adverse ef-fects. This reflects the principle of Le Chatelier,according to which a system tends to counteractexternal control attempts.

    2. On the other hand, if the system is close to a crit-ical or tipping point, even small changes maycause a sudden regime shift, also known as phasetransition or catastrophe (see Figs. 2 and Sec.8). In other words, small changes can sometimeshave a big impact, and often very unexpectedly so.However, there are typically some early warningsignals for such critical transitions [99]. This in-cludes the phenomenon of critical slowing down,which means that it takes a long time to dampenout perturbations in the system, i.e. to drive thesystem back to equilibrium.

    Another warning signal of potential regime shiftsare critical fluctuations, which normally obeya heavy-tail distribution (see Sec. II B). In otherwords, perturbations in the system tend to be largerthan usuala phenonenon which is also known asflickering.

    3. Control attempts may also be obstructed by irre-ducible randomness, i.e. a degree of uncertainty orperturbation which cannot be eliminated (see Sec.II E).

    4. Delays are another typical problem that often causea failure of control [100]. The underlying reason isthat delays may create an unstable system behavior(also when people attempt to compensate delays

  • 6by anticipation). Typical examples are the break-down of traffic flows and the occurence of stop-and-go traffic, which result from delayed speed adjust-ments of drivers to variations in the vehicle speedsahead.

    Since many control attempts these days are basedon the use of statistics, but compiling such statisticsis time-consuming, delays may cause instabilitiesalso in other areas of society. Business cycles, forexample, may result from such delays as well (ormay at least be intensified by them).

    5. Finally, there is the problem of unknown un-knowns [101], i.e. hidden factors which influencethe system behavior, but have not been noticedbefore. By definition, they appear unexpectedly.Structural instabilities [39] may create such ef-fects. The appearance of a new species in an ecosys-tem is a typical example. In economics, this role isplayed by innovations or new products, which hap-pen to change the social or economic world. Well-known examples for this are the invention of con-traceptives, computers, or mobile phones.

    G. The Logic of Failure

    As a consequence of the above, complex systems can-not be controlled in the conventional way (like pressinga button or steering a car). Such control attempts willusually fail, as Doerners book The Logic of Failurehas impressively shown [52].A typical failure scenario is as follows: A decision-

    maker tries to change the social system. It turns out thatthe measure taken does not have any effect (see Fig. 2).Therefore, he or she decides to intensify the measure. Theeffect may still not be as expected. Hence, an even moreforceful control attempt is made. As a consequence, thesystem undergoes a sudden regime shift (see Figs. 2+8)and the system organizes itself in a different way (butnot necessarily in the desired way). The decision-makernow tries to re-gain control and counteracts the unex-pected change. If the attempts to stabilize the systemare delayed, this can even lead to an oscillatory or chaoticsystem dynamics.The right approach to influence complex systems is

    to support and strengthen the self-organization and self-control of the system by mechanism design (see Sec.IVA). This basically means that coordination and co-operation in a complex system will appear by itself, ifthe interactions among the system elements are well cho-sen. That is, regulations should not specify what exactlythe system elements should do, but set bounds to ac-tions (define rules of the game), which give the systemelements enough degrees of freedom to self-organize goodsolutions. If the interaction rules are suitable, such an ap-proach will usually lead to a much more flexible and adap-tive system behavior. Another advantage is systemic ro-

    bustness, i.e. the ability cope with challenges by externalperturbations. Note however, that everything depends onthe interactions of the system elements. Unsuitable inter-actions can, for example, cause that the system behavesdynamically unstable or that it gets trapped in a subop-timal (frustrated) state. Hence, finding the right inter-action rules is a great challenge for decision-makers, andcomplex systems scientists are needed to address themproperly.

    III. THE EXAMPLE OF FINANCIAL MARKETINSTABILITY

    One example of systemic risks that deserves more at-tention here is financial market instability [102108]. Therecent financial crises shows very clearly how cascadingeffects can lead to an uncontrollable dynamics and a rel-atively sudden systemic crises. What started with localproblems concerning subprime mortgages eventually af-fected the mortgage companies, the home building indus-try, the financial markets, the US economy, and the worldeconomy. This crisis has been explained in many ways.Widely discussed reasons include

    the deregulation of financial markets, the explosive spread of derivatives (which reacheda value of 15 times the gross product of the world),

    the apparently riskless securization of risky dealsby credit default swaps, lowering lending standards,

    the opaqueness (intransparency) of financial deriva-tives,

    the failure of rating agencies due to the complexityof the financial products,

    bad risk models (neglecting, for example, correla-tions and the heavy-tail character of the fluctua-tions),

    calibration of risk models with historical data notreflecting the actual situation,

    insufficient net assets of banks, low interest rates to fight previous crises, the growth of over-capacities and other develop-ments with pro-cylical effects,

    short-term incentive structures (bonus schemes)and greed of investment bankers and managers.

    Less debated, but not less relevant reasons are [109111]:

    The complexity of the financial system is largerthan what is knowable. For example, many port-folios appear to contain too many different assetsto support a reliable optimization with the amountof data available [112].

  • 7 In the arms race between banks (and otheragents) with the regulators, regulators are some-times in the weaker position. Therefore, financialmarket instability may result from the fact that in-stability is beneficial for some interest groups: Itrequires an unstable market to allow some peopleto become very rich in a short time: Instabilityimplies opportunities for good investments. WhenGDP grows slowly, good returns mainly result fromfinancial bubbles.

    The financial architecture has created a complexsystem, with a hard-to-predict and hard-to-controldynamics. Financial products (derivatives) wereconstructed in a multi-level way, very much like ahouse of cards.

    The world-wide network interdependencies of allmajor banks have spread local risks all over thesystem to an extent that produced a systemic risk.It created a global village without any firewalls(security breaks).

    Delays in the adaptation of some markets buildup disequilibria in the system with the potential ofearthquake-like stress reliefs. As examples for this,one may take historical crashes in currency marketsor recent drops in the values of certain AAA-ratedstocks.

    The financial and economic system are organized ina way that allows for the occurrence of strong corre-lations. For example, when the strategies of compa-nies all over the world become more and more sim-ilar (due to group think [113] or asking the sameconsultancy companies), a lack of variety (hetero-geneity) results in the system. This can cause (moreor less) that either no company fails or many com-panies fail at the same time.

    An important factor producing herding effects[114, 115] and bubbles is the continuous informa-tion feedback regarding the investment decisionsof others [116]. In this connection, it is impor-tant to underline that repeated interactions be-tween decision-makers supports consensus, but cre-ates over-confidence (i.e. a false feeling of safety,despite misjudgements of reality). Therefore, it un-dermines the wisdom of crowds [117, 118]. Thisproblem may be further intensified by the publicmedia which, in the worst case, may even create amass hysteria.

    The price formation mechanism mixes material val-ues and psychology in a single, one-dimensionalquantity, the price. Therefore, the price dynamicsis sensitive to factors such as trust, risk aversion,greed, and herding effects (the imitation of the be-havior of others) [5456, 119].

    A stability of single banks does not imply that thebanking system cannot enter a state of systemicinstability. (Monetary value is a matter of trust,and therefore a single event such as the failure ofLehmann Brothers could induce that banks werenot anymore willing to lend money to each other.This triggered a liquidity crises so big that it wouldhave caused the failure of the world financial sys-tem, if the central banks would not have quicklyprovided huge amounts of liquidity.)

    Lack of trust also reduces lending of cheap moneyto troubled companies, which may drive them intobankruptcy, thereby increasing a banks problems.

    More generally, the economic system seems to havea tendency towards self-organized critical behavior(see Sec. IID).

    Many of the above factors have contributed to strongnon-linear couplings in the system. Furthermore, strongnetwork interdependencies have been created throughthe interbank markets and complex financial derivatives.These features are already expected to imply cascade-like effects and a heavy-tail statistics (see Sec. II B). Thistendency is expected to be further amplified by anticipa-tion attempts in fluctuating markets. However, even moredangerous than the occurrence of fluctuations in the mar-kets is the occurence of strong correlations. These can bepromoted by economic cycles, herding effects, and thecoupling of policies or regulation attempts to global riskindicators.The worldwide crisis in the automobile sector in 2009

    and the quant meltdown in August 2007 are good ex-amples for the occurence of strong correlations. Thelatter may be understood as follows [120]: Returns ofhedge fonds largely depend on their leverage. Therefore,there is an evolutionary pressure towards high lever-age, which can increase volatility. In case of huge pricejumps, however, banks tend to demand their loans back.This decreases the leverage of the affected hedge fundsand thereby their chances to perform well in the fu-ture. Hence, large system-wide leverage levels are pre-requisites for collapses, and crises can emerge virtuallyout of nothing, just through fluctuations. This exam-ple illustrates well how unsuitable risk-averse policiescan create pro-cyclical effects, through which banks mayharm their own interests.

    IV. MANAGING COMPLEXITY

    Having discussed the particular challenges of complexsystems, one may be left with the impression that suchsystems are just too difficult for us to handle. However,in the past decades, a variety of scientific techniques havebeen developed to address these challenges. These include

    large-scale data mining,

  • 8 network analysis, systems dynamics, scenario modeling, sensitivity analysis, non-equilibrium statistical physics, non-linear dyamics and chaos theory, systems theory and cybernetics, catastrophe theory, the statistics of extreme events, the theory of critical phenomena and, maybe mostprominently these days,

    agent-based modeling [129133].The methods developed by these fields allow us to betterassess the sensitivity or robustness of systems and theirdynamics, as will be shortly discussed in the following.They have also revealed that complex systems are not ourenemies. In fact, they possess a number of favorableproperties, which can be used to our benefit.

    A. How to Profit from Complex Systems

    Understanding complex systems facilitates to utilizetheir interesting properties, which however requires oneto work with the system rather than against it [121128].For example, complex systems tend to show emergent(collective) properties, i.e. properties that the single sys-tem components do not have. This is, for example, rele-vant for the possibility of collective intelligence [134136].One may also benefit from the fact that complex systemstend to self-organize in a way, which is adaptive to the en-viroment and often robust and resource-efficient as well.This approach has, for example, been successfully appliedto develop improved design principles for pedestrian fa-cilities and other systems.Technical control approaches based on self-

    organization principles become more and more availablenow. While previous traffic control on highways andin cities was based on a centralized optimization bysupercomputers with expensive measurement and con-trol infrastructures, currently developed approaches arebased on decentralized coordination strategies (such asdriver assistant systems or traffic lights that are flexiblycontrolled by local traffic flows).Centralized structures can reach a quick information

    exchange among remote parts of a system, but they be-come unstable beyond a certain critical size (as the col-lapse of political states and many unsuccessful mergersof companies show). In comparison, decentralized ap-proaches are particularly suited to reach a flexible ad-justment to local conditions and local coordination [137].

    Some decentralized concepts for real-time control alreadyexceed the performance of centralized ones, particularlyin complex, hardly controllable, fluctuating enviroments,which require a quick and flexible response to the actualsituation [138] (see Fig. 9). In fact, in a strongly varyingworld, strict stability and control is not possible anymoreor excessively expensive (as the public spending deficitsshow). Therefore, a paradigm shift towards more flexible,agile, adaptive systems is needed, possible, and overdue.The best solutions are probably based on suitable com-binations of centralized and decentralized approaches.

    Figure 9: One advantage of centralized control is quick large-scale coordination. However, disadvantages result from thevulnerability of the network, a tendency of information over-load, the risk of selecting the wrong control parameters, anddelays in adaptive feedback control. Because of greater flexi-bility to local conditions and greater robustness to perturba-tions, decentralized control approaches can perform better incomplex systems with heterogeneous elements, large degreeof fluctuations, and short-term predictability (after [139]).

    In social systems, the principle of self-organization,which is also known as principle of the invisible hand,is ubiquitous. However, self-organization does not auto-matically lead to optimal results, and it may fail underextreme conditions (as is known, for example, from fi-nancial and traffic systems as well as dense pedestriancrowds).A particularly important example of self-control is the

    establishment of social norms, which are like social forcesguiding the behavior of people. In this way, social ordercan be created and maintained even without centralizedregulations such as enforced laws. Nevertheless, one mustbe aware that the principles on which social cooperationand norms are based (for example, repeated interaction,trust and reputation, or altruistic sanctioning of deviantbehavior) are fragile. Simple computer simulations sug-gest, for example, that a change from repeated local in-teractions (between family members, friends, colleagues,and neighbors) to non-recurring interactions with chang-ing partners from all over the world may cause a break-down of human cooperation [140]. Therefore, the on-going globalization could potentially destabilize our social

  • 9systems [141143] (see Fig. 10), which largely builds onnorms and social cooperation. (Remember, for example,that the breakdown of the interbank market, which al-most caused a collapse of the world financial system, wasdue to a breakdown of the network of trust.)

    Figure 10: Establishment of cooperation in a world with lo-cal interactions and local mobility (left) in comparison withthe breakdown of cooperation in a world with global interac-tions and global mobility (right) (blue = cooperators, red =defectors/cheaters/free-riders) (after [140]). Note that the lossof solidarity results from a lack of neighborhood interactions,not from larger mobility.

    B. Reducing Network Vulnerability

    In Sec. II C, we have seen that systemic risks are mostlybased on cascade spreading effects in networks. However,the vulnerability of networks to such spreading eventscan be reduced. The following measures are often quiteeffective:

    Figure 11: A networked system should be constructed in a waythat allows its quick decomposition or decompartementaliza-tion into weakly coupled (or, if necessary, even uncoupled)subnetworks. In such a way, failure cascades all over the sys-tem (or large parts of it) can be avoided, and most parts ofit can be protected from damage.

    The network structure can often been improvedby redundancy, i.e. the provision of alternatives, sothat an over-critical perturbation would only occur,if several nodes would fail or several links wouldbreak simultaneously.

    However, too much interconnectedness may beharmful, as it is provides the infrastructure forthe system-wide spreading of an unexpected prob-lem. Therefore, it makes sense to limit the degreeof connectedness and the size of networks (in orderto avoid a too big to fail problem).

    Alternatively, one can introduce firewalls: Hav-ing several networks, each of them characterizedby strong links, while the connections between thenetworks are weak, would allow to decouple the sodefined supernetwork into several subnetworks (seeFig. 11). This principle of decompartementalizationallows one to prevent the spreading of a problemover the whole system, if the disconnection strat-egy is well chosen. The principle of firewalls to pro-tect computer systems from malicious intrusion orthe principle of electrical fuses to protect an electri-cal network from overload could certainly be trans-ferred to other networked systems such as the fi-nancial system.

    For similar reasons, a heterogeneity (variety) amongthe nodes and/or links of a network (in terms ofdesign principles and operation strategies) will nor-mally increase its robustness.

    When fighting failure cascades in networks, a quickresponse to over-critical perturbations is absolutelydecisive. If the time delay of disaster response man-agement is small, its effectiveness depends in a com-plicated way on the network structure, the amountof resources, and the strategy of distributing themin the network (see Fig. 6). In case of significantdelays, cascade spreading can hardly be mitigated,even when large resources are invested.

    A moderate level of fluctuations may be useful todestroy potentially harmful correlations (such as fi-nancial bubbles) in the system. Such fluctuationscould be created by central banks (for the pur-pose of bubble control) or by other regulators,depending on the system. Note, however, that alarge degree of fluctuations can cause over-criticalperturbations or coincidences of perturbations.

    An unhealthy degree of volatility can be loweredby introducing conservation laws and/or frictionaleffects in the system. This is expected to dampenfluctuations and, thereby, to reduce the likelihoodof events that may trigger systemic risks.

    Rather than applying these concepts permanently, it canmake sense to use them adaptively, depending on thestate of the system. When designing networked systemsaccording to the above principles, one can certainly profitfrom the experience of physicists and engineers with othersystems.

  • 10

    V. SUMMARY, DISCUSSION, AND OUTLOOK

    In this contribution, we have summarized propertiesof complex systems and identified sources and drivers ofsystemic risks in socio-economic systems. Complex sys-tems cannot be easily controlled. They rather tend tofollow a self-organized eigendynamics, and conventionalcontrol attempts often have counter-intuitive and unin-tended effects.As the example of ecosystems shows, a networked sys-

    tem can have an astonishing degree of robustness withoutany central control. Robustness just requires the right in-teraction rules, which may be implemented, for example,by social norms, laws, technological measures etc., de-pending on the system. Properly chosen rules will leadto a self-regulation or self-control of the system, but im-proper specifications can lead to low performance or sys-temic instability. For example, if the failure rate of sys-tem elements is reduced, this may lead to larger systemicfailures later on. Moreover, it is probably good if the sys-tem is regularly exposed to stress, as this is expected tostrengthen its immunity to perturbations.It was particularly underlined that, in any larger net-

    worked system, it is essential to have firewalls (securitybreaks), which facilitate its quick decomposition or de-compartmentalization into disconnected or weakly con-nected subnetworks before a failure cascade has perco-lated through the whole system or large parts of it.Among the success stories of complex systems research,

    one may mention the nobel prizes of Ilya Prigogine,Thomas Schelling, and Paul Krugmann. Some examplesfor application areas of complexity science are [144148]

    the organization of the internet, modern epidemiology, the prevention of crowd stampedes, innovative solutions to improve traffic flow, the understanding of global climate change, the enhancement of the reliability of energy supply, modern disaster response management, prediction markets and other methods using thewisdom of crowds.

    However, many socio-economic crises still occur, becausethe system dynamics is not well enough understood, lead-ing to serious management mistakes. In order to support

    decision-makers, scientists need to be put in a better po-sition to address the increasing number of socio-economicproblems. These mainly result from the fact that so-cial and economic systems are rapidly changing, i.e. ina transformation process rather than in equilibrium.We must close the gap between existing socio-economic

    problems and solutions, and create conditions allowingus to come up with solutions before a problem occurs.This requires to build up greater research capacities (asocio-economic knowledge accelerator). It will also benecessary to establish a new study direction (integrativesystems design) to provide decision-makers with solidknowledge regarding the behavior of complex systems,how to manage complexity in politics and economy, andhow to cope with crises.Finally, scientists need to have access to better and

    more detailed data. Special super-computing centers (asfor climate research) would allow scientists to simulatemodel societies and study the impact of policy measuresbefore their implementation. They would also supportthe development of contingency plans and the investi-gation of alternative ways of organization (plan B).Such centers will require a multi-disciplinary collabora-tion across the various relevant research areas, rangingfrom the socio-economic over the natural to the engi-neering sciences. For this, one needs to overcome the par-ticular challenges of multidisciplinary research regardingorganization, funding, and publication.Considering that we know more about the origin of

    the universe than about the conditions for a stable so-ciety, a prospering economics, and enduring peace, weneed nothing less than an Apollo project for the socio-economic sciences. There is no time to lose, since thereare already signs of critical fluctuations indicating pos-sible regime shifts [149154]: The recent riots in Greeceand France, for example, and the current protests againstthe Bologna reforms are speaking a clear language.

    Acknowledgements: This work was partially supportedby the ETH Competence Center Coping with Crisesin Complex Socio-Economic Systems (CCSS) throughETH Research Grant CH1-01 08-2. The author wouldlike to thank Peter Felten for creating many of the il-lustrations shown in this contribution. Furthermore, theauthor is grateful for inspiring discussions with StefanoBattiston, Lubos Buzna, Imre Kondor, Matteo Marsili,Frank Schweitzer, Didier Sornette, Stefan Thurner, andmany others.

    [1] D. Sornette (2006) Critical Phenomena in Natural Sci-ences: Chaos, Fractals, Selforganization and Disorder.Springer, Berlin.

    [2] S. Albeverio, V. Jentsch, and H. Kantz, eds. (2006) Ex-

    treme Events in Nature and Society. Springer, Berlin.[3] A. Bunde, J. Kropp, and H. J. Schellnhuber, eds. (2002)

    The Science of Disasters. Climate Disruptions, HeartAttacks, and Market Crashes. Springer, Berlin.

  • 11

    [4] M. Buchanan (2000) Ubiquity. Why Catastrophes Hap-pen. Three Rivers, New York.

    [5] Office of the President of Columbia University,Lee C. Bollinger (2005) Announcing the ColumbiaCommittee on Global Thought, see http://www.columbia.edu/cu/president/docs/communications/2005-2006/051214-committee-global-thought.html

    [6] W. Weidlich (2000) Sociodynamics: A Systematic Ap-proach to Mathematical Modelling in the Social SciencesHarwood Academic, Amsterdam.

    [7] H. Haken (2004) Synergetics: Introduction and Ad-vanced Topics. Springer, Berlin.

    [8] D. Helbing (1995) Quantitative Sociodynamics. KluwerAcademic, Dordrecht.

    [9] R. Axelrod (1997) The Complexity of Cooperation:Agent-Based Models of Competition and CollaborationPrinceton University, Princeton, NJ.

    [10] F. Schweitzer, ed. (1997) Self-Organization of Com-plex Structures: From Individual to Collective DynamicsGordon and Breach, Amsterdam.

    [11] D. S. Byrne (1998) Complexity Theory and the SocialSciences: An Introduction Routledge, New York.

    [12] W. Buckley (1998) SocietyA Complex Adaptive Sys-tem: Essays in Social Theory. Routledge.

    [13] S. Y. Auyang (1999) Foundations of Complex-SystemTheories in Economics, Evolutionary Biology, and Sta-tistical Physics Cambridge University, Cambridge.

    [14] F. Schweitzer, ed. (2002) Modeling Complexity in Eco-nomic and Social Systems. World Scientific, Singapore.

    [15] R. K. Sawyer (2005) Social Emergence: Societies AsComplex Systems Cambridge University, Cambridge.

    [16] A. S. Mikhailov and V. Calenbuhr (2006) From Cells toSocieties. Springer, Berlin.

    [17] B. Blasius, J. Kurths, and L. Stone (2007) Complex Pop-ulation Dynamics. World Scientific, Singapore.

    [18] M. Batty (2007) Cities and Complexity: Understand-ing Cities with Cellular Automata, Agent-Based Models,and Fractals. MIT Press.

    [19] S. Albeverio, D. Andrey, P. Giordano, and A. Vancheri(2008) The Dynamics of Complex Urban Systems. Phys-ica, Heidelberg.

    [20] K. Mainzer (2007) Thinking in Complexity: The Com-putational Dynamics of Matter, Mind, and MankindSpringer, Berlin.

    [21] J. H. Miller and S. E. Page (2007) Complex AdaptiveSystems: An Introduction to Computational Models ofSocial Life Princeton University, Princeton, NJ.

    [22] J. M. Epstein (2007) Generative Social Science: Stud-ies in Agent-Based Computational Modeling PrincetonUniversity, Princeton NJ.

    [23] B. Castellani and F. Hafferty (2009) Sociology and Com-plexity Science. Springer, Berlin.

    [24] D. Lane, D. Pumain, S. E. van der Leeuw, and G. West,eds. (2009) Complexity Perspectives in Innovation andSocial Change. Springer, Berlin.

    [25] C. Castellano, S. Fortunato, and V. Loreto, Statisticalphysics of social dynamics. Rev. Mod. Phys. 81, 591646(2009).

    [26] P. W. Anderson, K. Arrow, and D. Pines, eds. (1988)The Economy as an Evolving Complex System. West-view.

    [27] H. W. Lorenz (1993) Nonlinear Dynamical Equationsand Chaotic Economy. Springer, Berlin.

    [28] P. Krugman (1996) The Self-Organizing Economy.

    Blackwell, Malden, MA.[29] W. B. Arthur, S. N. Durlauf, and D. A. Lane (1997) The

    Economy as an Evolving Complex System II. Westview.[30] R. N. Mantegna and H. E. Stanley (1999) An Introduc-

    tion to Econophysics: Correlations and Complexity inFinance. Cambridge University, Cambridge.

    [31] D. Colander, ed. (2000) The Complexity Vision and theTeaching of Economics. Elgar, Cheltenham, UK.

    [32] J. D. Sterman (2000) Business Dynamics: SystemsThinking and Modeling for a Complex World McGrawHill.

    [33] T. Puu (2003) Attractors, Bifurcations, & Chaos. Non-linear Phenomena in Economics. Springer, Berlin.

    [34] L. E. Blume and S. N. Durlauf, eds. (2006) The Econ-omy as an Evolving Complex System III. Oxford Uni-versity, Oxford.

    [35] B. K. Chakrabarti, A. Chakraborti, and A. Chatter-jee (2006) Econophysics and Sociophysics. Wiley, NewYork.

    [36] A. Chatterjee and B. K. Chakrabarti, eds. (2007)Econophysics of Markets and Business Networks.Springer, Milan.

    [37] A. C.-L. Chian (2007) Complex Systems Approach toEconomic Dynamics. Springer, Berlin.

    [38] V. Lacayo, What complexity scienceteaches us about social change, see http://www.communicationforsocialchange.org/mazi-articles.php?id=333

    [39] G. Nicolis and I. Prigogine (1977) Self-Organization inNonequilibrium Systems. Wiley, New York.

    [40] A. S. Mikhailov (1991) Foundations of Synergetics I:Distributed Active Systems Springer, Berlin.

    [41] A. S. Mikhailov and A. Y. Loskutov (1991) Foundationsof Synergetics II: Complex Patterns. Springer, Berlin.

    [42] S. H. Strogatz (2001) Nonlinear Dynamics and Chaos:With Applications To Physics, Biology, Chemistry, AndEngineering. Westview.

    [43] N. Boccara (2003)Modeling Complex Systems. Springer,Berlin.

    [44] J. Jost (2005)Dynamical Systems: Examples of ComplexBehaviour. Springer, Berlin.

    [45] G. Nicolis (2007) Foundations of Complex SystemsWorld Scientific, New York.

    [46] P. Erdi (2007) Complexity Explained Springer, Berlin.[47] C. Gros (2008) Complex and Adaptive Dynamical Sys-

    tems. Springer, Berlin.[48] M. Schroeder (2009) Fractals, Chaos, Power Laws.

    Dover.[49] A. Saichev, Y. Malevergne, and D. Sornette (2010) The-

    ory of Zipfs Law and Beyond. Springer, Berlin.[50] D. Helbing, I. Farkas, and T. Vicsek (2000) Simulating

    dynamical features of escape panic. Nature 407, 487-490.

    [51] D. Helbing and A. Mazloumian (2009) Operationregimes and slower-is-faster effect in the control of traf-fic intersections. European Physical Journal B, 70(2),257-274.

    [52] D. Dorner (1997) The Logic Of Failure: Recognizing andAvoiding Error in Complex Situations. Basic, New York.

    [53] D. Helbing (2009) Managing complexity in socio-economic systems. European Review 17(2), 423-438.

    [54] R. H. Thaler, ed. (1993) Advances in Behavioral Fi-nance. Russell Sage Foundation, New York.

    [55] C. F. Camerer, G. Loewenstein, and M. Rabin, eds.

  • 12

    (2004) Advances in Behavioral Economics. PrincetonUniversity, Princeton, NJ.

    [56] R. H. Thaler, ed. (2005) Advances in Behavioral Fi-nance, Vol. II. Princeton University, Princeton, NJ.

    [57] M. Schonhof and D. Helbing (2007) Empirical featuresof congested traffic states and their implications for traf-fic modeling. Transportation Science 41(2), 135-166.

    [58] H. E. Stanley (1987) Introduction to Phase Transitionsand Critical Phenomena. Oxford University.

    [59] N. Goldenfeld (1992) Lectures On Phase Transitionsand the Renormalization Group. Westview.

    [60] E. C. Zeeman, ed. (1977) Catastrophe Theory. Addison-Wesley, London.

    [61] T. Poston and I. Stewart (1996) Catastrophe Theory andIts Applications. Dover.

    [62] M. Gladwell (2002) The Tipping Point: How LittleThings Can Make a Big Difference. Back Bay.

    [63] V. I. Arnold, G. S. Wassermann, and R. K. Thomas(2004) Catastrophe Theory. Springer, Berlin.

    [64] P. Doreian and F. N. Stokman, eds. (1997) Evolution ofSocial Networks. Routledge, Amsterdam.

    [65] D. J. Watts (1999) Small Worlds. Princeton University,Princeton, NJ.

    [66] B. A. Huberman (2001) The Laws of the Web: Patternsin the Ecology of Information. MIT Press.

    [67] R. Albert and L.-A. Barabasi (2002) Statistical mechan-ics of complex networks. Reviews of Modern Physics 74,4797.

    [68] S. Bornholdt and H. G. Schuster (2003) Handbook ofGraphs and Networks: From the Genome to the Internet.Wiley-VCH.

    [69] S. N. Dorogovtsev and J. F. F. Mendes (2003) Evolutionof Networks. Oxford University, Oxford.

    [70] P. J. Carrington, J. Scott, and S. Wassermann (2005)Models and Methods in Social Network Analysis. Cam-bridge University, New York.

    [71] M. Newman, A.-L. Barabasi, and D. J. Watts, eds.(2006) The Structure and Dynamics of Networks.Princeton University, Princeton, NJ.

    [72] F. Vega-Redondo (2007) Complex Social Networks.Cambridge University, Cambridge.

    [73] M. O. Jackson (2008) Social and Economic Networks.Princeton University, Princeton, NJ.

    [74] J. Bruggeman (2008) Social Networks. Routledge, NewYork.

    [75] A. Barrat, M. Barthelemy, and A. Vespignani (2008)Dynamical Processes on Complex Networks. CambridgeUniversity, Cambridge.

    [76] J. Reichardt (2009) Structure in Complex Networks.Springer, Berlin.

    [77] A. K. Naimzada, S. Stefani, and A. Torriero, eds. (2009)Networks, Topology, and Dynamics. Springer, Berlin.

    [78] A. Pyka and A. Scharnhorst, eds. (2009) InnovationNetworks. Springer, Berlin.

    [79] T. Gross and H. Sayama, eds. (2009) Adaptive Networks.Springer, Berlin.

    [80] I. Simonsen, L. Buzna, K. Peters, S. Bornholdt and D.Helbing (2008) Transient dynamics increasing networkvulnerability to cascading failures. Physical Review Let-ters 100, 218701.

    [81] J. Lorenz, S. Battiston, and F. Schweitzer (2009) Sys-temic risk in a unifying framework for cascading pro-cesses on networks. The European Physical Journal B71(4), 441-460.

    [82] D. Helbing and C. Khnert (2003) Assessing interactionnetworks with applications to catastrophe dynamics anddisaster management. Physica A 328, 584-606.

    [83] L. Buzna, K. Peters, and D. Helbing (2006) Modellingthe dynamics of disaster spreading in networks. PhysicaA 363, 132-140.

    [84] L. Buzna, K. Peters, H. Ammoser, C. Kuhnert and D.Helbing (2007) Efficient response to cascading disasterspreading. Physical Review E 75, 056107.

    [85] D. Helbing, H. Ammoser, and C. Khnert (2005) Disas-ters as extreme events and the importance of networkinteractions for disaster response management. In [19],pp. 319-348.

    [86] J. Reason (1990) Human Error. Cambridge University,Cambridge.

    [87] J. R. Chiles (2002) Inviting Disaster: Lessons From theEdge of Technology. Harper.

    [88] P. Bak (1999) How nature works: the science of self-organized criticality. Springer, Berlin.

    [89] H. J. Jensen (1998) Self-Organized Criticality: EmergentComplex Behavior in Physical and Biological Systems.Cambridge University, Cambridge.

    [90] D. Helbing (2001) Traffic and related self-driven many-particle systems. Reviews of Modern Physics 73, 1067-1141.

    [91] D. de Martino, L. DallAsta, G. Bianconi, and M. Mar-sili (2009) Congestion phenomena on complex networks.Physical Review E 79, 015101.

    [92] P. A. Davidson (2004) Turbulence Cambridge Univer-sity, Cambridge.

    [93] H. G. Schuster and W. Just (2005) Deterministic Chaos.Wiley-VCH, Weinheim.

    [94] N. G. van Kampen (2007) Stochastic Processes inPhysics and Chemistry. North-Holland, Amsterdam.

    [95] G. Deco and B. Schurmann (2001) Information Dynam-ics. Springer, Berlin.

    [96] R. A. Jones (1981) Self-fulfilling Prophecies: Social,Psychological, and Physiological Effects of Expectancies.Lawrence Erlbaum.

    [97] R. E. A. Farmer (1999)Macroeconomics of Self-fulfillingProphecies. MIT Press.

    [98] J. D. Murray (2003) Mathematical Biology, Vol. I.Springer, Berlin.

    [99] M. Scheffer, J. Bascompte, W. A. Brock, V. Brovkin,S. R. Carpenter, V. Dakos, H. Held, E. H. van Nes, M.Rietkerk, and G. Sugihara (2009) Early-warning signalsfor critical transitions. Nature 461, 53-59.

    [100] W. Michiels and S.-I. Niculescu (2007) Stability and Sta-bilization of Time-Delay Systems. siamSociety for In-dustrial and Applied Mathematics, Philadelphia.

    [101] N. N. Taleb (2007) The Black Swan: The Impact of theHighly Improbable. Random House.

    [102] E. E. Peters (1996) Chaos and Order in the Capital Mar-kets. Wiley, New York.

    [103] S. Claessens and K. J. Forbes, eds. (2001) InternationalFinancial Contagion. Kluwer Academic, Dordrecht.

    [104] K. Ilinsiki (2001) Physics of Finance. Gauge Modellingin Non-Equilibrium Pricing. Wiley, Chichester.

    [105] B. M. Roehner (2002) Patterns of Speculation. Cam-bridge University, Cambridge.

    [106] D. Sornette (2004) Why Stock Markets Crash: CriticalEvents in Complex Financial Systems Princeton Uni-versity, Princeton, NJ.

    [107] B. Mandelbrot and R. L. Hudson (2006) The Misbehav-

  • 13

    ior of Markets: A Fractal View of Financial Turbulence.Basic, New York.

    [108] M. Faggini and T. Lux, eds. (2009) Coping with theComplexity of Economics. Springer, Milan.

    [109] R. J. Breiding, M. Christen, and D. Helbing (April 2009)Lost robustness. Naissance Newsletter, 8-14.

    [110] R. M. May, S. A. Levin, and G. Sugihara (2008) Ecologyfor bankers. Nature 451, 893-895.

    [111] S. Battiston, D. Delli Gatti, M. Gallegati, B. C. N.Greenwald, and J. E. Stiglitz (2009) Liaisons dan-gereuses: Increasing connectivity, risk sharing and sys-temic risk, see http://www3.unicatt.it/unicattolica/CentriRicerca/CSCC/allegati/delligatti.pdf

    [112] I. Kondor and I. Varga-Haszonits (2008) Divergent es-timation error in portfolio optimization and in linearregression. The European Physical Journal B 64, 601-605.

    [113] G. A. Akerlof and R. J. Shiller (2009) Animal Spirits.How Human Psychology Drives the Economy, and WhyIt Matters for Global Capitalism. Princeton University,Princeton, NJ.

    [114] G. Le Bon (2002) The Crowd: A Study of the PopularMind Dover.

    [115] W. Trotter (2005) Instincts of the Herd in Peace andWar. Cosimo Classics, New York.

    [116] C. Mackay (2003) Extraordinary popular delusions andthe madness of crowds.

    [117] J. Surowiecki (2005) The Wisdom of Crowds. Anchor.[118] P. Ball (2004) Critical Mass. Arrow, London.[119] R. Falcone, K. S. Barber, J. Sabater-Mir, M. P. Singh,

    eds. (2009) Trust in Agent Societies. Springer, Berlin.[120] S. Thurner, J. D. Farmer, and J. Geanakoplos (2009)

    Leverage causes fat tails and clustered volatility. E-printhttp://arxiv.org/abs/0908.1555.

    [121] R. Axelrod and M. D. Cohen (2001) Harnessing Com-plexity: Organizational Implications of a Scientific Fron-tier Basic, New York.

    [122] H. Eisner (2005) Managing Complex Systems: ThinkingOutside the Box Wiley, New York.

    [123] L. Hurwicz and S. Reiter (2006) Designing EconomicMechanisms. Cambridge University, New York.

    [124] M. Salzano and D. Colander, eds. (2007) ComplexityHints for Economic Policy. Springer, Berlin.

    [125] E. Scholl and H. G. Schuster, eds. (2008) Handbook ofChaos Control. Wiley-VCH, Weinheim.

    [126] D. Grass, J. P. Caulkins, G. Feichtinger, G. Tragler, D.A. Behrens (2008) Optimal Control of Nonlinear Pro-cesses: With Applications in Drugs, Corruption, andTerror. Springer, Berlin.

    [127] D. Helbing, ed. (2008) Managing Complexity: Insights,Concepts, Applications. Springer, Berlin.

    [128] L. A. Cox Jr. (2009) Risk Analysis of Complex and Un-certain Systems. Springer, New York.

    [129] M. Gallegati and A. Kirman, eds. (1999) Beyond theRepresentative Agent. Edward Elgar, Cheltenham, UK.

    [130] A. Consiglio, ed. (2007) Artificial Markets Modeling.Springer, Berlin.

    [131] M. Aoki and H. Yoshikawa (2007) ReconstructingMacroeconomics. Cambridge University Press, Cam-bridge.

    [132] K. Schredelseker and F. Hauser, eds. (2008) Complexityand Artificial Markets. Springer, Berlin.

    [133] D. Delli Gatti, E. Gaffeo, M. Gallegati, G. Giulioni,and A. Palestrini (2008) Emergent Macroeconomics.

    An Agent-Based Approach to Business Fluctuations.Springer, Milan.

    [134] J. Surowiecki (2005) The Wisdom of Crowds. Anchor.[135] H. Rheingold (2003) Smart Mobs. Perseus.[136] D. Floreano and C. Mattiussi (2008) Bio-Inspired Arti-

    ficial Intelligence: Theories, Methods, and Technologies.MIT Press.

    [137] D. Helbing, A. Deutsch, S. Diez, K. Peters, Y.Kalaidzikis, K. Padberg, S. Lammer, A. Johansson,G. Breier, F. Schulze, and M. Zerial (2009) BioLogis-tics and the struggle for efficiency: Concepts and per-spectives. Advances in Complex Systems, in print, seee-print http://www.santafe.edu/research/publications/wpabstract/200910041

    [138] S. Lammer and D. Helbing (2008) Self-control of trafficlights and vehicle flows in urban road networks. Journalof Statistical Physics (JSTAT), P04019.

    [139] K. Windt, T. Philipp, and F. Bose (2007) Complex-ity cube for the characterization of complex productionsystems. International Journal of Computer IntegratedManufacturing 21(2), 195-200.

    [140] D. Helbing, W. Yu, and H. Rauhut (2009)Self-organization and emergence in social sys-tems: Modeling the coevolution of social en-vironments and cooperative behavior, seee-print http://www.santafe.edu/research/publications/wpabstract/200907026.

    [141] E. Ostrom (1990) Governing the Commons. The Evo-lution of Institutions for Collective Action. CambridgeUniversity, New York.

    [142] G. Hardin (1995) Living within Limits: Ecology, Eco-nomics, and Population Taboos. Oxford University.

    [143] J. A. Baden and D. S. Noonan, eds. (1998)Managing theCommons. Indiana University, Bloomington, Indiana.

    [144] GIACS Complexity Roadmap, see http://users.isi.it/giacs/roadmap

    [145] The Complex Systems Society Roadmap, see http://www.soms.ethz.ch/research/complexityscience/roadmap

    [146] ComplexityNET report on European ResearchLandscape, see http://www.soms.ethz.ch/research/complexityscience/European Complexity Landscape D2.2 short report.pdf

    [147] EU report on Tackling Complexity in Science, seehttp://www.soms.ethz.ch/research/complexityscience/EU complexity report.pdf

    [148] OECD Global Science Forum report on Applica-tions of Complexity Science for Public Policy, seehttp://www.oecd.org/dataoecd/44/41/43891980.pdf

    [149] J. A. Tainter (1988) The Collapse of Complex Societies.Cambridge University, Cambridge.

    [150] J. E. Stiglitz (2003) Globalization and Its Discontents.Norton, New York.

    [151] D. Meadows, J. Randers, and D. Meadows (2004) Limitsto Growth. The 30-Year Update. Chelsea Green Publish-ing, Withe River Junction, Vermont.

    [152] J. Diamond (2005) Collapse. How Societies Choose toFall or Succeed. Penguin, New York.

    [153] R. Costanza, L. J. Graumlich, and W. Steffen, eds.(2007) Sustainability of Collapse? MIT Press, Cam-bridge, MA.

    [154] P. Krugman (2009) The Return of Depression Eco-nomics and the Crisis of 2008. Norton, New York.