U n i v e r s i t é  Y O R K   U n i v e r s i t y
S C I E N C E   A N D   T E C H N O L O G Y   S T U D I E S



Lecture 3:  Is the World Simpler than We Think?

| Prev | Next | Search | Syllabus | Selected References | Home |



  • One of the books I suggested you read (see Syllabus) is Mark Buchanan's Ubiquity: The Science of History… Or Why the World is Simpler than We Think. The main reason for recommending such a book is to show you an example of the kind of unification of very different phenomena under a common explanation that science is always struggling to achieve—often, but by no means always, successfully. The immediate motivation is the recent string of earthquakes which struck the Okkaido region in Japan.

    For our purposes, Buchanan begins by noting that "mainstream geophysicists are cautious to the point of paranoia when it comes to making any earthquake predictions. This is not just due to their healthy respect for the social disruption that any prediction causes. In truth, it is the direct result of a long string of embarassing failures in earthquake prediction by scientists themselves. After a century of research, virtually all quakes still come completely unannounced." [ M Buchanan, Ubiquity: The Science of History… Or Why the World is Simpler than We Think. Phoenix 2001, p. 21 ]

    As quoted by Buchanan, here is some background as summarized by R Rikitake:
    "Many Japanese seismologists, earthquake engineers, and national and local officials responsible for disaster prevention are quite convinced nowadays that a great quake of magnitude 8 or so will hit the Tokai area … in central Japan between Tokyo and Nagoya in the near future … The targeted area was often struck by great earthquakes in historical times such as the 1854 and 1707 earthquakes … The mean return period of recurrence of great earthquakes there is estimated at about 120 years. As more than 120 years have already passed since the last shock, there is reason to believe an earthquake will recur sooner or later." [ Buchanan, Op. cit., p. 22 ]
    Such an assessment seems quite reasonable. Science looks for regularities, patterns, trends, etc., and on that basis it ventures its predictions. But predictions are really hypotheses about the causes of the phenomenon in question, and only empirical evidence can validate or invalidate them.

    Statistics about past events are often tricky to deal with. Think, for example, of a weekly lottery. If the lottery is fair, the fact that a given number has not turned up for a while does not make it more likely to turn up next time.

    The failure to reliably predict the occurrence of earthquakes seems to indicate that the regularities mentioned by Rikitake may not be so … regular, and that we are missing some crucial element. This is not surprising: earthquakes are very complex phenomena, and it is very difficult even to determine whether we have a reasonably complete picture of any one of them:
    "Earthquake prediction research has been conducted for over 100 years with no obvious successes. Claims of breakthroughs have failed to withstand scrutiny. Extensive searches have failed to find reliable precursors … reliable issuing of alarms of imminent large earthquakes appears to be effectively impossible." [ Buchanan, Op. cit., p. 23 ]
    Notice however that such skepticism is not shared by everybody. See for example, Earthquake Forecast Program Has Amazing Success Rate.
    "A NASA funded earthquake prediction program has an amazing track record. Published in 2002, the Rundle-Tiampo Forecast has accurately predicted the locations of 15 of California's 16 largest earthquakes this decade, including last week's tremors [ … ] 'We're elated our computer modeling technique has revealed a relationship between past and future earthquake locations,' said Dr. John Rundle, director of the Computational Science and Engineering initiative at the U.C. Davis."
  • The beginning of a breakthrough happened in 1956, when Beno Gutenberg and Charles Richter,
    "sifting through hundreds of books and papers … assembled details about many earthquakes all over the world. In each case, they noted the earthquake's magnitude. Next, they counted how many earthquakes had magnitudes between 2 and 2.5. Then they did the same for all those having magnitudes between 2.5 and 3, and so on. Continuing this way, they constructed a set of data showing the relative frequencies of earthquakes of different sizes. This relationship can be displayed visually in a simple graph [the bell curve]" [ Buchanan, Op. cit., p. 34-36 ]
    Here is a sketch of the results [ from Geo 107, Earthquakes by Larry Ruff, University of Michigan ]


    Number vs Magnitude of Earthquakes

    Number vs Magnitude of Earthquakes


    The relationship log N = a - bM is called a power law. In our case, a = 8.7 and b = 1.15 ≅ 1. If you don' know how the magnitude of earthquakes is defined, you will find a clean, simple summary in Earthquake Magnitudes and the Gutenberg Richter Law. If you prefer a more technical account, read Unified Scaling Law for Earthquakes.

    Gutenberg and Richter's work was nicely summarized in layman's terms in a June 2002 article in Scientific American entitled Scaling the Quakes: Why Aftershocks May Not Really Be Aftershocks After All. In plain English:
    "Recall that when the magnitude goes up by 1, the energy released in the quake goes up by 10. In terms of energy, it turns out that the Gutenberg-Richter law boils down to one very simple rule: if earthquakes of type A release twice the energy of those of type B, then type A quakes happen four times less frequently. Double the energy, that is, and an earthquake becomes four times as rare … This simple pattern holds for quakes over a tremendous range of energies."
  • And here is an important conclusion:
    "The scaling law supports the long-anticipated idea that earthquakes are self-organized critical phenomena, the investigators write in the April 29 Physical Review Letters. For such phenomena, a small change triggers a chain reaction of larger disturbances after some critical threshold is passed. A sandpile is the classic example of these systems: once it attains a certain slope, the addition of just a few extra grains will cause an avalanche. If real, the connection between earthquakes and self-organized critical phenomena suggests that one process is responsible for all quakes. 'It shows that one cannot understand individual earthquakes independently,' Christensen says."
  • As we have seen, power laws are characterised by the property that the distribution (of potato shards, earthquakes, rain events, avalanches, etc.) looks the same, regardless of the scale one chooses. We say that these phenomena possess scale invariance or fractal scaling properties or self-similarity. In other words, "An object is said to be self-similar if it looks 'roughly' the same on any scale. Fractals are a particularly interesting class of self-similar objects."
    [ Self-Similarity ]

    A famous example of fractal is The Mandelbrot Set


    The Mandelbrot Set

    The Mandelbrot Set


  • To illustrate the idea of power law, Buchanan refers to frozen potatoes:
    "Frozen potatoes are like rocks—brittle, and ready to shatter under the force of a sharp impact. Throw one against a wall and you end up with a pile of shards of different sizes … What is the typical size? To find out, you might throw a thousand or so potatoes against the wall … You would find a featurless curve of the Gutenberg-Richter sort. There would be a huge number of tiny fragments the size of grape seeds, and the number of fragments would then fall off gradually as you considered larger sizes. … In fact, you would find that the numbers of the larger fragments dwindle in an exceptionally regular way: every time you double the weight of the fragments you're looking at, their number will decrease by a factor of about six. This is the same kind of power-law pattern found by Gutenberg and Richter, the only difference being that here the reduction in numbers that follows from each doubling in weight involves a factor of six rather than four." [ Buchanan, Op. cit., p. 38-39 ]
    Another illustration, which was also verified experimentally, is a simple pile of rice. See V Frette, K Christensen, A. Malthe-Soerenssen, J Feder, T Joessang, and P Meakin, Avalanche Dynamics in a Pile of Rice, which appeared in Nature 379, 49-52 (1996).
  • As a matter of fact, the number of vastly different phenomena obeying a power law is quite large. Read Buchanan's book if you are curious. For a taste, browse through Earthquakes in the Sky, an article by O Peters, C Hertlein, and K Christensen, which discusses rainfall and shows "that rain events are analogous to a variety of nonequilibrium relaxational processes in Nature such as earthquakes and avalanches." This article was popularized in Rain is 'earthquake in the sky'. which appeared on the May 3, 2002 issue of New Scientist, as well as in
    Read !  Rainfall and Temblors May Be a Lot Alike, which appeared on Mike Martin's WeeklyScientist in February 2002.





    Finally, "USGS studies indicate that life and property losses from earthquakes, hurricanes, floods, and tornadoes exhibit fractal scaling behavior which can be used to forecast future losses." [ Natural Disasters: Forecasting Hurricane Occurrence, Economic, and Life Losses ]

Readings, Resources and Questions

  • Read Read !  Earthquakes Can Be Predicted Months in Advance, Report UCLA Scientists Who Predicted San Simeon Earthquake. Professor Keilis-Borok's methods involve many different theoretical tools, some of which are related to the power law described above.
    "… we look backwards to make our earthquake predictions. First, we search for quickly formed long chains of small earthquakes. Each chain is our candidate to a newly discovered short-term precursor. In the vicinity of each such chain, we look backwards, and see its history over the preceding years—whether our candidate was preceded by certain seismicity patterns. If yes, we accept the candidate as a short-term precursor and start a nine-month alarm. If not, we disregard this candidate." [ ibidem ]
    In addition to the purely scientific issues involved, the work of this and other scientists involves very thorny policy and ethics questions.
    "Keilis-Borok’s team communicates the predictions to disaster management authorities in the countries where a destructive earthquake is predicted. These authorities might use such predictions, although their accuracy is not 100 percent, to prevent considerable damage from the earthquakes—save lives and reduce economic losses—by undertaking such preparedness measures as conducting simulation alarms, checking vulnerable objects and mobilizing post-disaster services, Keilis-Borok said."
    Given the long history of failures in predicting earthquakes, scientists must exercise very careful judgment in issuing predictions for events which may not happen, in order not to undermine the public's and the authorities' confidence in science, avoid costly evacuations, and so on. Such considerations of course apply also to tornadoes, hurricanes, tidal waves, and other natural phenomena. For examples, visit CISN:
    "Large earthquakes in California are inevitable. However, the degree to which this natural hazard results in future losses of life and property in the State depends on our collective understanding of the earthquake problem and our investment in ways to mitigate the earthquake effects. Seismic monitoring is the foundation upon which earthquake understanding and mitigation practices are built. To better serve the emergency response, engineering, and scientific communities, several agencies in California have formed the California Integrated Seismic Network (CISN)."
  • Here is a recent, very interesting press release from UCLA:
    Subject: Strong Earth Tides Can Trigger Earthquakes, 
             UCLA Scientists Report (Forwarded)
    From: Andrew Yee 
    Newsgroups: sci.astro
    Office of Media Relations
    University of California-Los Angeles
    Contact: Stuart Wolpert
    Phone: 310-206-0511
    Date: October 21, 2004
    Strong Earth Tides Can Trigger Earthquakes, 
    UCLA Scientists Report
    Earthquakes can be triggered by the Earth's tides, UCLA
    scientists confirmed Oct. 21 in Science Express, the online
    journal of Science. Earth tides are produced by the 
    gravitational pull of the moon and the sun on the Earth,
    causing the ocean's waters to slosh, which in turn raise
    and lower stress on faults roughly twice a day. Scientists
    have wondered about the effects of Earth tides for more
    than 100 years. (The research will be published in the 
    print version of Science in November.)
    "Large tides have a significant effect in triggering 
    earthquakes," said Elizabeth Cochran, a UCLA graduate 
    student in Earth and space sciences and lead author of
    the Science paper. "The earthquakes would have happened
    anyway, but they can be pushed sooner or later by the 
    stress fluctuations of the tides."
    "Scientists have long suspected the tides played a role,
    but no one has been able to prove that for earthquakes 
    worldwide until now," said John Vidale, UCLA professor of 
    Earth and space sciences, interim director of UCLA's 
    Institute of Geophysics and Planetary Physics, and co-author
    of the paper. "Earthquakes have shown such clear correlations
    in only a few special settings, such as just below the 
    sea-floor or near volcanoes."
    "There are many mysteries about how earthquakes occur, and
    this clears up one of them," Vidale said. "We find that it 
    takes about the force arising from changing the sea level 
    by a couple of meters of water to noticeably affect the 
    rate of earthquakes. This is a concrete step in 
    understanding what it takes to set off an earthquake."
    Cochran, Vidale and co-author Sachiko Tanaka are the first 
    researchers to factor in both the phase of the tides and 
    the size of the tides, and are using calculations of the 
    effects of the tides more accurate than were available 
    just three years ago.  Tanaka is a seismologist with 
    Japan's National Research Institute for Earth Science and
    Disaster Prevention.
    Cochran and Vidale analyzed more than 2,000 earthquakes 
    worldwide, magnitude 5.5 and higher, which struck from 1977
    to 2000. They studied earthquakes in "subduction zones" 
    where one tectonic plate dives under another, such as near
    the coasts of Alaska, Japan, New Zealand and western South
    America. "These earthquakes show a correlation with tides
    because along continent edges ocean tides are strong," 
    Vidale said, "and the orientation of the fault plane is 
    better known than for faults elsewhere."
    Cochran conducted a statistical analysis of the earthquakes
    and tidal stress data, using state-of-the-science tide 
    calculations from Tanaka and the best global earthquake data,
    which came from Harvard seismologists. This research follows
    up on a 2002 study by Tanaka. The current research was 
    funded by the National Science Foundation and the Laurence
    Livermore National Laboratory.
    Cochran and Vidale found a strong correlation between when 
    earthquakes strike and when tidal stress on fault planes 
    is high, and the likelihood of these results occurring by
    chance is less than one in 10,000, Cochran said. They found
    that strong tides impose enough stress on shallow faults 
    to trigger earthquakes. If the tides are very large, more
    than two meters, three-quarters of the earthquakes occur
    when tidal stress acts to encourage triggering, she found.
    Fewer earthquakes are triggered when the tides are smaller.
    In California, and in fact in most places in the world, 
    the correlation between earthquakes and tides is 
    considerably smaller, Vidale said. In California, tides 
    may vary the rate of earthquakes at most one or two 
    percent; the overall effect of the tides is smaller, he 
    said, because the faults studied are many miles inland 
    from the coast and the tides are not particularly large.
  • Here are the opening and closing paragraphs of a 2002 article entitled Rain: Relaxation in the Sky, by O Peters and K Christensen:
    "Water is a precondition for human survival and civilisation. For this reason, measurements on water resources have been recorded for several centuries. A time series from the Roda gauge at the Nile reaches back to the year 622 AD. The main focus of analysis has historically been on statistics yielding a reliable estimate for the rainfall during the growth season. The most obvious question to ask is in this context: How much does it rain, on average, in the relevant months? Questions of this type can be answered using long time series without high temporal resolution, and a measurement of relatively low sensitivity may be suffcient. Entirely different levels of resolution and precision are needed in order to penetrate further into the complexity of precipitation processes. One might want to know just how reliable -- or in fact how meaningful -- is an estimate of future rainfall based on averages from the past. Of course, one would ultimately like to understand the processes that make a cloud release its water. Questions of this kind point to the statistical properties of rain events rather than temporal averages. "
    "New insight into the working of rain can be gained by defining rain events, which can be regarded as energy relaxations similar to earthquakes or avalanches. Taking this perspective, scaleˇfree power-law behaviour is found to govern the statistics of rain over a wide range of time and event size scales. Where clear deviations from the observed power laws and fractal dimensions are found, the limits and peculiarities of the underlying dynamical system become apparent, and physical insight is gained. Rainfall time series cannot be reproduced by conventional methods of probability theory. To enable anything more than an explicit reproduction of the fractal properties, a deeper understanding of self-organising processes leading to fractality must be sought. Our findings suggest that rain is an excellent example of a self-organised critical process. Rain is a ubiquitous phenomenon, and data collection is relatively easy. It is therefore well suited for work on self-organised criticality."

  • Explore the Mandelbrot Set, satisfying yourself that it possesses self-similarity, i.e. that it looks essentially the same at all scales. There is a veritable abundance of software, usually free, that you can use to do so. Or you can play with the Java applets at the University of Utah's website: The Mandelbrot Set
  • Using the library or the net, try to find other examples of self-similarity or scale invariance. You may also want to look at some of the articles pointed at in Scale-Free / Power Law Networks. The articles are generally quite technical, but you may find them useful when comparing the original scientific discourse with the 'English' version presented here.
  • Material related to these topics can be found in Lecture 7.


© Copyright Luigi M Bianchi 2003-2005
Picture Credits: University of Michigan · University of Utah · Imperial College
Last Modification Date: 24 September 2005