zotero/storage/QG2J2B53/.zotero-ft-cache

298 lines
68 KiB
Plaintext
Raw Permalink Normal View History

2024-08-27 21:48:20 -05:00
Challenges in Observational Seismology
W.H.K. Lee
US Geological Survey, Menlo Park, California, USA (retired)
1. Introduction
Earthquake seismology became a quantitative scientific discipline after instruments were developed to record seismic waves in the late 19th century (Dewey and Byerly, 1969; Chapter 1 by Agnew). Earthquake seismology is essentially
based onfield observations. The great progress made in the past
several decades was primarily due to increasingly plentiful and high-quality data that are readily distributed. Our ability to collect, process, and analyze earthquake data has been accelerated by advances in electronics, communications, computers, and software (see Chapter 85 edited by Snoke and GarciaFernandez).
Instrumental observation of earthquakes has been carried out for a little over 100 years by seismic stations and networks of various sizes, from local to global scales (see Chapter 87 edited by Lahr and van Eck). The observed data have been used, for example, (1) to compute the source parameters of earthquakes, (2) to determine the physical properties of the Earth's interior, (3) to test the theory of plate tectonics, (4) to map active faults, (5) to infer the nature of damaging ground shaking, and (6) to carry out seismic hazard analysis. Construction of a satisfactory theory of the earthquake process has not yet been achieved within the context of physical laws. Good progress, however, has been made in building a physical foundation of the earthquake source process, partly as a result of research directed toward earthquake prediction.
This chapter is intended for a general audience. Technical details are not given, but relevant references and chapters in this Handbook are referred to. The first part of this chapter presents a brief overview of the observational aspects of earthquake seismology, concentrating on instrumental observations of seismic waves generated by earthquakes (i.e., seismic monitoring), and readers are referred to Chapter 49 by Musson and Cecic for noninstrumental observations. A few key developments and practices are summarized by taking a general view, since many national and regional developments
have been chronicled in national and institutional reports (see Chapter 79 edited by Kisslinger). In the latter part of this chapter, the nature of seismic monitoring and some challenges
in observational seismology are discussed from a personal perspective. Comments of a technical or philosophical nature
are given in the Notes at the end of the chapter, and they are referenced by superscript numbers in the text.
2. Some Basic Information about Seismographs and Earthquakes
Besides geodetic data (see, e.g., Chapter 37 by Feigl), the primary instrumental data for the quantitative study of earth-
quakes are seismograms, records of ground motion caused
by the passage of seismic waves generated by earthquakes.
Seismograms are written by seismographs, instruments that
detect and record ground motion with timing information. A seismograph usually consists of three components: (1) a seismometer that responds to ground motion and produces a signal proportional to acceleration, velocity, or displacement over a range of input motions in amplitude and in frequency; (2) a timing device; and (3) a recording device that writes seismograms (ground motion plus time marks) on
papers or on electronic storage media. An accelerograph is
a seismograph designed to record the time history of acceleration of strong ground motion on scale. Most modem seis-
mographs are velocigraphsrecording the time history of ground
velocity. See Chapter 18 by Wielandt for a discussion of seismometry. A seismic network (or array) is a group of seismographs that are "linked" to a central headquarters. The link is by various methods of telemetry nowadays, and was by mail or telegraphy in the early days, or simply by manual col-
lecting of the records. When we speak of a seismic station, it
may be an observatory with multiple instruments in special vaults, or a small instrument package buried in a remote unmanned site.
INTERNATIONALHANDBOOKOFEARTHQUAKEANDENGINEERINGSEISMOLOGY,VOLUME81A Copyright !' 2002 by the Int'l Assoc. Seismol. & Phys. Earth's Interior, Committee on Education. All rights of reproduction in any form reserved.
ISBN: 0-12-440652-1
269
270
Lee
In 1935 C.F. Richter introduced the concept of magnitude to classify local earthquakes by their "size." See Chapter 44 by Utsu for a discussion of the various magnitude scales in use. Existing instruments and environments are such that the smallest natural earthquakes we routinely observe are about magnitude 0. The largest earthquake so far for which we have instrumental records is the magnitude 9.5 Chilean earthquake in 1960. Some commonly accepted adjectives used to describe the approximate size or magnitude (M) of an earthquake are "major" for M >_7 ("great" if M _>8), "moderate" for 5 _<M < 7, "small" for 3 _<M < 5, and "micro" for M < 3.
In 1941, B. Gutenberg and C.F. Richter discovered that over large geographic regions the frequency of earthquake occurrence is empirically related to magnitude by log N a - bM, where N is the number of earthquakes of magnitude M or greater, and a and b are numerical constants. Usually b ~ 1, implying, for example, that M - - 6 earthquakes are about 10 times more frequent than M - 7 earthquakes. See Chapter 43 by Utsu for further detail. Engdahl and Villasenor in Chapter 41 show that there has been an average of about 15 major (i.e., M _>7) earthquakes per year over the past 100 y. A list of deadly earthquakes of the world for the past five centuries has been compiled by Utsu in Chapter 42. It shows that M _>6 earthquakes (about 150 in the world per year) can be damaging and deadly if they occur in populated areas and if their focal depths are shallow (e.g., <50 km). Strong ground motions above 0.1g in acceleration are mainly generated by M _>6 earthquakes.
3. Seismic Networks for Observing Earthquakes
Seismic waves from earthquakes have a vast range in amplitude and frequency (see Fig. 1 of Hutt et al. in Chapter 20mabout 10 orders in amplitude of ground acceleration, from 10 - 7 to 103 cmsec -2, and about 7 orders in frequency, from 10-5 to 102 Hz). Since no single instrument type can cover such vast ranges, seismographs and seismic networks have evolved from three different optimizing choices, as discussed in the next three subsections. National reports, including those from Germany, Japan, Russia, the United Kingdom, and the United States (collected in Chapter 79, edited by Kisslinger), contain detailed early history on a national basis. For example, Kisslinger and Howell give a detailed account of historical developments in the United States to about 1960 in the USA National Report.
3.1 Seismic Networks Optimized for Teleseisms
Teleseisms are distant earthquakes that are big enough to produce measurable seismic waves at great distances. A major earthquake occurs about once a month (or a potentially
damaging earthquake occurs about every week) somewhere in the world (but rarely in one's own backyard), and is recorded on the seismograms of seismographs optimized for teleseisms. These seismograms also draw interest from research seismologists worldwide, as teleseisms are excellent sources of seismic waves for probing the Earth's interior (see, e.g., Chapter 11 by Romanowicz; Chapter 52 by Curtis and Snieder).
3.1.1 Early Years
In the beginning of instrumental seismology, observatories with various types of seismographs operated independently. The observatories were linked by mail, which could take months. Many seismological studies require seismograms or their readings from multiple stations. For example, arrival times of seismic waves from at least four well-distributed stations are needed to locate an earthquake satisfactorily. Even after one managed to get a few seismograms, it was difficult to work with records from different instruments with poorly synchronized time.
In the late 19th century the need for standardization and for data exchange was recognized by G. Gerland, J. Milne, and E. Rebeur-Paschwitz. With the support of the British Association for the Advancement of Science, over 30 Milne seismographs were placed at locations throughout the British Empire beginning in the late 1890s, and seismogram readings were reported to Milne's observatory at Shide on the Isle of Wight (see the UK National Report in Chapter 79). A global earthquake summary with seismogram readings was issued by John Milne beginning in 1899. These summaries are now known as the "Shide Circulars" (see Chapter 88 by Schweitzer and Lee). Milne seismographs were soon superseded by more advanced instruments, 1 and the headquarters of the International Association of Seismology was established in Strasbourg (see the German National Report in Chapter 79).
Seismographs for recording teleseisms were established at many observatories, especially meteorological and astronomical observatories. The early enthusiasts were academic professors, Jesuits, and gentleman scientists. See Chapter 89 edited by Howell for biographies of some notable pioneers. Revolutions and wars, however, frequently disrupted progress, especially in collecting and distributing earthquake information, during the first half of the 20th century. 2
3.1.2 WWSSN and ESSN
In the late 1950s, attempts to negotiate a comprehensive test ban treaty failed, in part because of perceptions that seismic methods were inadequate for monitoring the underground environment for nuclear testing (see Chapter 24 by Richards). The influential Berkner report of 1959 advocated major support for seismology (see article by Kisslinger and Howell in the USA National Report in Chapter 79). As a result, the World Wide Standardized Seismograph Network (WWSSN) was created with about 120 continuously recording stations, located over much of the world (except China and USSR) in the early
Challenges in Observational Seismology
271
1960s (Oliver and Murphy, 1971). Each WWSSN station was equipped with identical sets of short-period and long-period three-component seismographs and accurate chronometers. Seismograms were sent to the United States to be photographed onto 70-mm film chips for distribution (about $1 per chip). This network is credited with making possible rapid progress in global seismology, and with aiding the plate tectonic revolution in the Earth sciences in the late 1960s (see, e.g., Chapter 6 by Uyeda; article by Sykes in the USA National Report in Chapter 79).
At about the same time, the Unified System of Seismic Observations (ESSN) of the former USSR and its allied countries was established, consisting of almost 100 stations equipped with Kirnos short-period, broadband (1-20 sec displacement sensing) and long-period seismographs. See the Russian National Report in Chapter 79 for details.
Despite its great success, the WWSSN declined starting in the mid-1970s. By then it had produced 3 million analog seismograms, far more than seismologists could process and analyze. After about ten years of operation, funding for the WWSSN began to disappear. Although the initial costs were funded by the US Defense Advanced Research Projects Agency (DARPA), their emphasis is in research and not in long-term operation. Funding for continuing the WWSSN was then left to the National Oceanic and Atmospheric Administration (NOAA) and then to the US Geological Survey (USGS). Because of statutory restriction, USGS could not support global stations outside the United States. Although the US National Science Foundation (NSF) did pick up the funding for supporting foreign stations for some time, NSF also wanted to avoid funding any ongoing seismic networks. In addition, the emphasis in seismology was shifting to earthquake prediction at the USGS, then considered a new and promising venture. 3 Earthquake prediction, however, turned out to be far more difficult than anticipated, as reviewed for example, by Kanamori in Chapter 72.
3.1.3 The Digital Revolution and the GDSN
According to Duncan Agnew (personal communication, 2001), the idea of digital recording goes back to 1960, but it was not practical until the late 1970s, and Block and Moore (1966) pioneered the use of the feedback gravimeter. The introduction of electronic force feedback to sealed inertial seismometers (Melton, 1976; Wielandt and Streckeisen, 1982) together with the application of high-resolution analog-todigital converters made it possible to construct very broadband, large dynamic range seismograph systems. The first major digital broadband installation was the German Gr~fenberg (GRF) array, the first station of which started recording in 1975 (Harjes and Seidl, 1978).
The International Deployment of Accelerometers (IDA) Project was created in the 1970s as a global digital seismic network to collect data for low-frequency seismology (Agnew et al., 1976). Among the many contributions made by IDA are much improved studies of the Earth's free oscillation, longperiod source mechanics for major earthquakes, and the aspherical structure of the Earth's interior (see Chapter 51 by Lay). Another notable effort in global digital seismic networks was the French GEOSCOPE Program (Romanowicz et al., 1991), and the German GEOFON project.
With the availability of broadband, large dynamic range, force feedback seismometers and 24-bit digitizers, many of the WWSSN stations were replaced by broadband digital systems starting in the 1980s (see Chapter 20 by Hutt et al.). A global digital seismic network has emerged since the 1980s under the guidance of two effective organizations (the International Federation of Digital Broadband Seismographic Networks, FDSN, and the Incorporated Research Institutions for Seismology, IRIS).4 Digital seismograms recorded by stations worldwide are now readily available via the Internet from the IRIS Data Management Center within tens of minutes of an M ~ 6 or larger earthquake occurring anywhere in the world (see Chapter 86 by Ahem), as well as through the Europen ORFEUS center at de Bilt, The Netherlands, the GEOFON center at the GeoForschungsZentrum, Postdam, Germany, and several other centers.
Since analog seismograms have a low dynamic range (about 3 orders or less in amplitude) and must be digitized before computer processing, some seismologists recognized that "digital" instrumentation should be developed to achieve a much higher dynamic range and for ease of computer processing. Many scientists and engineers in other disciplines had already been making great advances in that direction because of the emerging digital technology in the 1970s. Seismologists also recognized that the tandem use of short-period and longperiod instruments was needed to avoid the natural seismic noise (see Chapter 19 by Webb). They realized that a new global seismic network should be rebuilt with (1) broadband, high dynamic range seismographs, (2) digital electronics, (3) communication by telemetry or a mass storage medium, and (4) processing by computers in mind.
3.2 Seismic Networks Optimized for Regional Earthquakes
Another major development in seismic monitoring was the establishment of seismic networks optimized to record the more frequent but smaller regional and local earthquakes. In order to observe as many nearby earthquakes as possible, seismographs with high magnifications are adopted to record as small an earthquake as the technology and background noise allow. A consequence of this requirement, when applied to inexpensive sensors and telemetry with low dynamic range, is that the recorded amplitudes are saturated for earthquakes with M > ~3 within about 50 km. This is not a serious defect, because the emphasis is to obtain as many first-arrival times as possible, so that more earthquakes can be detected and located. Since
272
Lee
seismic waves from small earthquakes are quickly attenuated with distance, it is also necessary to deploy many instruments with small station spacing (several to a few tens of kilometers), and to cover as large a territory as possible in order to record at least a few earthquakes every week. Since funding is finite, the regional seismic networks are usually optimized for a large number of stations rather than for high data quality.
3.2.1 A Brief History
In the 1910s, the Carnegie Institution of Washington (CIW)5 was spending large sums of money building the world' s largest telescope in southern California. Since astronomers were concerned about earthquakes that might disturb their telescopes, H.O. Wood was able to persuade CIW to support earthquake investigations. As a result, a regional network of about a dozen Wood-Anderson seismographs was established in southern California in the 1920s. See Goodstein (1991) for the early history leading to the establishment of the California Institute of Technology (Caltech) and its Seismological Laboratory. Many astronomers played important roles in getting seismic monitoring established in various regions of the world.
Regional networks using different seismographs were also established in many countries, such as Japan, New Zealand, and the USSR with its allies. In the 1960s, high-gain, shortperiod, telemetered networks were developed to study microearthquakes (see, e.g., Eaton, 1989). Over 100 microearthquake networks were implemented by the 1970s in various parts of the world for detailed studies of local earthquakes and especially for the purpose of earthquake prediction (Lee and Stewart, 1981). These microearthquake networks consisted of tens to hundreds of short-period seismometers with their signals telemetered into central recording sites for processing and analysis. High magnification was achieved by electronic amplification, which permitted recording of very small earthquakes (down to magnitude 0) at the expense that the recorded seismic wave amplitudes were saturated for earthquakes of M>~3. Some microearthquake networks soon expanded into regional seismic networks.
As with the WWSSN, it was difficult to improve and sustain regional seismic networks for long in many countries. For example, by the 1980s the regional seismic networks in the United States were in decline. The 1989 Loma Prieta earthquake and the 1994 Northridge earthquake demonstrated that the existing regional seismic networks in the United States were not satisfactory, especially during large damaging earthquakes. 6 With new funding in response to these disastrous earthquakes, a new life began in the form of real-time seismology.
3.2.2 Some Recent Advances
Because of recent advances in electronics, communications, and microcomputers, it is now possible to deploy sophisticated digital seismograph stations at global, national, and local scales
for real-time seismology (Kanamori et al., 1997). Many such networks, including portable networks, have been implemented in many countries. In particular, various real-time and near realtime seismic systems began operation in the 1990s. For example, Mexico (Chapter 76 by Espinosa-Aranda), California (Chapter 77 by Gee et al.; Chapter 78 by Hauksson et al.), and Taiwan (Teng et al., 1997).
For example, the Real-time Data (RTD) system operated by the Seismological Observation Center of the Central Weather Bureau in Taiwan is based on a network of telemetered digital accelerographs (see Chapter 64 by Shin et al.). This system, using pagers, e-mail and other techniques, has automatically and rapidly issued information on the hypocenter, magnitude, and shaking amplitude of felt earthquakes (M > ~4) in the Taiwan region since 1995. The disastrous Chi-Chi earthquake ( M w - 7.6) of 20 September 1999 caused 2471 deaths and an estimated economic losses of US $14 billion. For this earthquake sequence, the RTD system delivered accurate information (102 sec after the main shock and about 50sec for most aftershocks) to officials and proved to be useful for emergency response by the Taiwan government (Wu et al., 2000; Goltz et al., 2001).
3.3 Seismic Networks Optimized to Record Damaging Ground Shaking
Observing teleseisms at a spacing of several hundreds of kilometers does not yield the information about near-source strong ground shaking required for earthquake engineering purposes. A few hundred global stations, therefore, cannot provide the detailed data necessary to help in reducing seismic hazards. Broadband seismometers optimized to record earthquakes at great distances are not designed to perform well in the near field of a major earthquake. For example, during the 1999 Chi-Chi earthquake the nearest broadband station in Taiwan (epicentral distance of about 20 km) recorded mostly saturated amplitude data and stopped about one minute into the shock (see Kao and Angelier, 2001, for the recorded seismogram).
A regional seismic network with spacing of a few tens of kilometers cannot do the job either. The station spacing is still too large and, worse yet, the records are often saturated for earthquakes with magnitude > ~3, including a big one if it should occur. In his account of the early history of earthquake engineering, Housner in Chapter 2 credited John R. Freeman, an eminent engineer, for persuading the then US Secretary of Commerce to authorize a strong-motion program and in particular the design of an accelerograph for engineering purposes in 1930. In a letter to R.R. Martel (Housner's professor) at Caltech, Freeman wrote:
I stated that the data which had been given to structural engineers on acceleration and limits of motion in earthquakes as a basis for their designs were all based on guesswork, that there had never yet been a precise
Challengesin ObservationalSeismology
273
measurement of acceleration made. That of the five seismographs around San Francisco Bay which tried to record the Earthquake of 1906 not one was able to tell the truth.
Strong-motion recordings useful for engineering purposes are on-scale recordings of damaging earthquakes; in particular, recordings on or near structures in densely urbanized environments, within 20km of the earthquake-rupture zone for sites on rock and within about 100 km for sites on soft soils. Recordings of motions at levels sufficient to cause damage at sites at greater distances also are of interest for earthquake engineering in areas likely to be affected by major subduction zone earthquakes or in areas with exceptionally low attenuation rates (Borcherdt, 1997).
Although several interesting accelerograms were recorded in southern California in the 1930s and 1940s, to the delight of earthquake engineers, most seismologists did not pursue strong-motion monitoring until much later. The 1971 San Fernando earthquake demonstrated the need for strong-motion data for engineering purposes (see Chapter 57 by Anderson). Two important programs merged in the United States--the
National Strong-Motion Program (http://nsmp.wr.usgs.gov/),
and the California Strong Motion Instrumentation Program
(http://docinet3.consrv.ca.gov/csmip/). However, their budgets
were and continue to be small in comparison with those of other earthquake programs. High levels of funding for strongmotion monitoring (comparable to that of the GDSN and the regional seismic networks) occurred in Taiwan and Japan in the early and mid-1990s, respectively (see Section 5.1). The Consortium of Organizations for Strong-Motion Observation
Systems (http://www.cosmos-eq.org/) has recently been
established to promote the acquisition and application of strong-motion data.
4. Record Keeping and Data Processing on a Global Scale
Many scientific advances are based on accurate and long-term observations. Because disastrous earthquakes in a given region recur on a time scale of tens to hundreds of years, special efforts are needed to ensure that seismic monitoring is carried out in a consistent manner in keeping records, detecting and timing all locatable events, and processing the observed and derived data. The amount of data in earthquake seismology is large and growing7 and may be classified into six types (with their approximate annual output rate) as follows:
1. Information for instrument location, characteristics, and operational details: 2 • 10 7 bytes y-~.
2. Raw observational data (continuous signals from seismographs): 1014 bytes y-~.
3. Earthquake waveform data (containing seismic events): 5 • 10~2 bytes y-1.
4. Earthquake phase data, such as P, S, and secondary arrival times, maximum amplitude and period, first motion direction, signal duration, etc 95 x 108 bytes y-].
5. Event lists of origin time, epicenter coordinates, focal depth, magnitude, etc." 2 x 107 bytes y-].
6. Scientific reports describing seismicity, focal mechanisms, etc" 5 x 10 7 bytes y-~.
Seismic monitoring for the entire world depends on continuous international cooperation as discussed in Chapter 4 by Adams, because seismic waves propagate throughout the Earth without regard to national boundaries. Seismologists need not only to exchange scientific results (via reports and papers published in journals or books), but also to rely on the exchange of primary data (i.e., seismograms) and their derived products (e.g., phase data). Each seismic observatory can interpret its recorded seismograms, but single-station data are insufficient for the study of earthquakes, especially those occurring some distance away.
The International Seismological Centre (ISC) is charged with the final collection, analysis, and publication of standard earthquake information from all over the world (Willemann and Storchak, 2001). The ISC bulletins (issued since 1964) are the definitive summary of reported earthquake phase data, and from these data the earthquake parameters are determined by a standard procedure. These bulletins are published with a time lag of about two years in order to incorporate as much information as possible from cooperating seismic stations. Engdahl and Villasenor in Chapter 41 prepared a comprehensive catalog of earthquakes of the world from 1900 to 1999 by relocating systematically many thousands of earthquakes using the ISS and ISC phase data. Their catalog and the phase data they used for location are archived on the attached Handbook CD.
For more rapid dissemination of earthquake information on a global scale, the US National Earthquake Information Service (NEIS) of the US Geological Survey (in Golden, Colorado) issues results based on its own network of stations as well as phase readings sent by other stations. Visit their Web site at
http./neic.usgs.gov/. Earthquake parameters for significant
earthquakes are usually announced within about one hour, and summaries (now in computer files) are distributed on a weekly basis. The data collected by the NEIS are sent to the ISC for further analysis, as described in the preceding paragraph.
Moment tensor solutions for earthquakes of M_> ,-,5.5 worldwide have been determined by the Harvard group since
1976 (see http://www.seismology.harvard.edu/projects/CMT/),
and by the USGS since 1981 (see Chapter 50 by Sipkin). Preliminary solutions are produced within minutes after the NEIC's QED results. In addition to Harvard and USGS, there are several centers that determine moment tensor solutions (mostly for regional earthquakes) in near real time.
Most seismic observatories issue their own bulletins, and many national agencies publish national earthquake catalogs. However, their quality and contents vary greatly. The efforts
274
Lee
in seismic monitoring in many countries can be found in their national reports (summarized in Chapter 79 edited by Kisslinger and presented in full on the attached Handbook CD). See also Chapter 87 edited by Lahr and van Eck for a global inventory of seismographic networks.
5. The Nature of Seismic Monitoring of Damaging Earthquakes
Seismic monitoring of earthquakes has been most successful on the global scale because there are (1) organizations (e.g., FDSN, IRIS, ISC, IASPEI, and CTBTO) that promote it, (2) hundreds of academic users, and (3) several major and/or damaging earthquakes every year in the world that draw public and academic attention. Field work is mostly cooperative, with a central organization supplying (or augmenting) the latest equipment, and the labor cost per station is a relatively small amount that can easily be absorbed by a cooperating local institution. On the other hand, seismic monitoring on a regional or local scale is far more difficult because there are only a few academic users, and usually no significant or destructive earthquakes occur for decades in most geographical areas.
5.1 Requirements for Station Site and Spacing
The station site and spacing requirements for seismological research are very different from those for earthquake engineering purposes. Seismologists (especially those who study Earth structure) want their stations to be located at quiet sites, as far away from any human activities as possible. On the other hand, earthquake engineers want instruments in the built environment of urban areas. Since the occurrence of major earthquakes in a given region is rare, we can understand why most seismologists would be reluctant to wait for decades for a few strong-motion records.
Studies indicate that we may need a station spacing of about 1 km or less in order to reduce the observed variances in strong ground motion to a factor less than 2 (Field and Hough, 1997; Evans, 2001). During 1991-1996, the Taiwan government deployed about 1200 accelerographs (at 640 free-field sites with a station spacing of 3-5 km, and in 56 buildings and bridges) nonuniformly in the urban areas of Taiwan (see Chapter 64 by Shin et al.). After the 1995 Kobe earthquake, the Japanese government deployed the 1000-station K-Net (see Chapter 64 by Kinoshita) at a uniform spacing of 25 km over Japan. In both cases, the total cost to deploy one station was about $30 000. Therefore, to achieve 1km spacing in one major urban area would require a few thousand accelerograph stations, and would cost about one hundred million dollars to deploy. However, a more cost-effective alternative would be to selectively deploy strong-motion instruments at fewer sites that are representative of the local site conditions, rather than uniformly throughout an area.
In regard to strong-motion instrumentation needs of the United States, Borcherdt et al. (1997) derived estimates based on the National Seismic Hazard Maps, population exposure, and knowledge of the distribution of local geologic deposits. These estimates as reviewed and expanded to include the built environment provided the basis for consensus of a national workshop (Stepp, 1997). The estimates indicated that at least 7000 sites were needed to record strong ground shaking with station spacing for rock sites less than 7 km and preferably about 1.7 km in densely urbanized areas, such as San Francisco, California. Workshop consensus indicated that an additional 13 000 stations are needed to ensure that the next major earthquake is thoroughly documented on the built environment, with 7000 for buildings, 3000 for lifelines such as bridges and pipelines, and 3000 for critical facilities necessary for emergency response and near real time disaster assessment.
5.2 Total Cost
There are a number of costs associated with establishing and operating a seismic network: (1) deployment, (2) maintenance and operation, and (3) staffing. We should be aware that the deployment cost must include the capital cost for instruments (including equipment for telemetry, if any), and expenses for siting, site preparation, quality assurance, and administrative overhead in procurement and management. In both the recent Taiwan and Japan cases, the instrumentation cost was about 1/3 of the total deployment cost. Maintenance costs vary, but experience has shown that at least 20% of the instrument cost should be budgeted for parts, supplies, and repair services every year for satisfactory performance. Operating costs depend mostly on the telemetry method used for getting the data from the field to the headquarters, and vary greatly in different locations. Staffing costs for maintenance, operation, and research are most difficult to estimate and control over a long period of time. Therefore, the instrument cost is relatively minor in a project that must continue for decades.
Because governments and academics usually fund capital equipment out of the same budget, there is a tendency for some seismologists to regard inexpensive instruments as a panacea, s Experience has shown that the instrument cost constitutes only about 10% of the total project cost in 30 years, the length of time that is usually needed to accumulate some significant earthquake data in a high-seismicity area.
What one may do with four different levels of capital funding will be discussed next. Capital funding means a sufficient funding for the total deployment cost in the field and the cost for setting up a headquarters with a staff.
5.3 Very Low Capital Funding at $50 000
At this very low capital funding level, one may purchase a single low-end digital broadband system to monitor teleseisms, and
Challenges in Observational Seismology
275
operate it as a global seismic station. See Chapter 20 by Hutt et al. for more technical details. Deployment cost will be low if one simply installs it in the basement of one's building. Maintenance and operating cost will be low also, but a few thousand dollars per year are necessary for supplies and unexpected repairs. A part-time staff member is needed, but an existing staff member can usually serve in that capacity.
5.6 High Capital Funding at $50 000 000
At this high capital cost, one may consider an instrumentation program similar to the TriNet in southern California (see Chapter 78 by Hauksson, et al.), the 1000-station K-Net in Japan (Chapter 63 by Kinoshita), or the Taiwan program (Chapter 64 by Shin, et al.). The annual budget for staff (10 or more), maintenance, and operation is typically a few million dollars or more, depending on many factors.
5.4 Low Capital Funding at $500 000
At this low capital funding level, one may consider setting up a ten-station telemetered network using digital accelerographs to cover an area of about 3000 km 2 at a station spacing of about 25km. If the accelerograph allows four channels, then one would just add a vertical-component, short-period seismometer. However, if only three-component instruments are used, then to increase the number of detectable earthquakes one should replace the vertical-component accelerometer by a vertical-component, short-period seismometer (since engineers are interested mostly in the horizontal ground motions). Real-time telemetry may be achieved by leased telephone lines, radios, or satellites. If real-time performance is not needed, then it is cheaper to use the Internet or dial-up access, if they are available in the field. A few companies are selling this type of seismic network with a central processing system to monitor regional and local earthquakes and to provide some coverage for strong ground motion. Deployment cost will be highly dependent on where the network is to be located. The same is true for maintenance, operation, and staffing. One will need at least $100000 per year to maintain and operate this kind of network, with at least one full-time staff member.
5.5 Medium Capital Funding at $5 000 000
At this level of capital funding one may consider setting up, for example: (1) a 100-station telemetered network using digital accelerographs to cover an area of about 10 000 km 2 at a station spacing of about 10km, with the modification described in Section 5.4, for monitoring a modest area of high earthquake hazards; and (2) a few global broadband stations for academic research. There is always a trade-off between the area covered and the station spacing. Real-time telemetry may be by leased telephone lines, radios, or satellites. If real-time performance is not needed for all the stations, then one can make use of the Internet or dial-up connections, if available. Alternatively, one could set up a 100-station telemetered broadband network at a larger station spacing to cover more area, and supplement it with telemetered accelerometer signals and/or dial-up accelerographs. Deployment cost will be highly dependent on where the network is located. The same is true for maintenance and operation. One will need at least $1 000 000 per year and several full-time staff members to maintain and operate such a network.
5.7 Remarks
It is important to do planning in advance and to visit some wellestablished seismic networks before one embarks on an instrumentation project. One must be clear as to what objectives can be accomplished within the expected long-term funding situation. This will also help to decide which option of the trade-offs discussed above is preferable. Some guidance in this respect is given in Bormann (2002). Experience indicates that, technically, telemetry is the weakest link in any telemetered seismic network, especially during and after a major earthquake. Therefore, the field units should have some on-board recording capabilities so that important data can be retrieved later if necessary. Some redundancy in the field units, alternative methods of telemetry, and backup data acquisition systems are absolutely necessary.
6. Some Difficulties in Seismic Monitoring for Hazard Mitigation
The major problem in seismic monitoring, especially for hazard mitigation on a long-term basis, is not technical but political and financial. Adequate funding over decades is necessary for the success of a seismic monitoring project. Major earthquakes may not be damaging if they occur in uninhibited or lightly populated areas. However, an earthquake need not be very large in magnitude to cause serious damage if it occurs in a heavily populated area, especially if the earthquake's focal depth is shallow. According to Munich Reinsurance Company (2000), economic losses due to earthquakes in the 20th century are very large. For example, the five largest losses are $100 billion for the 1995 Kobe (Japan) Mw ----6.9 earthquake; $44 billion for the 1994 Northridge (USA) Mw =6.7 earthquake; $14 billion for the 1999 Chi-Chi (Taiwan) Mw = 7.6 earthquake; $14 billion for the 1988 Armenia Mw -- 6.7 earthquake; and $12 billion for the 1999 Turkey Mw -- 7.6 earthquake. Note that three of these five earthquakes are not called "major" earthquakes, because their magnitudes are below 7.
6.1 What Seismograms are Most Important to Engineering Designs?
According to Norm Abrahamson (private communication, 2000), designs of large engineering structures benefit most from
276
Lee
strong-motion records of M >_7 earthquakes obtained within 20 km of fault ruptures. As of mid-1999, there were only eight such strong-motion records in the world after nearly 70 years of effort in strong-motion monitoring. As discussed in the next subsection, the odds for having an M >_7 earthquake occurring near a station are rare, and the number of strong-motion instruments at the free-field sites throughout the world was not large until about 1990. The Kocaeli, Turkey, earthquake of 17 August 1999 added five more such records. And the Chi-Chi, Taiwan, earthquake of 20 September 1999 added over 60 such records, thanks to an extensive strong-motion instrumentation program that was completed three years earlier (see Chapter 64 by Shin et al.).
6.2 Major Earthquakes are Rare in a Given Region
As noted in Section 2, the average annual number of major (M _>7) earthquakes worldwide is about 15. Most of the shallow major earthquakes occur in subduction zones off the coast, and large areas of the Earth have few or no earthquakes (see Color Plate 15 of this Handbook). Using the earthquake catalog for major earthquakes from 1900 to 1999 by Engdahl and Villasenor in Chapter 41, about 32% of the M >_7 earthquakes of the world occurred on land. If a seismic station is deployed on land (as deployment at sea is very expensive), what is the probability that an M _>7 earthquake will occur within a radius of 20 km of this station in one year?
A very rough estimate can be derived as follows. Since the total land area of the Earth is about 1.5 x 108 km 2, the probability is then about 4 x 10-5, assuming that each M_> 7 earthquake is a point source. We can improve this probability by, say, a factor of 10 by placing a seismic station in a known seismically active area based on geological and seismological information. A probability of 4 x 10-4 implies that we must wait about 2500 years for an M >_7 earthquake to occur within 20 km of the station. Alternatively, we need to deploy about 2500 stations in order for an M _>7 earthquake to occur within 20 km of one of the stations in one year. Observing potentially large damaging earthquakes at near-field distances will, therefore, require the deployment of many stations, or waiting for a long time, or being very lucky in selecting station locations.
6.3 Extending Observation Offshore is Expensive
As pointed out by T. Utsu (personal communication, 2001), it is important for seismologists to extend observation offshore, because more than 2/3 of all earthquakes occur at sea and some of them can be very damaging to lives and properties on land. An effective means to observe earthquakes at sea is to deploy a cable ocean-bottom seismograph (OBS) system. However, this is very expensive. For example, a cable OBS system including four seismic stations, 150km of cable, and land
facilities costs about $25 million according to N. Hamada (personal communication). As of 2001, there were seven cable OBS systems in operation in Japan and their signals are monitored in real time by the regional centers of the Japanese Meteorological Agency (JMA). These data greatly improve the determination of location, especially the focal depth, of offshore earthquakes in Japan.
6.4 Planning, Management, and Bureaucracy
Good seismic monitoring requires long-term planning and efficient management, but few seismologists have the experience or training necessary for the planning and management tasks. Several related technical and organizational topics are elaborated in Bormann (2002). Seismic monitoring involves deploying and operating seismic instruments in the field. If we wish to be good at it, it pays to study how field operations are conducted, such as military operations in a war (e.g., Sun-Tzu, 1993). 9 Seismic monitoring also needs good management, and it may be instructive to read two recent books by Andy Grove explaining how Intel became the most successful company in the world in the production of microprocessors (Grove, 1987; 1996).
Since funding for seismic monitoring is almost entirely from governments, we cannot avoid dealing with government bureaucracy. It helps to have some understanding of bureaucracy through reading books on this subject (e.g., Parkinson, 1957). l~ Seismic monitoring requires teamwork, and therefore a good leader. It is of utmost importance to have someone who knows how the bureaucracy operates in a particular organization and country.
6.5 Integration, Reorganization, and Data Loss
Seismic monitoring of earthquakes evolved over the years into three major branches as discussed in Section 3. One recent favored approach is to integrate all seismic monitoring into a single entity called real-time seismology. There are, of course, many advantages in this approach, especially from a management point of view. However, getting a large group of people to work together is by no means easy. Aki in Chapter 5 described an approach of getting people from many disciplines to work together under the Southern California Earthquake Center, and it may serve as a blueprint for integrated earthquake programs elsewhere.
As mentioned before, seismic monitoring is almost entirely funded by governments (directly or indirectly), except for some notable early efforts by nongovernment groups (e.g., the Jesuits as reported by Udias and Stauder in Chapter 3). There is no bottom line to speak of for government agencies, since there are no "profits" or "losses." There is a tendency for staff members to increase according to Parkinson's law to consume all available funding, and for staff quality to decrease with time as deadwood accumulates, for it is difficult to dismiss civil servants. Consequently, few government institutions can
Challenges in Observational Seismology
277
maintain a high-quality level of seismic monitoring for decades, before going out of existence or being reorganized. ~1
Each reorganization creates disruption, as new chiefs often have their own agendas and directions. Facing uncertainties in management and funding, the long-term preservation of seismic data has been at the bottom of any priority list in seismology. Millions of seismograms have been poorly kept or were inaccessible, and will soon disappear. This sad status was recognized in the 1970s, but little has been done (Lee et al., 1988).
Major capital investments for seismic instrumentation occur infrequently, usually in response to disastrous earthquakes (e.g., K-Net), or military needs (e.g., WWSSN), or broader programs (e.g., microearthquake networks for earthquake prediction). There is a tendency to spend the new funds on equipment and more staff members, and to ignore the costs of operation and maintenance on a long-term basis. A few modem seismic networks have been idle because of this. It is tempting to buy the latest and most advanced equipment, without realizing the risk involved. It takes time and effort to master the latest pieces of equipment, and some of their new features often do not work well in the field.
7. Discussion
The history of earthquake seismology suggests that major advances take place shortly after accumulation of sufficient amounts of seismic data of a quality that surpasses that of previous data. ~2 Major advances in earthquake seismology are expected in the near future because digital seismic data have become widely available on both local and global scales. In addition, advances in computers with increasing computing power at decreasing cost have been important to seismologists for more sophisticated data processing and analysis in order to gain insight from the increasing volume of seismic data collected (see Chapter 22 by Scherbaum; Chapter 85 edited by Snoke and Garcia-Fernandez).
In the 1990s, especially after the 1995 Hyogo-ken Nanbu (Kobe) earthquake, large amounts of funding became available in Japan for digital seismic and other instrumentation. Under the Headquarters for Earthquake Research Promotion, a "Fundamental Seismic Survey and Observation Plan" was carried out in Japan for:
1. Earthquake observations: (a) inland earthquake observation by high sensitivity seismographs (observation of microearthquakes), and (b) inland earthquake observation by broadband seismographs
2. Strong-motion observations 3. Observations of crustal deformation (GPS continuous
observation) 4. Survey of inland and coastal active faults
These extensive instrumentation programs have now been completed and the data from the high-sensitivity seismograph
(Hi-Net), nationwide broadband seismograph network (Freesia), and digital strong-motion seismograph (KiK-Net) are effectively distributed in near real time by the National Institute for Earth Science and Disaster Prevention (NIED) via the Internet (http://www.bosai.go.jp/index.html).
Extensive strong-motion instrumentation programs (typically with 1000 digital instruments) were implemented in Japan and Taiwan, for example, at costs of several tens of millions of US dollars each (see Chapter 63 by Kinoshita; Chapter 64 by Shin et al.). The extensive strong-motion data set recorded during the Chi-Chi (Taiwan) earthquake sequence of 20 September 1999 (Lee et al., 2001) showed that these near-fault data will not only contribute to information needed for earthquake engineering but will also lead to a better understanding of the earthquake process (Lee and Shin, 2001; Teng et al., 2001). 13
Knowledge about the nature of strong ground motions expected from damaging earthquakes will become increasingly important as urbanization rapidly increases, because the lives and properties of ever greater numbers of people are exposed to seismic hazards (see Chapter 74 by Giardini et al.). Utsu in Chapter 42 compiled a list of deadly earthquakes in the world for the past five centuries and it is instructive to view his maps. A great challenge for seismologists will be the continued improvement of seismic monitoring to reduce seismic hazards. In particular, it is now technically possible to implement earthquake early-warning systems as discussed in Lee and Espinosa-Aranda (1998). These systems, in principle, provide lifeline operators and citizens with crucial seconds or minutes of lead time for taking some protective action before the strong shaking of a damaging earthquake begins. However, an earthquake early-warning system is expensive to implement and its benefit is difficult to evaluate.
Last, but not least, seismic arrays optimized to detect nuclear explosions (e.g., LASA and NORSAR) and their impact in the development of earthquake seismology have not been discussed, as readers can consult Chapter 23 by Douglas and Chapter 24 by Richards. Seismologists owe a great deal to the military for its generous support in advancing seismology, 14 although some may question its motives and policies regarding data analysis and exchange.
8. Concluding Remarks
Seismic monitoring of earthquakes is becoming a big science (Price, 1963; Weinberg, 1967). For example, the Japanese Fundamental Seismic Survey and Observation Plan had been executed successfully as discussed in Section 7. Two large projects are now underway in the United States: (1) the Advanced National Seismic System (ANSS) to integrate seismic monitoring of earthquakes (see http.'//www.anss.org/), and (2) the USArray, "a continental-scale seismic array to provide a coherent 3D image of the lithosphere and the deeper Earth"
278
Lee
(see http://www.earthscope.org/). Since 1996, the Preparatory Commission for the Comprehensive Nuclear Test-Ban Treaty Organization (CTBTO) is establishing global monitoring systems (including seismological) worldwide (visit its website at http://www.ctbto.org/) (see also Chapter 24 by Richards).
The most direct argument for governments to support longterm seismic monitoring is to collect some relevant data for hazard mitigation. As noted in Section 6, economic losses from damaging earthquakes in the past decade are about $200 billion, and future losses will be even greater as rapid urbanization is taking place worldwide. For example, the recent Japanese Fundamental Seismic Survey and Observation Plan (costing several hundred million US dollars) is a direct response to the economic losses of about $100 billion due to the 1995 Kobe earthquake.
In addition to scientific and technological challenges in observational seismology, seismologists must pay attention to achieving (1) stable long-term funding, 15 (2) effective management and execution, and (3) delivery of useful products to the users. Observational seismologists must have perseverance in order to succeed.16
Acknowledgments
I owe my seismic training to colleagues at the USGS and at many other institutions. Roger Borcherdt and Yi-Ben Tsai convinced me of the importance of monitoring strong ground motion, and I wish to thank them for their patience. I was fortunate to have participated in the large-scale seismic instrumentation program of Taiwan from 1991 to 1996. I thank Tony Shin and his staff at the Central Weather Bureau, Taipei, for being gracious hosts.
I am grateful to Robin Adams, Duncan Agnew, Nick Ambraseys, Doc Bonilla, Dave Boore, Roger Borcherdt, Peter Bormann, Ken Campbell, Jim Cousins, John Evans, Jennifer Hele', Porter Irwin, Paul Jennings, Hiroo Kanamori, Carl Kisslinger, Fred Klein, John Lahr, Axel Plesinger, Paul Richards, Johannes Schweitzer, Tony Shakal, Shri Singh, Arthur Snoke, Chris Stephens, Ta-Liang Teng, Yi-Ben Tsai, and Erhard Wielandt for their comments and suggestions on the manuscript. I thank Tokuji Utsu for pointing out the importance of extending seismic networks to offshore zones in seismically active regions, and I am grateful to Nobuo Hamada for supplying information on such efforts in Japan. I also thank Lucy Jones and David Oppenheimer for answering my queries about the Northridge earthquake and the Loma Prieta earthquake, respectively.
References
Agnew, D., J. Berger, R. Buland, W. Farrell, and F. Gilbert (1976). International deployment of accelerometers: a network for very long period seismology. EOS 57, 180-188.
Block, B. and R.D. Moore (1966). Measurements in the Earth mode frequency range by an electrostatic sensing and feedback gravimeter. J. Geophys. Res. 71, 4361-4375.
Borcherdt, R.D. (Ed.) (1997). "Vision for the future of the US National Strong-Motion Program," The committee for the future of the US National Strong Motion Program. US Geol. Surv. Open File Rept. 97-530 B.
Borcherdt, R.D., A. Frankel, W.B. Joyner, and J. Bouabid (1997). Vision 2005 for earthquake strong ground-motion measurement in the United States. In: "Proceedings, Workshop, Vision 2005: An Action Plan for Strong Motion Programs to Mitigate Earthquake Losses in Urbanized Areas" (J.C. Stepp, Ed.), Monterey, CA, April, 1997, pp. 112-130.
Bormann, P. (Ed.) (2002). "New Manual of Seismological Observatory Practice," in preparation. [A pre-publication version is placed on the attached Handbook CD under the directory of \ 81\ IASPEE Training.]
Dewey, J. and P. Byerly (1969). The early history of seismometry (to 1900). Bull. Seismol. Soc. Am. 59, 183-227.
Eaton, J.P. (1989). Dense microearthquake network study of northern California earthquakes. In: "Observatory Seismology" (J.J. Litehiser, Ed.), pp. 199-224. University of California Press, Berkeley.
Evans, J.R. (2001). Wireless monitoring and low-cost accelerometers for structures and urban sites. In: "Strong Motion Instrumentation for Civil Engineering Structures" (M. Erdik, M. Celebi, and V. Mihailov, Eds.), pp. 229-242. Kluwer Academic, Dordrecht.
Field, E.H. and S.E. Hough (1997). The variability of PSV response spectra across a dense array deployed during the Northridge aftershock sequence. Earthq. Spectra 13, 243-258.
Geschwind, C.H. (2001). "California Earthquakes: Science, Risk and the Politics of Hazard Mitigation." Johns Hopkins University Press, Baltimore, MD.
Goltz, J.D., P.J. Flores, S.E. Chang, and T. Atsumi (2001). Emergency response and early recovery. In: "1999 Chi-Chi, Taiwan, Earthquake Reconnaissance Report." Earthq. Spectra 17 (Supplement A), 173-183.
Goodstein, J.R. (1991). "Millikan's School: A History of the California Institute of Technology." Norton, New York.
Grove, A.S. (1987). "One-on-One With Andy Grove: How to Manage Your Boss, Yourself and Your Co-Workers." Putnam, New York.
Grove, A.S. (1996). "Only the Paranoid Survive." Currency Doubleday, New York.
Harjes, H.-P. and D. Seidl (1978). Digital recording and analysis of broadband seismic data of the Gr~ifenberg (GRF) Array. J. Geophys. 44, 511-523.
Kanamori H., E. Hauksson, and T. Heaton (1997). Real-time seismology and earthquake hazard mitigation. Nature 390, 461-464.
Kao, H. and J. Angelier (2001). Data files from "Stress tensor inversion for the Chi-Chi earthquake sequence and its implication on regional collision." Bull. Seismol. Soc. Am. 91, 1380 [and on the attached CD Supplement].
Kerr, R.A. (1991). A job well done at Pinatubo volcano. Science 253, 514.
Lawson, A.C. (chair) (1908). "The California Earthquake of April 18, 1906mReport of the State Earthquake Investigation Commission." Carnegie Institution of Washington, Washington, DC.
Challenges in Observational Seismology
279
Lee, W.H.K. (Ed.) (1994). "Realtime Seismic Data Acquisition and Processing." IASPEI Software Library, Vol. 1, 2nd edn. Seismological Society of America, E1 Cerrito.
Lee, W.H.K. and J.M. Espinosa-Aranda (1998). Earthquake earlywarning systems: current status and perspectives. In: "Proceedings of International IDNDR-Conference on Early Warning Systems for the Reduction of Natural Disasters," Potsdam, Germany.
Lee, W.H.K. and T.C. Shin (2001). Strong-motion instrumentation and data. In: "1999 Chi-Chi, Taiwan, Earthquake Reconnaissance Report." Earthq. Spectra 17 (Supplement A), 5-18.
Lee, W.H.K. and S.W. Stewart (1981). "Principles and Applications of Microearthquake Networks." Academic Press, New York.
Lee, W.H.K., H. Meyers and K. Shimazaki (Eds.) (1988). "Historical Seismograms and Earthquakes of the World." Academic Press, San Diego.
Lee, W.H.K., T.C. Shin, K.W. Kuo, K.C. Chen, and C.F. Wu (2001). CWB free-field strong-motion data from the 921 Chi-Chi (Taiwan) earthquake. Bull. Seismol. Soc. Am. 91, 1370-1376 [with data on the attached CD Supplement].
Melton, B.S. (1976). The sensitivity and dynamic range of inertial seismographs. Rev. Geophys. Space Phys. 14, 93-116.
Munich Reinsurance Company (2000). "World of Natural Hazard," CD-ROM version (see http://www.munichre.com/).
Murray, T.L., J.A. Power, G. Davidson, and J.N. Marso (1996). A PC-based real-time volcano-monitoring data-acquisition and analysis system. In: "Fire and Mud" (C.G. Newhall and R.S. Punongbayan, Eds.), pp. 225-232. University of Washington Press, Seattle.
Oliver, J. and L. Murphy (1971). WWNSS: Seismology's global network of observing stations. Science 174, 254-261.
Parkinson, C.N. (1957). "Parkinson's Law and Other Studies in Administration." Houghton Mifflin, Boston.
Price, D.J. de Solla (1963). "Little Science, Big Science." Columbia University Press, New York.
Richter, C.F. (1958). "Elementary Seismology." Freeman, San Francisco.
Romanowicz, B., J.F. Karczewski, M. Cara, et al. (1991). The GEOSCOPE Program: present status and perspectives. Bull. Seismol. Soc. Am. 81, 243-264.
Salam, A. (1979). "Gauge unification of fundamental forces." Nobel Prize in Physics Award Address, Nobel foundation.
Stepp, J.C. (Ed.) (1997). "Vision 2005: An action plan for strong motion programs to mitigate earthquake losses in urbanized areas." In: "Proceedings of a Workshop, Monterey, CA," April, 1997 (posted at http://www.cosmos-eq.org/vision2OO5.pdf).
Sun-Tzu (R. Ames, Transl.) (1993). "The Art of Warfare." Ballantine Books, New York.
Teng, T.L., L. Wu, T.C. Shin, Y.B. Tsai, and W.H.K. Lee (1997). One minute after: strong motion map, effective epicenter, and effective magnitude. Bull. Seismol. Soc. Am. 87, 1209-1219.
Teng, T.L., Y.B. Tsai, and W.H.K. Lee (Eds.) (2001). The 1999 ChiChi, Taiwan, Earthquake. Bull. Seismol. Soc. Am. 91, 893-1395.
Weinberg, A.M. (1967). "Reflections on Big Science." MIT Press, Cambridge, MA.
Wielandt, E. and G. Streckeisen (1982). The leaf-spring seismometer: design and performance. Bull. Seismol. Soc. Am. 72, 2349-2367.
Willemann, R.J. and D.A. Storchak (2001). Data collection at the International Seismological Centre. Seismol. Res. Lett. 72,440-453.
Wu, Y.M., W.H.K. Lee, C.C. Chen, T.C. Shin, T.L. Teng, and Y.B. Tsai (2000). Performance of the Taiwan Rapid Earthquake Information Release System (RTD) during the 1999 Chi-Chi (Taiwan) earthquake. Seismol. Res. Lett. 71, 338-343.
Notes
1. Because of low magnification, slow recording speed, and no damping, Milne seismograms are useful only for recording M ~ 8 or larger earthquakes. Numerous seismographs were developed beginning in the late 1890s. Mechanical seismographs (e.g., Wiechert) are considered to be the first generation. Electromagnetic seismographs (developed by B. Galitzin in the first decade of the 20th century) are considered the second generation, and dominated the design for the next 60+ years. The thirdgeneration seismographs developed in the 1970s are based on electronic force feedback.
2. Many dedicated individuals kept John Milne's vision alive for 50 difficult years. After Milne's death in 1913, the Shide Circulars were continued as bulletins by the British Association's Seismological Committee (i.e., J. H. Burgess and H.H. Turner). This publication became "The International Seismological Summary" (ISS) in 1922 under the direction of H.H. Turner and was supported by the International Union of Geodesy and Geophysics. After Turner's death in 1930, ISS volumes for 1927 to 1935 were issued by the University Observatory, Oxford. They were then issued by the Kew Observatory, Richmond, for the years 1936 to 1961. The ISS volumes for 1962 and 1963 were issued by the International Seismological Centre (ISC), which was organized in 1964 and continues to this day (Willemann and Storchak, 2001).
3. Earthquake prediction has always been a controversial subject. Proponents and skeptics debate the subject with an intensity comparable to that in religion or politics. Richter (1958, p. 8) wrote, "Prediction of earthquakes in any precise sense is not now possible. Any hope of such prediction looks toward a rather distant future. Cranks and amateurs frequently claim to predict earthquakes. They deceive themselves, and to some extent the public . . . . . " In the 1960s and 1970s, some seismologists claimed that they had good leads for predicting earthquakes, almost all in hindsight. Excitement soon faded because their claims could not be replicated. A few lonely voices even questioned the usefulness of a successful earthquake prediction. Nevertheless, earthquake prediction programs did advance earthquake seismology in many areas.
4. The concept of IRIS began in the late 1970s as a series of discussions to advance global seismology. One concern was about the failure of the WWSSN to modernize and
280
Lee
expand using digital technology (Paul Richards, personal communication, 2001).
5. After the 1906 San Francisco earthquake, government officials and civic leaders did not want to study earthquakes--bad for attracting business to the then-developing California (Geschwind, 2001). It was maintained that the fire destroyed San Francisco. The "Report of the State Earthquake Investigation Commission" on the 1906 earthquake (Lawson, 1908) in two volumes with an atlas was published by the private Carnegie Institution of Washington in Washington, DC.
6. Information about the location of the Loma Prieta and the Northridge earthquakes took ~ 1 hour and ~ 30 min, respectively, to reach the media and the public, although the locations were determined by the real-time systems quickly. According to David Oppenheimer (personal communication, 2001), "We lost A/C power and could not report out the solution [of the Loma Prieta earthquake location by the real-time system] to terminals or pagers." According to Lucy Jones (personal communication, 2001), the real-time system did not page the results because it flagged the location as a probable telemetry glitch in the Northridge case.
7. The amount of seismic data is large from the seismologists' point of view, but is small in comparison with that in many other disciplines. Unfortunately, most seismologists do not pay much attention to data archiving and, as a result, most seismograms and seismic data more than 20 years old are difficult to access. Current technology in computer storage can easily handle terabytes of data, but all storage media have finite lives and must be renewed in order to save the data permanently.
8. Some seismologists believe that it is cheaper to design and build your own instruments than to buy commercial ones. There is no doubt that designing and building an instrument oneself is instructive and useful for learning. It is also necessary in some cases to develop one's own instruments for certain applications when there are none available commercially. The argument that one needs only about 115 of the cost of a commercial instrument to buy the parts and that therefore one would save a lot of money, is false. By the time a prototype instrument is built and works in the field, it will usually cost more than the comparable commercial product when the salaries and overheads are taken into account.
9. Sun-Tzu was a general in the Kingdom of Wu (514496 Be) in China. He gained an audience with the King by presenting his "Art of War." He had a distaste of war and urged using military action only as a last resort. He argued that the cost of military actions would always be expensive and questioned whether there would be any financial gains
even if successful. He emphasized strategy, logistics, and discipline in the battle field. The "Art of War" has been widely circulated since his time, and bamboo strips containing it (dated about 200 Be) were discovered in 1974 in an ancient tomb.
10. C.N. Parkinson made a remarkable discovery that the number of subordinates in a government agency multiplies at a rate of 5-7% of the total staff per year, regardless of need. He used the staffing data from the British Admiralty and other British bureaus.
11. For example, the responsibility for earthquake monitoring in the United States government has already changed hands four times in less than a century--it began at the Weather Bureau in 1914, moved to the US Coast and Geodetic Survey in 1925, merged into the Environmental Science Services Administration (ESSA) and then the National Oceanic and Atmospheric Administration (NOAA) in the 1960s, and merged into the US Geological Survey (USGS) in the early 1970s. USGS, itself, narrowly escaped being eliminated by the US Congress in the mid- 1990s.
12. For example, shortly after a few hundred seismographs were established around the world in the early 1900s, the gross structure of the Earth's interior was quickly established by the 1930s. The establishment of the World Wide Standardized Seismograph Network in the early 1960s enabled the study of global seismicity and focal mechanisms on a scale that was previously impossible. As a result, earthquake seismology made significant contributions to the development of the theory of plate tectonics in the late 1960s.
13. After the 1989 Loma Prieta earthquake, Yi-Ben Tsai realized the values of strong-motion records for correlating ground shaking with structural damage. In a visit to Taiwan, he met Dr. Ching-Yen Tsay, the then new Director-general of the Central Weather Bureau and proposed an extensive strong-motion instrumentation program in the urban areas of Taiwan. Dr. Tsay enthusiastically accepted the proposal and persuaded Dr. Chan-Hsuan Liu, the then Minister of Transportation and Communication to incorporate it as a part of the Sixyear National Construction Programs, being planned at that time. An advisory board (chaired by Ta-Liang Teng) was established for the detailed planning and execution of the strong-motion instrumentation program, and subsequently a budget of about US $72 million was authorized by the Taiwan Legislature in 1991. I spent full time working in this program as an invited advisor, on loan from the USGS. This instrumentation program was completed slightly ahead of schedule in 1996 and for a little over US $40 million. Its successful completion was due largely to the efficient execution by Tony Shin and his staff at CWB, and
Challenges in Observational Seismology
281
to a small extent to my desire to retire from the USGS in 1995. None of us expected that a major earthquake of magnitude 7.6 would occur so soon in Taiwan.
14. The following is just a personal case. I joined the Earthquake studies group of the US Geological Survey in 1967. This group had its origin in the Vela-Uniform program of the Defense Advanced Research Projects Agency (DARPA). The simple PC-based real-time system I developed in the 1980s (Lee, 1994) was a by-product of a project funded by DARPA to conduct experiments in a quarry, and I thank Bob Blandford for his kind support. Its early success was in monitoring volcanic eruptions (e.g., Kerr, 1991; Murray et al., 1996). I was able to modify it quickly for the two key elements of the CWB strong-motion instrumentation program in Taiwan: (1) the RTD system (later refined by Y.M. Wu and his associates at CWB), and (2) the strong-motion array systems deployed in buildings and bridges. This simple PC-based real-time system with reference accelerometers and displacement gauges permitted me to evaluate commercial accelerographs on a shake table quickly, and made it possible to accelerate the procurement of accelerographs and accelerometers for deployment in Taiwan under open-bidding.
15. As pointed out by Duncan Agnew (personal communication, 2001), the funding system has a legitimate bias toward novelty and against doing the same thing as in long-term seismic monitoring. Seismology has to get by with occasional operational needs: whenever there is a damaging earthquake in a populated area, everyone gets very interested, but not for the rest of the time. One solution is for a government to impose a "tax" for funding seismic monitoring in the building permit fees; for example, the California Strong Motion Instrumentation Program has been funded this way.
16. In his Nobel lecture, Abdus Salam (1979) said he gave up experimental physics and started on quantum field theory because he recognized that the craft of experimental physics was beyond him, for "it was the sublime quality of patiencewpatience in accumulating data, patience with recalcitrant equipment" that he sadly lacked.
Editor's Note
For readers' convenience, two out-of-print books (Lee and Stewart, 1981; Lee, Meyers, and Shimazaki, 1988) are given as PDF files on the attached Handbook CD under the directory \ 17Lee 1.