Preferred Citation: Litehiser, Joe J., editor Observatory Seismology: A Centennial Symposium for the Berkeley Seismographic Stations. Berkeley:  University of California Press,  c1989 1989. http://ark.cdlib.org/ark:/13030/ft7m3nb4pj/


 
Four— The Global Seismographic Network: Progress and Promise

Four—
The Global Seismographic Network:
Progress and Promise

Adam M. Dziewonski

Introduction

The opening of the Berkeley stations preceded by two years the first documented teleseismic observation. According to Bolt (1982), E. von Rebeur Pashwitz interpreted the disturbances recorded by horizontal pendulums at Potsdam and Wilhelmshaven as caused by the earthquake in Tokyo of April 18, 1889. After that, progress in global seismology was rapid. The first global networks were established at the turn of the century. This coincided with the beginning of international cooperation in seismology; the systematic determination of epicenters of the world's large earthquakes reaches as far back as 1899.

The rewards were not long in coming. In 1906 Oldham discovered the liquid core, and in 1909 Mohorovicic[*] reported the discontinuity between the crust and the mantle. By the end of the 1930s the conceptual image of the spherically symmetric model of the elastic properties of the Earth's interior was essentially complete. It was also rather accurate. Travel-time tables of Jeffreys and Bullen, first published in 1939, are still used to locate earthquakes by the National Earthquake Information Centre (NEIC) of the U.S. Geological Survey (USGS) and by the International Seismological Centre (ISC). Important progress was made in the 1960s in the recovery of the fine structure of the upper mantle and in observations of free oscillations of the earth. It is significant, however, that this progress depended on the availability of waveform data to the researchers' data.

Studies using waveform data from a widely distributed network of stations were difficult until copies of the centralized information were made available. This, in addition to the uniformity of the instrumental responses, was the


65

principal advantage of the Worldwide Standardized Seismographic Network (WWSSN) established twenty-five year ago. The majority of the stations of this analog network, which once had 125 installations, are still operational.

The first two global networks with digital recording became operational in the mid-1970s. These were the International Deployment of Accelerometers (IDA; Agnew et al., 1976) and the Global Digital Seismographic Network (GDSN; Peterson et al., 1976) operated by the USGS. They were complementary in their response: IDA was designed to record ultralong-period waves, while GDSN has the highest sensitivity in the passband important for discriminating between nuclear explosions and earthquakes. Both suffered, however, from a rather limited dynamic range. The digital array installed in Gräfenberg, West Germany, in 1976 did much to convince the seismological community of the advantages of broadband recording (Harjes and Seidl, 1980).

Data from the early digital networks made a very significant contribution to seismology; some say that seismic tomography (Anderson and Dziewonski, 1984) will cause a new revolution in earth sciences. There has been progress in the routine quantification of earthquakes. The two figures shown here are meant to convey what is being done.

Figure 1 is a view of the Earth from a point 35,000 km from its center. The surface has been cut with a spherical triangle, each of its sides 10,000 km long. Velocity anomalies are mapped with a variable density of shading: the slowest velocities are the lightest. This is a composite picture, showing shear-velocity anomalies in the upper mantle, retrieved from mantle-wave analysis by Woodhouse and Dziewonski (1984), compressional velocity anomalies in the lower mantle, obtained from travel-time residuals by Dziewonski (1984), and compressional velocity anomalies obtained from observations of splitting of normal modes by Giardini et al. (1987). While progress continues—there are now shear-velocity models of the lower mantle from waveform inversion, and it had been proposed that the inner core is anisotropic—the important fact to notice is that images of the Earth's interior are being built from various types of data from very different frequency ranges.

Figure 2 shows eighty-eight centroid-moment tensor (CMT) solutions obtained from the analysis of GDSN data for a single month, March 1987 (Dziewonski et al., 1987). The CMT method was developed by Dziewonski et al. (1981) and later generalized by Dziewonski and Woodhouse (1983) to incorporate mantle waves. Woodhouse and Dziewonski (1984) developed algorithms for the direct and inverse problems for lateral heterogeneity; since 1984 the CMT solutions have been derived using synthetic seismograms and differential kernels corrected for aspherical structure. With the analysis extended back in time to 1977, the total number of CMT solutions for the last


66

figure

Figure 1
"Window into the Earth." A perspective view from a distance of 35,000 km
to the Earth's center, showing regions of fast (dark) and slow (light) seismic
velocities in the mantle and in the inner core. Upper mantle model is determined
from inversion of long-period waveforms, lower mantle from travel times of
body waves, and inner core from splitting of free oscillation spectral peaks.
Computer graphics by John Woodhouse, Harvard University.


67

figure

Figure 2
Centroid-moment tensor solutions for eighty-eight earthquakes occurring during March 1987 (Dziewonski et al., 1987). Although difficult
to distinguish in this figure, both moment-tensor (shading) and best-double-couple solutions (solid lines) are shown. Beachball diameter
is a linear function of magnitude; the smallest  M0  shown is 5 × 1023  dyne-cm, the largest (off the coast of Chile), 5 × 1027  dyne-cm.


68

eleven years stands at about 6,500. This data set allows us to investigate in detail spatiotemporal variations in regional stress release.

The figures were selected arbitrarily only as illustrations. When the "Science Plan for a New Global Seismographic Network" was prepared by the Incorporated Research Institutions for Seismology (1984), as many as thirty-four scientists from sixteen institutions contributed to it. Science plans were also prepared for GEOSCOPE, a global network operated by French universities (Romanowicz et al., 1984), and ORFEUS (Observatories and Research Facilities for European Seismology), an association of fifteen western European countries. Chapter 3 in this volume, by M. J. Berry, describes the formation and objectives of the Federation of Digital Broadband Seismographic Networks in the context of the international exchange of seismic data. The Federation will adopt minimum common standards for the response and dynamic range of the stations of member networks. The Federation may also adopt optimum standards. These, as preliminary discussions indicate, could be very close to those chosen by IRIS when it formulated the design goals for its Global Seismographic Network (GSN) stations. The description of a GSN station and progress toward its implementation are the subject of this chapter.

The Very Broad Band Seismograph

A meeting of the Ad Hoc Group on Global Seismographic Network was held in July 1983. Its report was an essential step toward the establishment of the GSN project. At the same time, less than thirty miles away, the final touches were being put on the first operational Very Broadband (VBB) Seismograph System with digital recording (Wielandt and Steim, 1986).

The idea of a VBB system is often credited to Plesinger (Plesinger and Horalek, 1976) who, in 1972, installed a system with a response flat to ground velocity from 0.3 to 300 seconds at Kasperske Hory, Czechoslovakia. The signals were recorded on magnetic tape using FM modulation. With the response falling off as w2 at long periods there was enough sensitivity to record not only the gravest modes of free oscillations of the Earth, but also the tides. At short periods, frequencies up to 5 Hz could be easily captured, and thus the entire band needed to record teleseismic signals could be accommodated.

Wielandt and Steim (1986) put it this way: "The most obvious argument in favour of such a VBB system is that is does not a priori determine the seismological research. Every user would simply extract that band of frequencies in which he is interested, unrestricted by any artificial subdivision of the spectrum."

The GSN "Design Goals" document (IRIS, 1985) states several basic requirements:


69

1. The sensitivity must be sufficient to resolve seismic signals at the level of the lowest ambient noise within the band from 0.3 mHz to 5 or 10 Hz.

2. The system should record on scale the largest teleseismic signals—say, an earthquake with Mw = 9 at 30°.

3. The linearity of the system should be such that signals near the ground noise minimum can be resolved in the presence of the maximum ground noise at other frequencies (microseismic storms).

Wielandt and Steim argue that it is possible to meet these requirements with a single data stream generated by a feedback system with a response flat to velocity between a frequency of 5 Hz and a period of 360 s. A much larger body of evidence related to this point can be found in Steim (1986).

The wide band of frequencies, with a significant variation in the spectral power within this band imposes important requirements on the other elements of the system, namely the analog-to-digital (A/D) converters. The need for a 140-db operating range could be met with a gain-ranging system; it could also be met with an A/D converter having a 24-bit resolution, undoubtedly more expensive and initially subject to serious doubts as to whether it could be realized.

Figure 3a, from Steim (1986), demonstrates the results of a test carried out in parallel on two high-resolution digitizers of different design. Two tones were used, one signal of 0.03 Hz with an amplitude of ± 5,000,000 counts and the other of 1 Hz with an amplitude of ± 5 counts, a million times smaller. The top two boxes show the unfiltered outputs from both digitizers, while the bottom two show the results after processing by a high-pass digital filter. A signal of 1 Hz is clearly present in both outputs, and it appears that distortion might have been introduced in part by the analog input, since the noise components in the two time series are somewhat similar. Figure 3b is a frequency-domain representation of the same test. The peak at 1 Hz with an amplitude of—120 db is about 30 db above the noise level. It has been demonstrated, therefore, that high-resolution A/D converters work. But are they worth the additional cost?

The following two examples also come from the work of Steim (1986). One of the experimental versions of the system developed at Harvard used a gain-ranged A/D converter with a 15-bit mantissa. With the sensitivity set at a reasonable level, it is only for earthquakes with magnitudes above 7.5, at teleseismic distances, that the need for gain ranging arises. Therefore, not until the Michoacan earthquake of September 19, 1985, was an offset of four counts between the zeros of the high-gain and low-gain channels discovered.

In the middle of figure 4 is a plot of the original VBB trace of the vertical component of the large September 21 Michoacan aftershock. No evidence of distortion can be seen. At the top are two traces obtained by filtering the


70

figure

Figure 3a
A two-tone test of two 24-bit A/D converters of different design and
manufacure. A 0.03-Hz tone and a 1-Hz tone, one million times smaller,
are applied simultaneously to the two encoders. The 1-Hz tone (lower
two panels) is visible after high-pass filtration. From Steim (1986).


71

figure

Figure 3b
Spectral representation of the experiment shown in the time domain in figure 3a. Note that the
1-Hz tone is 120 dB down but still some 30 dB above the noise floor. From Steim (1986).


72

figure

Figure 4
Quantization and offset errors at frequencies above 1 Hz, shown for the gain-ranged digitizer recording
of the Michoacan aftershock of September 21, 1985. See text for discussion; from Steim (1986).


73

VBB record with a short-period WWSSN response. The top trace shows distinct glitches at the times of gain switching. The errors appear to be largely removed when correction for the zero offset is applied. The bottom pair show the same VBB record processed by a 4-Hz high-pass filter. The spikes due to gain switching are now the most prominent feature, and even correction for zero offset does not remove them completely. What cannot be corrected at all is the high noise floor, which is particularly noticeable at high frequencies of the low-gain channel.

Gain-switching errors can also be important at very long periods. Figure 5 (top) shows the very-long-period recording of both the main shock and the largest aftershock of the Michoacan earthquake. In both cases significant distortions are visible immediately after the R1 arrivals. This is because the zero offset causes an overall bias when the gains are switching. In this case, correction for the offset seems to remove the distortion to a large extent.

It is next to impossible to assure perfect accuracy of the gain steps and matching of the zero levels. This is particularly true with regard to field-deployed instruments. Electronic parts can age differently, and the performance of instruments may change with time. In addition to the above argument against gain switching, there are other reasons to use 24-bit digitizers in the VBB system. Winterly marine microseisms can be 4,000 time larger in ground velocity that the ground noise in the free oscillation band. This factor could be 200,000 for a response flat to acceleration.

In 1987, as an interim step toward improving the quality of digital data, IRIS, in cooperation with the USGS, installed VBB sensors at five stations already equipped with digital data loggers: College, Alaska; Afiamalu, Samoa; Kevo, Finland; Charters Towers, Australia; and Toledo, Spain. A preliminary version of the GSN station (to be described below), which includes 24-bit digitizers, will soon be deployed at several U.S. sites: College, Alaska; Kipapa, Hawaii (in cooperation with GEOSCOPE); Cathedral Caves, Missouri; Pasadena, California; and Harvard, Massachusetts. These stations will be equipped with a dial-up system, and it will be possible for individual seismologists to retrieve data from them in quasi-real time.

IRIS also cooperates with the IDA project. New VBB stations have been deployed at Piñon Flats, California, and on Easter Island. Early upgrades are scheduled for Eskdalemuir, Scotland, and Sutherland, South Africa.

A state-of-the-art Data Collection Center is being developed jointly by IRIS and the USGS at the Albuquerque Seismological Laboratory. It is planned that the vastly increased data volume (GSN stations will be recording all three components of VBB data continously at twenty samples per second) will be handled using new optical mass-storage media.

Other member networks of the Federation are also accepting the VBB response. The very first network station using this response is a GEOSCOPE


74

figure

Figure 5
Very-long-period recordings of the Michoacan earthquakes of September 19 and 21, 1985, before (top)
and after (bottom) correction of gain-ranged digitizer for offset and gain errors. From Steim (1986).


75

installation in Japan. Canada, Italy, Germany, and others are also installing VBB stations.

Description of a GSN Station

The Science Plan for New Global Seismographic Network (IRIS, 1984) documented the need for a broadband station with certain characteristics. This issue was followed up in greater detail in the Design Goals for GSN (IRIS, 1985). Both documents had a distribution of about 500 copies, roughly half sent to scientists outside the United States. In the spring of 1986 these documents were translated into design specifications and submitted to IRIS's administration.

In April 1987, following appropriate procurement procedures, IRIS entered into a contractural agreement with Gould, Inc., for construction of a prototype station processor. The choice of the architecture of the system was guided by several considerations:

1. Meeting of the scientific objectives

2. Reliability and ease of maintenance

3. Modularity and minimal dependence on unique products

4. Feasibility of future upgrades and modifications of the chosen elements of the system

5. Scientific and operational benefits for the host organization

Figure 6 is a block diagram of the system. The two principal components are the data-acquisition unit (DA) and the data-processing unit (DP). They are connected by a serial link or any form of telemetry, from telephone lines to satellites. The following is a brief description of the major features of a GSN station.

Data-Acquisition (DA) Module

The DA is the heart of the system. With only minor software modifications, relative to the agreed-upon specifications, and the addition of a tape recorded, this unit would be, in principle, capable of fulfilling the principal objective of the entire system, namely, the acquisition and recording of data.

The major components of the DA system are:

1. A very broadband seismic sensor system. The STS-IV(H)/VBB, built by Streckeisen & Co., is a set of sensors that satisfy the IRIS design goals for installation in a vault.

2. A high-resolution digitizer/calibrator unit (HRDCU). This is a three-channel 24-bit formatted digitizer/calibrator for use with the VBB sensors. Its data rate is twenty samples per second (sps) per channel.


76

figure

Figure 6
Block diagram of the prototype GSN station. From report U87-260 by Gould Inc.


77

3. Kinemetrics OMEGA UTC time receiver.

4. Auxiliary digitizer unit (AUXDU). This is for monitoring various state-of-health parameters of the VBB system and, optionally, other parameters such as atmospheric pressure or geomagnetic field. Data rates are programmable.

5. Optional low-resolution digitizer-calibrator unit (LRDCU). This is for digitizing the output of additional sensors.

6. Optional very-short-period (VSP) sensor system. "Design Goals" suggested a sampling rate of 100 sps. Consideration is being given to increasing it up to 200 sps.

7. Low-gain (LG) sensor subsystem. This was originally thought of as a means to extend the dynamic range of the VBB system and could serve as a strong-motion monitor. Its sampling rate is the same as that of the VSP system. There could be a limitation on the aggregate data rate of the VSP and LG channels (about 600 sps total—all sensors and all components).

8. Data-acquisition unit (DA). This is based on a 68000 or 68020 CPU with a VME bus, operating under an OS-9 operating system and special application software. The DA processor interfaces with the three GSN digitizers (HRDCU, LRDCU, and AUXDU) and with the data-processor unit (DP). All interfaces are bi-directional. The major data-manipulation functions of the DA processor are to collect data from the three digitizers, time tag data blocks, compress VBB data, perform VSP and LG event detection, and transmit data to the DP.

Data-Processor (DP) Module

The DP module provides the means for on-site and remote monitoring of system performance, communications with remote users, digital and analog recording of data, maintenance of data buffers, display of data for a selected time window on a graphic terminal, and many other functions. Its principal role is the enhancement of the functions of the DA unit, on-site quality control, and providing station personnel and remote users with information on seismic events.

The station DP consists of commercially available hardware modules and is based on a standard VME bus. Figure 7 gives some details of the DP configuration. The computing power of the 68020 chip is such that only a small fraction of its capacity will be used for the planned tasks, which leaves much room for expansion. Even now it is a fairly complex, sophisticated system.

Figure 8 is a block diagram of the DP unit's data-manipulation functions. The VBB data are decompressed so that they can be processed through a series of digital filters that produce short-period (SP), long-period (LP; 1 sps), and very-long-period (VLP; 0.1 sps) data. The LP and VLP data


78

figure

Figure 7
Configuration of the GSN station data processor (DP). From report U87-260 by Gould Inc.


79

figure

Figure 8
Data processing functions carried out by the DP unit. U87-260 by Gould Inc.

streams are obtained using finite-impulse-response (FIR) filters with a very sharp corner near the edge of the passband. This assures retention of the maximum information in the passband and may modify the way in which seismologists evaluate the usefulness of various data streams.

Figure 9 is taken from Steim (1986). It shows the original VBB channel and five channels derived from it. There is a twenty-second delay caused by a FIR filter. The trace labeled LP is obtained from the VBB stream using a 201–points FIR filter. It is clear that is contains most of the information present in the VBB stream, but with twenty times fewer samples. It very well may be that this will be the most frequently used data stream, particularly because of its relatively light archival burden. The other data streams shown contain less high-frequency energy, with the simulated SRO-LP channel representing an extreme. The VBB response is particularly convenient for a stable recovery of the ground displacement function (bottom trace).

The command, control, and algorithmic functions executed by DP are shown in figure 10. Rather than discuss them in detail, let us list functions that the operator can execute from the system console without interrupting acquisition of data:

1. Adjust the scale and select active analog monitor channels, including the selection of simulated WWSSN and Seismic Research Observatory (SRO) response functions.

2. View a continuously updated full-screen status display that shows a snapshot of all data channels, the internal and received UTC time, and several other system parameters.

3. View the system event log.

4. Change the tape cartridge without loss of data.

5. Examine the status of active processes.

6. View selected data waveforms from buffers or in real time.

7. Set, change, or display event-detection parameters.

8. Exchange message text over the real-time and dial-up ports.


80

figure

Figure 9
The information content of the "broadband" 1-Hz LP data stream, obtained by processing VBB data
with a FIR filter with a very sharp roll-off near the edge of the passband. The four lower traces,
showing alternative responses, are derived from the LP data stream. The 20-s delay is due to the
finite length of the FIR filter. From Steim (1986).


81

figure

Figure 10
Command, control, and diagnostic function structure within the
data-processor (DP) unit. From Gould Inc. report U87-260.

9. Control and set up a calibration cycle or program the onset time of a calibration sequence to be recorded with the station data.

10. Run a calibration analysis as a low-priority background job.

11. Log messages and the results of calibration to the system's mass-storage device.

Many of the functions described above can be performed through a dial-up port, which allows for frequent checks of station performance from the network maintenance center.

It is expected that a prototype of the GSN system will be delivered in the fall of 1988 and, following tests, production units will be ordered. The first ten, budget permitting, will be deployed in 1989.

A Truly Global Network

The report of the Ad Hoc Group on Global Seismographic Network, prepared in July 1983, reads:

We envisage a network consisting of 100 [digital seismographic] stations of wide dynamic range covering frequencies from a fraction of a milliHertz to several Hertz or higher.


82

figure

Figure 11
Inadequacy of future global coverage by land-based stations. Shaded
areas show blocks that will contain, by 1992, at least one station
satisfying the Federation standards.

The Network would be a major research facility for earth scientists studying the structure and dynamics of our planet. The concept of an integrated set of instruments operating in unison, with widespread and convenient access to global data by a large number of users is a natural step in the evolution of seismology.

What is not clear from this quotation is that all these instruments would be land-based. Even though deployment on oceanic islands would help to alleviate some of the imbalance in distribution, islands are anomalous features in the ocean floor and are characterized by a high level of seismic noise. It is clear that without a complement of permanent ocean-bottom or sub-bottom observatories it will not be possible to examine with uniform resolution either the stress release in the Earth or its internal laterally varying structure.

While important progress is being made in the deployment of land-based stations through efforts such as that of the IRIS initiative in the United States, the GEOSCOPE Project in France, and the activities of other members of the Federation, nonetheless, as figure 11 shows, there are clear limits to the quality of global coverage obtainable in this way. The surface of the Earth has been divided into 128 regions of roughly equal area. Regions in which there will be a Federation station with "global" characteristics by 1992 are marked by hatching. Most of the large "white" areas, except for the Soviet Union, are in the major ocean basins.

The proposal of permanent geophysical observatories on the ocean bottom received strong support during the COSOD II (Conference on Scientific


83

Ocean Drilling) meeting held in Strasbourg, France, in July 1987. While the official report of the COSOD II proceedings is yet to be published, the text below appears to represent the consensus of the participants directly involved in this issue.

The desirability of establishing ocean bottom and sub-bottom geophysical observatories in conjunction with drill holes has been obvious from the start of the deep sea drilling program. But no such observatories have been established for two principal reasons. First, that rocks cored from the drill holes have, by far, been the most important reason for drilling the cores. Second, it has not been clear that the requisite technology has been available for the various aspects of such observatories: development of the bottom and sub-bottom instruments, their emplacement, their maintenance and data retrieval. The reason for tying the ocean bottom/sub-bottom observatories to the drilling program is three-fold. First, that it is very important to have information about the subsurface at the site of the observatories. Second, that emplacing instruments in the borehole reduces noise levels significantly for some instruments (Shearer and Orcutt, 1987). Third, that some experiments specifically require the emplacement of instrument at different levels in the boreholes.

It is critical for ocean bottom seismic observatories to supplement the land-based coverage and to become an essential part of the global observing system during the next decade. Both short period and long period seismic observations are extremely important. Deployment on the ocean floor of a very broad band seismograph system . . . would be ideal. The primary purpose of this type of an observatory would be to monitor seismic activity on the ocean floor, and to record information needed in studies of earth structure and earthquake source mechanism.

Important technological problems remain to be solved. Partly as a result of the COSOD II discussions, new initiatives are developing that involve both the oceanographic and the seismological communities. The first step will be to evaluate the pilot projects that should be undertaken as soon as possible.

There is an unusual opportunity that will develop during the next few years, which should accelerate the process. The telephone companies are now changing to transoceanic cables using fiber-optic technology. This means that old, still functional cables will be abandoned. These cables could well be used to power and transmit in real time data from ocean bottom stations. Indeed, POSEIDON, a very imaginative project proposed by Japanese seismologists, assumes a major deployment of permanent ocean bottom stations in the western Pacific, using the underwater communication cables.

There is a chance, perhaps the greatest during the entire first century of global seismology, of covering the world with instruments of high quality and performance that few of us dared to dream of only a few years ago. An important vehicle for this development will be the largest ever Earth-oriented inter-


84

national scientific program: The Global Change. It is, to a large degree, up to us to convince the rest of the geoscience community that seismology and seismographic stations are indispensable tools for a comprehensive study of our changing environment.

References

Agnew, D., J. Berger, R. Buland, W. Farell, and F. Gilbert (1976). International Deployment of Accelerometers: A network for very long period seismology. EOS, Trans. Am. Geophys. Un., 57: 180–188.

Anderson, D. L., and A. M. Dziewonski (1984). Seismic tomography. Scientific American, 251: 60–68.

Bolt, B. A. (1982). Inside the Earth: Evidence from Earthquakes. W. H. Freeman, San Francisco, 191 pp.

Dziewonski, A. M. (1984). Mapping the lower mantle: Determination of lateral heterogeneity in P-velocity up to degree and order 6. J. Geophys. Res., 89: 5929–5952.

Dziewonski, A. M., T. A. Chou, and J. H. Woodhouse (1981). Determination of earthquake source parameters from waveform data for studies of global and regional seismicity. J. Geophys. Res., 86: 2825–2852.

Dziewonski, A. M., G. Ekström, J. H. Woodhouse, and G. Zwart (1988). Centroid-moment tensor solutions for January–March, 1987. Phys. Earth Planet. Inter., 50: 116–126.

Dziewonski, A. M., and J. H. Woodhouse (1983). An experiment in the systematic study of global seismicity: Centroid-moment tensor solutions for 201 moderate and large earthquakes of 1981. J. Geophys. Res., 88: 3247–3271.

Giardini, D., X. D. Li, and J. H. Woodhouse (1987). Three-dimensional structure of the Earth from splitting in free oscillation spectra. Nature, 325: 405–411.

Harjes, H. P., and D. Seidl (1980). Digital recording and analysis of broad-band seismic data at the Gräfenberg (GRF) array. J. Geophys., 44: 511–523.

Incorporated Research Institutions for Seismology (1984). Science Plan for New Global Seismographic Network. IRIS, Washington, D.C., 130 pp.

——— 1985. Design Goals for New Global Seismographic Network. IRIS, Washington, D.C., 31 pp.

Peterson, J., H. M. Butler, L. G. Holcomb, and C. R. Hutt (1976). The seismic research observatory. Bull. Seism. Soc. Am., 66: 2049–2068.

Plesinger, A., and J. Horalek (1976). The seismic broadband recording and data processing system FBV/DPS and its seismological applications. J. Geophys., 42: 201–217.

Romanowicz, B., M. Cara, J. F. Fels, and D. Rouland (1984). GEOSCOPE—A French initiative in long-period three-component global seismic networks. EOS, Trans. Am. Geophys. Un., 65: 753–754.

Shearer, P. M. and J. A. Orcutt (1987). Surface and near-surface effects on seismic waves—theory and borehole seismometer results, Bull. Seism. Soc. Am., 77, 1168–1196.

Smith, S. W. (1987). IRIS-A university consortium for seismology, Rev. Geophys, Space Phys., 25, 1203–1207.


85

Steim, J. M. (1986). The Very-Broad-Band Seismograph System. PH.D. Diss., Harvard University, Cambridge, 183 pp.

Wielandt, E., and J. M. Steim (1986). A digital very-broad-band seismograph. Annales Geophysicae. Series B, Terrestrial and Planetary Physics, 4: 227–232.

Woodhouse, J. H., and A. M. Dziewonski (1984). Mapping the upper mantle: Three dimensional modeling of earth structure by inversion of seismic waveform. J. Geophys. Res., 89: 5953–5986.


86

Four— The Global Seismographic Network: Progress and Promise
 

Preferred Citation: Litehiser, Joe J., editor Observatory Seismology: A Centennial Symposium for the Berkeley Seismographic Stations. Berkeley:  University of California Press,  c1989 1989. http://ark.cdlib.org/ark:/13030/ft7m3nb4pj/