Results 1 - 10 of 9763
Results 1 - 10 of 9763. Search took: 0.037 seconds
|Sort by: date | relevance|
[en] A non-intrusive, seismic subbottom profile survey of pond sediments was conducted on a former U.S.Naval Facility at Argentia, Newfoundland, to characterize the nature and extent of contamination. An IKB Seistec boomer was used in conjunction with C-CORE's HI-DAPT digital data acquisition and processing system and differential GPS system. The survey was successful in locating regions of soft muddy sediments and in determining the thickness of these deposits. Subsurface buried objects, which are potential sources of pollution, were also identified. Intrusive profiling of the sediment was done with a new tool, the Soil Stiffness Probe, which combines two geophysical measurement systems to determine bulk density and shear stiffness. The muddy sediments were found to be highly 'fluidized', indicating that they could be easily removed with a suction dredge. 4 refs., 5 figs
[en] The following subjects were discussed: Rock mechanics, physical properties of rock strata, seismics and seismology, exploration and data processing, geophysical investigations and borehole logs, studies of the earth's crust and shell archaeometry, geotechnics, rock and palaeomagnetism, petrohydraulics, petrophysical studies, electromagnetic probing, geodyanmics. Separate records are available in the database for 14 papers. (RB)
[de]Themen dieser Tagung sind: Petromechanik, physikalische Gesteinseigenschaften, Seismik und Seismologie, Exploration und Datenverarbeitung, geophysikalische Untersuchungsmethoden und Bohrmessungen, Untersuchungen von Erdkruste und -mantel, Archaeometrie, Ingenieurgeophysik, Gesteins- und Palaeomagnetismus, Petrohydraulik, petrophysikalische Untersuchungen, elektromagnetische Sondierungen, Geodynamik. 14 Beitraege wurden einzeln aufgenommen. (RB)
BackgroundOn March 25th, 2015, a rapid landslide occurred upstream of the village of Gessi-Mazzalasino, in the municipality of Scandiano, affecting two buildings.Rapid landslides, due to their high velocity and mobility, can affect large areas and cause extensive damage.Considering the often unpredictable kinematics of landslides, the post-failure behavior has been studied by many authors to predict the landslide runout phase for hazard assessment.
FindingsWith the aim of characterizing the Gessi-Mazzalasino landslide, field surveys were integrated with the results of laboratory tests. The geometric characteristics (thickness, area and volume) and kinematic aspects of the landslide were estimated by using a laser scanning survey and geomorphological data.To model the landslide and obtain its rheological parameters, a back analysis of the event was performed by means of a depth-averaged 3D numerical code called DAN3D. The results of the back analysis of the landslide propagation were validated with field surveys and velocity estimations along selected sections of the landslide.Finally, potential areas prone to failure or reactivation were identified, and a new simulation was performed that considered the back-calculated rheological parameters.
ConclusionsRapid landslides are one of the most dangerous natural hazards and are one of the most frequent natural disasters in the world. Therefore, prediction of post-failure motion is an essential component of hazard assessment when a potential source of a mobile landslide it is located.To assess the risk affecting the area, both numerical and empirical methods have been proposed, in order to predict the runout phase of the phenomenon.For the numerical modelling of the landslide, carried out with DAN-3D code, the best results were obtained by using a Voellmy reological model, with a constant turbulence parameter (ξ) of 250 m/s2 and a friction parameter (μ) comprised between 0.15 and 0.19.The rheological parameters obtained through dynamic back analyses were used to evaluate the propagation phase and the deposition areas of new potential landslides, that could affect the same area of the 25th March 2015 event.The predicted runout length obtained by the DAN3D software was compared to runout lengths predicted by the Corominas (Can Geotech J 33:260–271, 1996), (Nat. Hazards 19, 47-77) and (UNICIV Report, R-416, School of Civil & Environmental Engineering, UNSW, Sydney Australia 2003) empirical relations.All the data confirm that the impact area of possible future events will be smaller than the 2015 event, probably due to the safety measures established after the landslide.
[en] Conventional approaches based on adaptive subtraction for ground roll attenuation first predict an initial model for ground rolls and then adaptively subtract it from the original data using a stationary matching filter (MF). Because of the non-stationary property of seismic data and ground rolls, the application of a traditional stationary MF is not physically plausible. Thus, in the case of highly non-stationary seismic reflections and ground rolls, a stationary MF cannot obtain satisfactory results. In this paper, we apply a non-stationary matching filter (NMF) to adaptively subtract the ground rolls. The NMF can be obtained by solving a highly under-determined inversion problem using non-stationary autoregression. We apply the proposed approach to one synthetic example and two field data examples, and demonstrate a much improved performance compared with the traditional MF approach. (paper)
[en] We describe the design and data analysis of the DEEP2 Galaxy Redshift Survey, the densest and largest high-precision redshift survey of galaxies at z ∼ 1 completed to date. The survey was designed to conduct a comprehensive census of massive galaxies, their properties, environments, and large-scale structure down to absolute magnitude MB = –20 at z ∼ 1 via ∼90 nights of observation on the Keck telescope. The survey covers an area of 2.8 deg2 divided into four separate fields observed to a limiting apparent magnitude of RAB = 24.1. Objects with z ∼< 0.7 are readily identifiable using BRI photometry and rejected in three of the four DEEP2 fields, allowing galaxies with z > 0.7 to be targeted ∼2.5 times more efficiently than in a purely magnitude-limited sample. Approximately 60% of eligible targets are chosen for spectroscopy, yielding nearly 53,000 spectra and more than 38,000 reliable redshift measurements. Most of the targets that fail to yield secure redshifts are blue objects that lie beyond z ∼ 1.45, where the [O II] 3727 Å doublet lies in the infrared. The DEIMOS 1200 line mm–1 grating used for the survey delivers high spectral resolution (R ∼ 6000), accurate and secure redshifts, and unique internal kinematic information. Extensive ancillary data are available in the DEEP2 fields, particularly in the Extended Groth Strip, which has evolved into one of the richest multiwavelength regions on the sky. This paper is intended as a handbook for users of the DEEP2 Data Release 4, which includes all DEEP2 spectra and redshifts, as well as for the DEEP2 DEIMOS data reduction pipelines. Extensive details are provided on object selection, mask design, biases in target selection and redshift measurements, the spec2d two-dimensional data-reduction pipeline, the spec1d automated redshift pipeline, and the zspec visual redshift verification process, along with examples of instrumental signatures or other artifacts that in some cases remain after data reduction. Redshift errors and catastrophic failure rates are assessed through more than 2000 objects with duplicate observations. Sky subtraction is essentially photon-limited even under bright OH sky lines; we describe the strategies that permitted this, based on high image stability, accurate wavelength solutions, and powerful B-spline modeling methods. We also investigate the impact of targets that appear to be single objects in ground-based targeting imaging but prove to be composite in Hubble Space Telescope data; they constitute several percent of targets at z ∼ 1, approaching ∼5%-10% at z > 1.5. Summary data are given that demonstrate the superiority of DEEP2 over other deep high-precision redshift surveys at z ∼ 1 in terms of redshift accuracy, sample number density, and amount of spectral information. We also provide an overview of the scientific highlights of the DEEP2 survey thus far
[en] The project of designing a radioactive total counts data collecting and processing system by using a pocket computer SHARP PC-15000(A) is discussed in the paper. The composition of the system, the features, the principles of the hardware and the software, and the applications of the system are also discussed
[en] When the aerial radiometric data have been converted into images, isolated striping can always be found in images. The author puts forward an algorithm in space domain which can remove the striping in converted images. Although the processing volume of the algorithm is not very large, the striping can be removed. The information masked by the striping reappears, and the information losses are greatly reduced. As a result, the ideal image effect can be obtained
[en] The type of this research work reflects an overview of disasters in South Asian countries. This outlines geographical aspects and institutional structures briefly in each country, and identifies gaps in disaster management regimes. Identified of these gaps is expected to give insights to the media to develop more informal disaster communications in South Asian Countries. Natural disasters have become a severe global problem. Deaths, displacements and damages resulting from natural disasters are colossal. During the 1990s global economic losses from major natural catastrophes averaged more than US $ 40 billion a year. The current Tsunami disaster has broken all previous records particularly in Indonesia, Sri Lanka and India. This paper focuses particularly on sub continental countries in the South Asian countries, how they are managed and mismanaged, and aims to provide condensed resource material on the subject. In such countries issues related to natural disasters are covered under the legal frameworks for environment, land use, water resources and human settlements. The shift from emergency management to disaster preparedness requires coordination between various government building departments and ministries and with other international organization and various community organizations. (author)
[en] It is now well-established that the elemental abundance patterns of stars hold key clues not only to their formation, but also to the assembly histories of galaxies. One of the most exciting possibilities is the use of stellar abundance patterns as “chemical tags” to identify stars that were born in the same molecular cloud. In this paper, we assess the prospects of chemical tagging as a function of several key underlying parameters. We show that in the fiducial case of 104 distinct cells in chemical space and stars in the survey, one can expect to detect groups that are overdensities in the chemical space. However, we find that even very large overdensities in chemical space do not guarantee that the overdensity is due to a single set of stars from a common birth cloud. In fact, for our fiducial model parameters, the typical overdensity is comprised of stars from a wide range of clusters with the most dominant cluster contributing only 25% of the stars. The most important factors limiting the identification of disrupted clusters via chemical tagging are the number of chemical cells in the chemical space and the survey sampling rate of the underlying stellar population. Both of these factors can be improved through strategic observational plans. While recovering individual clusters through chemical tagging may prove challenging, we show, in agreement with previous work, that different CMFs imprint different degrees of clumpiness in chemical space. These differences provide the opportunity to statistically reconstruct the slope and high-mass cutoff of CMF and its evolution through cosmic time.
[en] Complete text of publication follows. During the last years, the geographical coverage of magnetic fields improved thanks to the release of old and new data acquired from the Earth's surface to the satellite altitudes. Concerted international efforts to compile and publish these data in a digital format, like the World Digital Magnetic Anomaly Map (WDMAM) project, represented a key motivation for also improving our methods for interpreting and modelling marine, airborne and satellite data. Thus, these unprecedented high spatial resolution data also challenged our ability 1) to extract accurately the contribution of the lithospheric field from the total measurements, 2) to represent the data with potential field modelling technique capable of merging locally all kinds of data, 3) and to interpret these models in terms of sources distribution and depth, heat flow, etc... I will first briefly review recent advances made towards improving the marine and aeromagnetic compilations at the worldwide scale. Then, I will focus on the other end of the lithospheric magnetic field spectrum and discuss the consistency of various recent satellite-based lithospheric field models. This will allow me to illustrate the ambiguities and compatibility issues that remain to be addressed before we can successfully merge near surface and satellite data. Then, I will report on different studies carried out to interpret lithospheric magnetic field models in terms of tectonic, and discuss some original methods employed to estimate local and average properties of the Earth's magnetic crust.