Results 1 - 10 of 258824
Results 1 - 10 of 258824. Search took: 0.101 seconds
|Sort by: date | relevance|
[en] The author reviewed recent development and practical application of solid phase spectrophotometry in analysis of materials and goods of mining-metallurgy. Separation and preconcentration and conditions of coloring determination, sensitivity and range of detection, as well as interference of corresponding method are discussed
[en] Polymers are a common component of chemical background which complicates data analysis and can impair interpretation. Undesired chemical background cannot always be addressed via pre-analytical methods, chromatography, or existing data processing methods. The Kendrick mass filter (KMF) is presented for the computational removal of undesired signals present in MS1 spectra. The KMF is analogous to mass defect filtering but utilizes homology information via Kendrick mass scaling in combination with chromatographic retention time and the number of observed signals. The KMF is intended to assist in situations in which current data processing methods to remove background, e.g., blank subtraction, are either not possible or effective. The major parameters affecting KMF were investigated using PEG 400 and NIST standard reference material 1950 (metabolites in human plasma). Further exploration of the KMF performance was tested using an extract of a swab known to contain polymers. An illustrative real-world example of skin analysis with polymeric signal is discussed. The KMF is also able to provide a high-level view of the compositionality of data regarding the presence of signals with repeat units and indicate the presence of different polymers. .
[en] We developed a method of distribution estimation of hyperparameters in Markov random field (MRF) models. This study was motivated by the growing quantity of image data in natural sciences owing to recent advances in measurement techniques. MRF models are used to restore images in information science, and the hyperparameters of these models can be adjusted to improve restoration performance. The parameters appearing in data analysis represent physical quantities such as diffusion coefficients. Indeed, many frameworks of hyperparameter estimation have been proposed, but most are point estimation that is susceptible to stochastic fluctuations. Distribution estimation can be used to evaluate the confidence one has in point estimates of hyperparameters, in a similar way to physicists using error bars when they evaluate important physical quantities. We use a solvable MRF model to investigate the performance of distribution estimation in simulations. (paper)
[en] The Loess Plateau, the transitional zone between humid and arid regions of China, is an important region to examine the regional hydrological cycle and variation in humid and arid regions under global climate change. Aridity index (AI), the ratio of precipitation (P) to potential evapotranspiration (ET0), is an important indicator of regional climate conditions and is also used to classify drylands. In this study, data from 51 national meteorological stations during the period of 1961–2014 were collected to estimate the AI in the Loess Plateau. Results show that a downward trend in annual AI was detected and the boundary of the drylands region based on the AI was expanded across the Loess Plateau over the period of 1961–2014. The spatiotemporal variability of P was the main cause in the AI variations. Furthermore, data analysis suggested the occurrences of the extreme minimum AI values were mostly affected by fluctuations of the two factors (ET0 and P) rather than its corresponding trend during the period. Thus, this study indicated the major driving factor of AI and the relationship between extreme AI values and the global climate anomalies in the Loess Plateau region, and meanwhile, provided an understanding of the impacts of climate change on hydrological cycle in the Loess Plateau of China.
[en] A steady-state analysis for the catalytic turnover of molecules containing two substrate sites is presented. A broad class of Markovian dynamic models, motivated by the action of DNA modifying enzymes and the rich variety of translocation mechanisms associated with these systems (e.g., sliding, hopping, intersegmental transfer, etc.), is considered. The modeling suggests an elementary and general method of data analysis, which enables the extraction of the enzyme’s processivity directly and unambiguously from experimental data. This analysis is not limited to the initial velocity regime. The predictions are validated both against detailed numerical models and by revisiting published experimental data for EcoRI endonuclease acting on DNA
[en] Nowadays continuous signal digitization becomes a standard procedure in experimental physics. Though, signal pileup separation at high count rate remains a problem. The article presents algorithms which could be used for detection and extraction of events based on the shape of a single pulse. We also explain how these algorithms were applied in data analysis for the "Troitsk nu-mass" experiment.
[en] In this paper, the regionalization of geographical space according to selected topographic factors and the spatial distribution of precipitation is discussed. The model takes into account qualitative and quantitative data describing the conditions associated with the studied precipitation. In the modelling, data mining methods including data clustering methods for agglomeration and artificial neural networks for classification have been used. The reason for their use was the classification of the area due to conditions related to precipitation, the distinguishing of similar areas and the delimitation of the propagation of the phenomenon or transition zones. To realize the research aims, professional software for data management, spatial data analysis, mathematical calculations and data mining have been used. The result of the research was a model of the classes representing areas with specific conditions affecting the phenomenon, transition zones between classes and areas with conditions other than those in the surroundings of the measuring stations, which are not classified in any of the classes. Classification results indicate the boundaries of the areas in which we can model the values measured at stations, the transition zones of possible discontinuous change and areas in which the phenomenon should not be modelled due to significantly different conditions from those in the neighbourhoods of measuring stations. Unclassified areas are also potential locations for new measuring stations.
[en] The next generation of fusion experiments will use object-oriented technology creating the need for world wide sharing of an underlying hierarchical file-system. The Andrew file system (AFS) is a well known and widely spread global distributed file-system. Multiple-resident-AFS (MR-AFS) combines the features of AFS with hierarchical storage management systems. Files in MR-AFS therefore may be migrated on secondary storage, such as roboted tape libraries. MR-AFS is in use at IPP for the current experiments and data originating from super-computer applications. Experiences and scalability issues are discussed
[en] Single nuclear particles induce the single-event transients and upsets in the memory cells. The result of a single nuclear particle strike only on one transistor cluster of the two transistor clusters of the DICE cell is a single-event transient, but not a single-event upset. The use of the new read-data-correction unit increases the soft-error immunity of the two-transistor cluster DICE cell by the successful DICE cell data reading. The successful data reading has been confirmed by the simulation of the 28-nm multiport CMOS memory cell with the read-data-correction unit
[ru]Комплементарная структура металл-оксид-полупроводник (КМОП) ячейки памяти DICE при воздействии одиночной ядерной частицы переходят в нестационарное состояние. В этом состоянии происходит ложное чтение данных. Использование блока коррекции передачи данных на выход ячейки, который определяет достоверно два состояния на выводах триггера DICE, сохраняющиеся из стационарного состояния триггера до воздействия одиночной частицы, позволяет осуществить корректное чтение данных и повысить сбоеустойчивость состояния ячейки памяти. Моделирование характеристик 28-нм КМОП ячейки на основе триггера DICE подтвердили реальность достоверного чтения данных на выходе ячейки в многопортовом КМОП ОЗУ при воздействии помех на триггер ячейки памяти
[en] Today, data play a central role in most fields of Science. In recent years, the amount of data from experiment, observation, and simulation has increased rapidly and the data complexity has grown. Also, communities and shared storage have become geographically more distributed. Therefore, methods and techniques applied for scientific data need to be revised and partially be replaced, while keeping the community-specific needs in focus. The Helmholtz Portfolio Extension ''Large Scale Data Management and Analysis'' (LSDMA) focuses on the optimization of the data life cycle in different research areas. In its five Data Life Cycle Labs (DLCLs), data experts closely collaborate with the communities in joint research and development to optimize the respective data life cycle. In addition, the Data Services Integration Team provides data analysis tools and services which are common to several DLCLs. This presentation describes the various activities within LSDMA and focuses on the work done in the DLCL ''Structure of Matter''. The main topics of this DLCL are the support for the international projects FAIR (Facility for Anti Proton and Ion Research) which will evolve around GSI in Darmstadt and the European XFEL and PETRA III at DESY in Hamburg.