Results 1 - 10 of 272999
Results 1 - 10 of 272999. Search took: 0.101 seconds
|Sort by: date | relevance|
[en] To enhance the credibility of human reliability analysis, various kinds of data have been recently collected and analyzed. Although it is obvious that the quality of data is critical, the practices or considerations for securing data quality have not been sufficiently discussed. In this work, based on the experience of the recent human reliability data extraction projects, which produced more than fifty thousand data-points, we derive a number of issues to be considered for generating meaningful data. As a result, thirteen considerations are presented here as pertaining to the four different data extraction activities: preparation, collection, analysis, and application. Although the lessons were acquired from a single kind of data collection framework, it is believed that these results will guide researchers to consider important issues in the process of extracting data
[en] The author reviewed recent development and practical application of solid phase spectrophotometry in analysis of materials and goods of mining-metallurgy. Separation and preconcentration and conditions of coloring determination, sensitivity and range of detection, as well as interference of corresponding method are discussed
[en] Polymers are a common component of chemical background which complicates data analysis and can impair interpretation. Undesired chemical background cannot always be addressed via pre-analytical methods, chromatography, or existing data processing methods. The Kendrick mass filter (KMF) is presented for the computational removal of undesired signals present in MS1 spectra. The KMF is analogous to mass defect filtering but utilizes homology information via Kendrick mass scaling in combination with chromatographic retention time and the number of observed signals. The KMF is intended to assist in situations in which current data processing methods to remove background, e.g., blank subtraction, are either not possible or effective. The major parameters affecting KMF were investigated using PEG 400 and NIST standard reference material 1950 (metabolites in human plasma). Further exploration of the KMF performance was tested using an extract of a swab known to contain polymers. An illustrative real-world example of skin analysis with polymeric signal is discussed. The KMF is also able to provide a high-level view of the compositionality of data regarding the presence of signals with repeat units and indicate the presence of different polymers. .
[en] We developed a method of distribution estimation of hyperparameters in Markov random field (MRF) models. This study was motivated by the growing quantity of image data in natural sciences owing to recent advances in measurement techniques. MRF models are used to restore images in information science, and the hyperparameters of these models can be adjusted to improve restoration performance. The parameters appearing in data analysis represent physical quantities such as diffusion coefficients. Indeed, many frameworks of hyperparameter estimation have been proposed, but most are point estimation that is susceptible to stochastic fluctuations. Distribution estimation can be used to evaluate the confidence one has in point estimates of hyperparameters, in a similar way to physicists using error bars when they evaluate important physical quantities. We use a solvable MRF model to investigate the performance of distribution estimation in simulations. (paper)
[en] The obtaining of phosphorite concentrates from the Rivat Deposit has been considered in the present work. The chemical and mineral composition of phosphorite ores have been defined. The flotation studies of the ore have been carried out.
[en] A steady-state analysis for the catalytic turnover of molecules containing two substrate sites is presented. A broad class of Markovian dynamic models, motivated by the action of DNA modifying enzymes and the rich variety of translocation mechanisms associated with these systems (e.g., sliding, hopping, intersegmental transfer, etc.), is considered. The modeling suggests an elementary and general method of data analysis, which enables the extraction of the enzyme’s processivity directly and unambiguously from experimental data. This analysis is not limited to the initial velocity regime. The predictions are validated both against detailed numerical models and by revisiting published experimental data for EcoRI endonuclease acting on DNA
[en] Nowadays continuous signal digitization becomes a standard procedure in experimental physics. Though, signal pileup separation at high count rate remains a problem. The article presents algorithms which could be used for detection and extraction of events based on the shape of a single pulse. We also explain how these algorithms were applied in data analysis for the "Troitsk nu-mass" experiment.
[en] Modern cosmological analyses constrain physical parameters using Markov Chain Monte Carlo (MCMC) or similar sampling techniques. Oftentimes, these techniques are computationally expensive to run and require up to thousands of CPU hours to complete. Here we present a method for reconstructing the log-probability distributions of completed experiments from an existing chain (or any set of posterior samples). Here, the reconstruction is performed using Gaussian process regression for interpolating the log-probability. This allows for easy resampling, importance sampling, marginalization, testing different samplers, investigating chain convergence, and other operations. As an example use case, we reconstruct the posterior distribution of the most recent Planck 2018 analysis. We then resample the posterior, and generate a new chain with 40 times as many points in only 30 min. Our likelihood reconstruction tool is made publicly available online.
[en] In this paper, the regionalization of geographical space according to selected topographic factors and the spatial distribution of precipitation is discussed. The model takes into account qualitative and quantitative data describing the conditions associated with the studied precipitation. In the modelling, data mining methods including data clustering methods for agglomeration and artificial neural networks for classification have been used. The reason for their use was the classification of the area due to conditions related to precipitation, the distinguishing of similar areas and the delimitation of the propagation of the phenomenon or transition zones. To realize the research aims, professional software for data management, spatial data analysis, mathematical calculations and data mining have been used. The result of the research was a model of the classes representing areas with specific conditions affecting the phenomenon, transition zones between classes and areas with conditions other than those in the surroundings of the measuring stations, which are not classified in any of the classes. Classification results indicate the boundaries of the areas in which we can model the values measured at stations, the transition zones of possible discontinuous change and areas in which the phenomenon should not be modelled due to significantly different conditions from those in the neighbourhoods of measuring stations. Unclassified areas are also potential locations for new measuring stations.
[en] Present article is devoted to technology of ores enrichment of bottom lifts of Dzhizhikrut Deposit. The influence of various modifiers and other factors on flotation of studied ores of bottom lifts of Dzhizhikrut Deposit has been defined. On the basis of conducted researches the technological scheme of flotation of ores of above mentioned Deposit has been elaborated and proposed.