Results 1 - 10 of 177
Results 1 - 10 of 177. Search took: 0.022 seconds
|Sort by: date | relevance|
[en] For several decades, the French Atomic and Alternative Energies Commission – CEA - has been undertaking experimental programs aimed at validating the calculation tools used to design standard and advanced LWRs, as well as FBRs
[en] History matching a channelized reservoir with multiple facies has always posed a great challenge to researchers. In this paper, we present a workflow combining the ensemble smoother with multiple data assimilation (ES-MDA) method with a parameterization algorithm referred to as the common basis discrete cosine transform (DCT) and a post-processing technique in order to integrate static and dynamic data into multi-facies channelized reservoir models. The parameterization algorithm is developed to capture the critical features and describe the geological similarity between different realizations in the prior ensemble by transforming the discrete facies indicators into continuous variables. And the ES-MDA method is employed to update the continuous variables by assimilating the static and dynamic data. Finally, a post-processing technique based on a regularization framework is used to improve the spatial continuity of facies and estimate the non-Gaussian distributed reservoir properties. We apply this automatic history matching workflow to two synthetic problems that represent complex three-facies (shale, levee, and sand) channelized reservoirs. One is a 2D three-facies reservoir with a relatively high number of channels and the other is a 3D three-facies five-layer reservoir containing two geological zones with different channel patterns. The computational results show that the proposed workflow can greatly reduce the uncertainty in the reservoir description through the integration of production data. And the posterior realizations can well preserve the key geological features of the prior models, with a good history data match and predictive capacity. In addition, we also illustrate the superiority of the common basis DCT over the traditional DCT algorithm.
[en] Meteorological observations in Tibet are poor in quality with a severe amount of missing data; this is mostly caused by extreme climatological conditions and higher maintenance costs. This paper focuses on the imputation of missing data and the reconstruction of the regional temperature field. Due to insufficient observation stations and complicated topography, we employ the weather research and forecasting (WRF) model to produce the proper orthogonal decomposition (POD) basis for the study region. We then develop the gappy POD method for the imputation of missing data. Both methods are compared and tested for various missing data cases, and the results show that the gappy POD method dramatically outperforms the regularized EM algorithm when the amount of missing spatial data is not severe. Furthermore, between the two methods, only the gappy POD method is capable of reconstructing the temperature field at locations where the data are absent. The gappy POD method can also be generalized for data assimilation with the assumption that the data across all model grids have missing values.
[en] Drought is a major limiting factor affecting wheat production in the world. We aimed to study the effect of soil water deficit on dry matter remobilization (DMR), grain yield (GY) and yield components of durum and bread wheat genotypes. Drought stress accelerated DMR. Lowest remobilization of dry matter into grains was detected in the tallest, late heading genotypes, which were also characterized by low harvest index (HI). Drought stress showed less affect on plant height (PH), peduncle length (PL), spike length (SL), spike width (SW), spikelet number per spike (SNS) but strongly affected the biological yield (BY), spike mass (SM), grain number per spike (GNS) and grain mass per spike (GMS), thousand kernels mass (TKM). GY positively and significantly correlated with spikes m/sup -2/ (SN), BY and HI under drought stress condition. We consider that wheat characteristics DMR, SN, BY, HI are good selection criteria under drought stress. (author)
[en] Highlights: • Coherent patterns can be used to form effective data assimilation schemes. • A Pattern-based distance is used in a likelihood-free data assimilation. • The pattern-based scheme is unaffected by chaotic advection. We introduce a data assimilation method to estimate model parameters with observations of passive tracers by directly assimilating Lagrangian Coherent Structures. Our approach differs from the usual Lagrangian Data Assimilation approach, where parameters are estimated based on tracer trajectories. We employ the Approximate Bayesian Computation (ABC) framework to avoid computing the likelihood function of the coherent structure, which is usually unavailable. We solve the ABC by a Sequential Monte Carlo (SMC) method, and use Principal Component Analysis (PCA) to identify the coherent patterns from tracer trajectory data. Our new method shows remarkably improved results compared to the bootstrap particle filter when the physical model exhibits chaotic advection.
[en] It is well known that the true values of measured and computed data are impossible to know exactly because of various uncontrollable errors and uncertainties arising in the data measurement and interpretation reduction processes. Hence, all inferences, predictions, engineering computations, and other applications of measured and/or computed data are necessarily based on weighted averages over the possibly true values, with weights indicating the degree of plausibility of each value. Furthermore, combination of data from different sources involves a weighted propagation (e.g., via sensitivities) of all uncertainties, requiring reasoning from incomplete information and using probability theory for extracting optimal values together with 'best-estimate' uncertainties from often sparse, incomplete, error-afflicted, and occasionally discrepant data. The current state-of-the-art data assimilation/model calibration methodologies1 for large-scale nonlinear systems cannot take into account uncertainties higher-order than secondorder (i.e., covariances) thereby failing to quantify fully the deviations of the problem under consideration from a normal (Gaussian) multivariate distribution. Such deviations would be quantified by the third- and fourth-order moments (skewness and kurtosis) of the model's predicted results (responses). These higher-order moments would be constructed by combining modeling and experimental uncertainties (which also incorporate the corresponding skewness and kurtosis information), using derivatives of the model responses with respect to the model's parameters. This paper presents explicit expressions for skewness and kurtosis of computed responses, thereby permitting quantification of the deviations of the computed response uncertainties from multivariate normality. In addition, this paper presents a new and most efficient procedure for computing the second-order response derivatives with respect to model parameters using the 'adjoint sensitivity analysis procedure' (ASAP)
[en] We show that modifying a Bayesian data assimilation scheme by incorporating kinematically-consistent displacement corrections produces a scheme that is demonstrably better at estimating partially observed state vectors in a setting where feature information is important. While the displacement transformation is generic, here we implement it within an ensemble Kalman Filter framework and demonstrate its effectiveness in tracking stochastically perturbed vortices.
[en] This paper studies the effect of targeted observations on state and parameter estimates determined with Kalman filter data assimilation (DA) techniques. We first provide an analytical result demonstrating that targeting observations within the Kalman filter for a linear model can significantly reduce state estimation error as opposed to fixed or randomly located observations. We next conduct observing system simulation experiments for a chaotic model of meteorological interest, where we demonstrate that the local ensemble transform Kalman filter (LETKF) with targeted observations based on largest ensemble variance is skillful in providing more accurate state estimates than the LETKF with randomly located observations. Additionally, we find that a hybrid ensemble Kalman filter parameter estimation method accurately updates model parameters within the targeted observation context to further improve state estimation
[en] Teleconnections refer to links between regions that are distant to each other, but nevertheless exhibit some relation. The study of such teleconnections is a well-known task in climate research. Climate simulation shall model known teleconnections. Detecting teleconnections in climate simulations is a crucial aspect in judging the quality of the simulation output. It is common practice to run scripts to execute a sequence of analysis steps on the climate simulations to search for teleconnections. Such a scripting approach is not flexible and targeted towards one specific goal. It is desirable to have one tool that allows for a flexible analysis of all teleconnection patterns with a dataset. We present such a tool, where the extracted information is provided in an intuitive visual form to users, who then can interactively explore the data. We developed an analysis workflow that is modeled around four views showing different facets of the data with coordinated interaction. We present a teleconnection study with simulation ensembles and reanalysis data obtained by data assimilation to observe how well the teleconnectivity patterns match and to demonstrate the effectiveness of our tool.