Results 1 - 10 of 41993
Results 1 - 10 of 41993. Search took: 0.047 seconds
|Sort by: date | relevance|
[en] The directly observed average apparent magnitude (or in one case, angular diameter) as a function of redshift in each of a number of large complete galaxy samples is compared with the predictions of hypothetical redshift-distance power laws, as a systematic statistical question. Due account is taken of observational flux limits by an entirely objective and reproducible optimal statistical procedure, and no assumptions are made regarding the distribution of the galaxies in space. The laws considered are of the form z varies as rp, where r denotes the distance, for p = 1, 2 and 3. The comparative fits of the various redshift-distance laws are similar in all the samples. Overall, the cubic law fits better than the linear law, but each shows substantial systematic deviations from observation. The quadratic law fits extremely well except at high redshifts in some of the samples, where no power law fits closely and the correlation of apparent magnitude with redshift is small or negative. In all cases, the luminosity function required for theoretical prediction was estimated from the sample by the non-parametric procedure ROBUST, whose intrinsic neutrality as programmed was checked by comprehensive computer simulations. (author)
[en] Highlights: → We use data assimilation BLUE technique. → We applied it to the problem of nuclear masses evaluation combining model and data. → We evaluate the improvement of accuracy using such technique. → We conclude that data assimilation can be used in this framework and that this technique is promising. - Abstract: This paper presents methods to provide an optimal evaluation of the nuclear masses. The techniques used for this purpose come from data assimilation that allows combining, in an optimal and consistent way, information coming from experiment and from numerical model. Using all the available information, it leads to improve not only masses evaluations, but also to decrease uncertainties. Each newly evaluated mass value is associated with some accuracy that is sensibly reduced with respect to the values given in tables, especially in the case of the less well-known masses. In this paper, we first introduce a useful tool of data assimilation, the Best Linear Unbiased Estimation (BLUE). This BLUE method is applied to nuclear mass tables and some results of improvement are shown.
[en] New concepts in radiation oncology are based on the concept that combinations of irradiation and molecular targeted drugs can yield synergistic or at least additive effects. Up to now the combination of two treatment modalities has been tested in almost all cases. Similar to conventional anti-cancer agents, the efficacy of targeted approaches is also subject to predefined resistance mechanisms. Therefore, it seems reasonable to speculate that a combination of more than two agents will ultimately increase the therapeutic gain. No tools for a bio-mathematical evaluation of a given degree of interaction for more than two anti-neoplastic agents are currently available. The present work introduces a new method for an evaluation of triple therapies and provides some graphical examples in order to visualize the results
[en] In case of simulating the outdoor thermal environment, the radiation calculation is very important. In this study, as a new method to evaluate the outdoor thermal environment precisely, a coupled simulation of convection and radiation is proposed. And the suggested simulation method is validated by comparison of simulation results with those of field measurement.
[en] It was built a versatile photoelectrochemical cell devoted to the comparative study of the photosensitive materials used as photoelectrodes in solar-hydrogen production. The experimental arrangement makes possible a relative evaluation of the electrodes properties by the measurement of the electric parameters, giving directly I = f (U) for the cell electric circuit with and without an external electrical bias. It also gives a direct measurement of the volume of the evolved gases, and an on-line analysis of the gases by the coupled gas chromatograph, or of-line, by a mass spectrometer.
[en] In this paper, a simple model for analysing variability in radon concentrations in homes is tested. The approach used here involves two error components, representing additive and multiplicative errors, together with variation between-houses. We use a Bayesian approach for our analysis and apply this model to two datasets of repeat radon measurements in homes; one based on 3-month long measurements for which the original measurements were close to the current UK Radon Action Level (200 Bq m-3), and the other based on 6-month measurement data (from regional and national surveys), for which the original measurements cover a wide range of radon concentrations, down to very low levels. The model with two error components provides a better fit to these datasets than does a model based on solely multiplicative errors. - Highlights: → A new multilevel model for analysing variability in radon measurements in homes is tested. → The model includes additive and multiplicative errors and with variation between homes. → The model with two error components provides a better fit than a model based on only multiplicative error. → The model can also be used for other environmental data where the variation at low levels can't be modelled multiplicative error only.
[en] Aiming at the problem of parameter estimation in analog circuits, a new approach is proposed. The approach is based on the fractional wavelet to derive the Volterra series model of the circuit under test (CUT). By the gradient search algorithm used in the Volterra model, the unknown parameters in the CUT are estimated and the Volterra model is identified. The simulations show that the parameter estimation results of the proposed method in the paper are better than those of other parameter estimation methods. (paper)
[en] Anti-alpha-fetoprotein (AFP) radioimmunodetection was performed in response to clinical requests in 16 patients. In two patients, assessment of a known tumour was required; the anti-AFP scans were accurate and provided useful clinical information in both cases. In the remaining 14 patients the request was for localisation of suspected recurrent tumour. Accurate information was provided in four of these patients. In this latter group, various conventional methods of investigation had failed to disclose the site of recurrence. However, of a total of 21 sites reported as positive in these 14 patients, eight proved to be false positives. Two false negative results also occurred in this group and nine could not be evaluated. Although occasionally patients were usefully scanned, improvements are necessary before consistently reliable information can be obtained using this technique. (author)
[en] We designed a novel sensor specifically aimed at ex vivo measurements of white thrombus volume growth; a white thrombus is induced within an artificial micro-channel where hemostasis takes place starting from whole blood under flow conditions. The advantage of the proposed methodology is to identify the time evolution of the thrombus volume by means of an original data fusion methodology based on 2D optical and electrical impedance data simultaneously processed. On the contrary, the present state of the art optical imaging methodologies allow the thrombus volume estimation only at the end of the hemostatic process