Results 1 - 10 of 442
Results 1 - 10 of 442. Search took: 0.02 seconds
|Sort by: date | relevance|
[en] We present a set of indicators of vulnerability and capacity to adapt to climate variability, and by extension climate change, derived using a novel empirical analysis of data aggregated at the national level on a decadal timescale. The analysis is based on a conceptual framework in which risk is viewed in terms of outcome, and is a function of physically defined climate hazards and socially constructed vulnerability. Climate outcomes are represented by mortality from climate-related disasters, using the emergency events database data set, statistical relationships between mortality and a shortlist of potential proxies for vulnerability are used to identify key vulnerability indicators. We find that 11 key indicators exhibit a strong relationship with decadally aggregated mortality associated with climate-related disasters. Validation of indicators, relationships between vulnerability and adaptive capacity, and the sensitivity of subsequent vulnerability assessments to different sets of weightings are explored using expert judgement data, collected through a focus group exercise. The data are used to provide a robust assessment of vulnerability to climate-related mortality at the national level, and represent an entry point to more detailed explorations of vulnerability and adaptive capacity. They indicate that the most vulnerable nations are those situated in sub-Saharan Africa and those that have recently experienced conflict. Adaptive capacity - one element of vulnerability - is associated predominantly with governance, civil and political rights, and literacy. (author)
[en] This report describes a method and process assigning a value for system functions to categorize the safety-significant SSCs(Structures, Systems, Components) in graded quality assurance(GQA). In this study, we used the Delphi process for elicitation of the expert opinions. In Delphi Process, it is required to use a guideline for consistency. Therefore, we developed the guideline for SSCs categorization. This guideline was written to assign a value for system functions to screen out the safety-significant SSCs. This guideline provides with the requirement for an ideal expert panel composition, questionnaire, the description for issues and the guide assigning values
[en] Transparency and certainty are essential qualities for an acceptable and trusted valuation method. Based on the evaluation of the expert judgement method developed in the Delphi I study both of these criteria may be only partially accomplished by such a method. As for the technical procedure the method is well documented and transparency is good. Argumentation of the judgements, however, should be increased. The quality of the valuation indexes is explicitly available, but their certainty is very low for most interventions. The opinions of the experts differ much from each other. How much this depends on different values and how much on differences in knowledge etc. is impossible to assess. Also, how much the technique used and the statistical handling of the expert answers may have impacted the eventual scores of different interventions is difficult to assess. However, application of the expert judgement by means of the Delphi-technique to LCA valuation is a new idea, and, consequently, the method is still very much under development, far from maturity. This should be taken into account when considering the results out of the evaluation of the case study, which was the third of the kind in Europe
[en] A Nuclear Material Control and Accountability (MC and A) Functional Model has been developed to describe MC and A systems at facilities possessing Category I or II Special Nuclear Material (SNM). Emphasis is on achieving the objectives of 144 'Fundamental Elements' in key areas ranging from categorization of nuclear material to establishment of Material Balance Areas (MBAs), controlling access, performing quality measurements of inventories and transfers, timely reporting all activities, and detecting and investigating anomalies. An MC and A System Effectiveness Tool (MSET), including probabilistic risk assessment (PRA) technology for evaluating MC and A effectiveness and relative risk, has been developed to accompany the Functional Model. The functional model and MSET were introduced at the 48th annual International Nuclear Material Management (INMM) annual meeting in July, 20071,2. A survey/questionnaire is used to accumulate comprehensive data regarding the MC and A elements at a facility. Data is converted from the questionnaire to numerical values using the DELPHI method and exercises are conducted to evaluate the overall effectiveness of an MC and A system. In 2007 a peer review was conducted and a questionnaire was completed for a hypothetical facility and exercises were conducted. In the first quarter of 2008, a questionnaire was completed at Idaho National Laboratory (INL) and MSET exercises were conducted. The experience gained from conducting the MSET exercises at INL helped evaluate the completeness and consistency of the MC and A Functional Model, descriptions of fundamental elements of the MC and A Functional Model, relationship between the MC and A Functional Model and the MC and A PRA tool and usefulness of the MSET questionnaire data collection process
[en] Highlights: • Integrated key findings derived from three interdependent analytical phases. • Obtained a collective view of shipping experts pertaining to safety leadership. • Developed and validated a weighted safety leadership model in shipping context. • Effective safety leadership behaviours were articulated, verified and prioritized. • Provided practical standard/basis for accelerating safety leadership development. - Abstract: Recent years have witnessed a growing concern for safety and highlighted the importance of leadership in safety practice within high-risk organizations. By following up and integrating the state-of-art research trends, this study aims at (1) bridging a gap in safety leadership research – i.e., the lack of a holistic understanding of safety leadership contribution at all managerial levels within high-risk organizations; (2) developing and validating a weighted safety leadership model in the context of shipping which incorporates key safety leadership behaviors that may enable researchers and practitioners to better understand and exercise safety leadership in shipping organizations. To systematically fulfill the research aims, this study integrates both numerical and descriptive data by sequentially applying three interdependent research techniques – namely inductive analysis of literature, modified Delphi method and Analytical Hierarchy Process (AHP). The study results in a holistic weighted model with concrete safety leadership behaviors at each managerial level, which contributes to the building of theoretical foundations in the domain of safety leadership research and serves as practical standards for accelerating safety leadership development in shipping organizations.
[en] Objective: To apply the delphi method and analytic hierarchy process to establish a system for allocation of nuclear emergency medical resources of Sichuan, and to provide a scientific basis for evaluating the rescue capability of Sichuan nuclear emergency. Methods: Delphi method was used to select the index and establish evaluation index system; AHP was used to determine the weights of the index. Results: Through the Delphi method and analytic hierarchy process, the system of allocation of nuclear emergency medical resources in Sichuan province was finally established, including 6 primary indexes, 13 secondary indexes and 85 third grade indexes. Conclusion: The index system has reasonable structure and comprehensive contents, reflecting the core information of nuclear emergency medical resource allocation in Sichuan province, which can provide a basis for the application of this index system. (authors)
[en] Noise propagation in iterative reconstruction can be reduced by exact data projection. This can be done by area-weighted projection using the convolution method. Large arrays have to be convolved in order to achieve satisfactory image quality. Two procedures are described which improve the convolution method used so far. Variable binning helps to reduce the size of the convolution arrays without loss of image quality. Computation time is further reduced by abbreviated convolution. The effects of the procedures are illustrated by means of phantom measurements. (orig.)
[en] To accomplish a more accurate, precise and correct interpretation and analysis of spectrum data collecting from a gamma spectrometry counting system, a fully interactive computer code, named Y-Spect, has been developed by using the Delphi 7.0 programming language. The code combines several popular methods for peak search, i.e.: Mariscotti, Phillips-Marlow, Robertson et al., Routti-Prussin, Black, Sterlinski, Savitzky-Golay and Block et al. Any combinations of those methods can be chosen during a peak searching process, which can be performed in automatic or manual mode. Moving Window Average- and Savitzky-Golay-methods are available for spectrum data smoothing. Peak fitting is done by using a non-linear least square method of Levenberg-Marquardt for either a pure Gaussian peak shape or one with an additional Right/Left Tail function. Other than standard features, such as: peak identification and determination of: continuum, region of interest (ROI), and peak area, etc., Y-Spect has also a special feature which can predict the existence of escape- and/or sum peaks that belong to a certain radioisotope. Aside from displaying the complete spectrum graph, including: singlet or multiplet ROIs and peak identifications, Y-Spect can also display the first- or second-derivative of the spectrum data. Data evaluation is given as: isotope names, peak energy, Net-Count(-Rate), etc. Y-Spect is provided with a complete ENDF/B-VII.0 gamma-ray library file that contains of 16089 gamma energy lines from 1420 different radioisotopes. Other general specifications are: maximum number of: spectrum's channels = 16*1024; ROIs 2*1024; ROI's width = 2*1024 channels; Overlapping peaks (multiplet) = 20; Identified isotopes = 3*1024, and Isotope library's energy lines = 16*1024. (author)
[en] This paper estimates the historic relationship between carbon emissions and GDP using data across countries and across time. We combine this relationship with plausible projections for GDP and population growth to construct a model that offers insights into the likely path of global emissions in the next century. In addition, we experiment with a method for incorporating oil prices into the model. Our analysis provides independent confirmation of the business-as-usual forecasts generated by the larger structural models. (author)
[en] SDCE (Software Development Cost Estimation) has always been an interesting and budding field in Software Engineering. This study supports the SDCE by exploring its techniques and models and collecting them in one place. This contribution in the literature will assist future researchers to get maximum knowledge about SDCE techniques and models from one paper and to save their time. In this paper, we review numerous software development effort and cost estimation models and techniques, which are divided into different categories. These categories are parametric models, expertise-based techniques, learning-oriented techniques, dynamics based models, regression-based techniques, fuzzy logic-based methods, size-based estimation models, and composite techniques. Some other techniques which directly do not lie in any specific category are also briefly explained. We have concluded that no single technique is best for all situations; rather they are applicable in different nature of projects. All techniques have their own pros and cons and they are challenged by the rapidly changing software industry. Since no single technique gives a hundred percent accuracy, that is why one technique and model should not be preferred over all others. We recommend a hybrid approach for SDCE because in this way the limitations of one model and technique are complemented by the merits of the other model/technique. We also recommend a model calibration to obtain accurate results because if a model was developed in a different environment, we cannot expect reliable estimates from it in a completely new environment.