Results 1 - 10 of 4501
Results 1 - 10 of 4501. Search took: 0.036 seconds
|Sort by: date | relevance|
[en] This paper presents a general overview of scientific visualization from a historical orientation. It looks first at visualization before the advent of computers, and then goes on to describe the development of early visualization tools in the 'computer age'. There was a surge of interest in visualization in the latter part of the 1980s, following the publication of an NSF report. This sparked the development of a number of major visualization software systems such as AVS and IRIS Explorer. These are described, and the paper concludes with a look at future developments. ((orig.))
[en] In this paper, two families of non-narrow-sense (NNS) BCH codes of lengths and ( over the finite field are studied. The maximum designed distances of these dual-containing BCH codes are determined by a careful analysis of properties of the cyclotomic cosets. NNS BCH codes which achieve these maximum designed distances are presented, and a sequence of nested NNS BCH codes that contain these BCH codes with maximum designed distances are constructed and their parameters are computed. Consequently, new nonbinary quantum BCH codes are derived from these NNS BCH codes. The new quantum codes presented here include many classes of good quantum codes, which have parameters better than those constructed from narrow-sense BCH codes, negacyclic and constacyclic BCH codes in the literature.
[en] Highlights: • CAD sensitivity is still limited for automated detection of subsolid nodules. • CAD detection rate is higher for part-solid than for pure ground-glass nodules. • Part-solid nodule detection is not better for nodules with larger solid component. - Abstract: Objectives: To evaluate the performance of a commercially available CAD system for automated detection and measurement of subsolid nodules. Materials and methods: The CAD system was tested on 50 pure ground-glass and 50 part-solid nodules (median diameter: 17 mm) previously found on standard-dose CT scans in 100 different patients. True nodule detection and the total number of CAD marks were evaluated at different sensitivity settings. The influence of nodule and CT acquisition characteristics was analyzed with logistic regression. Software and manually measured diameters were compared with Spearman and Bland-Altman methods. Results: With sensitivity adjusted for 3-mm nodule detection, 50/100 (50%) subsolid nodules were detected, at the average cost of 17 CAD marks per CT. These figures were respectively 26/100 (26%) and 2 at the 5-mm setting. At the highest sensitivity setting (2-mm nodule detection), the average number of CAD marks per CT was 41 but the nodule detection rate only increased to 54%. Part–solid nodules were better detected than pure ground glass nodules: 36/50 (72%) versus 14/50 (28%) at the 3-mm setting (p < 0.0001), with no influence of the solid component size. Except for the type (i.e. part solid or pure ground glass), no other nodule characteristic influenced the detection rate. High-quality segmentation was obtained for 79 nodules, which for automated measurements correlated well with manual measurements (rho = 0.90[0.84–0.93]). All part-solid nodules had software-measured attenuation values above −671 Hounsfield units (HU). Conclusion: The detection rate of subsolid nodules by this CAD system was insufficient, but high-quality segmentation was obtained in 79% of cases, allowing automated measurement of size and attenuation.
[en] Highlights: • Review of the requirements and recommendations for BEPU methodology. • Summary of the advantages and limitations of the current deterministic bounding method for non-LOCA transient analysis. • Description of a pragmatic, graded approach for application of the BEPU methodology to non-LOCA transient analysis. • Proposal for a demonstration case. - Abstract: Since 1990’s, the use of best estimate plus uncertainty (BEPU) methodology is becoming a common practice for large-break Loss-Of-Coolant Accident (LOCA) analysis. However, the development and application of BEPU methodology requires a higher-level requirement on the verification and validation, and uncertainty quantification (VVUQ) of the used calculational method and computer codes. This may result in a high-cost for BEPU methodology development, and hence prevent the industry to take full benefit from the BEPU applications. This paper proposes a pragmatic, graded approach for application of the BEPU methodology to non-LOCA transient analyses.
[en] During the commissioning of the KSTAR device, electronic logbook has been developed to record participant's opinions regarding experimental procedures and results. The experimental logbook, a kind of the electronic logbooks inserts principal experimental parameters by itself and gets the experimenter's comments. Since it usually takes long time to surf around raw experimental data, a summarized comment on the experimental logbook helps physicists to analyze the results. The operation logbook, the other part of electronic logbook records history about system's abnormality, including their management history. Records on the operation logbook for components provide the criteria to validate the device's stability and the basis to complement the device operation procedure. Since a large majority of readers is interested in the information on the electronic logbook, the data is manipulated to be searched and read on the web-site which is accessible by the authenticated users. The web-site also includes a formatting function to report the logbook data as a document using Java Document Object Model (DOM) and Java Simple API for XML (SAX) API. Because there were difficulties to make an action scenario on the events due to insufficient experience of tokamak operation, it was developed in parallel during the commissioning. By training operators to write comments on every detail of the experimental results and operation events, it could be more valuable data source for the next experiments.
[en] Structural properties of u-constacyclic codes over the ring are given, where p is an odd prime and . Under a special Gray map from to , some new non-binary quantum codes are obtained by this class of constacyclic codes.
[en] The growing popularity of social media as a channel for distributing and debating scientific information raises questions about the types of discourse that surround emerging technologies, such as nanotechnology, in online environments, as well as the different forms of information that audiences encounter when they use these online tools of information sharing. This study maps the landscape surrounding social media traffic about nanotechnology. Specifically, we use computational linguistic software to analyze a census of all English-language nanotechnology-related tweets expressing opinions posted on Twitter between September 1, 2010 and August 31, 2011. Results show that 55 % of tweets expressed certainty and 45 % expressed uncertainty. Twenty-seven percent of tweets expressed optimistic outlooks, 32 % expressed neutral outlooks and 41 % expressed pessimistic outlooks. Tweets were mapped by U.S. state, and our data show that tweets are more likely to originate from states with a federally funded National Nanotechnology Initiative center or network. The trend toward certainty in opinion coupled with the distinct geographic origins of much of the social media traffic on Twitter for nanotechnology-related opinion has significant implications for understanding how key online influencers are debating and positioning the issue of nanotechnology for lay and policy audiences.
[en] The aim of the study was to evaluate the reliability of the analysis of only 10 frames rather than of a whole clip in performing quantitative assessment of tumor enhancement of focal liver lesions (FLLs) following ultrasound contrast injection. Contrast-enhanced ultrasonography (CEUS) examinations of 31 FLLs (median diameter: 30 mm) were performed. All clips were analyzed and quantified with an early prototype of the SonoLiver software (TomTec GmbH, Munich and Bracco Research SA, Geneva), first evaluating the entire clip then selecting only 10 frames at different time intervals. Enhancement measurements obtained from the analysis of the entire clip or of only 10 frames were closely correlated (r = 0.931 and p < 0.0001 for Area Under the Curve; r = 0.944 and p < 0.0001 for Perfusion Index). In conclusion, enhancement quantification of FLLs can be reliably obtained from only 10 frames, rather than the entire clip, at least for most parameters, making such procedure easier for potential routine use.