Results 1 - 10 of 967
Results 1 - 10 of 967. Search took: 0.026 seconds
|Sort by: date | relevance|
[en] This paper presents a general overview of scientific visualization from a historical orientation. It looks first at visualization before the advent of computers, and then goes on to describe the development of early visualization tools in the 'computer age'. There was a surge of interest in visualization in the latter part of the 1980s, following the publication of an NSF report. This sparked the development of a number of major visualization software systems such as AVS and IRIS Explorer. These are described, and the paper concludes with a look at future developments. ((orig.))
[en] ATLAS is one of the largest collaborations ever undertaken in the physical sciences. This paper explains how the software infrastructure is organized to manage collaborative code development by around 300 developers with varying degrees of expertise and situated in 30 different countries. ATLAS offline software currently consists of about 2 million source lines of code contained in 6800 C++ classes, organized in more than 1000 packages. We will describe how releases of the offline ATLAS software are built, validated and subsequently deployed to remote sites. Several software management tools have been used, the majority of which are not ATLAS specific; we will show how they have been integrated
[en] Abacus-based mental calculation is a unique Chinese culture. The abacus experts can perform complex computations mentally with exceptionally fast speed and high accuracy. However, the neural bases of computation processing are not yet clearly known. This study used a BOLD contrast 3T fMRI system to explore the brain activation differences between abacus experts and non-expert subjects. All the acquired data were analyzed using SPM99 software. From the results, different ways of performing calculations between the two groups were seen. The experts tended to adopt efficient visuospatial/visuomotor strategy (bilateral parietal/frontal network) to process and retrieve all the intermediate and final results on the virtual abacus during calculation. By contrast, coordination of several networks (verbal, visuospatial processing and executive function) was required in the normal group to carry out arithmetic operations. Furthermore, more involvement of the visuomotor imagery processing (right dorsal premotor area) for imagining bead manipulation and low level use of the executive function (frontal-subcortical area) for launching the relatively time-consuming sequentially organized process was noted in the abacus expert group than in the non-expert group. We suggest that these findings may explain why abacus experts can reveal the exceptional computational skills compared to non-experts after intensive training
[en] The GRENOUILLE traces of Gemini pulses (15 J, 30 fs, PW, shot per 20 s) were acquired in the Gemini Target Area PetaWatt at the Central Laser Facility (CLF), Rutherford Appleton Laboratory (RAL). A comparison between the characterizations of the laser pulse parameters made using two different types of algorithms: Video Frog and GRenouille/FrOG (GROG), was made. The temporal and spectral parameters came out to be in great agreement for the two kinds of algorithms. In this experimental campaign it has been showed how GROG, the developed algorithm, works as well as VideoFrog algorithm with the PetaWatt pulse class. - Highlights: • Integration of the diagnostic tool on high power laser. • Validation of the GROG algorithm in comparison to a well-known commercial available software. • Complete characterization of the GEMINI ultra-short high power laser pulse.
[en] Most physics analysis jobs involve multiple selection steps on the input data. These selection steps are called cuts or queries. A common strategy to implement these queries is to read all input data from files and then process the queries in memory. In many applications the number of variables used to define these queries is a relative small portion of the overall data set therefore reading all variables into memory takes unnecessarily long time. In this paper we describe an integration effort that can significantly reduce this unnecessary reading by using an efficient compressed bitmap index technology. The primary advantage of this index is that it can process arbitrary combinations of queries very efficiently, while most other indexing technologies suffer from the 'curse of dimensionality' as the number of queries increases. By integrating this index technology with the ROOT analysis framework, the end-users can benefit from the added efficiency without having to modify their analysis programs. Our performance results show that for multi-dimensional queries, bitmap indices outperform the traditional analysis method up to a factor of 10
[en] GAMOS is a software system for GEANT4-based simulation. It comprises a framework, a set of components providing functionality to simulation applications on top of the GEANT4 toolkit, and a collection of ready-made applications. It allows to perform GEANT4-based simulations using a scripting language, without requiring the writing of C++ code. Moreover, GAMOS design allows the extension of the existing functionality through user-supplied C++ classes. The main characteristics of GAMOS and its embedded functionality are described
[en] Modular software frameworks have become indispensable for large-scale experiments like the KM3NeT neutrino telescope. This article gives a generic definition of a software framework and presents an adaptation of IceTray, the framework currently in use by the IceCube collaboration, for water-based detectors.
[en] The CMS muon system is taking cosmic data since 2006 and is using them to study the performance of the three different detector technologies and triggers (drift tube, cathode strip chambers and resistive plate chambers). The muon system is described placing emphasis on the software tools that were developed and used to take data and to study, online and offline, the performance of the muon system. The results obtained analyzing up to 300 millions of cosmics acquired with the CMS detector will be described.
[en] This paper describes the steps needed to use SPEC's 'macro hardware facility' for control and reading of EPICS process variables. This resource allows many instruments that can communicate with EPICS to be integrated into the SPEC diffractometer-control program
[en] The uncertainty associated with the assessment of the radioactive content of any sample depends on the net counting rate registered during the measuring process and on the different weighting factors needed to transform this counting rate into activity, activity per unit mass or activity concentration. This work analyses the standard uncertainties in these weighting factors as well as their contribution to the uncertainty in the activity reported for three typical determinations for environmental radioactivity measurements in the laboratory. It also studies the corresponding characteristic limits and their dependence on the standard uncertainty related to those weighting factors, offering an analysis of the effectiveness of the simplified characteristic limits as evaluated by various measuring software and laboratories