Results 1 - 10 of 3581
Results 1 - 10 of 3581. Search took: 0.026 seconds
|Sort by: date | relevance|
[en] This paper aims to interpret the problem of the quantum-classical divide following Bohm's holographic model and to reformulate it as an indication of a new physical order. First of all I briefly outline the differences between the classical world and the quantum one (such as locality against nonlocality, determinism against indeterminism and continuity against discontinuity); then I claim that in order to understand the divide between the two domains we should start from what is common, and regard them as two abstractions and limiting cases of a general theory. In particular, following Bohm, I show that the central notion of this new theory is an undivided whole characterized by a general order consisting of a holomovement from an implicate order - the quantum domain - to an explicate order - in the classical domain. This part is explained with the aid of the structure of the hologram and is supported by a reflection on some key terms such as 'order', 'structure', 'implicate' and 'explicate'. Finally I propose that this movement of unfoldment and enfoldment can explain the apparent incompatibility of the two physical domains and the passage from one to the other.
[en] Physicists and philosophers have long claimed that the symmetries of our physical theories - roughly speaking, those transformations which map solutions of the theory into solutions - can provide us with genuine insight into what the world is really like. According to this 'Invariance Principle', only those quantities which are invariant under a theory's symmetries should be taken to be physically real, while those quantities which vary under its symmetries should not. Physicists and philosophers, however, are generally divided (or, indeed, silent) when it comes to explaining how such a principle is to be justified. In this paper, I spell out some of the problems inherent in other theorists' attempts to justify this principle, and sketch my own proposed general schema for explaining how - and when - the Invariance Principle can indeed be used as a legitimate tool of metaphysical inference.
[en] The nuclear fusion experiment Wendelstein 7-X is currently under construction in Greifswald, Germany. After completion in 2014, the experiment will be the largest and most advanced stellarator ever built. The cryostat hosting the superconducting coils and the vacuum vessel has a diameter of 16 meters and a height of 5 meters, and the magnetic field will be 2.5 T on axis. Wendelstein 7-X is designed to prove simultaneous high density, high temperature, steady-state plasma operation. The first plasma is planned for 2015. After first tests the plasma pulse time will be gradually increased up to 30 minutes from 2019 on. The core plasma temperature in this device will be over 100 million degree. Therefore, contact with the plasma facing components must be done carefully. One challenge in this connection is that the plasma shape will change during operation due to internal plasma currents generated by the plasma itself. Using state-of-the-art codes, we are investigating and developing operational scenarios for the first, relatively short plasma pulses, that allow us to address important issues for the later steady-state operation.
[en] The Higgs boson decay into bottom quarks has the highest branching fraction of all decay modes. Among the not yet observed decays, the branching fraction to charm quarks is the second highest. This talk presents a search for the Higgs boson in the gluon fusion production mode with high Lorentz boosts, decaying to a pair of bottom quarks. The analysis has been published in 2018 and was sensitive enough to observe the boosted Z boson decay into b quarks for the first time. Given the recently developed deep learning based tools for identification of bottom and charm flavor jets in such topologies, the natural next step is an analogous search for the decay to a pair of charm quarks. Probing this channel is not only important for completeness and studying the Higgs couplings to the second generation of fermions, but it could also be sensitive to potential beyond Standard Model corrections.
[en] We present an implementation of WZjj production via vector-boson fusion in the POWHEG BOX, a public tool for the matching of next-to-leading order QCD calculations with multi-purpose parton-shower generators. We provide phenomenological results for electroweak WZjj production with fully leptonic decays at the LHC in realistic setups and discuss theoretical uncertainties associated with the simulation. We find that beyond the leading-order approximation the dependence on the unphysical factorization and renormalization scales is mild. The two tagging jets are furthermore very stable against parton-shower effects. However, considerable sensitivities to the shower Monte-Carlo program used are observed for central-jet veto observables.
[en] In this talk I will show that scattering amplitudes with massive or massless particles of spin s 1 in renormalisable theories can be constructed recursively out of amplitudes with a lesser number of external legs. The basic idea is to deform the momenta of the external particles by shifting them into the complex plane in such a manner that on-shellness and overall momentum conservation are preserved. The precise shifting procedure depends on the kind of the involved particles and their spin projected along a reference axis. Using Cauchy's residue theorem leads to connect the physical amplitude to emerging poles in the complex plane and a contour integral which is forced to vanish by deforming enough external momenta. It turns out that shifting 5 external legs is always sufficient, shifting 3 external legs is sufficient except all external particles are scalars or longitudinally polarised vector bosons. Furthermore the introduction of special reference frames to guarantee a good behaviour of the contour integral and hence recursive constructibility involves some subtleties. Especially the reference frames for individual particles are different. To overcome these problems a formalism to convert a posteriori between different reference frames is introduced, leading to a description in a common frame. Together with construction rules, that forbid certain helicity combinations in the 3-point amplitudes, a set of selection and suppression rules can be derived.
[en] The process of forming particle trajectories from measurements is called track reconstruction. Due to pile-up, this quickly becomes the most resource intensive part of event reconstruction in HEP. In recent runs of the LHC, experiments have successfully used highly optimized software to achieve desirable computational and physics performance. In light of the upcoming increase of luminosity for the HL-LHC, new solutions are being developed. Propagation and navigation of particle trajectories through the detector are particularly CPU intensive tasks. This is essential for fitting tracks, and thus is revisited in the Acts project. In computer graphics, highly performant ray tracing algorithms are used frequently. Using ray-box intersections in hierarchies of boxes, intersections of the assumed direction of a track and the detector geometry can be found efficiently. These algorithms could enable a more robust and flexible alternative to other navigation solutions. The approach could also alleviate some of the sophistication and fine-tuning required in building geometries which can be navigated easily. Different navigation strategies through HEP detector geometries and their interplay with track reconstruction are covered in this talk. Benefits and pitfalls of the different approaches will be reviewed. The usage of intersection algorithms for fast Navigation is investigated, and applicability to real-world tracking Scenarios is evaluated.