Results 1 - 10 of 121
Results 1 - 10 of 121. Search took: 0.02 seconds
|Sort by: date | relevance|
[en] In this paper we advance into a generalized spinor classification, based on the so-called Lounesto’s classification. The program developed here is based on an existing freedom on the spinorial dual structures definition, which, in certain simple physical and mathematical limit, allows us to recover the usual Lounesto’s classification. The protocol to be accomplished here gives full consideration in the understanding of the underlying mathematical structure, in order to satisfy the quadratic algebraic relations known as Fierz-Pauli-Kofink identities, and also to provide physical observables. As we will see, such identities impose restrictions on the number of possible spinorial classes allowed in the classification. We also expose a subsidiary mathematical device - a slight modification on the Clifford algebra basis - which ensures real spinorial densities and holds the Fierz-Pauli-Kofink quadratic relations.
[en] The Particle Data Group recommends a set of procedures to be applied when discrepant data are to be combined. We introduce an alternative method based on a more general and solid statistical framework, providing a robust way to include possible unknown systematic effects interfering with experimental measurements or their theoretical interpretation. The limit of large data sets and practical cases of interest are discussed in detail.
[en] We propose using high order partial least squares path modeling (PLS-PM) to define a synthetic Italian well-being index merging traditional data, represented by the Quality of Life index proposed by “Il Sole 24 Ore”, and information provided by big data, represented by a Subjective Well-being Index (SWBI) performed extracting moods by Twitter. High order constructs allow to define a more abstract higher-level dimension and its more concrete lower-order sub-dimensions. These layered constructs have gained wide attention in applications of PLS-PM; many contributions in literature proposed their use to build composite indicators. The aim of the paper is to underline some critical issues in the use of these models and to suggest the implementation of a new adapted repeated indicator approach. Furthermore, following some recommendations proposed on the use of PLS-PM in longitudinal studies, we compare the situation in 2016 and 2017.
[en] Progress in Big Data in recent years has grown exponentially, which has allowed the detection and processing of a large amount of data. Until recently, this fact was unattainable by the lack of mechanization of the corporate governance reports. This paper investigates the relationship between corporate governance decisions affect the indebtedness policies of 1,956 industrial companies listed in Europe and the USA over the period 2016–2018 (5,868 observations). To measure corporate governance decisions, we use detailed information on the expertise of audit committees, the proportion of independent directors, board structures and women's presence on corporate boards. Our findings, which are based on a static panel data analysis, show that there is a strong negative relationship between Audit Committees expertise and indebtedness level in European and North American companies. There are also evidence that European and American companies with a onetier board structure and Audit Committees expertise are less likely to have lower level of indebtedness. Our results shed new light on corporate governance in relation to the experience of audit committees and the influence of their characteristics on indebtedness policy
[en] In this paper we use high frequency multidimensional textual news data and propose an index of inflation news. We utilize the power of text mining and its ability to convert large collections of text from unstructured to structured form for in-depth quantitative analysis of online news data. The significant relationship between the household’s infla-tion expectations and news topics is documented and the forecasting performance of news-based indices is evaluated for different horizons and model variations. Results sug-gest that with optimal number of topics a machine learning model is able to forecast the inflation expectations with greater accuracy than the simple autoregressive models. Addi-tional results from forecasting headline inflation indicate that the overall forecasting accu-racy is at a good level. Findings in this paper support the view in the literature that the news are good indicators of inflation and are able to capture inflation expectations well.
[en] Now-a-days, in the field of machine learning the data augmentation techniques are common in use, especially with deep neural networks, where a large amount of data is required to train the network. The effectiveness of the data augmentation technique has been analyzed for many applications; however, it has not been analyzed separately for the multimodal biometrics. This research analyzes the effects of data augmentation on single biometric data and multimodal biometric data. In this research, the features from two biometric modalities: fingerprint and signature, have been fused together at the feature level. The primary motivation for fusing biometric data at feature level is to secure the privacy of the user’s biometric data. The results that have been achieved by using data augmentation are presented in this research. The experimental results for the fingerprint recognition, signature recognition and the feature-level fusion of fingerprint with signature have been presented separately. The results show that the accuracy of the training classifier can be enhanced with data augmentation techniques when the size of real data samples is insufficient. This research study explores that how the effectiveness of data augmentation gradually increases with the number of templates for the fused biometric data by making the number of templates double each time until the classifier achieved the accuracy of 99%. (author)
[en] We study the Massless Semi-Relativistic Harmonic Oscillator within the framework of quantum mechanics with a Generalized Uncertainty Principle (GUP). The latter derives from the idea of minimal observable length, a quantity whose existence is expected to affect the energy eigenvalues and the eigenfunctions of the system. These effects are worked out, to the first order in the deformation parameter, using a perturbative approach based on Brau’s representation of position and momentum operators. Besides, we have discussed the impact of the GUP on the known duality between the considered model and the Schrödinger equation with a linear potential. (author)
[en] In Mobile Ad hoc Networks (MANET) nodes often change their location independently where neither fixed nor centralized infrastructure is present. Nodes communicate with each other directly or via intermediate nodes. The advantages of the MANET layout lead to self-structure and compatibility to most important functions such as traffic distribution and load balancing. Whenever the host moves rapidly in the network the topology becomes updated due to which the structure of MANET varies accordingly. In the literature, different routing protocols have been studied and compared by researchers. Still, there are queries regarding the performance of these protocols under different scenarios. MANETs are not based on a predesigned structure. In this paper, the performance assessment of the Quality of Services (QoS) for different protocols such as Ad hoc On-Demand Distance Vector (AODV), Temporally Ordered Routing Algorithm (TORA) and Zone Routing Protocol (ZRP) in the existence of the various number of communicating nodes is studied. The performance matrices throughput, end – to – end delay and packet delivery ratio are considered for simulations. Ns 2.35 simulator is used for carrying out these simulations. Results are compared for AODV, TORA, and ZRP routing protocols. The results show that AODV and TORA perform well in end – to – end delay as compared to zone routing protocol. Zone routing protocol performs well in packet delivery ratio and throughput as compared to both the other protocols. (author)
[en] INVAP performs design activities using a calculation suite, which is a set of codes of the different disciplines that includes, also, a set of procedures and methodologies that take into account the whole process of design including regulatory aspects, analyst qualification and the validation of computer codes as well. INVAP is in continuous development and validation of the calculation suite used for design and optimization of nuclear facilities, including also the improvement of the procedures and the training and retraining of the analyst. The IAEA Coordinated Research Programs (CRP) plays an important role in the V&V process allowing and promoting the strong interaction between data providers and benchmarkers of different countries, which strongly enhances the sharing of knowledge and the learning process of analysts. It also enables other important issues like: i) the comparison of different calculation codes and methodologies, ii) establishing procedures for the qualification of the computational tools and users, iii) the transference of know-how in the area of innovative methods in RR and iv) the minimization of user effects. This paper describes the validation of our calculation line and our experience in the qualification of the analyst and other side effects, based on the INVAP experience in the participation in the IAEA CRP’s. INVAP proposed scheme for the eight evaluated benchmarks is to develop alternate calculation approaches that consider not only the diverse paths of the calculation suite but also diverse specialists for each path. (author)