Results 1 - 10 of 36338
Results 1 - 10 of 36338. Search took: 0.051 seconds
|Sort by: date | relevance|
[en] A simple model for a set of interacting idealized neurons in scale-free networks is introduced. The basic elements of the model are endowed with the main features of a neuron function. We find that our model displays power-law behavior of avalanche sizes and generates long-range temporal correlation. More importantly, we find different dynamical behavior for nodes with different connectivity in the scale-free networks.
[en] Because of intense synaptic activity, cortical neurons are in a high conductance state. We show that this state has important consequences on the properties of a population of independent model neurons with conductance-based synapses. Using an adiabaticlike approximation we study both the membrane potential and the firing probability distributions across the population. We find that the latter is bimodal in such a way that at any particular moment some neurons are inactive while others are active. The population rate and the response variability are also characterized
[en] This Letter addresses the qualitative properties of equilibrium points in continuous Hopfield neural networks. We derive a sufficient condition for an equilibrium point to be locally exponentially stable. We also present an estimate on the domains of attraction of locally exponentially stable equilibrium points. Our condition and estimate are formulated in terms of the network parameters, the neurons' activation functions and the associated equilibrium point. Hence, they are easily checkable. In addition, these results neither depend on the monotonicity of the activation functions nor on coupling conditions between the neurons. Consequently, our results are of practical importance in the evaluation of performance of Hopfield associative memory networks
[en] Highlights: • We investigate the NDD phenomenon in a hybrid scale-free network. • Electrical synapses are more impressive on the emergence of NDD. • Electrical synapses are more efficient in suppressing of the NDD. • Average degree has two opposite effects on the appearance time of the first spike. - Abstract: We study the phenomenon of noise-delayed decay in a scale-free neural network consisting of excitable FitzHugh–Nagumo neurons. In contrast to earlier works, where only electrical synapses are considered among neurons, we primarily examine the effects of hybrid synapses on the noise-delayed decay in this study. We show that the electrical synaptic coupling is more impressive than the chemical coupling in determining the appearance time of the first-spike and more efficient on the mitigation of the delay time in the detection of a suprathreshold input signal. We obtain that hybrid networks including inhibitory chemical synapses have higher signal detection capabilities than those of including excitatory ones. We also find that average degree exhibits two different effects, which are strengthening and weakening the noise-delayed decay effect depending on the noise intensity
[en] The dynamics of the strongly deluted version of a model recently proposed by Herz et al. to store sequences of patterns with spatio-temporal retrieval properties is solved. The spurious sequence solutions are analyzed and the region in the (α,T) plane where the only relevant attractors are the learnt cycles, are found. (Author)
[en] Biological neural communications channels transport environmental information from sensors through chains of active dynamical neurons to neural centers for decisions and actions to achieve required functions. These kinds of communications channels are able to create information and to transfer information from one time scale to the other because of the intrinsic nonlinear dynamics of the component neurons. We discuss a very simple neural information channel composed of sensory input in the form of a spike train that arrives at a model neuron, then moves through a realistic synapse to a second neuron where the information in the initial sensory signal is read. Our model neurons are four-dimensional generalizations of the Hindmarsh-Rose neuron, and we use a model of chemical synapse derived from first-order kinetics. The four-dimensional model neuron has a rich variety of dynamical behaviors, including periodic bursting, chaotic bursting, continuous spiking, and multistability. We show that, for many of these regimes, the parameters of the chemical synapse can be tuned so that information about the stimulus that is unreadable at the first neuron in the channel can be recovered by the dynamical activity of the synapse and the second neuron. Information creation by nonlinear dynamical systems that allow chaotic oscillations is familiar in their autonomous oscillations. It is associated with the instabilities that lead to positive Lyapunov exponents in their dynamical behavior. Our results indicate how nonlinear neurons acting as input/output systems along a communications channel can recover information apparently ''lost'' in earlier junctions on the channel. Our measure of information transmission is the average mutual information between elements, and because the channel is active and nonlinear, the average mutual information between the sensory source and the final neuron may be greater than the average mutual information at an earlier neuron in the channel. This behavior is strikingly different than the passive role communications channels usually play, and the ''data processing theorem'' of conventional communications theory is violated by these neural channels. Our calculations indicate that neurons can reinforce reliable transmission along a chain even when the synapses and the neurons are not completely reliable components. This phenomenon is generic in parameter space, robust in the presence of noise, and independent of the discretization process. Our results suggest a framework in which one might understand the apparent design complexity of neural information transduction networks. If networks with many dynamical neurons can recover information not apparent at various waystations in the communications channel, such networks may be more robust to noisy signals, may be more capable of communicating many types of encoded sensory neural information, and may be the appropriate design for components, neurons and synapses, which can be individually imprecise, inaccurate ''devices.''
[en] The effect of lymphocytes from normal mice on the growth of a syngeneic radiation-induced, T-cell-derived lymphoma was investigated. Thymus and spleen cells enhanced the growth of admixed lymphoma cells in a reproducible manner. Growth enhancement was manifested by the earlier appearance and higher final incidence of tumours. Lymphocytes also enhanced the growth of radiation-damaged lymphoma cells. The enhancing activity of spleen cells was predominantly a property of T cells, since it was abolished by treatment with anti-theta serum plus complement and significantly less in spleen cells of nude mice. Tumour-enhancing thymocytes seem to belong to the immature thymic subpopulation, as indicated by their binding to peanut agglutinin. (author)
[en] Dynamics of class II neurons, firing frequencies of which are strongly regulated by the inherent neuronal property, have been extensively studied since the formulation of the Hodgkin-Huxley model in 1952. However, how class II neurons process stimulus information and what kind of external information and internal structure firing patterns of neurons represent are vaguely understood in contrast to firing rate coding by class I neurons. Here we show that the FitzHugh-Nagumo class II neuron simultaneously filters inputs based on the input frequency and represent the signal strength by interspike intervals. In this sense, the class II neuron works as an AM processor that passes the information on the carrier and on the temporal waveform of signals
[en] The collective behavior of an array of coupled Hodgkin-Huxley neurons, which are subject to subthreshold signal and external noise, is investigated by numerical methods. It is found that the network size, the number of Hodgkin-Huxley neurons in the network, has an optimal value, at which the collective behavior shows the best performance. The value of the optimal network size goes up when the coupling strength increases. Such a nontrivial dependence on the network size is not found if we only consider the response of an individual neuron in the network
[en] Gap junctions are effective electric couplings between neurons and form a very important way of communication between them. Since they can be considered as the points on the neuron's membrane on which for example dendrites of different cells become one piece, in three dimensions they can be modelled by observing this property in the created geometry. Thus they can be easily made part in an already existing 3-dimensional model for signal propagation on the neuron's membrane, if the geometries are chosen in such a way respect the blending of the membranes. A small network of two cells was created, which blend in their dendrites and a simulation of the three-dimensional model was carried out which reveals the fast transmission of the signal from one cell to the other.