Results 1 - 10 of 37759
Results 1 - 10 of 37759. Search took: 0.055 seconds
|Sort by: date | relevance|
[en] A simple model for a set of interacting idealized neurons in scale-free networks is introduced. The basic elements of the model are endowed with the main features of a neuron function. We find that our model displays power-law behavior of avalanche sizes and generates long-range temporal correlation. More importantly, we find different dynamical behavior for nodes with different connectivity in the scale-free networks.
[en] Because of intense synaptic activity, cortical neurons are in a high conductance state. We show that this state has important consequences on the properties of a population of independent model neurons with conductance-based synapses. Using an adiabaticlike approximation we study both the membrane potential and the firing probability distributions across the population. We find that the latter is bimodal in such a way that at any particular moment some neurons are inactive while others are active. The population rate and the response variability are also characterized
[en] This Letter addresses the qualitative properties of equilibrium points in continuous Hopfield neural networks. We derive a sufficient condition for an equilibrium point to be locally exponentially stable. We also present an estimate on the domains of attraction of locally exponentially stable equilibrium points. Our condition and estimate are formulated in terms of the network parameters, the neurons' activation functions and the associated equilibrium point. Hence, they are easily checkable. In addition, these results neither depend on the monotonicity of the activation functions nor on coupling conditions between the neurons. Consequently, our results are of practical importance in the evaluation of performance of Hopfield associative memory networks
[en] Highlights: • We investigate the NDD phenomenon in a hybrid scale-free network. • Electrical synapses are more impressive on the emergence of NDD. • Electrical synapses are more efficient in suppressing of the NDD. • Average degree has two opposite effects on the appearance time of the first spike. - Abstract: We study the phenomenon of noise-delayed decay in a scale-free neural network consisting of excitable FitzHugh–Nagumo neurons. In contrast to earlier works, where only electrical synapses are considered among neurons, we primarily examine the effects of hybrid synapses on the noise-delayed decay in this study. We show that the electrical synaptic coupling is more impressive than the chemical coupling in determining the appearance time of the first-spike and more efficient on the mitigation of the delay time in the detection of a suprathreshold input signal. We obtain that hybrid networks including inhibitory chemical synapses have higher signal detection capabilities than those of including excitatory ones. We also find that average degree exhibits two different effects, which are strengthening and weakening the noise-delayed decay effect depending on the noise intensity
[en] The dynamics of the strongly deluted version of a model recently proposed by Herz et al. to store sequences of patterns with spatio-temporal retrieval properties is solved. The spurious sequence solutions are analyzed and the region in the (α,T) plane where the only relevant attractors are the learnt cycles, are found. (Author)
[en] Biological neural communications channels transport environmental information from sensors through chains of active dynamical neurons to neural centers for decisions and actions to achieve required functions. These kinds of communications channels are able to create information and to transfer information from one time scale to the other because of the intrinsic nonlinear dynamics of the component neurons. We discuss a very simple neural information channel composed of sensory input in the form of a spike train that arrives at a model neuron, then moves through a realistic synapse to a second neuron where the information in the initial sensory signal is read. Our model neurons are four-dimensional generalizations of the Hindmarsh-Rose neuron, and we use a model of chemical synapse derived from first-order kinetics. The four-dimensional model neuron has a rich variety of dynamical behaviors, including periodic bursting, chaotic bursting, continuous spiking, and multistability. We show that, for many of these regimes, the parameters of the chemical synapse can be tuned so that information about the stimulus that is unreadable at the first neuron in the channel can be recovered by the dynamical activity of the synapse and the second neuron. Information creation by nonlinear dynamical systems that allow chaotic oscillations is familiar in their autonomous oscillations. It is associated with the instabilities that lead to positive Lyapunov exponents in their dynamical behavior. Our results indicate how nonlinear neurons acting as input/output systems along a communications channel can recover information apparently ''lost'' in earlier junctions on the channel. Our measure of information transmission is the average mutual information between elements, and because the channel is active and nonlinear, the average mutual information between the sensory source and the final neuron may be greater than the average mutual information at an earlier neuron in the channel. This behavior is strikingly different than the passive role communications channels usually play, and the ''data processing theorem'' of conventional communications theory is violated by these neural channels. Our calculations indicate that neurons can reinforce reliable transmission along a chain even when the synapses and the neurons are not completely reliable components. This phenomenon is generic in parameter space, robust in the presence of noise, and independent of the discretization process. Our results suggest a framework in which one might understand the apparent design complexity of neural information transduction networks. If networks with many dynamical neurons can recover information not apparent at various waystations in the communications channel, such networks may be more robust to noisy signals, may be more capable of communicating many types of encoded sensory neural information, and may be the appropriate design for components, neurons and synapses, which can be individually imprecise, inaccurate ''devices.''
[en] Cell-cycle distributions were measured by flow cytometry for Chinese hamster (CHO) cells cultured continuously under hypoxic conditions. DNA histograms showed an accumulation of cells in the early S phase followed by a traverse delay through the S phase, and a G2 block. During hypoxic culturing, cell viability decreased rapidly to less than 0.1% at 120 h. Radiation responses for cells cultured under these conditions showed an extreme radioresistance at 72 h. Results suggest that hypoxia induces a condition similar to cell synchrony which itself changes the radioresistance of hypoxic cells. (author)
[en] Radiation-induced mitotic delay in mice was investigated at each cell position along the side of the crypt (position corresponding to different levels of maturation). From the results presented, it is shown that the duration of mitotic delay is shorter the closer the proliferative cells are to their last cell division in the proliferative hierarchy in the crypt and longest for cells situated where the stem cells are to be expected. (UK)
[en] We perform an extensive numerical investigation on the retrieval dynamics of the synchronous Hopfield model, also known as Little-Hopfield model, up to sizes of 218 neurons. Our results correct and extend much of the early simulations on the model. We find that the average convergence time has a power law behavior for a wide range of system sizes, whose exponent depends both on the network loading and the initial overlap with the memory to be retrieved. Surprisingly, we also find that the variance of the convergence time grows as fast as its average, making it a non-self-averaging quantity. Based on the simulation data we differentiate between two definitions for memory retrieval time, one that is mathematically strict, τc, the number of updates needed to reach the attractor whose properties we just described, and a second definition correspondent to the time τη when the network stabilizes within a tolerance threshold η such that the difference of two consecutive overlaps with a stored memory is smaller that η. We show that the scaling relationships between τc and τη and the typical network parameters as the memory load α or the size of the network N vary greatly, being τη relatively insensitive to system sizes and loading. We propose τη as the physiological realistic measure for the typical attractor network response
[en] This work deals with the synchronizations of two both coupled Hodgkin-Huxley (H-H) neurons, where the master neuron posses inner noise and the slave neuron is considered in a resting state, (without inner noise) and an exciting state (with inner noise). The synchronization procedure is done via a feedback control, considering a class of high order sliding-mode controller which provides chattering reduction and finite time synchronization convergence, with a satisfactory performance. Theoretical analysis is done in order to show the closed-loop stability of the proposed controller and the calculated finite time for convergence. The main results are illustrated via numerical experiments