Examples of "tsallis_entropy"
Tsallis entropy for an exponential family can be written as
For multivariate normal, term "k" is zero, and therefore the Tsallis entropy is in closed-form.
The Tsallis Entropy has been used along with the Principle of maximum entropy to derive the Tsallis distribution.
The q-Gaussian is a probability distribution arising from the maximization of the Tsallis entropy under appropriate constraints. It is one example of a Tsallis distribution. The q-Gaussian is a generalization of the Gaussian in the same way that Tsallis entropy is a generalization of standard Boltzmann–Gibbs entropy or Shannon entropy. The normal distribution is recovered as "q" → 1.
Among the various available theoretical results which clarify the physical conditions under which Tsallis entropy and associated statistics apply, the following ones can be selected:
Given a discrete set of probabilities formula_1 with the condition formula_2, and formula_3 any real number, the Tsallis entropy is defined as
Conversely, quantum measurement cannot be claimed to reveal the properties of a system that existed before the measurement was made. This controversy has encouraged some authors to introduce the non-additivity property of Tsallis entropy (a generalization of the standard Boltzmann–Gibbs entropy) as the main reason for recovering a true quantal information measure in the quantum context, claiming that non-local correlations ought to be described because of the particularity of Tsallis entropy.
Entropy is considered to be an extensive property, i.e., that its value depends on the amount of material present. Constantino Tsallis has proposed a nonextensive entropy (Tsallis entropy), which is a generalization of the traditional Boltzmann–Gibbs entropy.
In a similar procedure to how the normal distribution can be derived using the standard Boltzmann–Gibbs entropy or Shannon entropy, the q-Gaussian can be derived from a maximization of the Tsallis entropy subject to the appropriate constraints.
In statistics, a Tsallis distribution is a probability distribution derived from the maximization of the Tsallis entropy under appropriate constraints. There are several different families of Tsallis distributions, yet different sources may reference an individual family as "the Tsallis distribution". The q-Gaussian is a generalization of the Gaussian in the same way that Tsallis entropy is a generalization of standard Boltzmann–Gibbs entropy or Shannon entropy. Similarly, if the domain of the variable is constrained to be positive in the maximum entropy procedure, the q-exponential distribution is derived.
In physics, the Tsallis entropy is a generalization of the standard Boltzmann–Gibbs entropy. It was introduced in 1988 by Constantino Tsallis as a basis for generalizing the standard statistical mechanics, and is identical in form to Havrda–Charvát structural α-entropy within Information Theory. In the scientific literature, the physical relevance of the Tsallis entropy has been debated. However, from the years 2000 on, an increasingly wide spectrum of natural, artificial and social complex systems have been identified which confirm the predictions and consequences that are derived from this nonadditive entropy, such as nonextensive statistical mechanics, which generalizes the Boltzmann–Gibbs theory.
In a similar procedure to how the exponential distribution can be derived using the standard Boltzmann–Gibbs entropy or Shannon entropy and constraining the domain of the variable to be positive, the q-exponential distribution can be derived from a maximization of the Tsallis Entropy subject to the appropriate constraints.
Just as the normal distribution is the maximum information entropy distribution for fixed values of the first moment formula_19 and second moment formula_20 (with the fixed zeroth moment formula_21 corresponding to the normalization condition), the q-Gaussian distribution is the maximum Tsallis entropy distribution for fixed values of these three moments.
Tsallis is credited with introducing the notion of what is known as Tsallis entropy and Tsallis statistics in his 1988 paper "Possible generalization of Boltzmann–Gibbs statistics" published in the "Journal of Statistical Physics". The generalization is considered to be a good candidate for formulating a theory of non-extensive thermodynamics. The resulting theory is not intended to replace Boltzmann–Gibbs statistics, but rather supplement it, such as in the case of anomalous systems characterised by non-ergodicity or metastable states.
A number of interesting physical systems abide to entropic functionals that are more general than the standard Tsallis entropy. Therefore, several physically meaningful generalisations have been introduced. The two most general of those are notably: Superstatistics, introduced by C. Beck and E.G.D. Cohen in 2003 and Spectral Statistics, introduced by G.A. Tsekouras and Constantino Tsallis in 2005. Both these entropic forms have Tsallis and Boltzmann–Gibbs statistics as special cases; Spectral Statistics has been proven to at least contain Superstatistics and it has been conjectured to also cover some additional cases.
Behavioural finance attempts to explain price anomalies in terms of the biased behaviour of individuals, mostly concerned with the agents themselves and to a lesser degree aggregation of agent behaviour. Statistical finance is concerned with emergent properties arising from systems with many interacting agents and as such attempts to explain price anomalies in terms of the collective behaviour. Emergent properties are largely independent of the uniqueness of individual agents because they are dependent on the nature of the interactions of the agents rather than the agents themselves. This approach has drawn strongly on ideas arising from complex systems, phase transitions, criticality, self-organized criticality, non-extensivity (see Tsallis entropy), q-Gaussian models, and agents based models (see agent based model); as these are known to be able to recover some of phenomenology of financial market data, the stylized facts, in particular the long-range memory and scaling due to long-range interactions.