Paper
9 April 2007 Learning, entropy, free energy, an underlying commonality?
Author Affiliations +
Abstract
Statistical Mechanics, which is due primarily to Maxwell, Gibbs, and Boltzmann in the ninetieth century, has proven to be useful model for drawing inferences about the collective behavior of individual objects that interact according to a known force law (which for more general usage is referred to as interacting units.). Collective behavior is determined not by computing F = ma for each interacting unit because the problem is mathematically intractable. Instead, one computes the partition function for the collection of interacting units and predicts statistical behavior from the partition function. Statistical mechanics was united with Bayesian inference by Jaynes [4]. As a continuation, Shannon [7] demonstrated that the partition function assignment of probabilities via the interaction Hamiltonian is the solution to Bayesian assignment of probabilities (based on the maximum entropy method with known means and standard deviations). Once this technique has been applied to a variety of problems and obtained a solution, one can, of course, solve the inverse problem of to determine the solution to an inverse problem to determine what interaction model gives rise to a given probability assignment [1] and [8]. The usage of statistical mechanics allows one can draw general inferences about any complex system including networks [5] by defining "energy", "heat capacity", "temperature", and other thermodynamic characteristics of most complex systems based on the common standard of the Helmholtz free energy. Principle has noted that the aspect of entropy used in reasoning with uncertainty may not be the most appropriate entropy for learning mechanisms [6]. Instead he has explored using Renyi entropy and derived a form of information learning dynamics that has some promising features [2]. To fully realize the potential of the usage of a more generalized entropy to the three aspects of survival, we suggest some connections to the free energy and learning. We also connect some aspects of sensing to probability distributions that suggest why certain search strategies perform better than others. In making these connections, we suggest a fundamental connection waits to be discovered between inference, learning, and related to the manner in which sensing mechanisms perform.
© (2007) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
John E. Gray and Harold H. Szu "Learning, entropy, free energy, an underlying commonality?", Proc. SPIE 6576, Independent Component Analyses, Wavelets, Unsupervised Nano-Biomimetic Sensors, and Neural Networks V, 657606 (9 April 2007); https://doi.org/10.1117/12.725208
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Mechanics

Complex systems

Inverse problems

Bayesian inference

Independent component analysis

Neural networks

Sensors

Back to Top