Advances in Information Entropy
1Shijiazhuang Tiedao University, Shijiazhuang, China
2Northeastern State University, Tahlequah, USA
3Northeastern University, Qinhuangdao, China
Advances in Information Entropy
Description
Entropy comes from thermodynamics in physics. It is one of the state parameters of matter to describe the degradation of energy. It generally refers to a measure of the state of some material systems. With the development of statistical physics and information theory, the essence of entropy is gradually explained, that is, the degree of internal chaos of a system. It has important applications in many fields, such as cybernetics, probability theory, life science and astrophysics. In information theory, entropy gives a measure of the amount of information in an event drawn from a distribution. In the information world, the higher the entropy is, the more information can be transmitted. The lower the entropy is, the less information can be transmitted. Information entropy is always a useful tool to deal with the information quantity contained in the information and random variables. Information entropy also measures the complexity of a system. Information entropy is widely used in signal processing, system analysis and other related fields.
The aim of this Special Issue is to bring together original research and review articles highlighting the recent advances in this field. We hope that this Special Issue provides a platform to outline the continuing efforts to understand this field.
Potential topics include but are not limited to the following:
- Information entropy analysis
- Statistical signal processing with information entropy
- Nonlinear adaptive filters with information entropy
- Complex system analysis with information entropy
- Optimum control with information entropy
- Bioinformatics in information entropy
- Machine learning with information entropy
- Computational modelling and statistical analysis of information entropy
- Image analysis using information entropy