The Maximum Entropy Method

Author: Nailong Wu
Publisher: Springer Science & Business Media
ISBN: 9783642606298
Release Date: 2012-12-06
Genre: Science

Forty years ago, in 1957, the Principle of Maximum Entropy was first intro duced by Jaynes into the field of statistical mechanics. Since that seminal publication, this principle has been adopted in many areas of science and technology beyond its initial application. It is now found in spectral analysis, image restoration and a number of branches ofmathematics and physics, and has become better known as the Maximum Entropy Method (MEM). Today MEM is a powerful means to deal with ill-posed problems, and much research work is devoted to it. My own research in the area ofMEM started in 1980, when I was a grad uate student in the Department of Electrical Engineering at the University of Sydney, Australia. This research work was the basis of my Ph.D. the sis, The Maximum Entropy Method and Its Application in Radio Astronomy, completed in 1985. As well as continuing my research in MEM after graduation, I taught a course of the same name at the Graduate School, Chinese Academy of Sciences, Beijingfrom 1987to 1990. Delivering the course was theimpetus for developing a structured approach to the understanding of MEM and writing hundreds of pages of lecture notes.

Maximum Entropy and Bayesian Methods in Science and Engineering

Author: G. Erickson
Publisher: Springer Science & Business Media
ISBN: 9789401090544
Release Date: 2013-03-13
Genre: Mathematics

This volume has its origin in the Fifth, Sixth and Seventh Workshops on "Maximum-Entropy and Bayesian Methods in Applied Statistics", held at the University of Wyoming, August 5-8, 1985, and at Seattle University, August 5-8, 1986, and August 4-7, 1987. It was anticipated that the proceedings of these workshops would be combined, so most of the papers were not collected until after the seventh workshop. Because most of the papers in this volume are in the nature of advancing theory or solving specific problems, as opposed to status reports, it is believed that the contents of this volume will be of lasting interest to the Bayesian community. The workshop was organized to bring together researchers from different fields to critically examine maximum-entropy and Bayesian methods in science and engineering as well as other disciplines. Some of the papers were chosen specifically to kindle interest in new areas that may offer new tools or insight to the reader or to stimulate work on pressing problems that appear to be ideally suited to the maximum-entropy or Bayesian method. These workshops and their proceedings could not have been brought to their final form without the support or help of a number of people.

Entropy Measures Maximum Entropy Principle and Emerging Applications

Author: Karmeshu
Publisher: Springer
ISBN: 9783540362128
Release Date: 2012-10-01
Genre: Mathematics

The last two decades have witnessed an enormous growth with regard to ap plications of information theoretic framework in areas of physical, biological, engineering and even social sciences. In particular, growth has been spectac ular in the field of information technology,soft computing,nonlinear systems and molecular biology. Claude Shannon in 1948 laid the foundation of the field of information theory in the context of communication theory. It is in deed remarkable that his framework is as relevant today as was when he 1 proposed it. Shannon died on Feb 24, 2001. Arun Netravali observes "As if assuming that inexpensive, high-speed processing would come to pass, Shan non figured out the upper limits on communication rates. First in telephone channels, then in optical communications, and now in wireless, Shannon has had the utmost value in defining the engineering limits we face". Shannon introduced the concept of entropy. The notable feature of the entropy frame work is that it enables quantification of uncertainty present in a system. In many realistic situations one is confronted only with partial or incomplete information in the form of moment, or bounds on these values etc. ; and it is then required to construct a probabilistic model from this partial information. In such situations, the principle of maximum entropy provides a rational ba sis for constructing a probabilistic model. It is thus necessary and important to keep track of advances in the applications of maximum entropy principle to ever expanding areas of knowledge.

Engineering Applications of Neural Networks

Author: Lazaros Iliadis
Publisher: Springer
ISBN: 9783642410161
Release Date: 2013-09-25
Genre: Computers

The two volumes set, CCIS 383 and 384, constitutes the refereed proceedings of the 14th International Conference on Engineering Applications of Neural Networks, EANN 2013, held on Halkidiki, Greece, in September 2013. The 91 revised full papers presented were carefully reviewed and selected from numerous submissions. The papers describe the applications of artificial neural networks and other soft computing approaches to various fields such as pattern recognition-predictors, soft computing applications, medical applications of AI, fuzzy inference, evolutionary algorithms, classification, learning and data mining, control techniques-aspects of AI evolution, image and video analysis, classification, pattern recognition, social media and community based governance, medical applications of AI-bioinformatics and learning.

Maximum Entropy and Bayesian Methods in Inverse Problems

Author: C.R. Smith
Publisher: Springer Science & Business Media
ISBN: 9789401722216
Release Date: 2013-04-17
Genre: Mathematics

This volume contains the text of the twenty-five papers presented at two workshops entitled Maximum-Entropy and Bayesian Methods in Applied Statistics, which were held at the University of Wyoming from June 8 to 10, 1981, and from August 9 to 11, 1982. The workshops were organized to bring together researchers from different fields to critically examine maxi mum-entropy and Bayesian methods in science, engineering, medicine, oceanography, economics, and other disciplines. An effort was made to maintain an informal environment where ideas could be easily ~xchanged. That the workshops were at least partially successful is borne out by the fact that there have been two succeeding workshops, and the upcoming Fifth Workshop promises to be the largest of all. These workshops and their proceedings could not have been brought to their final form without the substantial help of a number of people. The support of David Hofmann, the past chairman, and Glen Rebka, Jr. , the present chairman of the Physics Department of the University of Wyoming, has been strong and essential. Glen has taken a special interest in seeing that the proceedings have received the support required for their comple tion. The financial support of the Office of University Research Funds, University of Wyoming, is gratefully acknowledged. The secretarial staff, in particular Evelyn Haskell, Janice Gasaway, and Marce Mitchum, of the University of Wyoming Physics Department has contributed a great number of hours in helping C. Ray Smith organize and direct the workshops.

Maximum Entropy and Bayesian Methods

Author: John Skilling
Publisher: Springer Science & Business Media
ISBN: 9789400901070
Release Date: 2012-12-06
Genre: Mathematics

This volume records papers given at the fourteenth international maximum entropy conference, held at St John's College Cambridge, England. It seems hard to believe that just thirteen years have passed since the first in the series, held at the University of Wyoming in 1981, and six years have passed since the meeting last took place here in Cambridge. So much has happened. There are two major themes at these meetings, inference and physics. The inference work uses the confluence of Bayesian and maximum entropy ideas to develop and explore a wide range of scientific applications, mostly concerning data analysis in one form or another. The physics work uses maximum entropy ideas to explore the thermodynamic world of macroscopic phenomena. Of the two, physics has the deeper historical roots, and much of the inspiration behind the inference work derives from physics. Yet it is no accident that most of the papers at these meetings are on the inference side. To develop new physics, one must use one's brains alone. To develop inference, computers are used as well, so that the stunning advances in computational power render the field open to rapid advance. Indeed, we have seen a revolution. In the larger world of statistics beyond the maximum entropy movement as such, there is now an explosion of work in Bayesian methods, as the inherent superiority of a defensible and consistent logical structure becomes increasingly apparent in practice.

Information and Self Organization

Author: Hermann Haken
Publisher: Springer Science & Business Media
ISBN: 9783662078938
Release Date: 2013-11-11
Genre: Science

Complex systems are ubiquitous, and practically all branches of science ranging from physics through chemistry and biology to economics and sociology have to deal with them. In this book we wish to present concepts and methods for dealing with complex systems from a unifying point of view. Therefore it may be of inter est to graduate students, professors and research workers who are concerned with theoretical work in the above-mentioned fields. The basic idea for our unified ap proach sterns from that of synergetics. In order to find unifying principles we shall focus our attention on those situations where a complex system changes its macroscopic behavior qualitatively, or in other words, where it changes its macroscopic spatial, temporal or functional structure. Until now, the theory of synergetics has usually begun with a microscopic or mesoscopic description of a complex system. In this book we present an approach which starts out from macroscopic data. In particular we shall treat systems that acquire their new structure without specific interference from the outside; i. e. systems which are self-organizing. The vehicle we shall use is information. Since this word has several quite different meanings, all of which are important for our purpose, we shall discuss its various aspects. These range from Shannon information, from which all semantics has been exorcised, to the effects of information on receivers and the self-creation of meaning.

Maximum Entropy and Bayesian Methods

Author: C.R. Smith
Publisher: Springer Science & Business Media
ISBN: 9789401722193
Release Date: 2013-06-29
Genre: Computers

Bayesian probability theory and maximum entropy methods are at the core of a new view of scientific inference. These `new' ideas, along with the revolution in computational methods afforded by modern computers, allow astronomers, electrical engineers, image processors of any type, NMR chemists and physicists, and anyone at all who has to deal with incomplete and noisy data, to take advantage of methods that, in the past, have been applied only in some areas of theoretical physics. This volume records the Proceedings of Eleventh Annual `Maximum Entropy' Workshop, held at Seattle University in June, 1991. These workshops have been the focus of a group of researchers from many different fields, and this diversity is evident in this volume. There are tutorial papers, theoretical papers, and applications in a very wide variety of fields. Almost any instance of dealing with incomplete and noisy data can be usefully treated by these methods, and many areas of theoretical research are being enhanced by the thoughtful application of Bayes' theorem. The contributions contained in this volume present a state-of-the-art review that will be influential and useful for many years to come.

Self Organizing Maps

Author: Teuvo Kohonen
Publisher: Springer Science & Business Media
ISBN: 9783642976100
Release Date: 2012-12-06
Genre: Science

The book we have at hand is the fourth monograph I wrote for Springer Verlag. The previous one named "Self-Organization and Associative Mem ory" (Springer Series in Information Sciences, Volume 8) came out in 1984. Since then the self-organizing neural-network algorithms called SOM and LVQ have become very popular, as can be seen from the many works re viewed in Chap. 9. The new results obtained in the past ten years or so have warranted a new monograph. Over these years I have also answered lots of questions; they have influenced the contents of the present book. I hope it would be of some interest and help to the readers if I now first very briefly describe the various phases that led to my present SOM research, and the reasons underlying each new step. I became interested in neural networks around 1960, but could not in terrupt my graduate studies in physics. After I was appointed Professor of Electronics in 1965, it still took some years to organize teaching at the uni versity. In 1968 - 69 I was on leave at the University of Washington, and D. Gabor had just published his convolution-correlation model of autoasso ciative memory. I noticed immediately that there was something not quite right about it: the capacity was very poor and the inherent noise and crosstalk were intolerable. In 1970 I therefore sugge~ted the auto associative correlation matrix memory model, at the same time as J.A. Anderson and K. Nakano.

Entropy Based Parameter Estimation in Hydrology

Author: Vijay Singh
Publisher: Springer Science & Business Media
ISBN: 9789401714310
Release Date: 2013-04-17
Genre: Science

Since the pioneering work of Shannon in the late 1940's on the development of the theory of entropy and the landmark contributions of Jaynes a decade later leading to the development of the principle of maximum entropy (POME), the concept of entropy has been increasingly applied in a wide spectrum of areas, including chemistry, electronics and communications engineering, data acquisition and storage and retreival, data monitoring network design, ecology, economics, environmental engineering, earth sciences, fluid mechanics, genetics, geology, geomorphology, geophysics, geotechnical engineering, hydraulics, hydrology, image processing, management sciences, operations research, pattern recognition and identification, photogrammetry, psychology, physics and quantum mechanics, reliability analysis, reservoir engineering, statistical mechanics, thermodynamics, topology, transportation engineering, turbulence modeling, and so on. New areas finding application of entropy have since continued to unfold. The entropy concept is indeed versatile and its applicability widespread. In the area of hydrology and water resources, a range of applications of entropy have been reported during the past three decades or so. This book focuses on parameter estimation using entropy for a number of distributions frequently used in hydrology. In the entropy-based parameter estimation the distribution parameters are expressed in terms of the given information, called constraints. Thus, the method lends itself to a physical interpretation of the parameters. Because the information to be specified usually constitutes sufficient statistics for the distribution under consideration, the entropy method provides a quantitative way to express the information contained in the distribution.

Entropy and Energy Dissipation in Water Resources

Author: Vijay Singh
Publisher: Springer Science & Business Media
ISBN: 9789401124300
Release Date: 2012-12-06
Genre: Science

Since the landmark contributions of C. E. Shannon in 1948, and those of E. T. Jaynes about a decade later, applications of the concept of entropy and the principle of maximum entropy have proliterated in science and engineering. Recent years have witnessed a broad range of new and exciting developments in hydrology and water resources using the entropy concept. These have encompassed innovative methods for hydrologic network design, transfer of information, flow forecasting, reliability assessment for water distribution systems, parameter estimation, derivation of probability distributions, drainage-network analysis, sediment yield modeling and pollutant loading, bridge-scour analysis, construction of velocity profiles, comparative evaluation of hydrologic models, and so on. Some of these methods hold great promise for advancement of engineering practice, permitting rational alternatives to conventional approaches. On the other hand, the concepts of energy and energy dissipation are being increasingly applied to a wide spectrum of problems in environmental and water resources. Both entropy and energy dissipation have their origin in thermodynamics, and are related concepts. Yet, many of the developments using entropy seem to be based entirely on statistical interpretation and have seemingly little physical content. For example, most of the entropy-related developments and applications in water resources have been based on the information-theoretic interpretation of entropy. We believe if the power of the entropy concept is to be fully realized, then its physical basis has to be established.

Maximum Entropy and Bayesian Methods in Science and Engineering

Author: G. Erickson
Publisher: Springer Science & Business Media
ISBN: 9027727937
Release Date: 1988-08-31
Genre: Mathematics

This volume has its origin in the Fifth, Sixth and Seventh Workshops on and Bayesian Methods in Applied Statistics", held at "Maximum-Entropy the University of Wyoming, August 5-8, 1985, and at Seattle University, August 5-8, 1986, and August 4-7, 1987. It was anticipated that the proceedings of these workshops would be combined, so most of the papers were not collected until after the seventh workshop. Because all of the papers in this volume are on foundations, it is believed that the con tents of this volume will be of lasting interest to the Bayesian community. The workshop was organized to bring together researchers from different fields to critically examine maximum-entropy and Bayesian methods in science and engineering as well as other disciplines. Some of the papers were chosen specifically to kindle interest in new areas that may offer new tools or insight to the reader or to stimulate work on pressing problems that appear to be ideally suited to the maximum-entropy or Bayesian method. A few papers presented at the workshops are not included in these proceedings, but a number of additional papers not presented at the workshop are included. In particular, we are delighted to make available Professor E. T. Jaynes' unpublished Stanford University Microwave Laboratory Report No. 421 "How Does the Brain Do Plausible Reasoning?" (dated August 1957). This is a beautiful, detailed tutorial on the Cox-Polya-Jaynes approach to Bayesian probability theory and the maximum-entropy principle.