,

Statistical Field Theory for Neural Networks

Specificaties
Paperback, blz. | Engels
Springer International Publishing | e druk, 2020
ISBN13: 9783030464431
Rubricering
Springer International Publishing e druk, 2020 9783030464431
Onderdeel van serie Lecture Notes in Physics
Verwachte levertijd ongeveer 9 werkdagen

Samenvatting

This book presents a self-contained introduction to techniques from field theory applied to stochastic and collective dynamics in neuronal networks. These powerful analytical techniques, which are well established in other fields of physics, are the basis of current developments and offer solutions to pressing open problems in theoretical neuroscience and also machine learning. They enable a systematic and quantitative understanding of the dynamics in recurrent and stochastic neuronal networks.

This book is intended for physicists, mathematicians, and computer scientists and it is designed for self-study by researchers who want to enter the field or as the main text for a one semester course at advanced undergraduate or graduate level. The theoretical concepts presented in this book are systematically developed from the very beginning, which only requires basic knowledge of analysis and linear algebra.

Specificaties

ISBN13:9783030464431
Taal:Engels
Bindwijze:paperback
Uitgever:Springer International Publishing

Inhoudsopgave

<div>I. Introduction</div><div><br></div><div>II. Probabilities, moments, cumulants</div><div>A. Probabilities, observables, and moments</div><div>B. Transformation of random variables</div><div>C. Cumulants</div><div>D. Connection between moments and cumulants</div><div><br></div><div>III. Gaussian distribution and Wick’s theorem</div><div>A. Gaussian distribution</div><div>B. Moment and cumulant generating function of a Gaussian</div><div>C. Wick’s theorem</div><div>D. Graphical representation: Feynman diagrams</div><div>E. Appendix: Self-adjoint operators</div><div>F. Appendix: Normalization of a Gaussian</div><div><br></div><div>IV. Perturbation expansion</div><div>A. General case</div><div>B. Special case of a Gaussian solvable theory</div><div>C. Example: Example: “phi^3 + phi^4” theory</div><div>D. External sources</div><div>E. Cancellation of vacuum diagrams</div><div>F. Equivalence of graphical rules for n-point correlation and n-th moment</div><div>G. Example: “phi^3 + phi^4” theory</div><div>V. Linked cluster theorem</div><div>A. General proof of the linked cluster theorem</div><div>B. Dependence on j - external sources - two complimentary views</div><div>C. Example: Connected diagrams of the “phi^3 + phi^4” theory</div><br><div>VI. Functional preliminaries</div><div>A. Functional derivative</div><div>1. Product rule</div><div>2. Chain rule</div><div>3. Special case of the chain rule: Fourier transform</div><div>B. Functional Taylor series</div><div><br></div><div>VII. Functional formulation of stochastic differential equations</div><div>A. Onsager-Machlup path integral*</div><div>B. Martin-Siggia-Rose-De Dominicis-Janssen (MSRDJ) path integral</div><div>C. Moment generating functional</div><div>D. Response function in the MSRDJ formalism</div><div><br></div><div>VIII. Ornstein-Uhlenbeck process: The free Gaussian theory</div><div>A. Definition</div><div>B. Propagators in time domain</div><div>C. Propagators in Fourier domain</div><div><br></div><div>IX. Perturbation theory for stochastic differential equations</div><div>A. Vanishing moments of response fields</div><div>B. Vanishing response loops</div><div>C. Feynman rules for SDEs in time domain and frequency domain</div><div>D. Diagrams with more than a single external leg</div><div>E. Appendix: Unitary Fourier transform</div><div><br></div><div>X. Dynamic mean-field theory for random networks</div><div>A. Definition of the model and generating functional</div><div>B. Property of self-averaging</div>C. Average over the quenched disorder<div>D. Stationary statistics: Self-consistent autocorrelation of as motion of a particle in a potential</div><div>E. Transition to chaos</div><div>F. Assessing chaos by a pair of identical systems</div><div>G. Schrödinger equation for the maximum Lyapunov exponent</div><div>H. Condition for transition to chaos</div><div><br></div><div>XI. Vertex generating function</div><div>A. Motivating example for the expansion around a non-vanishing mean value</div><div>B. Legendre transform and definition of the vertex generating function Gamma</div><div>C. Perturbation expansion of Gamma</div><div>D. Generalized one-line irreducibility</div><div>E. Example</div>F. Vertex functions in the Gaussian case<div>G. Example: Vertex functions of the “phi^3 + phi^4”-theory</div><div>H. Appendix: Explicit cancellation until second order</div><div>I. Appendix: Convexity of W</div><div>J. Appendix: Legendre transform of a Gaussian</div><div><br></div><div>XII. Application: TAP approximation</div><div>Inverse problem</div><div><br></div><div>XIII. Expansion of cumulants into tree diagrams of vertex functions</div><div>A. Self-energy or mass operator Sigma</div><div><br></div><div>XIV. Loopwise expansion of the effective action - Tree level</div><div>A. Counting the number of loops</div>B. Loopwise expansion of the effective action - Higher numbers of loops<div>C. Example: phi^3 + phi^4-theory</div><div>D. Appendix: Equivalence of loopwise expansion and infinite resummation</div><div>E. Appendix: Interpretation of Gamma as effective action</div><div>F. Loopwise expansion of self-consistency equation</div><div><br></div><div>XV. Loopwise expansion in the MSRDJ formalism</div><div>A. Intuitive approach</div><div>B. Loopwise corrections to the effective equation of motion</div><div>C. Corrections to the self-energy and self-consistency</div><div>D. Self-energy correction to the full propagator</div><div>E. Self-consistent one-loop</div><div>F. Appendix: Solution by Fokker-Planck equation</div><br><div>XVI. Nomenclature</div><div><br></div><div>Acknowledgments</div><div><br></div><div>References</div>

Rubrieken

    Personen

      Trefwoorden

        Statistical Field Theory for Neural Networks