Regularization (mathematics)In mathematics, statistics, finance, computer science, particularly in machine learning and inverse problems, regularization is a process that changes the result answer to be "simpler". It is often used to obtain results for ill-posed problems or to prevent overfitting. Although regularization procedures can be divided in many ways, the following delineation is particularly helpful: Explicit regularization is regularization whenever one explicitly adds a term to the optimization problem.
Two-dimensional conformal field theoryA two-dimensional conformal field theory is a quantum field theory on a Euclidean two-dimensional space, that is invariant under local conformal transformations. In contrast to other types of conformal field theories, two-dimensional conformal field theories have infinite-dimensional symmetry algebras. In some cases, this allows them to be solved exactly, using the conformal bootstrap method. Notable two-dimensional conformal field theories include minimal models, Liouville theory, massless free bosonic theories, Wess–Zumino–Witten models, and certain sigma models.
Neural codingNeural coding (or neural representation) is a neuroscience field concerned with characterising the hypothetical relationship between the stimulus and the individual or ensemble neuronal responses and the relationship among the electrical activity of the neurons in the ensemble. Based on the theory that sensory and other information is represented in the brain by networks of neurons, it is thought that neurons can encode both digital and analog information.