Hopfield networkA Hopfield network (or Amari-Hopfield network, Ising model of a neural network or Ising–Lenz–Little model) is a form of recurrent artificial neural network and a type of spin glass system popularised by John Hopfield in 1982 as described by Shun'ichi Amari in 1972 and by Little in 1974 based on Ernst Ising's work with Wilhelm Lenz on the Ising model. Hopfield networks serve as content-addressable ("associative") memory systems with binary threshold nodes, or with continuous variables.
Transformer (machine learning model)A transformer is a deep learning architecture that relies on the parallel multi-head attention mechanism. The modern transformer was proposed in the 2017 paper titled 'Attention Is All You Need' by Ashish Vaswani et al., Google Brain team. It is notable for requiring less training time than previous recurrent neural architectures, such as long short-term memory (LSTM), and its later variation has been prevalently adopted for training large language models on large (language) datasets, such as the Wikipedia corpus and Common Crawl, by virtue of the parallelized processing of input sequence.
Integrated information theoryIntegrated information theory (IIT) attempts to provide a framework capable of explaining why some physical systems (such as human brains) are conscious, why they feel the particular way they do in particular states (e.g. why our visual field appears extended when we gaze out at the night sky), and what it would take for other physical systems to be conscious (Are other animals conscious? Might the whole Universe be?).
Reservoir computingReservoir computing is a framework for computation derived from recurrent neural network theory that maps input signals into higher dimensional computational spaces through the dynamics of a fixed, non-linear system called a reservoir. After the input signal is fed into the reservoir, which is treated as a "black box," a simple readout mechanism is trained to read the state of the reservoir and map it to the desired output. The first key benefit of this framework is that training is performed only at the readout stage, as the reservoir dynamics are fixed.
Stream of consciousnessIn literary criticism, stream of consciousness is a narrative mode or method that attempts "to depict the multitudinous thoughts and feelings which pass through the mind" of a narrator. The term was coined by Daniel Oliver in 1840 in First Lines of Physiology: Designed for the Use of Students of Medicine, when he wrote, If we separate from this mingled and moving stream of consciousness, our sensations and volitions, which are constantly giving it a new direction, and suffer it to pursue its own spontaneous course, it will appear, upon examination, that this, instead of being wholly fortuitous and uncertain, is determined by certain fixed laws of thought, which are collectively termed the association of ideas.
Collective unconsciousCollective unconscious (kollektives Unbewusstes) refers to the unconscious mind and shared mental concepts. It is generally associated with idealism and was coined by Carl Jung. According to Jung, the human collective unconscious is populated by instincts, as well as by archetypes: ancient primal symbols such as The Great Mother, the Wise Old Man, the Shadow, the Tower, Water, and the Tree of Life. Jung considered the collective unconscious to underpin and surround the unconscious mind, distinguishing it from the personal unconscious of Freudian psychoanalysis.
List of mathematical jargonThe language of mathematics has a vast vocabulary of specialist and technical terms. It also has a certain amount of jargon: commonly used phrases which are part of the culture of mathematics, rather than of the subject. Jargon often appears in lectures, and sometimes in print, as informal shorthand for rigorous arguments or precise ideas. Much of this is common English, but with a specific non-obvious meaning when used in a mathematical sense. Some phrases, like "in general", appear below in more than one section.
Superseded theories in scienceThis list catalogs well-accepted theories in science and pre-scientific natural philosophy and natural history which have since been superseded by scientific theories. Many discarded explanations were once supported by a scientific consensus, but replaced after more empirical information became available that identified flaws and prompted new theories which better explain the available data. Pre-modern explanations originated before the scientific method, with varying degrees of empirical support.
Foundations of mathematicsFoundations of mathematics is the study of the philosophical and logical and/or algorithmic basis of mathematics, or, in a broader sense, the mathematical investigation of what underlies the philosophical theories concerning the nature of mathematics. In this latter sense, the distinction between foundations of mathematics and philosophy of mathematics turns out to be vague. Foundations of mathematics can be conceived as the study of the basic mathematical concepts (set, function, geometrical figure, number, etc.
Set theorySet theory is the branch of mathematical logic that studies sets, which can be informally described as collections of objects. Although objects of any kind can be collected into a set, set theory, as a branch of mathematics, is mostly concerned with those that are relevant to mathematics as a whole. The modern study of set theory was initiated by the German mathematicians Richard Dedekind and Georg Cantor in the 1870s. In particular, Georg Cantor is commonly considered the founder of set theory.
Attention (machine learning)Machine learning-based attention is a mechanism mimicking cognitive attention. It calculates "soft" weights for each word, more precisely for its embedding, in the context window. It can do it either in parallel (such as in transformers) or sequentially (such as recursive neural networks). "Soft" weights can change during each runtime, in contrast to "hard" weights, which are (pre-)trained and fine-tuned and remain frozen afterwards. Multiple attention heads are used in transformer-based large language models.
PhilosophyPhilosophy (love of wisdom in ancient Greek) is a systematic study of general and fundamental questions concerning topics like existence, reason, knowledge, values, mind, and language. It is a rational and critical inquiry that reflects on its own methods and assumptions. Historically, many of the individual sciences, like physics and psychology, formed part of philosophy. But they are considered separate academic disciplines in the modern sense of the term.