Out-of-body experienceAn out-of-body experience (OBE or sometimes OOBE) is a phenomenon in which a person perceives the world from a location outside their physical body. An OBE is a form of autoscopy (literally "seeing self"), although this term is more commonly used to refer to the pathological condition of seeing a second self, or doppelgänger. The term out-of-body experience was introduced in 1943 by G. N. M. Tyrrell in his book Apparitions, and was adopted by researchers such as Celia Green, and Robert Monroe, as an alternative to belief-centric labels such as "astral projection" or "spirit walking".
ExperienceExperience refers to conscious events in general, more specifically to perceptions, or to the practical knowledge and familiarity that is produced by these processes. Understood as a conscious event in the widest sense, experience involves a subject to which various items are presented. In this sense, seeing a yellow bird on a branch presents the subject with the objects "bird" and "branch", the relation between them and the property "yellow". Unreal items may be included as well, which happens when experiencing hallucinations or dreams.
Near-death experienceA near-death experience (NDE) is a profound personal experience associated with death or impending death which researchers describe as having similar characteristics. When positive, which the great majority are, such experiences may encompass a variety of sensations including detachment from the body, feelings of levitation, total serenity, security, warmth, joy, the experience of absolute dissolution, review of major life events, the presence of a light, and seeing dead relatives.
Prior probabilityA prior probability distribution of an uncertain quantity, often simply called the prior, is its assumed probability distribution before some evidence is taken into account. For example, the prior could be the probability distribution representing the relative proportions of voters who will vote for a particular politician in a future election. The unknown quantity may be a parameter of the model or a latent variable rather than an observable variable.
Empirical probabilityIn probability theory and statistics, the empirical probability, relative frequency, or experimental probability of an event is the ratio of the number of outcomes in which a specified event occurs to the total number of trials, i.e., by means not of a theoretical sample space but of an actual experiment. More generally, empirical probability estimates probabilities from experience and observation. Given an event A in a sample space, the relative frequency of A is the ratio \tfrac m n, m being the number of outcomes in which the event A occurs, and n being the total number of outcomes of the experiment.
User experienceThe user experience (UX) is how a user interacts with and experiences a product, system or service. It includes a person's perceptions of utility, ease of use, and efficiency. Improving user experience is important to most companies, designers, and creators when creating and refining products because negative user experience can diminish the use of the product and, therefore, any desired positive impacts; conversely, designing toward profitability often conflicts with ethical user experience objectives and even causes harm.
Empirical distribution functionIn statistics, an empirical distribution function (commonly also called an empirical cumulative distribution function, eCDF) is the distribution function associated with the empirical measure of a sample. This cumulative distribution function is a step function that jumps up by 1/n at each of the n data points. Its value at any specified value of the measured variable is the fraction of observations of the measured variable that are less than or equal to the specified value.
Bayesian inferenceBayesian inference (ˈbeɪziən or ˈbeɪʒən ) is a method of statistical inference in which Bayes' theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is particularly important in the dynamic analysis of a sequence of data. Bayesian inference has found application in a wide range of activities, including science, engineering, philosophy, medicine, sport, and law.
Religious experienceA religious experience (sometimes known as a spiritual experience, sacred experience, or mystical experience) is a subjective experience which is interpreted within a religious framework. The concept originated in the 19th century, as a defense against the growing rationalism of Western society. William James popularised the concept. In some religions this may result in unverified personal gnosis. Many religious and mystical traditions see religious experiences (particularly the knowledge which comes with them) as revelations caused by divine agency rather than ordinary natural processes.
Bayesian probabilityBayesian probability (ˈbeɪziən or ˈbeɪʒən ) is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification of a personal belief. The Bayesian interpretation of probability can be seen as an extension of propositional logic that enables reasoning with hypotheses; that is, with propositions whose truth or falsity is unknown.
Fisher informationIn mathematical statistics, the Fisher information (sometimes simply called information) is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ of a distribution that models X. Formally, it is the variance of the score, or the expected value of the observed information. The role of the Fisher information in the asymptotic theory of maximum-likelihood estimation was emphasized by the statistician Ronald Fisher (following some initial results by Francis Ysidro Edgeworth).
Empirical processIn probability theory, an empirical process is a stochastic process that describes the proportion of objects in a system in a given state. For a process in a discrete state space a population continuous time Markov chain or Markov population model is a process which counts the number of objects in a given state (without rescaling). In mean field theory, limit theorems (as the number of objects becomes large) are considered and generalise the central limit theorem for empirical measures.
Body of lightThe body of light, sometimes called the 'astral body' or the 'subtle body,' is a "quasi material" aspect of the human body, being neither solely physical nor solely spiritual, posited by a number of philosophers, and elaborated on according to various esoteric, occult, and mystical teachings. Other terms used for this body include body of glory, spirit-body, luciform body, augoeides ('radiant body'), astroeides ('starry or sidereal body'), and celestial body.
EvidenceEvidence for a proposition is what supports the proposition. It is usually understood as an indication that the supported proposition is true. What role evidence plays and how it is conceived varies from field to field. In epistemology, evidence is what justifies beliefs or what makes it rational to hold a certain doxastic attitude. For example, a perceptual experience of a tree may act as evidence that justifies the belief that there is a tree. In this role, evidence is usually understood as a private mental state.
Psychedelic experienceA psychedelic experience (known colloquially as a trip) is a temporary altered state of consciousness induced by the consumption of a psychedelic substance (most commonly LSD, mescaline, psilocybin mushrooms, or DMT). For example, an acid trip is a psychedelic experience brought on by the use of LSD, while a mushroom trip is a psychedelic experience brought on by the use of psilocybin. Psychedelic experiences feature alterations in normal perception such as visual distortions and a subjective loss of self-identity, sometimes interpreted as mystical experiences.
Jeffreys priorIn Bayesian probability, the Jeffreys prior, named after Sir Harold Jeffreys, is a non-informative prior distribution for a parameter space; its density function is proportional to the square root of the determinant of the Fisher information matrix: It has the key feature that it is invariant under a change of coordinates for the parameter vector . That is, the relative probability assigned to a volume of a probability space using a Jeffreys prior will be the same regardless of the parameterization used to define the Jeffreys prior.
Bayesian statisticsBayesian statistics (ˈbeɪziən or ˈbeɪʒən ) is a theory in the field of statistics based on the Bayesian interpretation of probability where probability expresses a degree of belief in an event. The degree of belief may be based on prior knowledge about the event, such as the results of previous experiments, or on personal beliefs about the event. This differs from a number of other interpretations of probability, such as the frequentist interpretation that views probability as the limit of the relative frequency of an event after many trials.
Philosophical methodologyIn its most common sense, philosophical methodology is the field of inquiry studying the methods used to do philosophy. But the term can also refer to the methods themselves. It may be understood in a wide sense as the general study of principles used for theory selection, or in a more narrow sense as the study of ways of conducting one's research and theorizing with the goal of acquiring philosophical knowledge.
Empirical researchEmpirical research is research using empirical evidence. It is also a way of gaining knowledge by means of direct and indirect observation or experience. Empiricism values some research more than other kinds. Empirical evidence (the record of one's direct observations or experiences) can be analyzed quantitatively or qualitatively. Quantifying the evidence or making sense of it in qualitative form, a researcher can answer empirical questions, which should be clearly defined and answerable with the evidence collected (usually called data).
Information theoryInformation theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field, in applied mathematics, is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy.