Cross-entropyIn information theory, the cross-entropy between two probability distributions and over the same underlying set of events measures the average number of bits needed to identify an event drawn from the set if a coding scheme used for the set is optimized for an estimated probability distribution , rather than the true distribution . The cross-entropy of the distribution relative to a distribution over a given set is defined as follows: where is the expected value operator with respect to the distribution .
Scientific methodThe scientific method is an empirical method for acquiring knowledge that has characterized the development of science since at least the 17th century (with notable practitioners in previous centuries; see the article history of scientific method for additional detail.) It involves careful observation, applying rigorous skepticism about what is observed, given that cognitive assumptions can distort how one interprets the observation.
Joint entropyIn information theory, joint entropy is a measure of the uncertainty associated with a set of variables. The joint Shannon entropy (in bits) of two discrete random variables and with images and is defined as where and are particular values of and , respectively, is the joint probability of these values occurring together, and is defined to be 0 if . For more than two random variables this expands to where are particular values of , respectively, is the probability of these values occurring together, and is defined to be 0 if .
Differential entropyDifferential entropy (also referred to as continuous entropy) is a concept in information theory that began as an attempt by Claude Shannon to extend the idea of (Shannon) entropy, a measure of average (surprisal) of a random variable, to continuous probability distributions. Unfortunately, Shannon did not derive this formula, and rather just assumed it was the correct continuous analogue of discrete entropy, but it is not. The actual continuous version of discrete entropy is the limiting density of discrete points (LDDP).
World populationIn demographics, the world population is the total number of humans currently living. It was estimated by the United Nations to have exceeded eight billion in mid-November 2022. It took over 200,000 years of human prehistory and history for the human population to reach one billion and only 219 years more to reach 8 billion. The human population has experienced continuous growth following the Great Famine of 1315–1317 and the end of the Black Death in 1350, when it was nearly 370,000,000.
MethodologyIn its most common sense, methodology is the study of research methods. However, the term can also refer to the methods themselves or to the philosophical discussion of associated background assumptions. A method is a structured procedure for bringing about a certain goal, like acquiring knowledge or verifying knowledge claims. This normally involves various steps, like choosing a sample, collecting data from this sample, and interpreting the data. The study of methods concerns a detailed description and analysis of these processes.
Human population planningHuman population planning is the practice of managing the growth rate of a human population. The practice, traditionally referred to as population control, had historically been implemented mainly with the goal of increasing population growth, though from the 1950s to the 1980s, concerns about overpopulation and its effects on poverty, the environment and political stability led to efforts to reduce population growth rates in many countries.
Entropy (information theory)In information theory, the entropy of a random variable is the average level of "information", "surprise", or "uncertainty" inherent to the variable's possible outcomes. Given a discrete random variable , which takes values in the alphabet and is distributed according to : where denotes the sum over the variable's possible values. The choice of base for , the logarithm, varies for different applications. Base 2 gives the unit of bits (or "shannons"), while base e gives "natural units" nat, and base 10 gives units of "dits", "bans", or "hartleys".
Human overpopulationHuman overpopulation (or human population overshoot) describes a concern that human populations may become too large to be sustained by their environment or resources in the long term. The topic is usually discussed in the context of world population, though it may concern individual nations, regions, and cities. Since 1804, the global human population has increased from 1 billion to 8 billion due to medical advancements and improved agricultural productivity. Annual world population growth peaked at 2.
Conditional entropyIn information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable given that the value of another random variable is known. Here, information is measured in shannons, nats, or hartleys. The entropy of conditioned on is written as . The conditional entropy of given is defined as where and denote the support sets of and . Note: Here, the convention is that the expression should be treated as being equal to zero. This is because .
Population growthPopulation growth is the increase in the number of people in a population or dispersed group. Actual global human population growth amounts to around 83 million annually, or 1.1% per year. The global population has grown from 1 billion in 1800 to 7.9 billion in 2020. The UN projected population to keep growing, and estimates have put the total population at 8.6 billion by mid-2030, 9.8 billion by mid-2050 and 11.2 billion by 2100.
Dimensionality reductionDimensionality reduction, or dimension reduction, is the transformation of data from a high-dimensional space into a low-dimensional space so that the low-dimensional representation retains some meaningful properties of the original data, ideally close to its intrinsic dimension. Working in high-dimensional spaces can be undesirable for many reasons; raw data are often sparse as a consequence of the curse of dimensionality, and analyzing the data is usually computationally intractable (hard to control or deal with).
Entropy estimationIn various science/engineering applications, such as independent component analysis, , genetic analysis, speech recognition, manifold learning, and time delay estimation it is useful to estimate the differential entropy of a system or process, given some observations. The simplest and most common approach uses histogram-based estimation, but other approaches have been developed and used, each with its own benefits and drawbacks.
FalsifiabilityFalsifiability is a deductive standard of evaluation of scientific theories and hypotheses, introduced by the philosopher of science Karl Popper in his book The Logic of Scientific Discovery (1934). A theory or hypothesis is falsifiable (or refutable) if it can be logically contradicted by an empirical test. Popper proposed falsifiability as the cornerstone solution to both the problem of induction and the problem of demarcation.
Impedance analogyThe impedance analogy is a method of representing a mechanical system by an analogous electrical system. The advantage of doing this is that there is a large body of theory and analysis techniques concerning complex electrical systems, especially in the field of filters. By converting to an electrical representation, these tools in the electrical domain can be directly applied to a mechanical system without modification.
Mobility analogyThe mobility analogy, also called admittance analogy or Firestone analogy, is a method of representing a mechanical system by an analogous electrical system. The advantage of doing this is that there is a large body of theory and analysis techniques concerning complex electrical systems, especially in the field of filters. By converting to an electrical representation, these tools in the electrical domain can be directly applied to a mechanical system without modification.
Mechanical–electrical analogiesMechanical–electrical analogies are the representation of mechanical systems as electrical networks. At first, such analogies were used in reverse to help explain electrical phenomena in familiar mechanical terms. James Clerk Maxwell introduced analogies of this sort in the 19th century. However, as electrical network analysis matured it was found that certain mechanical problems could more easily be solved through an electrical analogy.
Estimates of historical world populationThis article lists current estimates of the world population in history. In summary, estimates for the progression of world population since the Late Middle Ages are in the following ranges: Estimates for pre-modern times are necessarily fraught with great uncertainties, and few of the published estimates have confidence intervals; in the absence of a straightforward means to assess the error of such estimates, a rough idea of expert consensus can be gained by comparing the values given in independent publications.
Hydraulic analogyElectronic-hydraulic analogies are the representation of electronic circuits by hydraulic circuits. Since electric current is invisible and the processes in play in electronics are often difficult to demonstrate, the various electronic components are represented by hydraulic equivalents. Electricity (as well as heat) was originally understood to be a kind of fluid, and the names of certain electric quantities (such as current) are derived from hydraulic equivalents.
Kullback–Leibler divergenceIn mathematical statistics, the Kullback–Leibler divergence (also called relative entropy and I-divergence), denoted , is a type of statistical distance: a measure of how one probability distribution P is different from a second, reference probability distribution Q. A simple interpretation of the KL divergence of P from Q is the expected excess surprise from using Q as a model when the actual distribution is P.