Uncertainty quantificationUncertainty quantification (UQ) is the science of quantitative characterization and estimation of uncertainties in both computational and real world applications. It tries to determine how likely certain outcomes are if some aspects of the system are not exactly known. An example would be to predict the acceleration of a human body in a head-on crash with another car: even if the speed was exactly known, small differences in the manufacturing of individual cars, how tightly every bolt has been tightened, etc.
Measurement uncertaintyIn metrology, measurement uncertainty is the expression of the statistical dispersion of the values attributed to a measured quantity. All measurements are subject to uncertainty and a measurement result is complete only when it is accompanied by a statement of the associated uncertainty, such as the standard deviation. By international agreement, this uncertainty has a probabilistic basis and reflects incomplete knowledge of the quantity value. It is a non-negative parameter.
UncertaintyUncertainty refers to epistemic situations involving imperfect or unknown information. It applies to predictions of future events, to physical measurements that are already made, or to the unknown. Uncertainty arises in partially observable or stochastic environments, as well as due to ignorance, indolence, or both. It arises in any number of fields, including insurance, philosophy, physics, statistics, economics, finance, medicine, psychology, sociology, engineering, metrology, meteorology, ecology and information science.
Propagation of uncertaintyIn statistics, propagation of uncertainty (or propagation of error) is the effect of variables' uncertainties (or errors, more specifically random errors) on the uncertainty of a function based on them. When the variables are the values of experimental measurements they have uncertainties due to measurement limitations (e.g., instrument precision) which propagate due to the combination of variables in the function. The uncertainty u can be expressed in a number of ways. It may be defined by the absolute error Δx.
Uncertainty principleIn quantum mechanics, the uncertainty principle (also known as Heisenberg's uncertainty principle) is any of a variety of mathematical inequalities asserting a fundamental limit to the product of the accuracy of certain related pairs of measurements on a quantum system, such as position, x, and momentum, p. Such paired-variables are known as complementary variables or canonically conjugate variables.
Statistical inferenceStatistical inference is the process of using data analysis to infer properties of an underlying distribution of probability. Inferential statistical analysis infers properties of a population, for example by testing hypotheses and deriving estimates. It is assumed that the observed data set is sampled from a larger population. Inferential statistics can be contrasted with descriptive statistics. Descriptive statistics is solely concerned with properties of the observed data, and it does not rest on the assumption that the data come from a larger population.
Neutron moderatorIn nuclear engineering, a neutron moderator is a medium that reduces the speed of fast neutrons, ideally without capturing any, leaving them as thermal neutrons with only minimal (thermal) kinetic energy. These thermal neutrons are immensely more susceptible than fast neutrons to propagate a nuclear chain reaction of uranium-235 or other fissile isotope by colliding with their atomic nucleus. Water (sometimes called "light water" in this context) is the most commonly used moderator (roughly 75% of the world's reactors).
Statistical mechanicsIn physics, statistical mechanics is a mathematical framework that applies statistical methods and probability theory to large assemblies of microscopic entities. It does not assume or postulate any natural laws, but explains the macroscopic behavior of nature from the behavior of such ensembles. Sometimes called statistical physics or statistical thermodynamics, its applications include many problems in the fields of physics, biology, chemistry, and neuroscience.
Statistical assumptionStatistics, like all mathematical disciplines, does not infer valid conclusions from nothing. Inferring interesting conclusions about real statistical populations almost always requires some background assumptions. Those assumptions must be made carefully, because incorrect assumptions can generate wildly inaccurate conclusions. Here are some examples of statistical assumptions: Independence of observations from each other (this assumption is an especially common error). Independence of observational error from potential confounding effects.
Neutron activation analysisNeutron activation analysis (NAA) is the nuclear process used for determining the concentrations of elements in many materials. NAA allows discrete sampling of elements as it disregards the chemical form of a sample, and focuses solely on atomic nuclei. The method is based on neutron activation and thus requires a source of neutrons. The sample is bombarded with neutrons, causing its constituent elements to form radioactive isotopes. The radioactive emissions and radioactive decay paths for each element have long been studied and determined.
Reactor-grade plutoniumReactor-grade plutonium (RGPu) is the isotopic grade of plutonium that is found in spent nuclear fuel after the uranium-235 primary fuel that a nuclear power reactor uses has burnt up. The uranium-238 from which most of the plutonium isotopes derive by neutron capture is found along with the U-235 in the low enriched uranium fuel of civilian reactors.
Tolerance intervalA tolerance interval (TI) is a statistical interval within which, with some confidence level, a specified sampled proportion of a population falls. "More specifically, a 100×p%/100×(1−α) tolerance interval provides limits within which at least a certain proportion (p) of the population falls with a given level of confidence (1−α)." "A (p, 1−α) tolerance interval (TI) based on a sample is constructed so that it would include at least a proportion p of the sampled population with confidence 1−α; such a TI is usually referred to as p-content − (1−α) coverage TI.
Reprocessed uraniumReprocessed uranium (RepU) is the uranium recovered from nuclear reprocessing, as done commercially in France, the UK and Japan and by nuclear weapons states' military plutonium production programs. This uranium makes up the bulk of the material separated during reprocessing. Commercial LWR spent nuclear fuel contains on average (excluding cladding) only four percent plutonium, minor actinides and fission products by weight.
Enriched uraniumEnriched uranium is a type of uranium in which the percent composition of uranium-235 (written 235U) has been increased through the process of isotope separation. Naturally occurring uranium is composed of three major isotopes: uranium-238 (238U with 99.2739–99.2752% natural abundance), uranium-235 (235U, 0.7198–0.7202%), and uranium-234 (234U, 0.0050–0.0059%). 235U is the only nuclide existing in nature (in any appreciable amount) that is fissile with thermal neutrons.
Statistical theoryThe theory of statistics provides a basis for the whole range of techniques, in both study design and data analysis, that are used within applications of statistics. The theory covers approaches to statistical-decision problems and to statistical inference, and the actions and deductions that satisfy the basic principles stated for these different approaches. Within a given approach, statistical theory gives ways of comparing statistical procedures; it can find a best possible procedure within a given context for given statistical problems, or can provide guidance on the choice between alternative procedures.
RadionuclideA radionuclide (radioactive nuclide, radioisotope or radioactive isotope) is a nuclide that has excess nuclear energy, making it unstable. This excess energy can be used in one of three ways: emitted from the nucleus as gamma radiation; transferred to one of its electrons to release it as a conversion electron; or used to create and emit a new particle (alpha particle or beta particle) from the nucleus. During those processes, the radionuclide is said to undergo radioactive decay.
Cosmogenic nuclideCosmogenic nuclides (or cosmogenic isotopes) are rare nuclides (isotopes) created when a high-energy cosmic ray interacts with the nucleus of an in situ Solar System atom, causing nucleons (protons and neutrons) to be expelled from the atom (see cosmic ray spallation). These nuclides are produced within Earth materials such as rocks or soil, in Earth's atmosphere, and in extraterrestrial items such as meteoroids. By measuring cosmogenic nuclides, scientists are able to gain insight into a range of geological and astronomical processes.
IsotopeIsotopes are distinct nuclear species (or nuclides, as technical term) of the same element. They have the same atomic number (number of protons in their nuclei) and position in the periodic table (and hence belong to the same chemical element), but differ in nucleon numbers (mass numbers) due to different numbers of neutrons in their nuclei. While all isotopes of a given element have almost the same chemical properties, they have different atomic masses and physical properties.
Weapons-grade nuclear materialWeapons-grade nuclear material is any fissionable nuclear material that is pure enough to make a nuclear weapon or has properties that make it particularly suitable for nuclear weapons use. Plutonium and uranium in grades normally used in nuclear weapons are the most common examples. (These nuclear materials have other categorizations based on their purity.) Only fissile isotopes of certain elements have the potential for use in nuclear weapons.
Interval estimationIn statistics, interval estimation is the use of sample data to estimate an interval of possible values of a parameter of interest. This is in contrast to point estimation, which gives a single value. The most prevalent forms of interval estimation are confidence intervals (a frequentist method) and credible intervals (a Bayesian method); less common forms include likelihood intervals and fiducial intervals.