Transmission electron microscopyTransmission electron microscopy (TEM) is a microscopy technique in which a beam of electrons is transmitted through a specimen to form an image. The specimen is most often an ultrathin section less than 100 nm thick or a suspension on a grid. An image is formed from the interaction of the electrons with the sample as the beam is transmitted through the specimen. The image is then magnified and focused onto an imaging device, such as a fluorescent screen, a layer of photographic film, or a sensor such as a scintillator attached to a charge-coupled device.
Scanning electron microscopeA scanning electron microscope (SEM) is a type of electron microscope that produces images of a sample by scanning the surface with a focused beam of electrons. The electrons interact with atoms in the sample, producing various signals that contain information about the surface topography and composition of the sample. The electron beam is scanned in a raster scan pattern, and the position of the beam is combined with the intensity of the detected signal to produce an image.
Electron microscopeAn electron microscope is a microscope that uses a beam of electrons as a source of illumination. They use electron optics that are analogous to the glass lenses of an optical light microscope. As the wavelength of an electron can be up to 100,000 times shorter than that of visible light, electron microscopes have a higher resolution of about 0.1 nm, which compares to about 200 nm for light microscopes.
Electron diffractionElectron diffraction refers to changes in the direction of electron beams due to interactions with atoms. Close to the atoms the changes are described as Fresnel diffraction; far away they are called Fraunhofer diffraction. The resulting map of the directions of the electrons far from the sample (Fraunhofer diffraction) is called a diffraction pattern, see for instance Figure 1. These patterns are similar to x-ray and neutron diffraction patterns, and are used to study the atomic structure of gases, liquids, surfaces and bulk solids.
Covariance matrixIn probability theory and statistics, a covariance matrix (also known as auto-covariance matrix, dispersion matrix, variance matrix, or variance–covariance matrix) is a square matrix giving the covariance between each pair of elements of a given random vector. Any covariance matrix is symmetric and positive semi-definite and its main diagonal contains variances (i.e., the covariance of each element with itself). Intuitively, the covariance matrix generalizes the notion of variance to multiple dimensions.
Empirical processIn probability theory, an empirical process is a stochastic process that describes the proportion of objects in a system in a given state. For a process in a discrete state space a population continuous time Markov chain or Markov population model is a process which counts the number of objects in a given state (without rescaling). In mean field theory, limit theorems (as the number of objects becomes large) are considered and generalise the central limit theorem for empirical measures.
Low-voltage electron microscopeLow-voltage electron microscope (LVEM) is an electron microscope which operates at accelerating voltages of a few kiloelectronvolts or less. Traditional electron microscopes use accelerating voltages in the range of 10-1000 keV. Low voltage imaging in transmitted electrons is possible in many new scanning electron detector. Low cost alternative is dedicated table top low voltage transmission electron microscope.
Scanning transmission electron microscopyA scanning transmission electron microscope (STEM) is a type of transmission electron microscope (TEM). Pronunciation is [stɛm] or [ɛsti:i:ɛm]. As with a conventional transmission electron microscope (CTEM), images are formed by electrons passing through a sufficiently thin specimen. However, unlike CTEM, in STEM the electron beam is focused to a fine spot (with the typical spot size 0.05 – 0.2 nm) which is then scanned over the sample in a raster illumination system constructed so that the sample is illuminated at each point with the beam parallel to the optical axis.
CovarianceIn probability theory and statistics, covariance is a measure of the joint variability of two random variables. If the greater values of one variable mainly correspond with the greater values of the other variable, and the same holds for the lesser values (that is, the variables tend to show similar behavior), the covariance is positive. In the opposite case, when the greater values of one variable mainly correspond to the lesser values of the other, (that is, the variables tend to show opposite behavior), the covariance is negative.
Independent and identically distributed random variablesIn probability theory and statistics, a collection of random variables is independent and identically distributed if each random variable has the same probability distribution as the others and all are mutually independent. This property is usually abbreviated as i.i.d., iid, or IID. IID was first defined in statistics and finds application in different fields such as data mining and signal processing. Statistics commonly deals with random samples. A random sample can be thought of as a set of objects that are chosen randomly.
Scanning tunneling microscopeA scanning tunneling microscope (STM) is a type of microscope used for imaging surfaces at the atomic level. Its development in 1981 earned its inventors, Gerd Binnig and Heinrich Rohrer, then at IBM Zürich, the Nobel Prize in Physics in 1986. STM senses the surface by using an extremely sharp conducting tip that can distinguish features smaller than 0.1 nm with a 0.01 nm (10 pm) depth resolution. This means that individual atoms can routinely be imaged and manipulated.
Test oracleIn computing, software engineering, and software testing, a test oracle (or just oracle) is a mechanism for determining whether a test has passed or failed. The use of oracles involves comparing the output(s) of the system under test, for a given test-case input, to the output(s) that the oracle determines that product should have. The term "test oracle" was first introduced in a paper by William E. Howden. Additional work on different kinds of oracles was explored by Elaine Weyuker.
Software testingSoftware testing is the act of examining the artifacts and the behavior of the software under test by validation and verification. Software testing can also provide an objective, independent view of the software to allow the business to appreciate and understand the risks of software implementation. Test techniques include, but are not necessarily limited to: analyzing the product requirements for completeness and correctness in various contexts like industry perspective, business perspective, feasibility and viability of implementation, usability, performance, security, infrastructure considerations, etc.
Unit testingIn computer programming, unit testing is a software testing method by which individual units of source code—sets of one or more computer program modules together with associated control data, usage procedures, and operating procedures—are tested to determine whether they are fit for use. It is a standard step in development and implementation approaches such as Agile. Before unit testing, capture and replay testing tools were the norm. In 1997, Kent Beck and Erich Gamma developed and released JUnit, a unit test framework that became popular with Java developers.
Stochastic processIn probability theory and related fields, a stochastic (stəˈkæstɪk) or random process is a mathematical object usually defined as a sequence of random variables, where the index of the sequence has the interpretation of time. Stochastic processes are widely used as mathematical models of systems and phenomena that appear to vary in a random manner. Examples include the growth of a bacterial population, an electrical current fluctuating due to thermal noise, or the movement of a gas molecule.
MicroscopeA microscope () is a laboratory instrument used to examine objects that are too small to be seen by the naked eye. Microscopy is the science of investigating small objects and structures using a microscope. Microscopic means being invisible to the eye unless aided by a microscope. There are many types of microscopes, and they may be grouped in different ways.
Normal distributionIn statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is The parameter is the mean or expectation of the distribution (and also its median and mode), while the parameter is its standard deviation. The variance of the distribution is . A random variable with a Gaussian distribution is said to be normally distributed, and is called a normal deviate.
Donsker's theoremIn probability theory, Donsker's theorem (also known as Donsker's invariance principle, or the functional central limit theorem), named after Monroe D. Donsker, is a functional extension of the central limit theorem. Let be a sequence of independent and identically distributed (i.i.d.) random variables with mean 0 and variance 1. Let . The stochastic process is known as a random walk. Define the diffusively rescaled random walk (partial-sum process) by The central limit theorem asserts that converges in distribution to a standard Gaussian random variable as .
Distance correlationIn statistics and in probability theory, distance correlation or distance covariance is a measure of dependence between two paired random vectors of arbitrary, not necessarily equal, dimension. The population distance correlation coefficient is zero if and only if the random vectors are independent. Thus, distance correlation measures both linear and nonlinear association between two random variables or random vectors. This is in contrast to Pearson's correlation, which can only detect linear association between two random variables.
Base pairA base pair (bp) is a fundamental unit of double-stranded nucleic acids consisting of two nucleobases bound to each other by hydrogen bonds. They form the building blocks of the DNA double helix and contribute to the folded structure of both DNA and RNA. Dictated by specific hydrogen bonding patterns, "Watson–Crick" (or "Watson–Crick–Franklin") base pairs (guanine–cytosine and adenine–thymine) allow the DNA helix to maintain a regular helical structure that is subtly dependent on its nucleotide sequence.