Review articleA review article is an article that summarizes the current state of understanding on a topic within a certain discipline. A review article is generally considered a secondary source since it may analyze and discuss the method and conclusions in previously published studies. It resembles a survey article or, in news publishing, overview article, which also surveys and summarizes previously published primary and secondary sources, instead of reporting new facts and results.
Systematic reviewA systematic review is a scholarly synthesis of the evidence on a clearly presented topic using critical methods to identify, define and assess research on the topic. A systematic review extracts and interprets data from published studies on the topic, then analyzes, describes, and summarizes interpretations into a refined conclusion. For example, a systematic review of randomized controlled trials is a way of summarizing and implementing evidence-based medicine.
Philosophical methodologyIn its most common sense, philosophical methodology is the field of inquiry studying the methods used to do philosophy. But the term can also refer to the methods themselves. It may be understood in a wide sense as the general study of principles used for theory selection, or in a more narrow sense as the study of ways of conducting one's research and theorizing with the goal of acquiring philosophical knowledge.
ResearchResearch is "creative and systematic work undertaken to increase the stock of knowledge". It involves the collection, organization and analysis of evidence to increase understanding of a topic, characterized by a particular attentiveness to controlling sources of bias and error. These activities are characterized by accounting and controlling for biases. A research project may be an expansion on past work in the field. To test the validity of instruments, procedures, or experiments, research may replicate elements of prior projects or the project as a whole.
Peer reviewPeer review is the evaluation of work by one or more people with similar competencies as the producers of the work (peers). It functions as a form of self-regulation by qualified members of a profession within the relevant field. Peer review methods are used to maintain quality standards, improve performance, and provide credibility. In academia, scholarly peer review is often used to determine an academic paper's suitability for publication. Peer review can be categorized by the type of activity and by the field or profession in which the activity occurs, e.
Machine learningMachine learning (ML) is an umbrella term for solving problems for which development of algorithms by human programmers would be cost-prohibitive, and instead the problems are solved by helping machines 'discover' their 'own' algorithms, without needing to be explicitly told what to do by any human-developed algorithms. Recently, generative artificial neural networks have been able to surpass results of many previous approaches.
MethodologyIn its most common sense, methodology is the study of research methods. However, the term can also refer to the methods themselves or to the philosophical discussion of associated background assumptions. A method is a structured procedure for bringing about a certain goal, like acquiring knowledge or verifying knowledge claims. This normally involves various steps, like choosing a sample, collecting data from this sample, and interpreting the data. The study of methods concerns a detailed description and analysis of these processes.
Cochrane LibraryThe Cochrane Library (named after Archie Cochrane) is a collection of databases in medicine and other healthcare specialties provided by Cochrane and other organizations. At its core is the collection of Cochrane Reviews, a database of systematic reviews and meta-analyses which summarize and interpret the results of medical research. The Cochrane Library aims to make the results of well-conducted controlled trials readily available and is a key resource in evidence-based medicine.
Literature reviewA literature review is an overview of the previously published works on a topic. The term can refer to a full scholarly paper or a section of a scholarly work such as a book, or an article. Either way, a literature review is supposed to provide the researcher/author and the audiences with a general image of the existing knowledge on the topic under question. A good literature review can ensure that a proper research question has been asked and a proper theoretical framework and/or research methodology have been chosen.
Islamic studiesIslamic studies refers to the academic study of Islam, and generally to academic multidisciplinary "studies" programs—programs similar to others that focus on the history, texts and theologies of other religious traditions, such as Eastern Christian Studies or Jewish Studies but also fields such as (environmental studies, Middle East studies, race studies, urban studies, etc.)—where scholars from diverse disciplines (history, culture, literature, art) participate and exchange ideas pertaining to the particular field of study.
Jewish studiesJewish studies (or Judaic studies; madey ha-yahadut) is an academic discipline centered on the study of Jews and Judaism. Jewish studies is interdisciplinary and combines aspects of history (especially Jewish history), Middle Eastern studies, Asian studies, Oriental studies, religious studies, archeology, sociology, languages (Jewish languages), political science, area studies, women's studies, and ethnic studies. Jewish studies as a distinct field is mainly present at colleges and universities in North America.
Data extractionData extraction is the act or process of retrieving data out of (usually unstructured or poorly structured) data sources for further data processing or data storage (data migration). The import into the intermediate extracting system is thus usually followed by data transformation and possibly the addition of metadata prior to export to another stage in the data workflow. Usually, the term data extraction is applied when (experimental) data is first imported into a computer from primary sources, like measuring or recording devices.
Knowledge extractionKnowledge extraction is the creation of knowledge from structured (relational databases, XML) and unstructured (text, documents, s) sources. The resulting knowledge needs to be in a machine-readable and machine-interpretable format and must represent knowledge in a manner that facilitates inferencing. Although it is methodically similar to information extraction (NLP) and ETL (data warehouse), the main criterion is that the extraction result goes beyond the creation of structured information or the transformation into a relational schema.
Maximum likelihood estimationIn statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed statistical model, the observed data is most probable. The point in the parameter space that maximizes the likelihood function is called the maximum likelihood estimate. The logic of maximum likelihood is both intuitive and flexible, and as such the method has become a dominant means of statistical inference.
Automated machine learningAutomated machine learning (AutoML) is the process of automating the tasks of applying machine learning to real-world problems. AutoML potentially includes every stage from beginning with a raw dataset to building a machine learning model ready for deployment. AutoML was proposed as an artificial intelligence-based solution to the growing challenge of applying machine learning. The high degree of automation in AutoML aims to allow non-experts to make use of machine learning models and techniques without requiring them to become experts in machine learning.
Academic publishingAcademic publishing is the subfield of publishing which distributes academic research and scholarship. Most academic work is published in academic journal articles, books or theses. The part of academic written output that is not formally published but merely printed up or posted on the Internet is often called "grey literature". Most scientific and scholarly journals, and many academic and scholarly books, though not all, are based on some form of peer review or editorial refereeing to qualify texts for publication.
Selection biasSelection bias is the bias introduced by the selection of individuals, groups, or data for analysis in such a way that proper randomization is not achieved, thereby failing to ensure that the sample obtained is representative of the population intended to be analyzed. It is sometimes referred to as the selection effect. The phrase "selection bias" most often refers to the distortion of a statistical analysis, resulting from the method of collecting samples. If the selection bias is not taken into account, then some conclusions of the study may be false.
Information extractionInformation extraction (IE) is the task of automatically extracting structured information from unstructured and/or semi-structured machine-readable documents and other electronically represented sources. In most of the cases this activity concerns processing human language texts by means of natural language processing (NLP). Recent activities in multimedia document processing like automatic annotation and content extraction out of images/audio/video/documents could be seen as information extraction Due to the difficulty of the problem, current approaches to IE (as of 2010) focus on narrowly restricted domains.
Naturalism (philosophy)In philosophy, naturalism is the idea or belief that only natural laws and forces (as opposed to supernatural ones) operate in the universe. According to philosopher Steven Lockwood, naturalism can be separated into an ontological sense and a methodological sense. "Ontological" refers to ontology, the philosophical study of what exists. On an ontological level, philosophers often treat naturalism as equivalent to materialism. For example, philosopher Paul Kurtz argues that nature is best accounted for by reference to material principles.
Data scrapingData scraping is a technique where a computer program extracts data from human-readable output coming from another program. Normally, data transfer between programs is accomplished using data structures suited for automated processing by computers, not people. Such interchange and protocols are typically rigidly structured, well-documented, easily parsed, and minimize ambiguity. Very often, these transmissions are not human-readable at all.