Welcome to Kenny-s Blog

Bayesian statistics

Bayesian statistics is a mathematical approach that applies the rules of probability to combine sample data with prior, or existing, information. Employing both prior and sample information, Bayesian statistics aims to update subjective beliefs in light of new data or evidence (sample data), in order to produce inferences that can be more precise than would be obtained by either source of information by itself. The goal of Bayesian inference is to provide a rational, mathematically sound procedure for incorporating prior beliefs, with the evidence at hand, to produce an updated posterior (or later) belief. Bayesian inference allows for the continual adjustment of beliefs under new data.

To understand Bayesian inference, it’s important first to look at statistical inference, in general, as the process of deducing properties of an underlying larger population distribution by analysis of smaller sample data. The techniques include collecting and reading sample data, applying the rules of probability and possibly other information and thus deriving estimates, like the mean and variance of the population, as well as performing hypothesis testing.

In Bayesian inference, probability is interpreted as a measure of believability or confidence that an individual may have about the occurrence of a specific event. Bayesian statistics contrasts with classical,or frequentist, statistics, which assumes that probabilities are the frequencyof random events that happen in along runof repeated trials. While frequentist statistics aims to eliminateuncertainty by providing estimates without prior information, Bayesian statistics tries toretainand refine uncertainty by adjusting individual beliefs as a consequence of new evidence.

At the heart of Bayesian inference is Bayesian logic, named for Thomas Bayes, a 18th-century clergyman and mathematician. Bayesian logic can be applied to decision making and inferential statistics that deals with probability inference: using the knowledge of prior events to predict future events.

Although the Bayesian theory has roots in the 18th-century, the concept took flight in the mid-20th century and has become more popular in recent decades for applications including animal breeding in the 1950s, education measurement in the 1960s and 1970s, spatial statistics in the 1980s, and marketing and political science in the 1990s.

Today, Bayesian models and methods that have been developed in various fields, from financial forecasting to medical research, are widely considered useful. Bayesian inference has applications in artificial intelligence (AI) and expert systems as well as in developing algorithms for identifying e-mail spam.

Have something to add? Share it in the comments.

Your email address will not be published. Required fields are marked *