"In probability theory and statistics, Bayes' theorem (alternatively Bayes' law or Bayes' rule) relates current to prior belief. It also relates current to prior evidence. It is important in the mathematical manipulation of conditional probabilities. Bayes's rule can be derived from more basic axioms of probability, specifically conditional probability.
When applied, the probabilities involved in Bayes's theorem may have any of a number of probability interpretations. In one of these interpretations, the theorem is used directly as part of a particular approach to statistical inference. ln particular, with the Bayesian interpretation of probability, the theorem expresses how a subjective degree of belief should rationally change to account for evidence: this is Bayesian inference, which is fundamental to Bayesian statistics. However, Bayes's theorem has applications in a wide range of calculations involving probabilities, not just in Bayesian inference.
Bayes's theorem is named after Rev. Thomas Bayes (/ˈbeɪz/; 1701–1761), who first showed how to use new evidence to update beliefs. Bayes' unpublished manuscript was significantly edited by Richard Price before it was posthumously read at the Royal Society. Bayes' algorithm remained unknown until it was independently rediscovered and further developed by Pierre-Simon Laplace, who first published the modern formulation in his 1812 Théorie analytique des probabilités.
Sir Harold Jeffreys put Bayes' algorithm and Laplace's formulation on an axiomatic basis. Jeffreys wrote that Bayes's theorem "is to the theory of probability what Pythagoras's theorem is to geometry""
See more @ Wikipedia
More Info...