Bayes theorem in Statistcs

 

Bayes theorem 


Bayes' theorem, 

  • It is named after 18th-century British mathematician Thomas Bayes
  • It is a mathematical formula for determining conditional probability.
  • It shows the relationship between one conditional probability and its inverse. 


Conditional probability is the likelihood of an outcome occurring, based on a previous outcome occurring. Bayes theorem shows the relationship between one conditional probability and its inverse.

Related terms in Bays theorem

  • P(A):The probability of event A not concerning its associated event B. This is also called as Prior probability of A
  • P(B):The probability of event B not concerning its associated event A. This is also called as Prior probability of A
  • P(B|A) : Conditional probability of B given A. This is also called as likelihood.
  • P(A|B) : Conditional probability of A given B. This is also called as posterior probability

Bayes Example

Finding out a patient’s probability of having liver disease if they are an alcoholic. “Being an alcoholic” is the test (kind of like a litmus test) for liver disease.
  • A could mean the event “Patient has liver disease.” Past data tells you that 10% of patients entering your clinic have liver disease. P(A) = 0.10.
  • B could mean the litmus test that “Patient is an alcoholic.” Five percent of the clinic’s patients are alcoholics. P(B) = 0.05.
  • You might also know that among those patients diagnosed with liver disease, 7% are alcoholics. This is your B|A: the probability that a patient is alcoholic, given that they have liver disease, is 7%.

Solution:

P(A|B) = (0.07 * 0.1)/0.05 = 0.14
In other words, if the patient is an alcoholic, their chances of having liver disease is 0.14 (14%). This is a large increase from the 10% suggested by past data. But it’s still unlikely that any particular patient has liver disease.


No comments:
Write comments

Please do not enter spam links

Meet US

Services

More Services