Bayes Classifier

Mahima Jain
2 min readFeb 1, 2021

--

A Bayesian classifier is based on the idea that the role of a (natural) class
is to predict the values of features for members of that class. Examples are
grouped in classes because they have common values for the features. Such
classes are often called natural kinds. In this section, the target feature cor-
responds to a discrete class, which is not necessarily binary.
The idea behind a Bayesian classifier is that, if an class is known, prediction
of other features can be done. If it does not know the class, Bayes’ rule can
be used to predict the class given (some of) the feature values. In a Bayesian
classifier, the learning agent builds a probabilistic model of the features and
uses that model to predict the classification of a new example.

where,
P(Ci — x) : Posterior Probability, the conditional probability that is assigned
after the relevant evidence or background is taken into account
p(x — Ci) : Likelihood, it is about an infinite set of possible probabilities,
given an outcome
P(Ci) : Prior, the probability an event will happen before you taken any new
evidence into account.
p(x) : Evidence

Bayes Classifier algorithm

We can implement a Bayes classifier by following the below steps :
• Convert the data set into a frequency table
• Create Likelihood table by finding the probabilities
• Now, use Naive Bayesian equation to calculate the posterior probability
for each class. The class with the highest posterior probability is the
outcome of prediction.

Pros and Cons of Bayes Classifier

Below are the Pros for choosing Bayes Classifier :
• It is easy and fast to predict class of test data set. It also perform well
in multi class prediction.
• When assumption of independence holds, a Naive Bayes classifier per-
forms better compare to other models like logistic regression and you
need less training data.
• It perform well in case of categorical input variables compared to nu-
merical variable(s). For numerical variable, normal distribution is as-
sumed (bell curve, which is a strong assumption).
Below are the Cons for choosing Bayes Classifier.
• If categorical variable has a category (in test data set), which was
not observed in training data set, then model will assign a 0 (zero)
probability and will be unable to make a prediction.
• On the other side naive Bayes is also known as a bad estimator, so the
probability outputs from predict probability are not to be taken too
seriously.
• Another limitation of Naive Bayes is the assumption of independent
predictors.

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

Mahima Jain
Mahima Jain

Written by Mahima Jain

Just a geek who enjoys learning new technologies. Please feel free to correct me if there is anything wrong in my blogs.

No responses yet

Write a response