In the following article, I introduce some of the basic concepts of the Bayesian Network, which
comes from the probability theory. I continue with the definitions of probability theory, joint
probability distribution, and graph theory in the Bayesian Network.
In the following post, we will learn about the nature of the convolutional neural networks and their
layers and a short example of defining transfer learning models in python with the use of the Keras
library;
The feature selection process is divided into supervised (considering the target feature) and
unsupervised (ignoring the target) approaches. Moreover, the supervised model includes various
strategies, for instance;
Bayesian network is a graphical model (DAG — Directed Acyclic Graph)based on the probability
definitions and reviews the conditional dependencies between the variables and events (chance of
happening an event).
If we want to solve this problem with machine learning models, we need a clean dataset with class
labels, which would be a supervised binary classification problem. In this study, I will introduce
an example of this problem and try to predict the resistance with a classifier.