Machine Learning Tutorial
- Introduction to Machine Learning
- Classification Algorithm
- Types of ML Classification Algorithms
- Types of Machine Learning
- Supervised learning
- Applications of unsupervised learning
- Unsupervised Learning
- Reinforcement learning
Machine Learning Interview Question & Answers
The classification algorithm is an approach of supervised learning which allow the model to learn from the input data. By learning, it also classify new predictions. The predicted data set can be bi-class or multi-class. For example, speech recognition, identify bio metric, classification of documents etc. The algorithm which implements the classification on a dataset is known as a classifier. There are three types of classifications as shown in figure:
Binary Classification comes under Supervised Learning where the training dataset is labelled and it consists of two classes. Where in several possible binary analysis difficulties, if two combinations are not symmetric rather than overall correctness, then the corresponding symmetry are of varies according to the types of failures concern. For example, in medicinal tests, a false positive (identifying a virus if it is not present) is viewed uniquely from a false negative (not identifying a virus when it is present).
Zero-R is the easiest method which relies on the target also overlooks all its features. It identifies the popular group but Zero -R does not have the predictability power, it is just useful to define a baseline execution for another group systems.
A frequency statistics for the targeted class and choose the common repeatedly value.
The contribution of features in the zero-R algorithm is not much because of it doesn’t use any one of them.
The Zero-R only identify the largest class correctly. As mentioned earlier, it is just useful to define a baseline execution for another group systems.
One -R classifier is also defined as “One Rule” with an accurate algorithm which generates one rule for each identified data set further it choose a rule with least complete error as the “one rule”.