Quick Contact


    Supervised learning

    It is a type of machine learning algorithm that make predictions using training datasets. These training datasets have labelled an input variables and output variables to predict desired outcome values. For example, Risk Assessment, detection of fraud, spam filtering, etc can be calculated. This algorithm learns from this dataset to build a model. So that, a model can predict outcome values. To validate a model a test dataset is being used. However, by using huge amount of training dataset has the ability to predict larger power to make new datasets.

    Thus, supervised learning algorithm can be written as:

    Y= f(x)

    Where (Y) implies the predicted an output value which can calculated by using mapping function and also determine a set for an input value (x). During training, the mapping function is used to link input features to a projected output which is designed by the machine learning model. The figure shows the process of supervised learning algorithm:

    Supervised Learning

    Further, supervised learning is classified into two different categories:

    Supervised Learning
    Regression Algorithm

    It is a mathematical predictive method in which the model finds the significant relationship between dependent variables and independent variables. The algorithm aims, predict a continuous number like sales, income, and test scores. Thus, linear regression equation can be written as:

    Yi=f(Xi, β) +ei )

    Where, Yi =Dependent variable, f = function, Xi= Independent variable, β= unknown parameters and ei= Error terms

    There are many different types of regression algorithms. The three most common are listed below:

    Linear regression

    One of the simplest and common machine learning algorithm is linear regression. It is a statistical concept which shows the relationship between two variables x and y. In other words, it shows how the dependent variable(y) values vary according to the independent variable (x) values.

    y=mX +c

    Where, y= Dependent variable

    x = Independent variable

    m = Coefficient of X

    c = Intercept point

    Generally, On the basis of the input variable, linear regression is divided into two categories:

    • Single linear regression:

      Only one independent variable is used.

    • Multiple linear regression:

      More than one independent variable are used.

    Now, let’s understand with the help of an example, how to implement linear regression using by jypter library of Python:

    Step 1: Import python modules

    To import python modules use:

    Import numpy as np

    import matplotlib.pyplot as plt

    from sklearn.linear_model import LinearRegression

    Firstly, it is compulsory to import libraries; where, “numpy” helps to manage numerical calculations, Matplotlib libraries will help to analysis process in the data manipulation and sklearn library imports the text data and stores in the input_data variable.

    Step 2: To enter dataset

    x = np.array([1,3,5,7,9,11,13,15])

    y = np.array([2,4,6,8,10,12,14,16])

    The two variables have been taken, namely dependent variable(y) and Independent variable(x).

    Step 3: Calculate the dataset

    linreg = LinearRegression()

    x = x.reshape(-1,1)

    linreg.fit(x,y)

    y_pred = linreg.predict(x)

    Further, values of the linear regression function passed parameters can store in a model. Thus, the values of the different variables can be calculated once the model is created using the fit function.

    Step 4: Visualise the dataset

    plt.scatter(x,y)

    plt.plot(x, y_pred, color=’red’)

    plt.show()

    Supervised Learning
    Output: Linear Regression

    Besides, also can calculate the values of a coefficient and Intercept point to determine the correctness of a model.

    print (linreg.coef_)

    print (linreg.intercept_)

    For the above program, the value of coefficient = 1 and intercept = 0.9

    Polynomial regression

    It is a linear regression type which is implemented for independent variable, when power is more than one. When a linear model is applied on a linear dataset it provides a better outcome. Besides, in Simple Linear Regression without any modification, when the same model is applied on a non-linear dataset, it does not provides a better outcome and which tends to increase in loss function, high error rate and decease in accuracy. So for such cases, where data points are arranged in a non-linear fashion, we need the Polynomial Regression model.

    An equation for two variables is:

    y= b0 + b1 x ………. (1)

    Similarly, an equation for multiple values:

    y=b0+ b1 x + b2 x2 + b3 x3+⋯..+ bn xn……… (2)

    Thus, Polynomial regression is:

    y = b0+ b1 x + b2x2+⋯………..+ bmxm + residual error…….. (3)

    Where, y=dependent variable

    x=independent variable

    b=slope

    Now, let’s understand with the help of an example, how to implement polynomial regression using by jypter library of Python:

    Step 1: Import python modules

    Import numpy as np

    import pandas as pd

    import matplotlib.pyplot as plt

    from sklearn.linear_model import LinearRegression

    from sklearn.preprocessing import PolynomialFeatures

    Where, “numpy” helps to manage numerical calculations, Matplotlib libraries will help to analysis process in the data manipulation and sklearn library imports the text data and stores in the input_data variable. Also, “PolynomialFeatures” can generate a new matrix with other all polynomial combinations of features with given degree.You can visualize input being transformed into matrix generated by PolynomialFeatures.

    Step 2: Input dataset

    x = np.array([-1.8,-0.8,-0.6,-0.5,-0.4,-0.2,0,0.2,0.4,0.5,0.6,0.8,1.8])

    y = np.array([-0.5,-0.75,-0.47,-0.98,-0.53,-0.92,0.1,0.6,0.69,0.96,-0.56,-0.93,-0.7])

    The dataset will be stored in NumPy packages in the array data structure.

    Step 3: Calculate the dataset

    linreg = LinearRegression

    x= x.reshape(-1,1)

    poly = PolynomialFeatures(degree=13)

    x_poly = poly.fit_transform(x)

    poly.fit(x_poly, y)

    linreg = LinearRegression()

    linreg.fit(x_poly,y)

    y_pred = linreg.predict(x_poly)

    The transform function will take the argument as an input array and produce the modified array, which will be further passed on to the fit function. Designing a polynomial model is similar to designing a linear regression model. Once a model has been designed, it can be used for the replication process.

    Step 4: Visualise the dataset

    plt.scatter(x,y,color=’blue’)

    plt.plot(x,y_pred,color=’red’)

    Now, scatter plot is used to plot data points; horizontal and vertical axis. It also show how one variable gets affected by another. Scatter plot of existing data will fit on linear regression line across.

    Supervised Learning
    Output: Polynomial Regression

    Copyright 1999- Ducat Creative, All rights reserved.