Assumptions of Linear Regression and Testing the assumptions in Python
Assumptions of Linear Regression and Testing the assumptions in Python. We will discuss the key assumptions of Linear Regression and test in Python
Assumptions of Linear Regression and Testing the assumptions in R
Assumptions of Linear Regression and Testing the assumptions in R. We will discuss the key assumptions of Linear Regression and test it using R code.
No Intercept Linear Regression Model and RMSE
A linear regression model with intercept = 0, i.e. the regression equation passes through the origin is called a No Intercept Linear Regression Model.
Linear Regression | No Intercept Linear Regression Model and RMSE in R
No Intercept Linear Regression Model and RMSE in R. We will learn how to estimate the value using the model and compute the error.
Variable Transformation in Machine Learning
In machine learning, we apply Variable Transformation to improve the fit of the regression model on the data and improve the model performance.
Linear Regression | Importance of Variable Transformation and R Code
Importance of Variable Transformation and R Code. I will explain how a small variable transformation can improve the model performance drastically.
Multicollinearity and Variance Inflation Factor
Multicollinearity is a phenomenon when two or more independent variables are highly intercorrelated. VIF is the test for multicollinearity.
Multiple Linear Regression & Adjusted R-Squared
This blog provides a detailed explanation of Multiple Linear Regression and Adjusted R-Squared with practical examples using Python & R code.
“R Squared” Formula, Concept & Calculation in Regression
“R Squared” is a measure of how good the linear regression model is fitting on the data. It is calculated as the ratio of Explained Variance to Total Variance.