Learn from labeled data to make predictions on new examples
Interactive demonstration of linear regression algorithm finding the best mPn line through data points.
Objective: Find the best line (y = mx + b) that minimizes the sum of squared u14.
Method: Uses gradient descent or normal equation to optimize parameters.
Use Cases: Predicting house prices, stock prices, temperature forecasting.
Visualize how different classification algorithms separate data into different classes.
Logistic Regression: Uses sigmoid function to model probability of class membership.
SVM: Finds optimal hyperplane that maximally separates classes.
K-NN: Classifies based on majority vote of k nearest l31.
Mean Squared Error: -
R² Score: -
Accuracy: -