Skip to content

Adding of RMSE and Log-Cosh Loss Function in Machine learning for Model Evaluation #13379

@Addyk-24

Description

@Addyk-24

Feature description

Feature Description

I would like to add RMSE and Log-Cosh Loss Function in Machine learning folder in loss_function.py

RMSE

RMSE (Root Mean squared Error) metric to measure the difference between predicted and actual values in regression models.
It represents the square root of the average squared differences between predicted values (y_pred) and actual values (y_true)
It is given by :

RMSE = $$\sqrt{\frac{1}{n}\sum_{i=1}^{n}(y_i - \hat{y}_i)^2}$$

Log-Cosh Loss function

Log-Cosh is the logarithm of the hyperbolic cosine of the prediction error. It is very similar to Mean Squared Error but is **less sensitive to outliers.
It represents the square root of the average squared differences between predicted values (y_pred) and actual values (y_true).
it is given by:

LCL = $$\sum_{i=1}^{n} \log(\cosh(\hat{y}_i-y_i))$$

Benefits of Adding RMSE and Log-Cosh

  • Squaring the error penalizes large deviations more heavily than small ones.
  • Useful when outliers are important and you want the model to focus on minimizing them.
  • Behaves like MSE for small errors (smooth) and like MAE for large errors (less sensitive to extreme outliers).
  • Helps prevent the model from being overly influenced by rare extreme values.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementThis PR modified some existing files

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions