Error calculation allows you to see how well a machine learning
method is performing.
One way of determining this performance is to calculate a numerical error This number is sometimes a percent,
however it can also be a score or distance. The goal is usually to minimize an error
percent or distance:
however th goal may be to minimize or maximize a score. Encog supports the
following error calculation methods.
Sum of Squares Error (ESS)
Root Mean Square Error (RMS)
Mean Square Error (MSE) (default)
SOM Error (Euclidean Distance Error)
RMSE measures error of a predicted numeric value, and so applies to contexts like
regression and some recommender system techniques,
which rely on predicting a numeric value. It is not relevant to classification
techniques
like logistic regression and Naive Bayes, which predict categorical values.
It also is not relevant to unsupervied techniques like clustering.
The root-mean-square deviation (RMSD) or root-mean-square error (RMSE) is a
frequently used measure of the
differences between values predicted by a model or an estimator and the values
actually observed. Basically,
the RMSD represents the sample standard deviation of the differences between
predicted values and observed values.
These individual differences are called residuals when the calculations are
performed over the data sample that was used for estimation,
and are called prediction errors when computed out-of-sample. The RMSD serves
to aggregate the magnitudes
of the errors in predictions for various times into a single measure of predictive
power. RMSD is a good measure of accuracy,
but only to compare forecasting errors of different models for a particular variable
and not between variables, as it is scale-dependent.
Submit