Skip to main content
# Table 2 *Evaluation metrics*

From: Recommender systems to support learners’ Agency in a Learning Context: a systematic review

Measure | Definition | Equation | N | % |
---|---|---|---|---|

Precision | The ratio of relevant items selected to the number of items selected. | \( P=\frac{N_{rs}}{N_s} \) | 20 | 66.67 |

Recall | The ratio of relevant items selected to the total number of relevant items available. | \( R=\frac{N_{rs}}{N_r} \) | 20 | 66.67 |

F1 | The combination of precision and recall measures, because it is possible to increase one at the expense of the other. | \( {F}_1=\frac{2 PR}{P+R} \) | 14 | 46.67 |

Accuracy rate | Ratio of good recommendations to all recommendations. | \( AcR=\frac{N_r}{N} \) | 7 | 23.33 |

Mean absolute error | The divergence between prediction and actual opinion. | \( MAE=\frac{\sum \limits_{i=1}^n\left|{p}_i-{r}_i\right|}{N} \) | 5 | 16.67 |

Coverage | The ratio of items for which a recommender system can provide recommendations. | \( Coverage=\frac{N_s}{N} \) | 4 | 13.33 |

Root mean squared error | Mean Squared Error (MSE) is a metric that penalizes more major errors than minor errors, but does not offer an intuitive scale. RMSE repositions MSE results along a more intuitive scale. | \( RMSE=\sqrt{\frac{\sum \limits_{i=1}^n{\left({p}_i-{r}_i\right)}^2}{N}} \) | 1 | 3.33 |

Average rating | The average rating from all the users. | \( AvR=\frac{\sum \limits_{i=1}^n{r}_i}{\# ratings} \) | 1 | 3.33 |