orion.evaluation.point_accuracy

orion.evaluation.point_accuracy(expected, observed, data=None, start=None, end=None)[source]

Compute an accuracy score between the ground truth and the detected anomalies.

Parameters
  • expected (DataFrame or list of timestamps) – Ground truth passed as a pandas.DataFrame or list containing one column: timestamp.

  • observed (DataFrame or list of timestamps) – Detected anomalies passed as a pandas.DataFrame or list containing one column: timestamp.

  • data (DataFrame) – Original data, passed as a pandas.DataFrame containing timestamp. Used to extract start and end.

  • start (int) – Minimum timestamp of the original data.

  • end (int) – Maximum timestamp of the original data.

Returns

Accuracy score between the ground truth and detected anomalies.

Return type

float