-
Notifications
You must be signed in to change notification settings - Fork 27
Open
Description
In the implementation of the bag of word error rate, you pick the maximum over positive deltas (i.e. what you could call sum of false negative frequencies) vs. negative deltas (i.e. sum of false positive frequencies):
return Math.max(dplus, dminus); |
What's the logic behind this, what definition is this based upon?
I would expect BoW in terms of
- an error rate (complement of accuracy) to be calculated as the sum of both deltas over the total count of tokens in both GT and OCR. Or for
- false negative rate (complement of recall), as the positive deltas over the total count of tokens in GT. Or for
- false discovery rate (complement of precision), as the negative deltas over the total count of tokens in OCR.
Metadata
Metadata
Assignees
Labels
No labels