This may not be the perfect place to post this, but I didn't know where else.
Several websites discuss how to calculate net typing speed based on the raw (or gross) typing speed and the number of errors.
Here a typical one: https://www.speedtypingonline.com/typing-equations.
The standard formula for the gross typing speed (gwpm) is:
gwpm = ((totalkeystrokes/5)/minutes).
That is, divide the total number of characters typed (including spaces and punctuation) by 5 to get words and then by the number of minutes it took to type it.
To get the net typing speed, the standard practice is to deduct 1 wpm for each error/minute.
nwpm = gwpm - (errors/minutes)
But this can be written as
nwpm =words/minute - errors/minute = (words-errors)/minute
But a word is 5 characters, so if the character error rate is 20%, it's a word error rate of 100% (5 x 20%), so the net wpm would be zero. Or another way to look at it, If the error rate is 20%, then there is an error every 5 characters, which is the same as every word, so, again, the net wpm is zero.
And if the error rate is greater than 20%, the net wpm is negative.
Did I do that right, or am I missing something?