I heard that it is normal for the SMART attribute "read error rate" to be very high with seagate drives because of the different way they handle how they count read errors.
For Western Digital drives, if you don't have any read errors, the raw value for the read error rate will say 0.
For Seagate drives, it is different. I basically want to know how to tell, by looking at the raw value of the read error rate, that a Seagate drive has any read errors or not?
I read from a source that the raw values for read error rate represent the sector count and not the error counts, and would roll back to 0 when it reaches 250 million, and some math involved that makes it so that when it says 1 for a western digital drive, it means 1 read error, but when it says 1 on a seagate drive, it doesn't mean 1 read error.
So does anybody here have any knowledge of Seagate drives to understand how to read the read error rate SMART attribute?