Shortcomings of the 80% Rule - Arbitrariness and Lack of Statistical Basis

The final shortcoming of the 80% Rule that I will discuss is its arbitrariness and lack of statistical basis. The choice of 80% has no scientific or empirical basis. There is no underlying reason for choosing 80%; it is just as good as the choice of 75% or 90%.

While all "threshold" rules have some degree of arbitrariness, some are more arbitrary than others. For example, in Hazelwood, the courts determined that "2 or 3 units" of standard deviation was considered to be statistically significant. The court could have chosen 1 unit, or 5 units, but they chose "2 or 3" units. Why? Presumably because these values are associated with probabilities of 5% and 1%, respectively, which are widely accepted within the statistics and social sciences communities. In the case of the 80% Rule, there is no such general acceptance.

The 80% Rule's arbitrary nature has been questioned by courts in the United States since the 1980's. In fact, the U.S. Equal Employment Opportunities Commission suggested in a recent memorandum that a "more defensible standard" would be based on comparing an organization's hiring rate of a particular protected group with the rate that would occur if the organization "simply selected people at random".

Given all of the problems associated with the 80% Rule, it makes sense to explore other statistical techniques to examine the kinds of questions to which the 80% Rule is often applied. In the next post, I will discuss the Fisher's Exact test. The Fisher's Exact test is generally accepted by the scientific community and the legal community and is based on sound statistical principles.

0 comments: