Skip to content

Obtaining illogical results from posterior LDA-classification because of "too good" data?

1 message · Uwe Ligges

#
Arne Schulz wrote:
Your learning data has an intra-group variance close to 0 and hence the 
pooled variance is also almost 0.
Hence minimal deviation from the center makes the posterior almost 1 in 
the corresponding direction.

In your second example you are increasing the variance by orders of 
magnitude.

Best,
Uwe Ligges