Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In a sample size of <500, assuming a standard deviation of 15 for the IQ test (since that's the most common) 3 points wouldn't seem to mean anything. Even a well-versed psychologist wouldn't be able to reliably tell apart people with a difference if .2 standard deviations.


I am not so used to statistics anymore (in fact probably never was). I just wonder if there isn't a difference between comparing the IQ of two people (5 points difference maybe not significant) and comparing the average IQs of two "large" groups of people - since the average was 95, it means that there were also some with an IQ of 80 and some with an IQ of 120 or whatever. Meaning a difference of 5 point in the averages might be more significant than a difference of 5 points between two people?

As I said, I am not used to these calculations anymore, and it is too late at night now to do the research... Maybe tomorrow... Or maybe you can answer that question (guessing that years of playing poker make you quite experienced in such things...).


You are correct. I'm not sure exactly, but I suspect that it would be entirely within the realm of reasonable possibility for two equal groups of 250 to end up 5 points apart with an SD of 15. You could find out pretty easily with a script and a Gaussian RNG function if you were so inclined. My curiosity on this one is just slightly overpowered by my laziness though.


Well I admit the 5 points don't so dramatic after all ;-)


Ha, yeah, and the study cited 3, which I'm guessing to be significantly less meaningful than 5 even.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: