Originally, IQ was a score obtained by dividing a person’s estimated mental age, obtained by administering an intelligence test, by the person’s chronological age. The resulting fraction (quotient) was multiplied by 100 to obtain the IQ score.
“Originally” because that’s not the case for modern IQ tests because now we fit the data to a normal distribution, giving us a much more reliable and repeatable experiment.
Furthermore, even if that was quotient formula was still used, the average score of others your age is still a population parameter (something you cannot measure the true value of) that you can only sample and estimate for the possibly indefinite population. Your confidence in your estimate of the average depends on the number of samples; the actual parameter does not because it is (supposedly) an inherent quality of the class of things you’re sampling.
Please just go through a statistics crash course I don’t know how to explain this better.
There is a limit. A limit based on population. Because, again, it’s not a score. It’s a quotient.
(Mental Age / Chronological Age) X 100.
Chronological age is the average score of others your age. If you aren’t comparing to the population then you aren’t calculating an IQ.
“Originally” because that’s not the case for modern IQ tests because now we fit the data to a normal distribution, giving us a much more reliable and repeatable experiment.
Furthermore, even if that was quotient formula was still used, the average score of others your age is still a population parameter (something you cannot measure the true value of) that you can only sample and estimate for the possibly indefinite population. Your confidence in your estimate of the average depends on the number of samples; the actual parameter does not because it is (supposedly) an inherent quality of the class of things you’re sampling.
Please just go through a statistics crash course I don’t know how to explain this better.