When examining tolerances or variation in sensor readings, the key concept is how much measurements deviate from the average. The standard CompTIA approach:
Standard deviation measures the spread of data around the mean.
A small standard deviation ⇒ readings are tightly clustered (low variation).
A large standard deviation ⇒ readings vary widely (high variation), indicating potential issues with sensor consistency.
Why the other options are less appropriate:
Median (C) and Mean (D) are measures of central tendency, not dispersion; they tell you about the center, not how tightly values cluster around it.
Quartile range (B) (or interquartile range) is also a dispersion measure, but standard deviation is the classic choice when dealing with continuous measurements and tolerances in engineering/QA contexts.
Thus, to evaluate the tolerances in IoT sensor readings, Standard deviation (A) is the best measure.
CompTIA Data+ Reference (concept alignment):
DA0-001 Exam Objectives – Data analysis: measures of central tendency and dispersion (standard deviation, variance, range, IQR).
CompTIA Data+ Study content: standard deviation is emphasized as a primary measure of spread for continuous data.
Submit