Skip to main content

The science behind the Core Numerical Reasoning Assessment

Learn how the Core Numerical Reasoning Assessment (CNRA) is designed, validated, and continuously improved.

Updated this week

Overview

The Core Numerical Reasoning Assessment (CNRA) measures how individuals interpret, analyze, and reason with quantitative and data-based information. The assessment evaluates skills that are essential for data-driven decision-making, problem-solving, and analytical thinking in many professional contexts. The CNRA combines multiple types of numerical reasoning tasks into a concise, adaptive assessment that typically takes 10–15 minutes to complete.


Key features

  • Adaptive design: The CNRA uses adaptive testing, where each response influences the difficulty of subsequent questions. This allows for more precise measurement of ability across different skill levels while minimizing unnecessary questions and maintaining accuracy.

  • Validity and reliability: The CNRA has been tested on a global sample of more than 1,400 participants. It is calibrated using a 1-Parameter Logistic (Rasch) IRT model, enabling fair, stable, and interpretable measurement across a wide range of ability levels.

  • Fairness: Psychometric analyses show no evidence of bias across age, gender, or ethnicity. Adverse impact ratios exceed standard fairness thresholds, supporting equitable outcomes across diverse groups.


Continuous improvement

The CNRA is continuously refined through ongoing research, expanded datasets, and new item development. Planned enhancements include recalibration using more advanced models and the introduction of new AI-assisted question types, further strengthening both scientific rigor and practical relevance.


Note: The Core Numerical Reasoning Assessment is currently available in English only.


Learn more

For a deeper explanation of the CNRA’s theoretical foundations, methodologies, validation studies, and ongoing research, refer to the technical summary below.

Did this answer your question?