Published 22 August 2008, doi:10.1136/bmj.a1282
Cite this as: BMJ 2008;337:a1282

Head to Head

Are national qualifying examinations a fair way to rank medical students? Yes

Chris Ricketts, director of assessment 1, Julian Archer, NIHR academic lecturer in medical education 1

1 Peninsula College of Medicine and Dentistry, Plymouth PL4 8AA

Correspondence to: J Archer julian.archer{at}pms.ac.uk

Chris Ricketts and Julian Archer argue that a national test is the only fair way to compare medical students, but Ian Noble (doi: 10.1136/bmj.a1279) believes that it will reduce the quality of education

The General Medical Council’s consultation on student assessment1 and the inquiry into Modernising Medical Careers2 have prompted interest in national examinations for medical students or newly qualified doctors. We believe that national examinations are the only fair way to rank medical students because they offer a unique opportunity for standardisation, consensus, and pooling of resources.

Level playing field

The UK already has a system for ranking medical students as part of the application process for their first postgraduate position. Students are ranked 1, 2, 3, or 4 depending on their performance within their medical school. In 2007 this rank provided 45 marks of the total application score of 100 (45 being the maximum mark and 30 the minimum allocated according to each rank) and therefore had a major effect on every student’s chance of getting his or her preferred post. Each medical school uses it own internally devised assessments to rank students.

The current system is inherently unfair because it erroneously assumes that all medical schools have the same spread of candidate capability and that their assessment data are of equal value or validity for ranking. However, recent studies of medical school final examinations in the UK have shown several important differences in their qualifying assessments3 and standard setting,4 and the value of the current ranking system in the application process has recently been down-weighted. These differences may persist through a doctor’s working life because graduates from different medical schools show significantly different performance in subsequent postgraduate examinations.5 6

Fair ranking requires good reliability. Reliability requires standardisation and structure.7 Standardisation is best achieved by all candidates experiencing the same assessment tasks rather than the current plethora of non-standardised local assessments. National examinations would provide a common set of assessment tasks for every student, a prerequisite for fair ranking.

Quality and usability

Any national examination with high stakes would need to be designed and delivered to current standards of best practice in test procedures.8 This includes a robust and defensible approach to definition of the test, implementation, standard setting, and quality assurance. These criteria are best achieved by consensus of stakeholders, including employers,9 and pooling rare assessment expertise, as is currently done in the United States and Canada. However, such an approach is impossible in the UK while resources for assessment are divided among medical schools. It costs as much to set a high quality test for 100 students as for 10 000. Pooling of resources to create one national examination could therefore reduce costs or make better use of available funds.

A national examination has other advantages. Firstly, it would support the establishment of a common curriculum. This will be increasingly important with the establishment of private medical schools. Secondly, because national examinations are independent, they can remove any local bias. Some students perform particularly well in one high profile area: the subsequent "halo" effect10 11 may bias their local ranking. Thirdly, a national examination would provide prospective students \with a more robust comparison of how medical schools performed. Applicants to seemingly expensive medical school courses will increasingly demand better data to inform their choices. The performance of each medical school’s graduates in a national examination would be important information. Fourthly, with the movement of doctors globally, especially freely within the European Economic Area, a national examination would allow direct comparison of all graduates and doctors wanting to enter practice.

Finally, a national qualifying examination is likely to improve patient care. Although much of the evidence for the impact of national examinations on patient outcomes is from postgraduate certification examinations,12 there is also evidence that performance on earlier licensing examinations also affects patient care.13

Role of national examination

A further question relates to whether a national examination should automatically be a qualifying examination. There are clear benefits for a national examination and its unique role in ranking students. The main difference between a ranking and a qualifying examination is that the first provides information about candidates’ relative ability whereas the second also provides a pass or fail decision. Several nationally delivered examinations primarily provide ranking or grading information without pass or fail decisions—for example, A levels and medical school admission tests. The main purpose of national ranking of medical students is to aid recruitment to the first postgraduate training post. So, a national examination may not need to be a qualifying examination.

However, relying on national ranking but local qualification may produce anomalies. A student may qualify from one medical school but be ranked lower nationally than a student who has failed to qualify from another. Using a national examination for both ranking and qualification is therefore fair to medical students, standardises the minimum qualification level, and is likely to improve patient care.

Cite this as: BMJ 2008;337:a1282.


Competing interests: None declared.

References

  1. General Medical Council: Strategic proposal for assessment. 2008. www.gmc-uk.org/education/documents/strategic_proposal_for_assessment.pdf.
  2. Tooke J. Final report of the independent inquiry into modernising medical careers. www.mmcinquiry.org.uk/draft.htm.
  3. McCrorie P, Boursicot KAM, Southgate LJ. Order in variety we see; though all things differ, all agree: clinical assessment for graduation—is there equivalence across UK medical schools? Ozzawa Conference Abstracts 2008:244.
  4. Boursicot KAM, Roberts T, Pell G. Using borderline methods to compare passing standards for OSCEs at graduation across three medical schools. Med Educ 2007;41:1024.[CrossRef][ISI][Medline]
  5. Wakeford R, Foulkes J, McManus C, Southgate L. MRCGP pass rate by medical school and region of postgraduate training. BMJ 1993;307:542-3.[ISI][Medline]
  6. McManus IC, Elder AT, Champlain A, Dacre JE, Mollon J, Chis L. Graduates of different UK medical schools show substantial differences in performance on MRCP(UK) part 1, part 2 and PACES examinations. BMC Medicine 2008;6:5.[CrossRef][Medline]
  7. Van der Vleuten CPM. The assessment of professional competence. Adv Health Sci Educ 1996;1:41-67.[CrossRef]
  8. Downing SM, Haladyna TM. Handbook of test development. Philadelphia, PA: Lawrence Erlbaum, 2006.
  9. Wass V. Ensuring medical students are fit for purpose. BMJ 2005;331:791-2.[Free Full Text]
  10. Iramaneerat C, Yudkowsky R. Rater errors in a clinical skills assessment of medical students. Eval Health Prof 2007;30:266-83.[Abstract/Free Full Text]
  11. Rosenzweig P. The halo effect . and the eight other business delusions that deceive managers. New York: Free Press, 2007.
  12. Norcini JJ, Lipner RS, Kimball HR. Certifying examination performance and patient outcomes following acute myocardial infarction. Med Educ 2002;36:853-9.[CrossRef][ISI][Medline]
  13. Tamblyn R, Abrahamowicz M, Dauphinee D, Wenghofer E, Jacques A, Klass D, et al. Physician scores on a national clinical skills examination as predictors of complaints to medical regulatory authorities. JAMA 2007;298:993-1001.[Abstract/Free Full Text]

Add to CiteULike CiteULike   Add to Complore Complore   Add to Connotea Connotea   Add to Del.icio.us Del.icio.us   Add to Digg Digg   Add to Reddit Reddit   Add to Technorati Technorati    What's this?

Relevant Articles

In search of equity
Fiona Godlee
BMJ 2008 337: a1554. [Extract] [Full Text]

Are national qualifying examinations a fair way to rank medical students? No
Ian S G Noble
BMJ 2008 337: a1279. [Extract] [Full Text]

Ensuring medical students are "fit for purpose"
Val Wass
BMJ 2005 331: 791-792. [Extract] [Full Text] [PDF]

Rapid Responses:

Read all Rapid Responses

Every patient is a unique examination
Anwar Jawdat Hassan
bmj.com, 28 Aug 2008 [Full text]
What is the purpose of medical 'educationalists'?
Roderick Neilson
bmj.com, 7 Sep 2008 [Full text]
National Qualfying Examinations, not so fair.
Christopher James McAloon
bmj.com, 7 Sep 2008 [Full text]
If we must rank students, we must do so fairly.
David McKean
bmj.com, 8 Sep 2008 [Full text]
We have better ranking system than the national qualifying exams
Suvash Shrestha
bmj.com, 8 Sep 2008 [Full text]
Where's the evidence?
Carmen Eynon Soto
bmj.com, 10 Sep 2008 [Full text]



Student BMJ

Sepsis

The latest guidlines will affect how we practice medicine

www.student.bmj.com

Listen to the latest BMJ Interview