Does the UK need medical schools?


Clinical academics also flee to their labs from the lecture theatres. They know that involvement in undergraduate teaching – and its administration – early in their careers does not pay off. Better to concentrate on research – or else the postgraduate taught courses that bring in additional revenue.

Explaining to non-medical colleagues how clinical undergraduate medical education in the UK is organised is usually met with polite scepticism. The medical school chooses who to admit and provides most of the initial teaching. But after the first two years, things change. The bulk – perhaps over 90 per cent – of teaching is then delivered by NHS staff with no contract or even meaningful contact with the university.

At one time, most teaching was located within a largely monogamous partner hospital. But many medical schools now admit 10 times as many students as the archetypal London medical school, on which the model was founded. Students from several medical schools, with different structures, may mingle at far-flung hospitals or clinics, raising the issue of what any single university contributes. Add to that the rise of specialisation and the pressure on doctors’ time and a modern medical student might be “taught” by 500 people during their clinical years, in fleeting, anonymous rotations across scores of sites.

This quasi-outsourcing model of teaching, with the medical school acting as purchaser and the NHS as provider, has allowed many medical schools, particularly the newer ones, to dispense with most of their full-time academic staff. They simply do not possess clinical academic expertise across the breadth of medicine.

NHS staff have noticed these shifts. The challenge to their professional status and autonomy by the growing ethos of medicine as a service industry has sensitised them to the fact that their relationship with their academic colleagues is increasingly instrumental rather than collegial. Besides, everybody – including the students – knows that meeting NHS waiting-list targets and treating the emergency patients waiting on trolleys trump teaching medical students. So the students get a raw deal, too.

In much of Europe, medical schools have never claimed to deliver the immersive apprentice model of UK medical education, and their graduates’ immediate employment roles reflect this. In the US, meanwhile, medicine is a graduate-only course, and medical schools are explicitly in the business of providing clinical care, so its priorities are not so split. However, I doubt that anybody in the UK would seriously contemplate all would-be UK doctors paying for both undergraduate and graduate medical degrees.

Let us reflect on the fact that those wishing to become hospital specialists train on the job at a single department (or between a few geographically linked departments), but with teaching and certification supervised at the UK level by the Royal Colleges. We happily accept that universities could not perform either function.

We should also recall the now largely historical Oxbridge “3+3” model, whereby three years of university education in the basics and fundamentals was followed by three years’ immersion in a hospital, usually in London. The students were not always paid as apprentices, but they were certainly treated as such.

I suggest that we likewise move the second “3” of the primary medical qualification – which is now a faux apprenticeship – to where it belongs: the workplace. Trainees should graduate, then learn and work as paid employees before obtaining professional certification, as lawyers and accountants do. Given that NHS funding already supports clinical placements, this would probably cost less overall than the current system.

Meanwhile, medical schools would be liberated from the constraints of professional certification and the associated pretence that they have full control of the clinical experience. Their core function would be the age-old one of providing the foundational knowledge and intellectual skills that equip graduates to make sense of – and grow within – professional practice. But while all students would take some core modules on physiology, cell biology and public health, we could offer options in medical history and innovation, in how law and economics impact on healthcare, and the nature of evidential claims and statistics. This would allow us to produce graduates who bring diverse intellectual qualities to the “trade” aspects of their profession.

Finally, this readjustment would open up medicine to those who don’t want to practise clinically. It would allow medicine as an area of contemporary thought and scholarship to diffuse more widely. After all, medicine is too important to just be left to doctors.

Author Bio: Jonathan Rees is an Emeritus Professor of Dermatology at the University of Edinburgh.