Thursday, 7 September 2017

Whose money is it anyway?

It’s hard not to notice the current focus by some in government, parliament and the media on universities, and in particular issues of value (levels of tuition fees) and accountability (how can VC’s high salaries be justified).

There’s lots to be said on this, but in this blog I want to focus on an underlying issue: whose money is it anyway? Put bluntly, if universities are spending private money, then it’s no business of the state what they spend it on, as long as it’s legal.

Universities get money from lots of sources, and they publish information annually – through their annual accounts and through statutory returns to the Higher Education Statistics Agency (HESA) – about what exactly they get and from who. The information is in a standard format, with many categories. Bear with me while I list these; it’s worth seeing to give context to the argument I’ll be making later. There are:

  • Funding body grants


  • Tuition fees, comprising Full-time undergraduate, Full-time postgraduate, Part-time undergraduate, Part-time postgraduate, PGCE, Non-EU domicile, Non-credit-bearing course fees, FE course fees, and Research training support grants.


  • Research grants and contracts, comprising grants from: BEIS Research Councils, The Royal Society, British Academy and The Royal Society of Edinburgh; UK-based charities; UK central government bodies/local authorities, health and hospital authorities; UK central government tax credits for research and development expenditure; UK industry, commerce and public corporations; other UK sources; EU government bodies; EU-based charities; EU industry, commerce and public corporations; EU (excluding UK) other; Non-EU-based charities; Non-EU industry, commerce and public corporations; Non-EU other


  • Other services rendered, comprising income from BEIS Research Councils, UK central government/local authorities, health and hospital authorities, EU government bodies and other sources


  • Other income, comprising: Residences and catering operations (including conferences); Grants from local authorities; Income from health and hospital authorities (excluding teaching contracts for student provision); Other grant income; Capital grants recognised in the year; Income from intellectual property rights; and Other operating income


  • Donations and endowments, comprising New endowments; Donations with restrictions and Unrestricted donations

If you’ve made it through the list (well done!) you’ll see that some of these come from public sources (eg BEIS research grants), some of these are private (eg UK industry grants). Add together all of the public income for a university, divide by the total incomer, and you can work out what percentage of the university’s income is from public sources. Which is surely relevant for understanding how accountable universities need to be with their spending choices.

For some categories, though, it isn’t obvious if it’s public money. The big one here is tuition fee income.

For income from non-EU students, it is clearly private income. Even if they’re supported by their own government, the UK government doesn’t have a duty or obligation in relation to the money.

For postgraduate tuition fees paid by home and EU students, it will be a mixed bag: some will be paid by the students themselves or their employers; some will be funded via postgraduate grants; some will be paid via public PG loans schemes.

For home and EU undergraduate fees, we need to think about it. Where students have to pay tuition fees (remember that Scottish students in Scotland pay no fees) they are able to take out a loan, on less than commercial terms, from the Student Loans Company. And students do this. After graduation, students make repayments towards the loan from their salary; the amount they repay depends on how much they earn. And after 30 years the remaining debt is cancelled. The initial funds are provided to the Student Loans Company by the state; and an allowance for the ultimately unrepaid element – called the RAB charge – is also part of government spending. So is it public or private money? With hindsight, a proportion of it is private, and a proportion public. Up front, the cash is public.

On this basis it is possible to masker the calculation about the proportion of universities income which comes from public funds. I’ve included home and EU undergraduate tuition fees; I’ve excluded postgraduate tuition fees; I’ve included research and other services rendered sources from UK government and public bodies, and from EU government and public bodies (the income for this ultimately derives from UK government funds, as we’re a net contributor to the EU budget.)

What this shows is that universities receive significant public funding. Across the UK as a whole, 58% of income in 2015-16 (the most recent year for which HESA data is available) comes from public sources. In actual money, that is £20.3 billion out of a total income of £34.7 billion. Yes, I did say billion. It is a lot of money!

Nation
% Publicly-funded
England
58%
Wales
65%
Scotland
59%
Northern Ireland
75%
Total UK
58%

Of course this varies between individual universities. Some have very little income (comparatively!) from non-public sources; a few have very little (again, comparatively!) from the public. 

The graph shows the data: each university is one of the bars; they’re rank ordered from the most dependent on the left (Plymouth College of Art, since you ask, with 96% dependency on public funding) through to Heythrop College on the right (with no public funding whatsoever.) Even the famously-private Buckingham University has a little public income - £95k in funding body grants and research income from UK public bodies. Which means that it is second from the right, with about 0.25% of its income from public sources.

Source: HESA data
What of other universities? The Russell Group members range from the mid 20s (LSe with 24%) to the high 60s (Queen’s Belfast with 69%). The big post 1992 civic universities range from the mid 50s (Sunderland with 56%) to the mid 80s (Liverpool John Moores with 86%). The smaller or specialist research intensives (the 1994 Group, as was) range from the high 30s (SOAS with 38%) to the mid 60s (Birkbeck College, with 66%).

So does the state have an interest in how universities spend their money? The data say yes: at least to the extent that the money derives from public sources.

This doesn’t mean that all of the criticisms made of universities are valid. And it doesn’t mean that university autonomy isn’t a good idea. History, and international comparisons, tell us that the best universities are those that have the most freedom to make their own academic choices.

But it does lend validity to arguments that universities need to be accountable for their spending choices. In my experience, universities don’t disagree with this need for accountability. 

What of current criticisms? The danger is that the huge good that universities do for individuals and for society as a whole is forgotten amongst the current hubbub, and damage is then done. To avoid this, those making the noise need to be careful that their criticisms are well-founded. There’s an anti-elitism in current public discourse which easily mutates into unthinking policy.

And universities themselves need to be aware that (some at least) of the criticisms come from a real point. Are student always the first thought? Sometimes research sees like it is king. And is there real transparency? A few universities have a student on their remuneration committees, and their world has not fallen down. Why not more?

Tuesday, 15 August 2017

Make the National Student Survey Great Again!

The NSS data was out last week. This year it’s a new set of questions – some are the same as in previous surveys, some are amended versions of previous questions, and some are entirely new. This means that year-on-year comparisons need to be treated with a little caution.

But one aspect of reporting continues to bother me. The survey measures final year undergraduate student responses to a number of statements. For instance, “Overall, I am satisfied with the quality of the course” on a Likert scale - that is, a 1-5 scale, where 1 = definitely disagree; 2 = mostly disagree; 3 = neither agree nor disagree; 4 = mostly agree; and 5 = definitely agree. The data is presented with a simple summing of the percentages who respond 4 or 5 to given a ‘% agree’ score for every question at every institution. Which in turn means universities can say “93% satisfaction” or whatever it might be.

This is simple and straightforward, but loses important data which could be summarised by using a GPA (Grade Point Average) approach – just like the HE sector commonly uses in other responses, for instance in REF outcomes. Using a GPA, an overall score for a question reflects the proportion giving the five different responses.

To calculate a GPA, there’s a simple sum:

GPA = (% saying ‘1’ x 1) + (% saying ‘2’ x 2) + (% saying ‘3’ x 3) + (% saying ‘4’ x 4) + (% saying ‘5’ x 5)

This gives a number which will be 5 at most (if all respondents definitely agreed) and a minimum of 1 (if all respondents definitely disagreed).

If GPA was used for the reporting, there’d still be one number which users would see, but it would contain more nuance. GPA measures how extreme people’s agreement or disagreement is, not just the proportion who are positive. And this matters.

I looked at the raw data for all 457 teaching institutions in the 2017 NSS. (This is not just universities but also FE Colleges, which work with universities to provide foundation years, foundation degrees and top-up degrees, and alternative providers.)  I calculated the agreement score and the GPA for all teaching institutions for question 27: Overall, I am satisfied with the quality of the course. And then I rank-ordered the institutions using each method.

What this gives you are two ordered lists, each with 457 institutions in it. Obviously, in some cases institutions get the same score; where this happens, they all have the same rank order. And institutional rank is reflects the number of institutions above them in the rank order.

So, for example, on the ‘agreement score’ method, 27 institutions score 100%, the top score available in this method. So they are all joint first place. One institution scored 99%: so this is placed 28th.  Similarly, on the GPA ranking, one institution scored 5.00, the top score using the GPA method. The next highest score was 4.92, which two institutions got. So those two are both joint second.

What I did next was compare the rank orders, to see what difference it made. And it makes a big difference! Take, for example, the Anglo-European College of Chiropractic. It’s 100% score on the ‘agreement score’ method puts it in joint first place. But its GPA of 4.39 places it in joint 79th place. In this instance, its agreement score was 61% ‘mostly agree’ and 39% ‘definitely agree’. Very creditable. But clearly not as overwhelmingly positive as Newbury College, which with 100% ‘definitely agree’ was joint 1st on the agreement score method and also in first place (and all on its own) on the GPA measure.

The different measures can lead to very significant rank-order differences. The examples I’m going to give relate to institutions lower down the pecking order.  I’m not into name and shame so I won’t be saying which ones (top tip – the data is public so if you’re really curious you can find out for yourself with just a bit of Excel work), but take a look at these cases:

Institution A: With a score of 87% on the agreement score method, it is ranked 138/457 overall: just outside the top 30%. With a GPA of 3.95, it is ranked 349/457: in the bottom quarter.

Same institution, same data. 

Or try Institution B: with an agreement score of 73% it is ranked 382/457, putting it in the bottom one-sixth of institutions. But its GPA of 4.28 places it at 129/457, well within the top 30%.

Again, same institution, same data.

In the case of Institution A, 9% of respondents ‘definitely disagreed’ with the overall satisfaction statement. This means that the GPA was brought down. Nearly one in ten students were definitely not satisfied overall.

In the case of institution B, no students at all disagreed that they were satisfied overall (although a decent number, more than a quarter, were neutral on the subject.) This means that their GPA was higher, but the overall satisfaction reflected the non-committal quarter.

I’m not saying that institution A is better than B or vice versa. It would be easy to argue that the 9% definitely disagree was simply a bad experience for one class, and unlikely to be repeated. Or that the 27% non-committal indicated a lack of enthusiasm. Or that the 9% definitely disagree was a worrying rump who were ignored. But what I am saying is that we’re doing a disservice by not making it easier for applicants to access a more meaningful picture.

The whole point of the National Student Survey is to help prospective students make judgements about where they want to study. By using a simple ‘agreement’ measure, the HE sector is letting them down. Without any more complexity we can give a more nuanced picture, and help prospective students. It’ll also give a stronger incentive to universities to work on ensuring that nobody is unhappy. Can this be a bad thing?

GPA is just as simple as the ‘agreement score’. It communicates more information. It encourages universities to address real dissatisfaction.

So this is my call: let’s make 2017 the last year that we report student satisfaction in the crude ‘agreement score’ way. GPA now.

Tuesday, 1 August 2017

Value for money

Universities seems to be having a torrid time, at least as far as their standing in the political firmament goes. As well as pension headaches for USS member institutions (mostly the pre-1992's), there are high-profile stories on VC salaries, Lord Adonis' campaign about a fee-setting cartel, and (low) teaching contact hours. So far, so not very good at all.

There's a feeling that this might be more than a quiet season set of grumbles: David Morris at Wonkhe writes interestingly on this. For what its worth, I suspect that this is indeed politically driven rather than accidental. Maybe Lord Adonis is marking out ground for his re-emergence within a new model Labour Party; maybe Jo Johnson is preparing for tough discussions around future fees. But whatever the end point, it's worth looking at whether the concerns are real.

An underlying point is value for money. The charge is that (English) students don't get a lot for their money. One quick way to look at this is university spend on staff, the single biggest item on university's accounts. HESA publish handy data on student numbers and staff numbers. It's straightforward to calculate the ratio of students to academic staff over the years.

source: HESA, my calculations
The data show that from 2004-05 to 2011-12, for every member of academic staff there were about 14 students. In 2012-13 - the first year of the new fees regime in England - this ratio started to fall, and by 2015-16 there were just over 11 students for every member of academic staff.

Does this mean that the stories of low contact hours, and questionable value for money are wrong? Not necessarily - the data doesn't speak to the reality at individual universities or programmes, nor does it describe any individual student's experience. But it does show that universities have invested in the most important element of their provision: academic staff.

Monday, 24 July 2017

Reflections on the UK 'Carnegie Classification'

I posted yesterday on how UK universities and higher education institutions would map onto the US Carnegie classifications. That post simply presented the data; its worth a little reflection on what the data show.

Unsurprisingly, perhaps, the Doctoral classification is the biggest, but there is more differentiation than I expected. When I did a similar exercise 15-20 years ago there weren’t any UK HEI’s in the Masters or Baccalaureate categories - every institutions was R1 or R2 (in the old Carnegie scheme). This reflects the broadening of the pool in the UK (there are more universities now than there were); but also, perhaps, the more selective approach to funding of research studentships. Some of the institutions which are now Master's universities made the Doctoral cut, if I remember correctly, on the old classification.

Also not present in this list are HE and FE Colleges, which between them would occupy the Baccalaureate category, the Baccalaureate/Associate’s category and the Associate’s category. In the UK the equivalent of the Associate's category is Foundation Degrees and foundation years, but the principle remains the same. Its clear that if you want to understand the breadth of UK HE you need to look at HE delivered in FE Colleges. This is a challenge to most of the usual narratives about UK higher education; perhaps it reflects the university-sector 'ownership' of some sector wide bodies such as UCAS, HESA etc. (I'm using 'ownership' loosely here.)

And of course, as the UK did most of its historical oppressing in countries which are now sovereign, there isn’t an equivalent of the Tribal Colleges category.

Is this a useful analytical framework? At the moment the categories used are often driven by mission groups. Whilst membership of these is in part driven by data, it isn't transparent. And as mission groups are clubs not leagues, there isn't often relegation, although promotion does happen. (Remember the expansion of the Russell Group in 2012). Perhaps we need a UK equivalent, to allow for more transparent analysis of the section and how it is developing?

Sunday, 23 July 2017

Mapping UK universities against the US Carnegie Classifications

In 1970 the Carnegie Commission on Higher Education developed a framework for analysing university and colleges, to facilitate research and policy development on higher education. The framework has continued to be developed by higher education researchers in the US. As England’s HE policy framework begins to approximate the US – with an increasing emphasis on diversity of institution, market entry and exit, and the student as consumer – in this blog I’m looking at the US classifications, what they tell us, and how the UK’s HEI’s would map onto the Carnegie classification.

The current Carnegie Classifications – using a methodology last significantly updated in 2005 – divide universities and colleges into seven broad classes. In some classes there is further differentiation by scale of activity. These are:

Doctoral University (with subclasses R1, R2, R3 defined by scale)
Institutions that award at least 20 PhD/DPhil degrees per year
Master’s University (with subclasses M1, M2, M3 defined by scale)
Institutions that award at last 50 Master’s degrees per year
Baccalaureate Colleges
Institutions where bachelor’s degrees made up more than 50% of degrees awarded
Baccalaureate/Associates Colleges
Institutions with at last one Bachelor’s programme and with more than 50% of awards at the Associate degree level
Associate's Colleges
Institutions whose highest qualification awarded was an Associate degree
Special Focus Institutions
Institutions where more than 75% of degree awards relates to a single field or set of related fields
Tribal Colleges
Institutions which are members of the American Indian Higher Education Consortium

The first four categories are applied hierarchically: if you’re doctoral, you’re not counted in Masters; if you’re Master’s, you’re not counted in Baccalaureate, even though numerically you’d meet both sets of criteria.

An associate degree is a two-year undergraduate qualification. Typically it would equate to the first two years of the four-year baccalaureate degree. 

Using HESA data from 2015-16, it’s possible to match UK institutions to these categories. I haven’t done the detailed analysis required to categorise Doctoral universities as RE1, R2 or R3; or Master’s universities similarly. 

All UK HEI’s – or at least those which reported to HESA in 2015-16 – fall within one of the Doctoral, Master’s, Baccalaureate or Specialist classes. Here’s the classification. 

Doctoral Universities

  • Aberystwyth University
  • Anglia Ruskin University
  • Aston University
  • Bangor University
  • Birkbeck College
  • Birmingham City University
  • Bournemouth University
  • Brunel University London
  • Canterbury Christ Church University
  • Cardiff Metropolitan University
  • Cardiff University
  • City, University of London
  • Coventry University
  • Cranfield University
  • De Montfort University
  • Edinburgh Napier University
  • Glasgow Caledonian University
  • Goldsmiths College
  • Heriot-Watt University
  • Imperial College of Science, Technology and Medicine
  • Keele University
  • King's College London
  • Kingston University
  • Leeds Beckett University
  • Liverpool John Moores University
  • London Metropolitan University
  • London School of Economics and Political Science
  • London South Bank University
  • Loughborough University
  • Middlesex University
  • Newcastle University
  • Oxford Brookes University
  • Queen Margaret University, Edinburgh
  • Queen Mary University of London
  • Roehampton University
  • Royal Holloway and Bedford New College
  • Sheffield Hallam University
  • St George's, University of London
  • Staffordshire University
  • Swansea University
  • Teesside University
  • The Institute of Cancer Research
  • The Manchester Metropolitan University
  • The Nottingham Trent University
  • The Open University
  • The Queen's University of Belfast
  • The Robert Gordon University
  • The School of Oriental and African Studies
  • The University of Aberdeen
  • The University of Bath
  • The University of Birmingham
  • The University of Bradford
  • The University of Brighton
  • The University of Bristol
  • The University of Buckingham
  • The University of Cambridge
  • The University of Central Lancashire
  • The University of Dundee
  • The University of East Anglia
  • The University of East London
  • The University of Edinburgh
  • The University of Essex
  • The University of Exeter
  • The University of Glasgow
  • The University of Greenwich
  • The University of Huddersfield
  • The University of Hull
  • The University of Kent
  • The University of Lancaster
  • The University of Leeds
  • The University of Leicester
  • The University of Lincoln
  • The University of Liverpool†
  • The University of Manchester
  • The University of Northampton
  • The University of Oxford
  • The University of Portsmouth
  • The University of Reading
  • The University of Salford
  • The University of Sheffield
  • The University of Southampton
  • The University of St Andrews
  • The University of Stirling
  • The University of Strathclyde
  • The University of Sunderland
  • The University of Surrey
  • The University of Sussex
  • The University of Warwick
  • The University of Westminster
  • The University of Wolverhampton
  • The University of York
  • University College London
  • University of Abertay Dundee
  • University of Bedfordshire
  • University of Chester
  • University of Durham
  • University of Gloucestershire
  • University of Hertfordshire
  • University of Northumbria at Newcastle
  • University of Nottingham
  • University of Plymouth
  • University of South Wales
  • University of the Arts, London
  • University of the Highlands and Islands
  • University of the West of England, Bristol
  • University of Ulster
  • University of Wales Trinity Saint David
Master's Universities
  • Bath Spa University
  • Buckinghamshire New University
  • Edge Hill University
  • Falmouth University
  • Glynd┼Ár University
  • Liverpool Hope University
  • Newman University
  • Royal Agricultural University
  • Southampton Solent University
  • St Mary's University, Twickenham
  • The University of Bolton
  • The University of Chichester
  • The University of the West of Scotland
  • The University of West London
  • The University of Winchester
  • University College Birmingham
  • University of Cumbria
  • University of Derby
  • University of Suffolk
  • University of Worcester
  • York St John University
Baccalaureate Universities
  • Bishop Grosseteste University
  • Leeds Trinity University
  • SRUC
  • St Mary's University College
  • University of St Mark and St John
Specialist Focus Institutions
  • Conservatoire for Dance and Drama
  • Courtauld Institute of Art
  • Glasgow School of Art
  • Guildhall School of Music and Drama
  • Harper Adams University
  • Heythrop College
  • Leeds College of Art
  • Liverpool School of Tropical Medicine
  • London Business School
  • London School of Hygiene and Tropical Medicine
  • Norwich University of the Arts
  • Plymouth College of Art
  • Ravensbourne
  • Rose Bruford College
  • Royal Academy of Music
  • Royal College of Art
  • Royal College of Music
  • Royal Conservatoire of Scotland
  • Royal Northern College of Music
  • Stranmillis University College
  • The Arts University Bournemouth
  • The Liverpool Institute for Performing Arts
  • The National Film and Television School
  • The Royal Central School of Speech and Drama
  • The Royal Veterinary College
  • The University of Wales (central functions)
  • Trinity Laban Conservatoire of Music and Dance
  • University for the Creative Arts
  • University of London (Institutes and activities)
  • Writtle University College


Monday, 17 July 2017

How many overseas students are there?

With more-than-usual amounts of chatter around UK HE policy at the moment, its useful to remind ourselves about evidence and actuality. With this in mind, I’m presenting two views of UK HE and its reliance upon students from other countries.

The chart below shows this – it’s drawn from HESA data, and shows the headcount of students – all levels, all modes – from outside the UK. This includes other EU and EEA countries, and students from all other countries.  It’s about domicile recorded for fees purposes, which practitioners will know is not a simple correlation with passport.


The data show that there’s a wild and fluctuating market, with sharp declines and even sharper rises. It must be a nightmare for universities in such a market – no wonder the emphasis given to recruitment activities.

Let’s look at a different chart. Again, students from outside the UK. The data here tell a different story – a stable market (you might even say strong and stable). Universities can concentrate on what that do best – teaching and supporting students learning; and conducting research. Our global partnerships look safe and secure.


It’s the same data of course. The trickery is in the Y axis (the horizontal one) – in the first chart the scale is truncated to start at 415k students; in the second chart the scale starts at 0.  The first gives the detail, the second the big picture. The chicanery is that the eye is tempted to focus on the line not the numbers. The conclusions drawn from the two charts are quite different.

There’s two lessons in this.

Firstly, don’t be fooled by bad charts. Darrell Huff’s How to lie with statistics is essential reading for everyone, in my view.

Secondly, the details does matter. Although total numbers are stable, the total is made up of the totals at all of the UK’s universities and HEIs – there’s almost 200 components to this, and maintaining, or growing, your numbers make the difference between adding jobs and giving a better student experience, or retrenchment, retraction and job losses.

So, look at the detail; remember the bigger picture might say something different; and try not to mistake wood for tees, and vice versa!


Monday, 10 July 2017

Cartel schmartel

The question of university tuition fees in England is causing brouhaha again. Part of the issue seems to be some Conservative ministers wanting to catch up on an issue where the Labour Party had the edge on them at the recent General Election. Part of the issue appears to be a Damascene conversion by Andrew Adonis, the behind-the-scenes architect of the 2004-06 increases in tuition fees, and in particular his assertions that universities are operating a cartel about fees. A particular trigger appears to be recent revelations about the amount of debt that students will incur and not repayhttp://www.bbc.co.uk/news/uk-politics-40547740.

At heart, a lot of the issues appear to relate to the consistent £9k fee charged by universities in England. This costs more than the government planned (an estimated cost of £7.5k per student per year underpinned the calculations in 2010 and 2011). So why do universities charge £9,000 per year? And why do students pay?

(Cautionary note: although I graduated from LSE, I am not an economist. But I don’t think that arguments I make are bad arguments. Second cautionary note – the Higher Education and Research Act and the Office for Students changes a lot of the nomenclature in this, but the fundamentals remain the same.)

(by the way, on this general topic here is a really good article from the BBC showing some data about this complex topic)

Firstly, let’s look at the question of why students pay.

The student loans scheme in England is not like a normal loan. Repayments are contingent upon income levels (ie you don’t pay until you earn £21,000 per year; and then you pay a flat rate 9% of income over 21,000 per year.). You keep paying until you’ve paid off the debt, or until 30 years post-graduation.

This means that many won’t fully repay their student loans, as their income levels between 21 and 50 aren’t high enough to have paid for enough years. It also means that for any given level of income, the amount you repay would be the same regardless of the amount of loan you took out. Let’s show how that works:

Student A borrowed £9,000 per year - £27,000 in total – to fund tuition fees at Poppleton University. They now earn £25,000 per year, and so every year they repay £360 – that is, 9% of £4,000, which is the difference between their income (£25,000) and the threshold (£21,000).

Student B went to the Poppleton Metropolitan University which charges £6,000 fees per year, so they borrowed in total £18,000. They now earn £25,000 per year, and so every year they repay £360 – that is, 9% of £4,000, which is the difference between their income (£25,000) and the threshold (£21,000).
You can see that the annual amount of repayments is not related to the amount borrowed. Total amount repaid does relate – in principle – to the amount of borrowing, but you’ll only be expected to repay all of the debt if you earn enough.

This leads to my first proposition: the loans system does not encourage students to be price-sensitive.

The second question relates to why universities charge £9,000. It’s important to understand how this £9,000 is made up. The legislation provides for a standard amount (£6,000 per year) and a variable element (initially an additional £3,000 per year, now £3,250.)

Any approved English HE provider can charge £6,000 per year. Institutions can charge an additional fee if they have an access agreement with the Director of the Office for Fair Access (OFFA).  The Access Agreement sets out what additional fee they plan to charge, and what they’ll do to ensure that this does not militate against fair access. The assumption was that only in exceptional cases would £9,000 be payable; but there was no mechanism to enforce this. OFFA had to judge each application on its merits. And so if, say, University A had got an access agreement for £9,000 fees with fair access spend of £500 per student, then University B which proposed £9,000 fee with £500 fair access spend per student would be able to argue that to deny them the right to charge £9k would be perverse. The quality of the university was not a consideration.

In that initial round, nearly every university had its access agreement approved without conditions; a few had to make revisions. But none were turned down. £9,000 proved not to be exceptional.
Why did universities even suggest such a fee level? In part because fees act as a marker for quality. If your rival charges £9,000, why would you charge less? To do so could be to indicate that you weren’t as confident as your rival in the value of your offer.  It isn’t about greed or excess; simply about market position.

So this is my second proposition: there was no incentive for universities not to charge £9,000.

Also relevant was the actual behaviour of students. When setting higher fees for the first time, many university governing bodies recognised that they were entering the unknown. There was a recognition that student numbers might well fall. So a £9,000 fee, together within internal budgeting for fewer students, would make sense. We put on a brave face for the world, but plan for hard times. As it turned out, student numbers did not (after the first year) decline. And this is the third factor which I think is relevant.

Student recruitment has historically been controlled by the government, through a funding cap. Essentially, universities had a recruitment target set by HEFCE; there were penalties for exceeding this. When the cap was relaxed and then removed, as happened in 2013-14 onwards, universities were free to recruit as many students as they liked. This meant that a university which was growing could afford to spend resources on other activities, as in most cases the £9,000 fee was greater than the full cost of teaching. This enabled development of new subjects; investment in new buildings for teaching and research; and investment to improve reputation in, for instance, the REF.

This creates my third proposition: universities had many incentives to grow; few to remain the same size.

Put these three factors together, and a lot of the features of the current system become clear. £9,000 fees were the natural desire for universities. There was no price competition between universities, because enough students (in a growing market) were not price sensitive. So £9,000 becomes the norm.

This isn’t a cartel. Universities are by habit compliant with law and regulation, and the injunction ‘do not discuss fees with your peers’ was, in my experience, very well observed.  But it was bad regulation. The design of the system did not, beyond pious words, prevent £9k becoming the norm.

What would Nick do?
 Equally, universities had for decades been enjoined to behave more entrepreneurially. In relation to PGT fees, universities were encouraged to see these as a price, not a cost. Would students pay the fee? If so, charge it. If the fee wasn’t enough, stop teaching the programme. And this habit infused undergraduate fee decisions. It would be possible to regulate or legislate for a fee regime that reflected cost not market price. But this wasn’t done.

So what is to be done now? I’ve a simple plan. Ask Nick Barr. He’s Professor of Public Economics at LSE, knows more about higher education funding than almost anybody else in the world, and is wise and fair-minded. Ask Nick Barr how to fund students and universities on a fair and sustainable basis. And do what he says.