Tuesday 25 August 2015

A framework for service quality - second draft

In July I posted about the framework for service quality which I'm working on. The post has generated a lot of interest (I can tell that from the Google Analytics) and some very thoughtful feedback.

I'm sharing now the second draft of the elements - you can download it from my Resources page as a pdf, and the pdf itself is here. In this document I ask a specific question for feedback - in relation to staff appraisals - and say a bit more about the concept of a maturity matrix. You can also see how the draft has changed from the first version, and again, comments and feedback are welcomed.

The point of the exercise is to create a diagnostic tool which anyone can use to help them measure and improve the quality of a professional service in higher education. It's published with a creative commons license - you're free to use and amend it as you wish, as long as you acknowledge the source, and share what you've done on the same basis.

You can post comments in reply to this blog; or reply to the tweets or linked-in posts which I use to publicise it. Or, if you'd rather, email me directly - hugh @ hughjonesconsulting.co.uk.

Thank you!

Monday 24 August 2015

An export business

With about a month to go until the start of the new academic year universities are busy with admissions and preparations for enrolment. Nothing new about that. But its worth looking at who is being admitted.

source: HESA, my calculations
The chart shows two things.

The columns, in blue, represent the total number of students enrolled in universities in a given year. The numbers reflect real people, not full-time equivalents, so this is the number of actual people enrolled. They also include all levels - undergraduate, postgraduate taught and research.

The line, in orange, represents the proportion of students whose domicile is outside the UK - that is, from any other EU country or from anywhere in the rest of the world. (A technical note for Theresa may, James Brokenshire and others - domicile is not identical to nationality; there will be a small number of people who count as domiciled outside the UK who have UK citizenship - its very complicated...)

(A second technical note for data geeks - the rest of you can skip over this one. HESA changed population definitions and from 2007-08 did not include writing-up and sabbatical students within the overall student numbers, recording them separately without domiciliary data. The proportion of of non-UK students is calculated on the basic HESA data; the total number of students is the raw HESA data plus the writing up/sabbatical data. The difference is negligible, but best to be clear.)

So the overall picture is one of a growth - and its too soon to see definitively whether there's a peak in 2010-11 or a temporary trough in 2012-13 and 2013-14. But the growing proportion of non-UK domiciled students adds to the picture: here's another chart, with one fewer significant axis:

source: HESA, my calculations
The blue is UK domiciled student numbers, the orange is students from the rest of the world. (Data geeks: I've assumed the split for writing-up students mirrors the split for PG students generally and calculated on this basis.)

This seems to me to show that UK student numbers in 2013-14 are pretty much where they were in 2002-03 (actually about eight thousand fewer). The number of UK domicile students hasn't been static over the period - there were nearly 1/4 million more in 09-10 than in 13-14), but the overall growth between 02-03 and 13-14 is driven by non-UK students.

This really does go to show that higher education is an export business. Universities UK regularly seeks to explain - to government and to the public - that universities are a major export industry. And with good reason - without overseas students in particular, many UK universities would be in financial difficulty. It would be a good idea - economically speaking - for the government to discount overseas students from its migration figures, and ease up on visa restrictions.

Thursday 20 August 2015

More fun with the NSS

The NSS data - about which I posted earlier this week - allows you to compare specific universities. I've analysed the data to show how the different university mission groups compare over the last six years.

For each mission group I've calculated the mean satisfaction score for each year across the mission group's members, unweighted by the number of students at each institution. The 'satisfaction score' is the proportion of students who agreed with the statement 'Overall, I am satisfied with the quality of my course' (Question 22 of the NSS.)  The mission group membership for each year is that at August 2015 - that is, I haven't taken account of historic changes, such as the addition of four members to the Russell Group in 2012.

Although there are clearly differences between the mission groups, with sustained differences over time, its important to recognise that the data show a large amount of satisfaction whatever the mission group - scores range in 2015 from 85% satisfaction to 89% satisfaction. Although every Vice-Chancellor will say that there's room for improvement, its a good solid performance.

I've also been a little mischievous - the 1994 Group folded a few years ago but I've resurrected it for this comparison. It was formed of research-intensive universities, like the Russell Group, but they tended to be smaller and, as their informal strap-line had it, 'elite but not elitist'.  Most of the former 1994 Group members are no longer in a mission group, but the pattern of performance shows that mission group alone should not be taken as a guide to a university's performance.

Do the data necessarily mean that teaching and the student experience are better at the Russell Group? Not necessarily. Remember - the data show student satisfaction, which might be higher at Russell Group universities for other reasons. Maybe Russell Group universities are better at managing expectations; maybe the students at Russell Group have more realistic expectations of university because their family has a history of going to university already - they have the social capital to know what to expect and to make the most of it. But of course it is also possible that students at Russell Group universities just do have a better experience ...

Monday 17 August 2015

If you're happy and you know it ...

The results of the 2015 National Student Survey (NSS) were published last week. The NSS asks all final year undergraduates in the UK to rate their satisfaction with various elements of their study. It’s a survey which has had plenty of criticism over time in relation to its usefulness, but it is here to stay.

The survey includes a question asking students to rate their overall satisfaction, and it is this that often generates the headlines. I’ve compiled a data set going back to 2010 for all UK universities, showing their performance on this question. The data shows the proportion of respondents who definitely or mostly agree with the statement ‘Overall, I am satisfied with the quality of my course’.

The overall trend is upwards: in 2010 the institutional average (mean) score was 81.8%; in 2015 it was 85.9%, and it has risen in every year. This might be because students are generally more satisfied; in my view it also reflects greater effort by universities to manage student expectations, to address issues, and to encourage higher rates of survey completion. (A good general rule is that the more cross someone is, their more likely they are to give feedback – so the more responses you get, the better on average the response will be).

You can see some interesting patterns when you look at the four nations in the UK. England dominates the UK sector in numbers, so it’s unsurprising that the England pattern is like the UK average. But the three other nations – Scotland, Wales, and Northern Ireland – show distinct patterns. Northern Ireland in particular performs well, and although this is only a small number of universities, it is a consistently higher response.

Headlines often focus, understandably, on who has the most satisfied students, but there’s another measure possible. A benchmark is calculated for each university, reflecting the sector average satisfaction levels but adjusted to reflect the mix of students at the institution. And where performance above (or below) benchmark is statistically significant, this is flagged. And so we can see which universities consistently score better than the data says they ought. These are places which, if the survey is to be believed, understand how to ensure that their students are satisfied.

I’ve identified those universities which – in every year from 2010 to 2015 – perform significantly better than their benchmark for overall student satisfaction. Here they are – in alphabetical order. This would be my go-to list if I wanted to understand how to make NSS results better:

University of Buckingham
Conservatoire for Dance and Drama
University of East Anglia
University of Essex
University of Keele
Loughborough University
University of Newcastle upon Tyne

Well done those universities.

Tuesday 4 August 2015

Focusing on what matters

Ernst and Young - a large firm of accountants, m'lud - have changed their staff recruitment practices. The BBC picked up the story, and undoubtedly it is of interest.

What's happening is that EY are no longer looking - during the early stages of recruitment - at an applicant's A levels or university degree. Applicants will still give these details, and they will be looked at by EY recruiters before appointments are made, but at the first stage EY will only go on the outcomes of their own tests.

EY use, it seems, online 'strengths assessments' which focus on Leadership, Commerciality, Networking and Influence, as well as numeric tests. And by excluding information about a candidate's background and prior education EY hope to remove a barrier to entry to the firm - moving away from practices which meant that such firms often recruit in staff members' own image: 'good' schools, 'good' universities, and similar (middle-class) backgrounds. Stop using a mirror!
Not the best interview panel

This isn't a leap in the dark - EY have looked at 18 months' results of their own assessments and are confident that they capture all the attributed that they want. Nor is it political correctness. It is part of what helps equality - a focus on what you're looking for, not just what you see at first glance. And it is good news that EY don't assume that a good degree measures all that you might want in a future colleague. Getting the right people is the hardest part of any venture.

So will this mean the end of students chasing good universities? I doubt it: positively, education in an intellectually challenging environment can never be a bad thing for a person. And negatively, I'm sure there'll be ways for candidates to make known where they studied. There's a loophole in every process: I'm sure many will have experience of age-blind recruitment processes where its possible to use dates of degree awards and career progress to estimate a candidate's age in advance of meeting them.

There's two lessons for universities in this. Firstly, the importance of developing students' skills beyond the curriculum. And secondly, are there things universities can learn about assessment? It'll be interesting to see whether other firms follow EY's lead.

Monday 3 August 2015

Staff costs

I’m looking at the issues around managing costs in higher education at the moment (look out for a post in the next few days about why not all vacant posts can be replaced). My starting point has been to look at the data, and in particular the high level data on staff costs.

HJ calculation from HESA data
The chart – which I have calculated using HESA data – shows the proportion of universities’ total income which is spent on staff.  (See my post from October 2014 which explains what this means and why it matters). There is data for the four home nations plus UK-wide data, covering the period from 1994-95 to 2013-14, being the most recent HESA data set available.

The chart tells a story of good financial times in the early years of the Labour government in the late 90’s; and closer management of spend following the introduction of top-up fees in 2006.

Of the four national patterns, England accounts for by far the largest share of the overall income, so it’s no surprise that the total UK and the England lines follow each other closely. In turn, Scotland appears over time to track England more closely. The data for Northern Ireland looks peaky, but don’t forget that it’s a very small sector (only four institutions, of which two are very small), so individual institutional strategy will have a disproportionate impact. Wales looks to be going against the grain – increasing the proportion of income spent on staff compared to the other nations.

With devolved funding this can happen – this may reflect differences in funding for capital, for instance, rather than deliberate policy by Welsh universities to grow staff spend. But HEFCW in Llanishen might be interested to find out why, before the Assembly Finance Committee in Cardiff Bay asks the same question.