International Strategy for Higher Education Institutions
Posted on by Vicky Lewis
This is the tenth in my series of blogs sharing insights and emerging ideas on ways to measure international success, based on a review of university international strategies. Links to earlier blogs in the series are provided at the end of this one.
Despite well-rehearsed reservations about the commercial global rankings, these are often used (by institutions and their stakeholders) as a measure of reputation and a marker of prestige.
I was amused to find out recently that the term ‘prestige’ used to be a synonym for ‘practising juggling or sleight of hand’, and is derived from a Latin adjective meaning ‘full of tricks’ or ‘deceitful’.
It can feel as if institutions are ‘practising sleight of hand’ when they focus on optimising specific metrics simply to climb the rankings; and the rankings agencies open themselves up to accusations of ‘deceit’ when they offer reputation-building consultancy to institutions with one hand, while compiling various rankings with the other.
This insightful piece in International Higher Education by Chris Glass and Gerardo Blanco investigates the worrying dynamics of what they describe as the ‘new analytics-industrial complex’, reflecting a transition on the part of companies like THE and QS from ‘rankings to regulatory power’.
Some institutions (e.g. Utrecht University) have publicly withdrawn from global rankings. Others are signing up to initiatives such as More Than Our Rank. Meanwhile, the rankings organisations have branched out into new, more specific rankings such as the THE Impact Rankings and QS Sustainability Rankings.
The former allows institutions to differentiate themselves by selecting their ‘priority SDGs’ – but still ends up with a global ranking.
The very notion that you can reliably rank institutions that have widely differing missions and contexts (apples, oranges and - in the illustration for this blog - a pineapple) is, when you think about it, ludicrous.
That doesn’t stop rankings being used by a whole host of stakeholders to inform decisions, including:
So, are rankings still a popular KPI in international strategies? And what other measures of global reputation are used?
My analysis of the KPIs of six published international strategies with an end date between 2024 and 2030 shows that only one includes a KPI linked to global rank or reputation (and this is carried through from the main institutional strategy).
In this instance, the KPI is ‘ranking in the THE World Reputation Rankings’ and the target is ‘Top 100 in the world’.
When I was reviewing UK university strategic plans in 2020 (134 of them), in those cases where there was a single international KPI (as opposed to several), the most frequent choice was ‘global rankings’ (even ahead of international student enrolments).
It can be argued that such rankings are actually a composite KPI, since they reflect performance in a range of different areas. However, the areas prioritised are selected by the rankings compiler, rather than being tailored to the priorities of any individual institution. As a result, peer groups are chasing the same measures of ‘success’ and institutional distinctiveness is often diluted.
Across the international strategies I reviewed in 2020, KPIs relating to ‘global rank and reputation’ fell into two thematic categories: those that were focused largely on climbing the rankings; and those that emphasised different aspects of brand awareness, impact and engagement.
Climbing the rankings
To supplement general ranking-climbing KPIs (e.g. ‘achieve a position in the top X in the Y rankings’), there were some more detailed ones such as:
And (from the same institution)
Another institution mentioned the QS reputation survey as one way of measuring ‘positive recommendation and endorsement’ by overseas academics and professionals.
Increasing brand awareness, impact and engagement
One area of focus in this category was improving communications to enhance awareness of the university. This included KPIs such as:
Looking at things from the outcomes perspective, other common KPIs related to measuring the reach and impact of communications (with metrics including media coverage and social media mentions / content). One institution sought to:
Beyond the media profile / communications dimension, there were some one-off reputation-linked KPIs such as:
I’m aware of another, more recent, international strategy where the hosting of international conferences is included as a reputation-related KPI.
As you can probably tell from the introduction to this blog, I’m not a fan of the whole concept of ranking institutions. I would certainly be wary of reducing ‘international success’ to a single KPI linked to climbing the rankings.
If a rankings KPI has to be included, it shouldn’t be allowed to dominate institutional thinking about reputation-building. A long-term, sustainable approach is to focus on establishing a global reputation in different ways (with an uplift in the rankings being a positive by-product), rather than taking a short-term, tactical approach to ‘playing the rankings game’.
If the strategy highlights specific areas (thematic and/or geographic) where the institution seeks to build its international reputation, a campaign approach can be adopted with associated measures of impact and engagement (mission-aligned media mentions, stakeholder engagement etc., but also some of the other indicators mentioned above such as international accreditations or inward missions).
A strong theme in the narrative of some strategies is linking local and global communities, nurturing civic, academic and employer networks and tackling shared interest projects collaboratively. Yet this is rarely reflected in KPIs.
I’m sure the reputation experts among you can think of other international KPIs to help institutions focus on the things that matter (and that make them distinctive). I’d love to hear your ideas.
In my next blog, which I think is going to be the final one in this series (though I reserve the right to change my mind as I have a feeling there’s a lot to say!), I’ll be trying to wrap things up with some missing metrics, some key learning points from my research and – just to keep things interesting – some challenges to the value of KPIs.
Part 1 – What sits below the top of the iceberg?
Part 2 – Characteristics and key themes
Part 3 – Good KPIs, traps and tips
Part 4 – TNE students, programmes and partnerships
Part 5 – Attracting international students to the home campus
Part 6 – International and intercultural experiences and exposure
Part 7 – International student experience, success and alumni engagement
Part 8 – Internationalised research and knowledge exchange
Part 9 – International staff base and development opportunities
|