International Strategy for Higher Education Institutions
Posted on by Vicky Lewis
This is the twelfth and, you'll be relieved to hear, final blog in my series sharing insights and emerging ideas on ways to measure international success, based on a review of university international strategies. Links to earlier blogs in the series are provided at the end of this one.
In this final blog, I review some of the constructive and insightful comments made when I’ve shared my KPI blogs on LinkedIn. It was truly gratifying that this series stimulated such lively debate and I appreciate that so many of you took the time to share your own observations.
I’m going to go through each of the thematic blogs in turn.
In response to the blog on TNE students, programmes and partnerships, Jenny Jenkin flagged up the impact on staff when there’s a ‘emergent approach’ to the operating model. It inevitably takes longer than planned to get a TNE operation up and running – and KPIs are often overambitious in this respect. This can be problematic, especially when executive teams and governing bodies lack understanding of the time needed to build relationships and adapt to different cultures and operating processes. This point resonated with Douglas Proctor and Ines Teresa-Palacio, who both also highlighted the importance of risk management KPIs when it comes to TNE. Moving to a more outward-looking perspective, there was an interesting exchange with Ishan Cader about how institutions can best measure the societal impact of TNE in the host country.
The blog on international student recruitment KPIs prompted Ruth Arnold to highlight the importance of measuring the impact of alumni around the world, in order to connect international recruitment with ‘the wider purpose of education’. Rhianna Skeates noted the challenges around evidencing the employment outcomes of international students due to the paucity of data. Vincenzo Raimo contrasted the emphasis in the UK with the more values-led approach to international engagement which shapes priorities and success measures at many continental European universities. Eddie West shared a comment made by a colleague that ‘international students should be valued first and foremost for their contributions to the curriculum, per se, and not for their contributions to the institutional bottom line’. And Julie Vincent highlighted the need to balance data-heavy recruitment / income targets with primary research that involves listening to students and measuring success in terms of what they value.
In my blog on international and intercultural experiences and exposure, Nita Timmerman welcomed the emphasis on Internationalisation at Home. Kathleen Griffiths noted the growing importance of new forms of collaboration – in the shape of COIL and short-term programmes – as there’s a move away from traditional one-for-one reciprocal exchanges. I also had an interesting offline exchange with Elspeth Jones about the tendency, when it comes to KPIs relating to the curriculum, to measure success in terms of ‘proportion of international content’ in programmes, rather than achievement of meaningful internationalised learning outcomes.
The blog on international student experience, success and alumni engagement led Vincenzo Raimo to remind us of the adage from some other industries that ‘happy customers = repeat orders’, reinforcing the link between positive student experience and healthy, sustainable student recruitment. Kamila Malavia and I had an interesting exchange about the danger of overlooking KPIs relating to international student and alumni experience when there is poor integration across different university functions. The recommendation in the blog to find ways to gauge – and enhance – international students’ sense of belonging resonated with Jenny Jenkin.
My blog on internationalised research and knowledge exchange noted the danger of some potential KPIs falling down the cracks between research strategy and international strategy, and the need to have conversations to align these strategies and develop mutually reinforcing KPIs. Ines Teresa-Palacio mentioned a ‘Global and Cooperation’ strategy she had seen, which was designed mainly around the UN Sustainable Development Goals.
In response to the blog on KPIs relating to an international staff base and development opportunities, Douglas Proctor highlighted the driver of international rankings, for which the proportion of international staff is a key metric. Sandy Bhangal-Chaib noted that ‘few institutions actually take advantage of, develop and utilise the international perspectives, expertise, and networks of their staff’, which is a missed opportunity – a point echoed by George Abraham. There was interest in the idea of having KPIs linked to the development of global leadership competencies (as one institution does). As Kamila Malavia pointed out, ‘when international engagement is valued and embedded within frameworks for recognition and progression, it moves beyond policy commitments and into everyday academic and professional life, where it can truly influence culture’.
The final thematic blog on KPIs linked to global rank and reputation caused Kamila Malavia to reflect on how younger universities are often deeply engaged in innovative, impactful, mission-driven international work, which is not recognised within the traditional criteria used to formulate global rankings. Ines Teresa-Palacio highlighted the urgent need to understand and unpick what ‘reputation’ means for each institution and stakeholder group, using this as a starting point for defining relevant KPIs. Jenny Jenkin noted the cost in internal resources of fixating on climbing the rankings, while Gareth Topp – and others – observed that, while universities remain reliant on international fee income, rankings are here to stay since they are such a major driver of student decision making. Gareth also noted the growth in importance of employability as evidence of ROI – and the fact that, in some markets, high ranking is perceived to be a proxy for strong employment outcomes.
Some of the blogs sparked broader reflections.
Being honest about the ‘why’
Dörte Stevenson noted the importance of questioning why we are doing what we’re doing. This links to a fascinating exchange of comments with Vincenzo Raimo about the importance of KPIs reflecting an honest evaluation of strategic purpose.
Complementing and contextualising hard KPIs with softer ones
Abigail Gregory suggested that Faculty-level strategies may address some of the ‘softer’ metrics that sometimes get omitted at institutional level. There was a view, shared among many, that KPIs can be useful but should be contextualised, and numerical KPIs should be complemented by qualitative analysis.
Alignment between strategy, KPIs and practice
George Abraham raised the disconnect between strategy and practice (what people prioritise in their daily work). Gareth Topp commented on the need for accountability for KPIs, the importance of leading (as opposed to lagging) indicators, and the value of regular progress reporting. Not performative reporting, but the kind that leads, where necessary, to action being taken!
Horizontal alignment across different institutional strategies
Observations about the interrelatedness of different strands of strategy prompted Vinitha Gengatharan to reflect on the ‘intertwined complexities and challenges’ that institutions must navigate. There are opportunities for KPIs linked to different institutional priorities to reinforce one another. However, if horizontal alignment is poor, there’s also a risk that they may be inconsistent or even conflicting.
Taking a long-term view – to beyond the end of the strategy period
Ruth Arnold highlighted the less tangible – but nonetheless hugely valuable – benefits of international education, noting that ‘when students meet and connect with one another as individuals at a formative age it changes their perception not only of one another but of themselves and the world we share’. She suggested that ‘the short term metrics of attainment and retention are only the beginning of a lifelong process, and the income gained by an institution or a country step one in what can be a way of forming authentic connections with the power to do long term good’. The true measures of success don’t always fit neatly within a 5 or even 10-year strategy envelope.
I’ve found the discussions sparked by my exploration of the KPIs in UK university international strategies fascinating – and there are lots of areas it would be interesting to follow up. I need a bit more time to reflect on that.
However, I keep coming back to the notion that the keys to developing an effective (and distinctive) international strategy (supported by KPIs that are useful) are:
Oh, and not trying to do everything.
Making considered strategic choices (including choosing what to leave out) is pretty important too!
Here are some of the key questions, specifically relating to measuring success, which I proposed back in 2021 in the conclusion to my Global Strategies Report.
New ways of measuring success - key questions:
What would you add to these?
Future blogs
If you have any thoughts about international strategy related topics that it might be worth me exploring in more detail, please let me know. I’m open to suggestions.
And, if you’d like to receive occasional email updates from me, you can sign up here. They are very sporadic (you won’t get bombarded). I mainly use the updates to flag up blogs I’ve written or other pieces I’ve contributed to. Sometimes I throw in links to other people’s insights too!
Part 1 – What sits below the top of the iceberg?
Part 2 – Characteristics and key themes
Part 3 – Good KPIs, traps and tips
Part 4 – TNE students, programmes and partnerships
Part 5 – Attracting international students to the home campus
Part 6 – International and intercultural experiences and exposure
Part 7 – International student experience, success and alumni engagement
Part 8 – Internationalised research and knowledge exchange
Part 9 – International staff base and development opportunities
Part 10 – Global rank and reputation
Part 11 – What’s missing – and do we really need to measure everything?
|