What's working and why?
As a software supplier and partner to over 70 universities, Lorensbergs is monitoring how universities are responding to the challenges of measuring and understanding student satisfaction levels. Being able to enhance the student experience is key when developing effective student-facing software - it's a topic of high importance to Lorensbergs and our university customers. In this briefing, we explore how universities are remaining focused on student satisfaction and teaching quality measurement but identify some wide variations in how it’s being approached. We hope you enjoy reading the findings provided in this latest briefing.
What’s really getting measured?
It’s no exaggeration to say that established student satisfaction ratings, like the yearly Wall Street Journal/Times Higher Education College rankings, have become compulsory reading for all university administrators and faculty. Yet these surveys are not without their critics.
Are they reliable barometers of teaching quality and academic merit, or are they really measuring a different set of variables all together? A recent study of U.S. student satisfaction survey data by the University of Groningen provides some intriguing insights.
The study suggests that the data captures student enjoyment (influenced to some extent by how lenient a teacher is or how little effort is required to take the class) more often than they do learning value, yielding a worryingly misleading evaluation of the real academic merits of institutions in the Higher Education sector.
Another problem exists though, and that's the growing student despondency to the many attempts to measure their satisfaction. There's certainly a need to review the approach and develop one that's more sustainable.
Can we really obtain a clear understanding of the quality of student experience and the true value of their academic pursuits in one annual survey? Critics suggest that we can’t, and that a more agile and responsive approach to harvesting student feedback is sorely needed.
This isn’t to decry the inherent value of garnering comprehensive student opinion. But it is to question whether data obtained via survey machinery with such lengthy lead times each year is the best way of gauging it. Important information invariably gets lost with such approaches, no matter how many ‘pillars’ of student satisfaction are solicited.
Growing numbers of institutions in the Higher Education sector are becoming increasingly hostile to the reductionistic approach to college rankings published in magazines like US News, which wield enormous influence over where students choose to attend. This was reflected in a letter from the US academic excellence watchdog, the Educational Conservancy, which was jointly signed by a huge number of US college presidents.
The letter, published in 2007, urged other presidents to boycott such media-driven surveys. It's a concern that continues to this day, with recent new studies fueling the debate. But what if we were to sample student experience on a more continuous basis, using more organic and easy-to-access methods of collecting student feedback? Some key initiatives from ‘across the pond’ in the UK suggest that significant improvements in accuracy can indeed be achieved.
An initiative from the University of Edinburgh points to a means of obtaining student satisfaction feedback which is vastly more reliable and valid. After launching its “Student Experience Project” in 2012, the university appointed a student survey team who devised a nimble, easy solution: a central point of contact for student feedback which fed the results directly back to staff. This resulted in a series of interventions with successful projects being developed into mainstream practices. Key to this work was the Students Communications Team, whose members kept the feedback loop alive by demonstrating that student opinion was being action upon.
Communication and agility in responsiveness are important factors, and so is time. A major new study of 600 current students and graduates of U.S. universities found that those institutions that focused on enhancing student satisfaction and attachment on a long-term basis outperformed those who focused predominantly on prestige, significantly strengthening their brand reputation and recruitment attractiveness in the process. Published in the August 2016 edition of the Journal of Business Research, the study (The role of brand attachment in higher education) encompassed the satisfaction and attachment not only of current students but also graduates who had progressed into their careers.
Teaching quality was a significant parameter in student ratings, but so too were campus facilities, social life, atmosphere and employment opportunities. These deepened students’ feelings of connectedness to their university, which was sustained amongst graduates whose universities continued to make efforts to solicit their views and keep them connected to the institution.
Certainly, universities that view students as ‘customers’ who experience the ‘brand’ of an institution, are going to greater lengths in tracking their opinions. For example, one institution employs a customer experience consultancy and uses touchscreen kiosks, countertop tablets and online surveys to get same-day feedback on events and new initiatives. In this way, a near constant stream of opinion is obtained and relayed to relevant departments.
Clearly, technology can help in this long-term endeavour: disposing of cumbersome paper questionnaires and emails, and opting instead for rapid-response methods.
Anecdotal evidence from faculty members suggests that it’s not uncommon for students to be asked to respond to more than 20 questionnaires a year. Is it useful to repeatedly ask students for their opinion like this?
Or is it more likely to result in survey fatigue?
But when students see rapid responses and changes arising directly from their feedback, the evidence from Edinburgh suggests that such fatigue evaporates. Their Student Communications Team emerges as exemplary, winning sustained student participation and interest throughout the process.
Universities in the U.S. are undoubtedly getting a lot of things right: the prestigious QS World University Rankings found that U.S. institutions dominated the subjects league table, leading in 31 of the 42 subjects monitored (the closest rival was the UK, leading on just eight of the subjects). But there’s no room for complacency: the latest ICEF i-graduate Agent Barometer found that of over 1,000 agents from 108 countries polled, 67 percent rated the US 'very attractive' in 2016. That sounds good, but it represents a decline from the 77 percent achieved in 2015.
The way forward
Agile, easily accessible and rapidly responsive sampling techniques enabled by digital technologies offer much promise, as results discussed above demonstrate. With these techniques, valid feedback can continue to be pooled and acted upon while survey fatigue becomes a thing of the past. Most importantly, student satisfaction and accomplishment can both continue to rise.
Lorensbergs is the company behind connect2, the equipment reservation system that’s ideal for media centers and equipment loan desks, putting the focus on accessibility and efficiency.
Connect2 is designed for universities and colleges to showcase equipment and resources, manage inventory and offer students an easy to use reservation facility. Most importantly, it's a system that students love using - customer feedback tells us that 88% of student users rate connect2 as ‘excellent’ or ‘good’.
For further information on connect2, please contact Danny Thomas on 646 583 2215 or email firstname.lastname@example.org