fbpx

Analysing large survey data using automated insights

Jan. 10, 2023
lumivero
Published: Jan. 10, 2023

NVivo Plus enables quick identification of themes and sentiment in University wide student feedback.

An impossible task?

The National Student Survey (NSS) is an annual survey which gathers the opinions, and experiences, of final-year undergraduate students on the course that they have studied. It is widely recognised as an authoritative and highly influential survey in building a picture of the quality of life in higher education in the UK.

At Lancaster University, as with many other institutions, there had to date been no systematic analysis of the qualitative element of the survey performed. The challenge was to develop a better and deeper understanding from the data collected, to improve the student experience of living and studying at Lancaster.

This was taken on as a pilot project by Steve Wright, Ph.D., a Learning Technologist in the Faculty of Health and Medicine at Lancaster University.

Finding the right tool for the task

The NSS is very important to Lancaster, as the score that is achieved has a huge influence on the current standing of the University.

Lancaster was awarded Gold in the Teaching Excellence Framework (TEF) following an outstanding NSS score (84.3% positive score) in 2016. It was also recently named University of the Year by the Times and the Sunday Times Good University Guide 2018.

The aim of the NSS pilot project was to feed into a broad spectrum of institutional activity in preparation for the University’s TEF submission and evaluation.

The first phase looked at a comparison of three possible tools to use for the qualitative analysis:

  • QDA Miner with WordStat,
  • NVivo,
  • and Leximancer were all in the running.

NVivo was chosen because:

  • The interface is much more comprehensible
  • The institution is familiar with NVivo and has many skilled users
  • WordStat and QDA Miner are incredibly powerful but incredibly complex
  • The outputs of the others didn’t have the interactivity of NVivo

NVivo’s ability to present data visually was also an important factor in the selection process. “In NVivo the visualisations enable you to explore the data which makes it a very powerful tool for both analysis and presentation,” said Steve. Rather than put a bar chart in a report, he presents directly from within NVivo. “The ability to use it as an interactive presentation is what makes it so powerful, for example, to be able to click on a column in the histogram and get the underlying data,” he said.

Testing the automation of sentiment and themes

Lancaster University received 8000 NSS comments, amounting to 25,000 words to be analysed, across the institution.

The NSS asked students to complete three open-ended comments:

  • A positive comment
  • A negative comment
  • A comment about the institution

Steve was interested in finding out what insights could be gleaned from the data, particularly if they used a systematic approach that could be replicated. The approach would pull out key topics, group those key topics and then explore the sentiment related to those topics. The idea being that as the sentiment is there in the NSS structure (asking for a positive and a negative comment), it is possible to check the accuracy of NVivo’s sentiment analysis (negative/positive) against those and then extrapolate from that, or use it as an example of the confidence you can have in NVivo’s automated sentiment tagging for other datasets.

“We have loads of text but mostly what happens is, we share it with no analysis, and only basic structure with the key people in a department for them to read through. And the easiest thing that happens, when they receive three or four pages of comments, is to read the first few and construct a narrative in their head and immediately get information bias,” said Steve.

“NVivo Plus tools were really good for this. I was able to take this minimally structured data – which only gave the department it related to and the type of a comment and then to extract topics and cross-reference those with the sentiment, as well as provide summaries,” he said.

Benefits of using NVivo

The analysis was well received however it received one critical response, which contended his analysis did not show anything further than the previous statistical analysis. Steve argued that was not the case. “The statistics show that students are broadly happy here. They like certain things but specific areas are shown by the quantitative statistics as being lower, however, what the statistics don’t do, is give any real insights as to the processes, the experiences that inform those lower scores.”

“What the qualitative analysis allows you to do, is to pull out those topics and break them apart to see why some departments had a higher score – to identify good practice, and some of the specific reasons given where there were lower scores to inform interventions and development, for example,” Steve said.
NVivo’s sentiment analysis capabilities played an important part in the data analysis, particularly given the way survey data is collected for the NSS.

“Because of the structure of the NSS, of positive comment and negative comment, we were able to cross-tabulate that with NVivo’s sentiment analysis and get a kind of built-in check of accuracy,” said Steve. “And it was very high. It tends to get things wrong by addition, not omission. i.e. it will classify something as both positive and negative when it’s just positive. The classic instance being ‘I had a load of personal problems and the department was fantastic.’ That is a positive comment, but, because it has the word ‘problems’ in it, it is automatically classified as negative as well.

Overwhelmingly NVivo classifies it correctly, we know that there will be some false matches but they’re a minority and given the volume of data it enables us to work with they can be accounted for,” he said.  “What’s more, this gives us a baseline for being confident in the sentiment analysis when we apply this approach to other student feedback and comment data without this structure.”

Being able to share the project across the University with other staff who are familiar with NVivo is an ambition for the future, as opposed to sharing a static report. Staff can delve straight into the project and discover insights for themselves.

Future Work

The University is planning to repeat the analysis next year, and build upon the framework.

From those who have seen it, there has been some real interest. “I think the real potential is with student or staff surveys. Most organisations have staff surveys, and they ask for extensive qualitative comments and usually, don’t do any sort of systematic analysis with them,” said Steve.

The point of the project was to develop a method, and NVivo assisted with a better analysis of this data. “The questions were:

  • How could we develop this into a good method not just for this project, but with an eye towards other datasets to improve our institutional data analysis?
  • How can the institution actually use this data which is essentially about our student “customers” and improve the student experience?

And I really think the support NVivo provided has a real potential for other sectors with those practical priorities for working with qualitative data rather than the software being part of the somewhat arcane, and highly theoretical, pursuits of qualitative analysis within academia,” said Steve.

He also suggests that there’s a significant opportunity for commercial and public-sector organisations who need to work with unstructured datasets for analysing customer experience, and with a lot of potential for further development of methods and approaches like those introduced here.

About Steve Wright

Earning his Ph.D. in E-Research and Technology Enhanced Learning in 2014, Steve Wright works as a Learning Technologist in the Faculty of Health and Medicine at Lancaster University in the UK. He is also an independent CAQDAS trainer, consultant and certified NVivo expert. As a researcher, he completed five small-scale research projects, in addition to his Ph.D. thesis on sensory learning with a focus on craft beer, with which he took an ethnographic approach.

As an academic-related professional, he’s particularly interested in the e-research area and discovering what is possible for digital tools and how they’ll influence new approaches, which remains his focus. He also has an interest in the development, research, and teaching of methods. Steve’s consultancy and training work is through www.caqdas.co.uk    

About QSR International

Every day, QSR International helps 1.5 million researchers, marketers and others to utilize Qualitative Data Analysis (QDA) to uncover deeper insights contained within the “human data” collected via social media, consumer and community feedback and other means. We give people the power to make better decisions by uncovering more insights to advance their area of exploration.

NVivo Plus enables quick identification of themes and sentiment in University wide student feedback.

An impossible task?

The National Student Survey (NSS) is an annual survey which gathers the opinions, and experiences, of final-year undergraduate students on the course that they have studied. It is widely recognised as an authoritative and highly influential survey in building a picture of the quality of life in higher education in the UK.

At Lancaster University, as with many other institutions, there had to date been no systematic analysis of the qualitative element of the survey performed. The challenge was to develop a better and deeper understanding from the data collected, to improve the student experience of living and studying at Lancaster.

This was taken on as a pilot project by Steve Wright, Ph.D., a Learning Technologist in the Faculty of Health and Medicine at Lancaster University.

Finding the right tool for the task

The NSS is very important to Lancaster, as the score that is achieved has a huge influence on the current standing of the University.

Lancaster was awarded Gold in the Teaching Excellence Framework (TEF) following an outstanding NSS score (84.3% positive score) in 2016. It was also recently named University of the Year by the Times and the Sunday Times Good University Guide 2018.

The aim of the NSS pilot project was to feed into a broad spectrum of institutional activity in preparation for the University’s TEF submission and evaluation.

The first phase looked at a comparison of three possible tools to use for the qualitative analysis:

  • QDA Miner with WordStat,
  • NVivo,
  • and Leximancer were all in the running.

NVivo was chosen because:

  • The interface is much more comprehensible
  • The institution is familiar with NVivo and has many skilled users
  • WordStat and QDA Miner are incredibly powerful but incredibly complex
  • The outputs of the others didn’t have the interactivity of NVivo

NVivo’s ability to present data visually was also an important factor in the selection process. “In NVivo the visualisations enable you to explore the data which makes it a very powerful tool for both analysis and presentation,” said Steve. Rather than put a bar chart in a report, he presents directly from within NVivo. “The ability to use it as an interactive presentation is what makes it so powerful, for example, to be able to click on a column in the histogram and get the underlying data,” he said.

Testing the automation of sentiment and themes

Lancaster University received 8000 NSS comments, amounting to 25,000 words to be analysed, across the institution.

The NSS asked students to complete three open-ended comments:

  • A positive comment
  • A negative comment
  • A comment about the institution

Steve was interested in finding out what insights could be gleaned from the data, particularly if they used a systematic approach that could be replicated. The approach would pull out key topics, group those key topics and then explore the sentiment related to those topics. The idea being that as the sentiment is there in the NSS structure (asking for a positive and a negative comment), it is possible to check the accuracy of NVivo’s sentiment analysis (negative/positive) against those and then extrapolate from that, or use it as an example of the confidence you can have in NVivo’s automated sentiment tagging for other datasets.

“We have loads of text but mostly what happens is, we share it with no analysis, and only basic structure with the key people in a department for them to read through. And the easiest thing that happens, when they receive three or four pages of comments, is to read the first few and construct a narrative in their head and immediately get information bias,” said Steve.

“NVivo Plus tools were really good for this. I was able to take this minimally structured data – which only gave the department it related to and the type of a comment and then to extract topics and cross-reference those with the sentiment, as well as provide summaries,” he said.

Benefits of using NVivo

The analysis was well received however it received one critical response, which contended his analysis did not show anything further than the previous statistical analysis. Steve argued that was not the case. “The statistics show that students are broadly happy here. They like certain things but specific areas are shown by the quantitative statistics as being lower, however, what the statistics don’t do, is give any real insights as to the processes, the experiences that inform those lower scores.”

“What the qualitative analysis allows you to do, is to pull out those topics and break them apart to see why some departments had a higher score – to identify good practice, and some of the specific reasons given where there were lower scores to inform interventions and development, for example,” Steve said.
NVivo’s sentiment analysis capabilities played an important part in the data analysis, particularly given the way survey data is collected for the NSS.

“Because of the structure of the NSS, of positive comment and negative comment, we were able to cross-tabulate that with NVivo’s sentiment analysis and get a kind of built-in check of accuracy,” said Steve. “And it was very high. It tends to get things wrong by addition, not omission. i.e. it will classify something as both positive and negative when it’s just positive. The classic instance being ‘I had a load of personal problems and the department was fantastic.’ That is a positive comment, but, because it has the word ‘problems’ in it, it is automatically classified as negative as well.

Overwhelmingly NVivo classifies it correctly, we know that there will be some false matches but they’re a minority and given the volume of data it enables us to work with they can be accounted for,” he said.  “What’s more, this gives us a baseline for being confident in the sentiment analysis when we apply this approach to other student feedback and comment data without this structure.”

Being able to share the project across the University with other staff who are familiar with NVivo is an ambition for the future, as opposed to sharing a static report. Staff can delve straight into the project and discover insights for themselves.

Future Work

The University is planning to repeat the analysis next year, and build upon the framework.

From those who have seen it, there has been some real interest. “I think the real potential is with student or staff surveys. Most organisations have staff surveys, and they ask for extensive qualitative comments and usually, don’t do any sort of systematic analysis with them,” said Steve.

The point of the project was to develop a method, and NVivo assisted with a better analysis of this data. “The questions were:

  • How could we develop this into a good method not just for this project, but with an eye towards other datasets to improve our institutional data analysis?
  • How can the institution actually use this data which is essentially about our student “customers” and improve the student experience?

And I really think the support NVivo provided has a real potential for other sectors with those practical priorities for working with qualitative data rather than the software being part of the somewhat arcane, and highly theoretical, pursuits of qualitative analysis within academia,” said Steve.

He also suggests that there’s a significant opportunity for commercial and public-sector organisations who need to work with unstructured datasets for analysing customer experience, and with a lot of potential for further development of methods and approaches like those introduced here.

About Steve Wright

Earning his Ph.D. in E-Research and Technology Enhanced Learning in 2014, Steve Wright works as a Learning Technologist in the Faculty of Health and Medicine at Lancaster University in the UK. He is also an independent CAQDAS trainer, consultant and certified NVivo expert. As a researcher, he completed five small-scale research projects, in addition to his Ph.D. thesis on sensory learning with a focus on craft beer, with which he took an ethnographic approach.

As an academic-related professional, he’s particularly interested in the e-research area and discovering what is possible for digital tools and how they’ll influence new approaches, which remains his focus. He also has an interest in the development, research, and teaching of methods. Steve’s consultancy and training work is through www.caqdas.co.uk    

About QSR International

Every day, QSR International helps 1.5 million researchers, marketers and others to utilize Qualitative Data Analysis (QDA) to uncover deeper insights contained within the “human data” collected via social media, consumer and community feedback and other means. We give people the power to make better decisions by uncovering more insights to advance their area of exploration.

magnifierarrow-right
linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram