For our study, 90 students from various subject areas, levels of study and with different self-reported confidence in using GAI tools were chosen to participate. Half the participants were given an online tutorial around using the CLEAR method (adapted from Lo, 2023) to create effective GAI prompts and evaluate outputs to work through before beginning the tasks. All participants were then asked to work through a variety of tasks using a GAI tool of their choice, and asked to explain their satisfaction with the output. Our findings showed that students who completed the online tutorial had a significantly higher mean rank score for following the CLEAR method (59.88, N= 45) compared to students who did not complete the tutorial (31.12, N= 45), U= 365.500, P<0.001.
Please briefly describe the activity undertaken for the case study
Entering relevant prompts is essential to get the desired results out of any tool. We developed a version of the CLEAR method, which was detailed in the online tutorial completed by half the students. This explored:
- Being Clear and specific
- Laying out the context and giving background information
- Experimenting with different tools and formats
- Adjusting the prompt as needed
- Reviewing and critically evaluating the output
The aims of our project were:
- To discover how proficient students are in writing prompts and evaluating outputs produced by GAI tools for various academic tasks.
- To examine student satisfaction with outputs generated by GAI tools.
- Establish whether completion of an online tutorial had an impact on improving their prompt literacy and satisfaction with GAI outputs.
- To identify areas of good and bad practice in creating prompts that could inform an online tutorial that further supports students in effectively using GAI.
How was the activity implemented?
We launched a survey to ask for student interest in participating in the project, promoted via a Canvas announcement on our KnowHow courses. The survey asked for the student’s name, email, level of study, confidence in academic writing and confidence in using GAI tools. 90 students were selected at random to participate in the study. All students who participated were given a £20 Amazon voucher funded by CIE’s Innovation Fund.
The 40-minute slots took place in a study room in the library, and students were asked to bring their own laptop, enabling them to access any technologies they were accustomed to using. Upon arrival, half of the participants initially completed an online tutorial created in Articulate Storyline which focused on creating effective GAI prompts and evaluating outputs. All students were then provided with a list of nineteen tasks including writing emails, fixing broken R code, creating revision flashcards, generating images and creating elements of essays. They were informed that they could choose any tasks they wished and use any GAI tool to complete the tasks during their slot. After completing each task, they were asked to verbally explain how satisfied they were with the output and why. The student’s screen and narration were recorded in Microsoft Teams as they completed the tasks. The research team worked through transcribing the steps taken in each recording, and statistical analysis was performed on the outputs.
Has this activity improved programme provision and student experience, if so, how?
This activity has given us insight into how students are using GAI technologies, and some examples of good and poor practice that are taking place. This has informed the creation of tutorials and content around using GAI, which can be found on our KnowHow AI Library Guide.
From our study, we concluded:
- Students who completed the online tutorial had a significantly higher mean rank score for following the CLEAR method (59.88, N= 45) compared to students who did not complete the tutorial (31.12, N= 45), U= 365.500, P<0.001. This demonstrates the impact of our team and importance of teaching students these critical thinking skills.
- Confidence in using GAI technologies was not associated with following the CLEAR method overall rs=0.128, P>0.05, N=90, suggesting that completing the tutorial had a favourable effect on prompt literacy but that confidence levels in using GAI technologies were not associated with better prompt literacy.
- ChatGPT was chosen by the majority of students (96.7%), suggesting that more awareness needs to be raised of alternative technologies that may be more beneficial for specific tasks.
- Overall, 79% students selected that they were satisfied or very satisfied with the outputs produced by the GAI tool. The task which required participants to use a GAI to fix a broken code received the highest satisfaction score, with the tasks on creating an image for a website and organising data in Excel receiving the lowest.
Did you experience any challenges in implementation, if so, how did you overcome these?
Our main challenge was around students turning up for their research project slots. 108 did not attend, which meant that we ended up sending invites and scheduling 198 students in total. Having our student partners available to do simultaneous slots allowed us to work through this more quickly.
How does this case study relate to the Hallmarks and Attributes you have selected?
Our project aligns with several aspects of the Liverpool Curriculum Framework:
Digital Fluency: we examined the ways in which students use and evaluate a variety of digital tools. This provided insight into their GAI prompt skills and evaluating outputs and has allowed us to develop useful resources to allow students to improve their confidence and ability in using GAI tools. Inclusivity: by seeking a broad range of students for our study, we have been able to produce support materials which will appeal to learners with different support needs.
Research-connected teaching: the project findings will underpin the GAI-related aspects of teaching undertaken by Learning Development & Academic Liaison staff, for example embedding GAI content where appropriate into teaching sessions.
How could this case study be transferred to other disciplines?
The findings of this study are relevant across the university. Our previous study showed that up to 51% of university students are considering using GAI tools for academic purposes (Johnston et al., 2024), so our role is to educate students on how to use these most effectively. Areas for student upskilling include training students to use the most suitable GAI tool for the task, prioritising the quality of the prompt and output over the speed, and taking time to reflect on their prompts. This study has demonstrated that even a short online tutorial can effectively teach these skills and lead to measurable improvement in GAI prompt engineering and evaluating outputs, so departments across the university should take advantage of embedding KnowHow content where appropriate.
If someone else were to implement the activity within your case study what advice would you give them?
Having student partners help on projects such as this is valuable, as they played a key role in helping to design the study, conduct the research slots and evaluate the outputs. We would recommend envisaging it will take longer to collect the data than you anticipate, and to prepare for no-shows, if you are doing interviews or some other in-person method of data collection.
Generative Artificial Intelligence - Prompt Literacy Student Research Project is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.