This project trialled the integration of the Canvas Peer Review tool into a live, two-hour session, transforming infographic production into a dynamic, "authentic" learning experience. By enabling a large cohort to produce interim work and immediately apply formal marking rubrics to their peers, the initiative successfully bridged the gap between theory and practice. This manageable approach directly addresses the challenges of large-scale engagement and digital fluency, fostering a robust community of enquiry that aligns with the core LLF Hallmarks and Graduate Attributes.
Please briefly describe the activity undertaken for the case study
This activity involved a large cohort (approx. 140 students) in a live, two-hour session focused on the creation and evaluation of infographics. Using Canvas’s in-built Peer Review tool, students were tasked with producing an interim infographic based on specific data and then immediately peer-reviewing their classmates' work using the official assessment criteria for their upcoming formal coursework. We found this live setting to be unique and highly effective. It ensured technical reliability by providing immediate support, maximised student engagement through a structured classroom environment, and improved outcomes by giving students instant, actionable feedback before their final submission.
How was the activity implemented?
The implementation was a tightly synchronised collaboration between the Management School, the Library, and the CIE:
- Canvas Configuration: We used a standard Canvas Assignment with 'Require Peer Reviews' and 'Automatically assign peer reviews' enabled.
- Anonymity: Reviews were set to appear anonymously to foster a safe environment for honest feedback.
- Live Session Timeline:
- 0-30 mins: Library delivery on design principles and information criticality.
- 30-60 mins: Students produced and uploaded their infographics to Canvas.
- 60-75 mins: At the set deadline, Canvas automatically distributed two reviews to each student who had submitted work.
- 75-120 mins: Students completed reviews using the module rubric.
- Proactive Inclusion: Because the system only automatically assigns reviews to those students who have themselves submitted an infographic for review, there is a risk that late-comers or those that fall behind are left with nothing to do. To ensure inclusion, students who missed the deadline were paired with those who had submitted, allowing them to engage in the marking process manually and stay included in the activity.
Has this activity improved programme provision and student experience, if so, how?
Implementing the Peer Review tool live rather than asynchronously significantly reduced student and staff anxiety while increasing engagement.
- Immediate Feedback: Students received formative feedback in real-time, allowing them to apply lessons learned to their final coursework immediately.
- Authenticated Learning: By using the actual module rubric, students moved beyond passive reception to active "marking," gaining a deeper understanding of how their own work would be assessed.
- AI Deterrence: Engaging students in a live, critical review process encourages authentic production rather than relying solely on Generative AI outputs.
The "haste" of the live session proved to be a positive catalyst for learning:
- Focus Over Perfection: We were not expecting masterpieces within the short timeframe; instead, the time pressure allowed students to focus on trying out specific parts of the rubric criteria.
- Heightened Engagement: The fast-paced, synchronised nature of the task created a buzz of activity that moved students from passive listeners to active participants.
- Technical Reliability: Running the activity live meant staff could resolve technical hurdles or interface confusion immediately, ensuring no student was left behind.
Did you experience any challenges in implementation, if so, how did you overcome these?
- The "To-Do List" Interface: We discovered that students clicking the link in their Canvas "To-Do" list saw a different, less intuitive interface. We overcame this by providing clear instructions and slides directing them to access reviews specifically through the Assignment page.
- Timing Pressures: Some students did not submit by the strict deadline and were excluded from the automatic distribution. To mitigate this in the future, we recommend a second "post-work" submission box for those who missed the live window.
How does this case study relate to the Hallmarks and Attributes you have selected?
This project aligns with the Liverpool Learning Framework (LLF) through the following explicit linkages:
- Authentic Assessment (Hallmark): The task mirrors the format and critical thinking required of a graduate-level professional. Students engaged in a meaningful application of knowledge by applying real-world assessment criteria to their peers' work.
- Active Learning (Hallmark): Students were active participants in their learning, using collaborative peer-to-peer feedback to develop self-efficacy and identify their own learning needs.
- Digital Fluency (Graduate Attribute): Students developed the ability to connect and collaborate via digital platforms while making balanced, ethically informed judgements about the information they find and use.
- Confidence (Graduate Attribute): By acting as evaluators, students became more proactive and resilient in their approach to the subject matter, building the confidence to apply academic knowledge in a fast-paced setting.
- Design Principle 6: The activity used digital technology as a direct enabler of learning, progressively developing students' information literacy and critical thinking skills.
How could this case study be transferred to other disciplines?
The "Live Peer Review" model is highly adaptable:
- Health Sciences: For rapid peer-evaluation of patient case notes or triage plans during a seminar.
- Architecture/Design: For "lightning" reviews of digital sketches or floor plans.
- Law: For checking the structure of legal arguments against a specific marking framework in-class.
If someone else were to implement the activity within your case study what advice would you give them?
- Strict Timings: Ensure your Canvas "Due Date" is set to the exact minute you want peer reviews to begin; the system relies on this trigger to distribute work.
- File Restrictions: Restrict uploads to PDF to ensure the infographic renders correctly within the Canvas feedback interface for peers.
- Manage Inspiration: Consider creating a post-session "Gallery" where students can see all submissions to gain inspiration, (or a gallery of worked examples) but be mindful of providing "perfect" templates that might discourage original thinking.
- Monitor Live: Use the "Peer Review" button on the assignment page to monitor progress checkmarks in real-time, ensuring no student is left without work to review.
- Collaborate: Involve your Liaison Librarian or CIE early to help script the session and build student-facing guidance slides.

Live Peer Review in Canvas for Rapid Feedback and Criticality by Dr Regina Frank, Liam Kaye and Will Moindrot is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
