Tablet and Books

Innovative Digital Assessment

In this guide, we propose you to consider setting assessments in formats not considered before. What is considered innovation in one discipline might not be considered as such in another. However, innovation generally means change directed at improving a current practice, and that the initiator sees this change as innovation in a specific context (Hannan and Silver, 2000). To proponents of digital innovation, the question is not whether technology should be used in classrooms, but how it should be used (Wellings and Levine, 2009).

Author: Dr Monica Chavez Munoz & Dr Tunde Varga-Atkins

Benefits

By creating innovative digital assessments students:

  • Engage more with subject content. Creating an output using different media increases students’ subject engagement.
  • Increase their employability skills. The skills acquired by creating an output give students authentic skills that are wanted by employers.
  • Assess diverse skills. Innovative assessments could make assessment fairer by including other abilities not fully assessed (e.g. collaboration) (Cox, Vasconcelos & Holdridge, 2010).

Putting it into practice

Some innovative student outputs include:

  • Talk show performance. Making a traditional presentation into a more interactive experience by recording/streaming it and making it character-based (Rao and Stupans, 2012).
  • Podcasts. Empowering students’ voice and creativity by developing their ability to express and explain subject content while recording themselves (Lee et al., 2008).
  • Vlogs. Digital video projects place ownership on the student and motivates them as their work is viewed and evaluated not only by the teacher but also by their peers (Kearney and Shuck, 2006).
  • Infographic. Students can synthesise and present information in a creative visual form drawing on information, visual, and technology literacies.
  • Digital poster. Poster production can be individual or group work and is an opportunity to use peer assessment to give feedback. A clearly explained marking criteria help students assess their peers accurately and reliably (Orpen,1982).

Considerations

  • Purpose & alignment. Technological methods should be constructively aligned with learning outcomes, assessment and feedback (Biggs and Tang, 2011).
  • Marking. Assessment of disciplinary knowledge should be emphasised over other aspects of the final product (e.g.media).
  • Formative assessment. Students may study only what is assessed (Innis, 1996). Therefore, formative assessments should be strategically timed and designed to connect with the summative assessment (Gibbs, 2019).
  • Reflection. A commentary of students’ reflection and learning about meaning making through various modes, or about the skills learnt through the process would make a more meaningful and effective assessment.

Challenges

  • Assessment of academic knowledge. Digital assessments do not compromise students’ ability to apply knowledge from their subject area. Instead, they help relate subject knowledge to the real work and in turn increase their employability.
  • Technological support. Plan and develop resources to support different levels of digital capabilities of students to produce media content. The time investment you make will pay off in the long term as you are able to reuse the assessment design with other cohorts.
  • Students prefer the essay or exam. Fear of change is the main reason students are suspicious of innovative digital assessments. However, with good design and support, students can be more motivated to perform than with traditional methods.
  • Resistance from external accrediting bodies and examiners. Implementing a digital assessment has the potential to be recognised as best practice by professional bodies and examiners (Bragg, Whitworth & Firth, 2019).

References

Biggs, J. And C. Tang. (2011). Teaching for Quality Learning at University. 4th ed. Maidenhead: Open University Press.

Bragg, J., Whitworth, D., & Firth, M. (2019). Busting myths on alternative assessments. Available from: https://luminate.prospects.ac.uk/busting-myths-on-alternative-assessments 

Cox, A. M., Vasconcelos, A. C., & Holdridge, P. (2010). Diversifying assessment through multimedia creation in a non‐technical module: reflections on the MAIK project. Assessment & Evaluation in Higher Education35(7), 831-846.

Gibbs, G. (2006). Why assessment is changing. In Innovative assessment in higher education (pp. 31-42). London: Routledge.

Hannan, A. and Silver, H. (2000). Innovating in Higher Education: Teaching, Learning and Institutional Cultures. Buckingham: SRHE and Open University Press.

Innis, K. (1996). Diary Survey: how undergraduate full-time students spend their time. Leeds: Leeds Metropolitan University.

Kearney, M. & Shuck, S. (2006). Spotlight on authentic learning: student developed digital video projects. Australasian Journal of Educational Technology, 22(2):189–208.

Lee, M. J. W., McLoughlin, C., & Chan, A. (2008). Talk the talk: Learner-generated podcasts as catalysts for knowledge creation. British Journal of Educational Technology, 39, 501–521.

Orpen, C. (1982). Student versus lecturer assessment of learning: a research note. Higher Education, 11, 567-572. 

Rao, D., & Stupans, I. (2012). Exploring the potential of role play in higher education: development of a typology and teacher guidelines. Innovations in Education and Teaching International, 49(4), 427-436.

Wellings, J., & Levine, M. H. (2009). The digital promise: Transforming learning with innovative uses of technology. In New York: Joan Ganz Cooney Center at Sesame Workshop.

Help and Feedback

Can you help us improve this resource or suggest a future one? Do you need this resource in an alternative format? Please contact us at cie@liverpool.ac.uk

 

Creative Commons Licence
Innovative Digital Assessment by Dr Monica Chavez Munoz & Dr Tunde Varga-Atkins is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.