How Online Peer Assessment Platforms Can Enhance Student Engagement and Learning in Actuarial Science

By Mirabelle Huynh and Emily Kozlowski

Expanding Horizons, August 2023

As two lecturers in Canada’s largest actuarial science program, we’re always exploring new educational methods to enhance student learning. Over the last two years, we’ve experimented with online peer assessment platforms in undergraduate courses and have been thrilled to see them deliver major learning advantages and be highly popular with students.

In this article, we explain why peer assessment shows great promise for actuarial science education and the main factors to consider when designing peer assessment activities. We also detail our use of online platforms in two undergraduate courses.

By sharing our experience, we hope to provide you, our fellow educators, with a solid starting point for experimenting with this format. To this end, we’ve made some of our course materials available under a Creative Commons licence for others to use and build on. We’re keen to see how the potential of peer assessment in actuarial science education will be further realized in the future.

Why Peer Assessment?

Both educators and employers agree that strong communication skills are key attributes of successful actuarial science graduates. There’s little doubt that underdeveloped written and oral communication skills hinder professional growth. However, these skills often lag behind technical skills in our students. That’s why strengthening communication skills through coursework has long been a goal of actuarial science educators.

Yet there’s only one way to strengthen students’ communication skills: have them create and present original content and then receive detailed, constructive feedback. But with class sizes ranging from 30 to 250 students, this format has been administratively prohibitive until now. While peer assessment was always an option to distribute the workload of evaluating student content, it’s only with the arrival of online platforms that it has become feasible for a single instructor to administer peer assessment activities at scale.

Designing Peer Assessments

In this context, “peer assessment” means any activity where students review the work of their peers and then provide high-quality, constructive feedback. We consider two main factors when designing peer assessment activities: task design—the type of content students create—and feedback design—how feedback from peers is collected and used.

For communication-based assignments, task design addresses these considerations:

  • What format, e.g., written memos/reports or audio/video presentations
  • What audience, e.g., one with expert knowledge, like a manager, or one without, like a client or friend
  • What purpose, e.g., explain a concept or framework, justify an analysis or recommend a course of action
  • What scope, e.g., open-ended or a well-defined right answer
  • What depth, e.g., demonstrate a deeper understanding of material or apply it to new scenarios

Feedback design addresses these considerations:

  • How feedback is collected, e.g., number of peer reviewers per submission, anonymity of authors and reviewers
  • How feedback is used by students, e.g., whether there is an opportunity to incorporate feedback into a final submission
  • Feedback type, e.g., using a detailed rubric with numerical scores and written comments
  • How feedback is graded, e.g., not graded, pass/fail or based on its quality

Two Applications of Online Peer Assessments in Undergraduate Courses

We have each experimented with peer assessments in different undergraduate courses.

Mirabelle’s Second-Year Introductory Financial Mathematics Course

In a large 200-student introductory course, administrative constraints typically limit the design of student assessments to technical questions that have numerical answers. However, through my interactions with students, I found they didn’t have an opportunity to develop and demonstrate an in-depth understanding of the course material.

To give them this opportunity, I believe they require exposure to more realistic, open-ended questions that require critical thinking and verbal communication. This became a viable option by using an online peer assessment platform.

I asked students to advise a nontechnical audience on how to solve a realistic financial problem by applying actuarial concepts.

Twice during the term, students were asked to record a five-minute narrated PowerPoint presentation (maximum two slides) for one of two problems randomly assigned to them. Here’s an excerpt of a problem I gave them:

Your friend is buying a car and has only five minutes before she meets with her car dealer to choose a payment plan. She says, “I know that payment plans differ in three ways: the down payment size, the interest rate and the loan term. But how do I know which payment plan is the best deal? How should I factor in my personal financial situation?”

Presentations were assigned to three anonymous peers, who used a detailed rubric to provide numerical scores and written feedback.

Each student’s assessment score combined two scores given by peers: one for the quality of their presentation and another for the quality of feedback they provided. Only 5 percent of students requested a re-grade by the instructor, which took less than three hours to complete.

Using the online Kritik platform enabled me to easily manage submissions, assign peer reviewers, collect scores and comments, and calculate final grades.

In course evaluations, almost all students agreed that this activity helped them learn the material better and that evaluating their peers was a valuable learning experience.

You can find the peer assessment instructions provided to students, the presentation rubric, sample questions and other administrative details in the shared folder.

Emily’s Third-Year Life Contingencies Course

I decided to introduce a peer assessment activity to address some of my key goals in teaching this course:

  • to develop critical thinking and written communication skills in actuarial students and
  • to ensure assignments are formative learning opportunities.

As such, my peer assessment activity involved students creating written reports for each assignment that analyzed and summarized the technical results. To maximize the learning opportunity of the assignment, I wanted to create a feedback cycle whereby students got formative feedback that would ideally lower their anxiety related to being assessed on their written communication skills and raise the level of improvement.

Peer assessment allowed me to implement such a cycle by:

  • giving students the opportunity to see their peers’ work and identify their own misconceptions and areas for improvement before final submission,
  • providing students detailed feedback on their own work and
  • reducing the instructor workload required to give detailed formative feedback to each student.

For each of the three assignments during the term, students completed a one- to two-page written response, asking them to communicate and justify their findings in accessible language, as if they were presenting to their manager at work. Using an online peer assessment platform (the PEAR platform, a University of Guelph–developed software available at the University of Waterloo), I could easily create multiple stages for the assignment.

Students first had to submit the technical assignment work for grading and a draft written response. Each student was then randomly assigned three other anonymous students’ written responses to assess using a provided rubric. After the completion and release of the peer assessments, students had three days to modify their written response before submitting a final written response for grading by the instructor/TA. The final written response could incorporate any changes based on the feedback from the three peers who reviewed their work or insight gleaned from their assessment of their peers’ work.

Each assignment was worth 3.5 percent of the total course grade, broken down as follows:

  • 2 percent for grading computational questions
  • 0.5 percent for completing peer assessment to a satisfactory degree
  • 1 percent for grading the final written response

The PEAR platform contained the rubric used by the peer assessors in drop-down form with space for detailed comments, and it allowed instructor grading of the written response to be completed on the same rubric in the same platform. The grading of the peer assessment feedback itself was also done in the online platform and was necessary to ensure participation and equity in assessments.

Overall, feedback from students in post-activity surveys was very positive. Almost all students agreed that the written responses and peer assessment process helped develop their abilities to communicate in an actuarial setting. Further, 75 percent of respondents agreed that the amount of work throughout the term was manageable.

You can find the peer assessment instructions provided to students, the written response rubric and a sample assignment in the shared folder.

Conclusion: Benefits of Peer Assessment to Students and Educators

  1. Students think more critically about course material.
    Communication-based assignments don’t just test the quality of communication; they also reveal a student’s depth of understanding and clarity of thinking. By completing these assignments, students and instructors alike can gain insight into learning gaps and identify ways to bridge them.
  2. Students receive high-quality, constructive feedback.
    Peer feedback was often more thoughtful and thorough than we anticipated. We believe students were motivated by several factors: knowing that feedback itself was graded, knowing that their comments would be read carefully and feeling a sense of reciprocity. In fact, the peer assessment activities seemed to help build community in the courses.
  3. Students gain insight from critiquing their peers.
    By reviewing their peers’ work, students were better able to identify their own strengths and weaknesses. Further, by comparing alternative solutions, they were able to recognize the diversity of valid solutions that differed by communication strategy, structure or emphasis.
  4. Online platforms afford instructors flexibility and ease of use.
    Our experiences show the diverse applications of peer assessment that are enabled by online platforms: different topics at different levels and with different task and feedback designs. The PEAR and Kritik platforms, among many others available, make administration easy and still have many features that we haven’t yet explored.

Statements of fact and opinions expressed herein are those of the individual authors and are not necessarily those of the Society of Actuaries, the editors, or the respective authors’ employers.


Mirabelle Huynh, Ph.D., FSA, ACIA, is a continuing lecturer in the Department of Statistics and Actuarial Science at the University of Waterloo in Waterloo, Ontario, Canada. Mirabelle can be reached at m3huynh@uwaterloo.ca.

Emily Kozlowski, MMath, BEd, ASA, ACIA, is a continuing lecturer in the Department of Statistics and Actuarial Science at the University of Waterloo in Waterloo, Ontario, Canada. Emily can be reached at eikozlow@uwaterloo.ca.