Redesign of a course from F2F to Fully Online

I have extensive experience in course design and development through my involvement in MSCI0501 which I helped to revise and blend and CLIM1001 which I helped to develop into on-line only mode in 2017. Both courses are general education courses which need to accommodate a broad range of learners.        

My CCRC colleagues, Gab Abramowitz, Willem Huiskamp and I, in collaboration with members of UNSW’s Learning and Teaching team, redesigned CLIM1001- Introduction to Climate Change  into an online only format. We followed a backward design approach (Wiggins and McTighe, 1998) where we considered what we wanted students to be able to understand and do before delving into content and assessment. Consequently, this course redesign was extensive, starting with a revision of the learning outcomes, development of new online lessons, new digital assets (illustrations and short videos) and new online formative and summative assessments all sourced or developed by the teaching team (this was not part of a Digital Uplift). Of particular note was a new assessment that I designed which allowed students to play the role of peer reviewers by reviewing each other’s work within a mock peer review process, submitting their work to a fictional journal with staff playing the role of editors. In addition to the role play element, a point of difference in this assessment from many peer marking models is that students go through three phases (write up, review and address of review) in line with the learning outcome of gaining critical insights into the scientific peer review process. This critical insight is then assessed in a final individual reflection piece.  

We attempted to optimise student consultation and feedback by providing a link to students every week entitled "What can we do better?". This allowed students to report challenges with the course as soon as they experienced it which may be long forgotten by the time MyExperience comes around. It also allowed us to address easy fixes immediately and also consult with the entire course cohort via discussions if we thought that a particular issue or student suggestion raised may not have strong consensus (e.g., group work design, assessment weightings).  

Evaluation is an important element of education design. For evaluating learning outcomes, I compared student outputs from assignments with our key learning outcomes as well as previous version of the course (keeping in mind the learning outcomes were redefined) to determine whether assessment tasks were effective. We found, for example, that students' personal reflections showed broad and deep critical insights into the peer review process such as examining the time constraints on reviewers, expertise consensus, journal choices, single blind versus double blind models etc whereas previously, students generally parroted back that scientific peer review was best practice without justification.

A major constraint in undergraduate teaching is providing opportunities for students to act on feedback. An advantage of the three phase peer review assessment model is that addressing feedback is built into the design. Phases one and two are formative and students must address peer review comments as part of their rejoinder which is a summative assessment task. They also receive broad feedback from staff at each stage leading to marked improvements in student papers by phase three. Many students are caught off guard by the amount of writing expected in this course and require language, grammar and essay structure feedback to improve their formal writing. To aid students with this, I ran a trial of UNSW's Smarthinking by Pearson in the course in T3 2019. Students had the option to submit a draft of any phase of the peer review process and a mandatory submission for their reflection piece. The Learning and Teaching Unit's  Student Academic and Career Success team analysed the 2019 T1 and T2 grades to evaluate the impact of this tool. While they found no statistical difference between student grades with and without Smarthinking in this course, they did find a statistically significant improvement in grades with Smarthinking in another large course. 

References:

Wiggins, G. and McTighe, J. (1998) Understanding by Design. Alexandria, VA: Association for Supervision and Curriculum Development.

Sample student paper of individual reflection on peer review process. Anonymised, marks and feedback removed.
Smarthinking feedback learning outcomes alignment and framework
Sample feedback from Smarthinking tutors on draft papers. Students can choose which elements they wish to receive feedback on.This student chose content development.
Smarthinking Pilot evaluation report which includes CLIM1001