January/February 2002 // Assessment
Quality Assurance for Online Courses:
Implementing Policy at RMIT
by Carmel McNaught
Note: This article was originally published in The Technology Source (http://ts.mivu.org/) as: Carmel McNaught "Quality Assurance for Online Courses:
Implementing Policy at RMIT" The Technology Source, January/February 2002. Available online at http://ts.mivu.org/default.asp?show=article&id=1034. The article is reprinted here with permission of the publisher.

While it has existed in various guises since 1887, the Royal Melbourne Institute of Technology was granted university status in 1992 and is now referred to as RMIT University. With 50,000 students and 5,000 academic and general staff, RMIT is large by Australian standards, and it hosts more international students than any other Australian university. To best meet the needs of its diverse student population, RMIT is bi-sectoral; that is, it includes both a higher educational sector and a vocational sector. Programs are taught within and across seven faculties, where faculty refers to a group of departments or schools sharing a disciplinary interest. Over the past few years, RMIT University has made a substantial investment in the use of technology. This initiative, known as the IT Alignment Program (ITAP), directs investment to infrastructure, enterprise computer-based systems, library resources, academic development, and program and course renewal. The aim of this substantial investment is to ensure that our teaching programs are pedagogically more valuable and also have greater market flexibility. As part of ITAP, we have developed a university-wide quality assurance system governing aspects of educational design for online courses. In describing the components of this quality assurance system, this article provides a model, still evolving, that readers may be able to adapt, whole or in part, to their own environments.

With its history of strong central policy whose implementation is tightly controlled, RMIT represents what McNay (1995) terms a corporate university. The overall ethos of this institution is such that it is possible to develop and implement policy and systems that would not be acceptable in more laissez-faire institutions. While I do not attempt to assess the pros and cons of establishing and adhering to university-wide systems and processes, it is important to keep RMIT's position on this point in mind when reading this article.

Through our distributed learning system (DLS), we offer academic teachers, the Australian term for faculty members, a set of online tools to assist them in renewing their programs and courses. McNaught, Kenny, Kennedy, and Lord's report (1999) on RMIT’s work describes our overall policy framework, the online toolset, early implementation experiences, and early evaluations. We now have approximately 1,000 online courses in our system and several more on local servers that have yet to be migrated to central servers and incorporated into the DLS. The intellectual property rights of our staff are protected in this system, though RMIT, as the employer, has a non-exclusive right to the use of these materials through its copyright policy.

In order to assist academic teachers in using the DLS, we periodically grant approximately 145 academic teachers across the university, two or three in each department, release time (26 days) in a professional development exercise to develop online materials and encourage colleagues in their departments to engage in online teaching and learning (McNaught & Kennedy, 2000). This support for online courses does not entail the abandonment of other approaches. At RMIT, the vast majority of our courses involve mixed mode designs that incorporate traditional classroom or face-to-face teaching, work-place learning arrangements (a central feature of many RMIT programs), or partnerships with other educational providers (especially international ones).

Ensuring quality through policy

RMIT’s policy for the quality assurance of online courses has three primary components: educational design, peer review, and formal evaluation. Moreover, the degree to which these components apply varies according to the nature of the course. At RMIT, each faculty has a list of strategically important programs, chosen because of the importance of these programs to the nature of the faculty and also in terms of their perceived market potential. There is significant investment in developing or renewing the courses in these strategic programs. This investment secures the time release needed to support academic teachers as they reconceptualize the educational design of their courses and produce materials that will expand online learning opportunities in strategic programs. While all courses must have evidence of educational design, only strategic courses need to demonstrate evidence of peer review and formal evaluation.

  • Evidence of educational design. All online courses (no matter how minor the online component) need to be signed off at the faculty level by each faculty's Director of Teaching Quality (DoTQ). For sign-off to occur and enable a course to become "live" on the DLS, the DoTQ needs evidence of clear educational design and planning. DoTQs ensure that an academic teacher has considered the design features of an online system, has an overall rationale for their courses, and complies with basic publishing standards (including copyright matters). DoTQs also assess the level of coherence between RMIT's Course Guide (which includes information relating to course details, learning outcomes, planned student learning experiences, assessment, and study program), its Online Checklist, and the online component of the course.

  • Evidence of peer review. The main purpose of the peer review sessions is to produce evidence that peer scrutiny can improve the quality of online courses. The scholarship of teaching is an important concept at RMIT, and we are anxious to encourage scholarly peer review of aspects of teaching in ways similar to the peer review of research outputs. A report of the decision made at peer review sessions is required. The DoTQs manage this peer review process but this evidence is also examined by a senior person in Learning Technology Services (LTS) on behalf of the chair of the university’s Programs Committee.

  • Evidence of forward thinking through an evaluation plan. The RMIT requirement to have an evaluation plan indicates the significance the institution places on ongoing quality improvement. The plan outlines what evaluation strategies are to be used once the course is being taught to students. As with the peer reviews, the process is managed by the DoTQs but reviewed by LTS as well.

Enacting policy through processes

RMIT has developed processes to aid academic teachers in meeting the criteria necessitated by its quality assurance policy. One such process, the professional development exercise, has been mentioned above. But RMIT also offers criterion-specific forms of support.

  • Facilitating educational design with models. One mechanism by which RMIT helps faculty members meet this quality criterion is through the use of exemplars. Because they enjoy professional educational design and production input during development, strategic courses serve as particularly good models for academic teachers. Their Course Guides include careful documentation and provide a clear explanation of the ways in which the learning outcomes, teaching and learning strategies, and assessments cohere. Their course materials offer numerous examples of simulations, animations, online role-plays, quizzes, and interactive lecture notes to assist other staff with design ideas.

  • Facilitating educational design by course review. In addition to offering models for educational design, we also conduct a review of all DLS courses. The first review actually predates the university's policy for the quality assurance of online courses; it occurred in June 2000 and involved 530 online courses in our DLS. There were seven review teams, one for each faculty; each team consisted of academic teachers from the respective faculty and academics from LTS. Each review team examined all of the online courses produced in that faculty. Initially there was a great deal of suspicion within the faculties about this "police" activity, but LTS academics worked hard to build a collaborative environment with academic teachers so that the exercise came to be viewed as providing evaluative feedback as much as implementing quality control. Still, a fine line exists between the two. We did not look at all online learning environments at RMIT; the review teams did not look at subjects still undergoing development, at subjects "switched off" because they were not currently operating, or at online materials that still resided on faculty and departmental servers.

  • Relying on a checklist similar to the Online Checklist, our review concluded that only about half of the courses appeared adequate. Typical concerns raised in review included: a lack of clarity in linking resources and activities to learning outcomes; a lack of flexibility in catering for diverse groups of students; failure to link to strategic priorities (i.e., internationalization or work-integrated learning); failure to link to activities in addition to resources; inclusion of extraneous buttons; and unclear navigation strategies. Suggestions for improvement were given to all course owners.

    The 2001 course review involves 1,000 courses, all of which will have been signed off by DoTQs, and is mostly complete. As with its predecessor, the 2001 review employs collaborative teams at the faculty level. Wherever possible, program leaders are involved in order to facilitate discussion about the coherence of courses within a particular program. Coherence is important because we want our students to experience a unified program rather than a variety of isolated classes. Moreover, we are interested in seeing how our online courses contribute to the total learning experience each student has. This year more detailed information about the online design features being used has been collected. Feedback on each course is given to each course coordinator. This feedback covers all sections of the Online Checklist and includes publishing standards as well.

  • Facilitating peer reviews. We provide a range of optional proformas for this process and have found these peer review sessions to be an extremely valuable academic development exercise. Basically, academic teachers gather in a computer lab, review partially or nearly developed online strategic courses, make comments, and then have an open discussion. Peer reviews typically run 1-2 hours. The peer review process has two main beneficial outcomes. The first is that members of each strategic course team receive explicit feedback on the design features of their courses. This feedback varies from comments about the pros and cons of particular strategies to specific editorial and technical suggestions. The second benefit is that review team members see the work done by others and discuss new strategies that they might then apply to their own teaching.

  • Facilitating formal evaluation plans. Each department and group at RMIT produces an annual student feedback plan, but, as in most universities, the evaluation focuses mostly on measures of student satisfaction as opposed to measures of the degree to which student learning of the course content area has been enhanced. To assist in developing scholarly approaches to evaluation, RMIT is involved in the ASCILITE CUTSD evaluation project. At present, the proformas we prepare for academic teachers draw on the evaluation framework outlined in Bain (1999). Although the development of these evaluation plans has clarified the nature of evaluation data for many staff, we realize that the work required must be feasible within already stretched academic workloads. The extent to which the evaluation plans are enacted in the face of the pressures and time constraints of teaching remains to be seen.

Achieving quality outcomes

Are we really making headway? Is our online quality assurance policy improving online learning? We are cautiously optimistic, and, overall, we are satisfied with the processes we have designed. Listed below are some of the indicators that give us optimism.

  • The 2001 DLS review is more rigorous than that of 2000. Because it looks more closely at courses and holds them to higher standards, it has also been taken more seriously.

  • The need to consider quality assurance issues for courses in relation to program level design and management is now more widely accepted. This is true across most Australian universities because of the recent commissioning of the Australian Universities Quality Agency. Increasingly, Australian universities will be asked to provide evidence for how they are maintaining academic standards, and this evidence will be subject to audit.

  • We have a much clearer idea of how to document and report on design issues to all stakeholders.

  • There is little faculty resistance to the evidence of planning process, which includes copyright and intellectual property sign-off. The response of academic teachers at workshops and discussions has been largely positive.

  • The response and engagement of academic teachers at peer review sessions has been positive.

  • Seventy-five course teams went through the full quality assurance process of evidence of educational design, peer review, and evaluation planning by the end of 2001

However, we are unclear about the effectiveness of the evaluation plans at this stage. Although evaluation plans may look scholarly and academic teachers may intend to enact them, once teaching is in place, it is hard to find time for serious evaluation. A rigorous evaluation process would add to the workload of academic teachers and appear as another challenge rather than an asset.

One of the keys to the success of quality assurance systems is to have effective and efficient management of the process. This means that we need to be flexible and adaptable in improving the process. For example, we are developing online document repositories and online peer review processes. We have done this for our Global University Alliance strategic courses and hope to use this as a model for other strategic courses, thereby linking our online policies to the improvement of our processes.

References

Bain, J. D. (1999). Introduction to the special issue: Learner-centred evaluation of innovation in higher education. Higher Education Research & Development, 18 (2), 165-172.

McNaught, C., & Kennedy, P. (2000, Nov/Dec). Learning technology mentors: Bottom-up action through top-down investment. The Technology Source. Retrieved June 6, 2001, from http://technologysource.org/article/185/

McNaught, C., Kenny, J., Kennedy, P., & Lord, R. (1999). Developing and evaluating a university-wide online Distributed Learning System: The experience at RMIT University. Educational Technology and Society, 2 (4). Retrieved June 6, 2001, from http://ifets.massey.ac.nz/periodical/vol_4_99/mcnaught.html

McNay, I. (1995). From the collegial academy to the corporate enterprise: The changing cultures of universities. In T. Schuller (Ed.), The changing university? Buckingham: SRHE & Open University Press.

simulation gamesadventure gamesmanagement gamesshooter gamesbrain teaser games
View Related Articles >