Abstract
Program development and review are a central part of institutional and industry quality assurance. Traditional approaches, although well established, present several barriers that could undermine the integrity of the process and the quality of outcomes. Here, a new approach to program development and design is explored with the goal of enhancing outcomes for students and institutions.
As part of an institution’s ongoing commitment to providing high-quality learning and teaching, and to ensure that graduates are market-ready, programs typically undergo cyclic internal review. A 5-y review cycle is fairly typical and is the policy at Charles Sturt University (CSU); however, some events may trigger a shorter cycle (e.g., external accreditation). Although the life cycle and review processes of a program vary from institution to institution and will accommodate nuances specific to different countries and regulatory procedures, the process has traditionally been insight-driven. Here, an outcome-driven approach is outlined as an alternative.
TRADITIONAL PROGRAM-REVIEW PROCESS
For professional degrees leading to an accredited qualification, program design is typically driven by the standards set by an external accrediting agency. In the United States, the Joint Review Committee on Educational Programs in Nuclear Medicine Technology provides a set of standards to be satisfied for a program to be approved or accredited for training. These standards include both resources and curriculum. In Australia and elsewhere in the world, similar bodies have guided the essential curriculum required to be trained in a specific profession. In Australia, these bodies have been the Australian Institute of Radiography for radiography and radiation therapy and the Australia and New Zealand Society of Nuclear Medicine for nuclear medicine technology. In essence, the program design is driven by the curriculum (Fig. 1). This approach has value, including creating a program design that is relatively consistent across an educational community. Nonetheless, although programs at different institutions can have nearly identical structures (i.e., at a minimum, the evidence of specific curricula across all programs can be mapped), the learning outcomes of graduates can be quite different.
In a typical program review, the existing program is evaluated against the accreditation standards by academics or external stakeholder groups (usually a convened committee) and by feedback from industry. The review is informed by collective wisdom and expertise, which offer powerful insight yet may be diluted by numerous factors. This approach is somewhat of an art form, with the end product representing an overall opinion that the structure “feels right.” The program structure itself is typically populated by defining the course subjects required to meet the accreditation standards, defining the curricula for those subjects, and determining what the learning outcomes could or should be. An experienced team of academics would then map student assessment methods against those learning outcomes and across the entire program to ensure that skills and capabilities are appropriately developed. As the program takes shape, this mapping might reference Bloom’s taxonomy, a framework for cognitive capabilities, with lower-order outcomes in earlier years and higher-order in later years (1).
The traditional process for designing and reviewing a program typically uses accrediting body standards as the backbone, with refinements coming from input by the academic team, external stakeholder groups, and intuition. Institutional outcomes are then woven throughout the program. This weaving might be in the form of a liberal arts component of a program, for example, or, in the case of CSU, weaving in expected graduate learning outcomes (GLOs), including Indigenous cultural competence (2). At CSU, nuclear medicine is offered as 1 of 3 specializations—nuclear medicine, radiation therapy, or medical imaging (radiography)—in a 4-y degree that is a 100% science degree (as opposed to having a liberal arts component). More than 50% of the program is common to all specializations. The nuclear medicine program currently being offered (Fig. 2) represents the collective wisdom and insight of a large group of academics, external stakeholder groups from all specializations, multiple accrediting bodies, and tradition.
REIMAGINING PROGRAM DEVELOPMENT
Recently in Australia, program accreditation has been moved from the professional bodies to an independent national authority: the Australian Health Practitioner Regulatory Agency (AHPRA). Among the changes in approach is that the AHPRA standards are driven by graduate outcomes, not by curricula. Although a quick glance at the AHPRA requirements for programs in medical radiation science provides hints of specific curricula that are important (e.g., anatomy and physiology), the regulatory body has stepped away from prescribing the ingredients of the curriculum. Instead, the AHPRA has provided 5 key domains common to all specializations in medical radiation science and 1 additional domain with detailed learning outcomes for each specialization. It has been challenging to retrofit the traditionally developed program to provide evidence of student capability against the AHPRA domains. The key AHPRA domains are a macro perspective, with each domain having numerous subdomains and each subdomain having very specific expected outcomes for graduates (Supplemental Appendix A; supplemental materials are available at http://jnm.snmjournals.org) (3). These key domains include professional and ethical conduct; communication and collaboration; evidence-based practice and professional learning; radiation safety and risk management; and practice in medical radiation sciences, including practice in diagnostic radiography, nuclear medicine, and radiation therapy. Furthermore, CSU GLOs (Supplemental Appendix B) need to be built into the student capabilities woven across the program.
Notwithstanding the AHPRA change, the traditional approach to program development is somewhat counterintuitive despite being well established. At CSU, program design has been reimagined with the implementation of a new university-wide learning platform for design, implementation, and evaluation of programs. The learning platform itself had some implementation challenges for the medical radiation science programs, particularly in the review phase. The discipline team hybridized the learning platform philosophies and reimagined application and implementation. The reimagined approach to program review and design (Fig. 3) is more intuitive and less traditional.
The program structure is not defined by intuition, experience, and curricula. The starting point is simply the expected learning outcomes, recognizing that they are a national standard and evidence-based (against scope of practice). The learning outcomes for graduates are defined by the accreditation body (Supplemental Appendix A) and the university (Supplemental Appendix B). These learning outcomes need to be scaffolded across the 4 y of the program, with some learning outcomes not being evidenced until late in the program and others being evidenced at increasing depth and capability multiple times over the 4 y. For example, higher-order capabilities associated with radiopharmaceutical administration are not introduced until deep into the second year, whereas understanding of professionalism is built from lower-order capability in the first year to higher-order capability in the fourth year. The capabilities themselves are not simply scaffolded against the traditional cognitive taxonomy of Bloom (Table 1) but rather are a hybridization of both Bloom’s knowledge (Table 2) and cognitive domains, providing a richer outcome for graduates (Fig. 4) (4–6).
Once the learning outcomes for the entire program are determined and scaffolded across the various years, these learning outcomes are used to develop student assessment methods. The assessments themselves need to provide evidence of the extent to which students have achieved the learning outcome. Importantly, the medical radiation science program at CSU mapped and scaffolded the skill-based capabilities defined by CSU GLOs to increase the skill of students in some specific domains: portfolio development and reflection, evidence sourcing and evaluation, dissemination (written, oral, poster), and collaboration. First-year students are not expected to develop a portfolio but rather skills and appreciation for the value of reflection, which then builds through the introduction of the portfolio in the second year. The ability to source, evaluate, and synthesize evidence builds from basic search-and-assess tasks in the first year through the writing of authentic systematic reviews, reports, and teaching cases at the end of the program, using the guidelines of the Journal of Nuclear Medicine Technology. The ability to work collaboratively, productively, and collegially while solving clinical problems is an essential capability independent of a curriculum scaffolded from the first to fourth years.
The learning outcomes defined by the national accreditation standards require the creation of specific assessment tasks, which in turn dictate the specific curricula required for students to effectively complete these tasks, with the discrete subject units then being based on these specific curricula. This logical grouping of curricula challenges the traditional boundaries of subjects. Combined with scaffolding, the curriculum then drives the organization of subjects into a logical program design. Although a complete breakdown of the extensive mapping undertaken is beyond the scope and length limit of this discussion, there may be some benefit to providing an example using a single subdomain of the regulatory standards: “5B.5. Implement the Delivery of Nuclear Medicine Radioisotope Examinations and Therapies” (3). In this example, the learning outcome for the first year was simply to be able to calculate the radiation dose and decay of diagnostic and therapeutic radionuclides, and the learning outcome for the second year was to be able to explain the difference between diagnostic and therapeutic radionuclides and doses. Increasing capabilities were expected in the second, third, and fourth years, with clinical practicum students expected to be able to apply the principles of diagnostic and therapeutic radionuclides and doses within the clinical context and to be aware of the implications for patients. Higher-order capabilities were expected in the third and fourth years: explaining, planning, and applying the principles of radionuclide therapy to ensure appropriate preparation, management (care and aftercare), safety, and delivery to patients. The fourth-year expectations were of the highest order: planning and applying the principles of patient care, radiation safety, aseptic technique, and radiation physics to implement and deliver appropriate radionuclide therapy.
REENGINEERING PROGRAM DESIGN
The new approach to developing and reviewing programs challenges the traditional approaches and intrinsic assumptions, as well as the prejudices of both internal and external stakeholders about what is required. At CSU, the process was led by a small group of academics who championed the approach and outcomes. The group had 100% buy-in to the process, creating a level of productivity and collegiality not previously seen. The process was self-perpetuating and intuitive. The endpoint, rather than being one giving the sense that a task is complete (art), was one defined by measurable outcomes (science), similar to the now measurable endpoints and outcomes of our graduates. The new program structure is outlined in Figure 5.
Despite some superficial similarities to the old structure, significant changes resulted from adoption of the reimagined approach. For example, one change was that the requirement for a course on research methods was omitted on the basis of it not matching any learning outcomes of either the accrediting body or the university GLOs. The general sense that research was important and that the course subject should be kept was supplanted by outcome expectations focused on the ability to source, evaluate, and synthesize evidence rather than the ability to conduct a research project.
A second change was a refocusing from imaging anatomy to cross-sectional anatomy in response to the themes associated with the learning outcomes. A third was the introduction of an entire subject dealing with pharmacology, as this specific learning outcome was not mapped elsewhere and, indeed, could not be convincingly demonstrated by graduates. The idea of embedding pharmacology as a subject in the program was rejected by much of the discipline team until mapping against the AHPRA learning outcomes led to its unanimous support. Another change was the conversion of the separate ultrasound and MRI subjects into a single, combined subject (nonionizing imaging techniques) to reorient the expected learning outcomes and remove superfluous and extemporaneous curricula. Similarly, digital image processing and informatics fell short of a whole subject once the actual learning outcomes had been mapped and duplication across the program removed. This digital theme was instead linked to the CSU GLOs associated with sustainability and the global community. In addition, refinement of learning outcomes in the first year and removal of duplication and redundant curricula afforded an opportunity for other changes: the introduction of dedicated studies on sociology and health, and the introduction in the first year (previously the third year) of a key GLO as a foundation for understanding Indigenous health.
Besides these changes, the clinical practicum was otherwise built into nuclear medicine–specific subjects to better connect learning and key theory, and several subjects were moved forward in the program to allow students to better contextualize learning in the clinical environment (e.g., radiopharmacy, instrumentation, and radiation biology/protection).
These changes were driven by learning outcomes and would not have occurred with the traditional approach to program review. Indeed, most of these subjects and curricula had remained largely unchanged and firmly entrenched in the various iterations of the program for the last 20 years. Although there is neither scope nor space sufficient to detail the entire program, the extracts in Table 3 provide insight into how individual learning outcomes drive and scaffold assessments to become evidence that informs curricula and ultimately create course subjects and the program structure.
DISCUSSION
The balance between art and science has been previously debated (7), with another example of this debate being the approaches to undertaking a program review. Traditional approaches to program review rely heavily on the rich insights and intuition of academics and clinical experts, with the endpoint being defined as a sense or feeling that the program is complete. Conversely, the reimagined and reengineered approach is informed by evidence and defined outcomes, which have an endpoint indicating that “it works” rather than “it feels right.” The new approach is not without its skeptics. It mirrors in some ways the Brad Pitt movie Moneyball, which depicts evidence- and outcome-driven recruitment in baseball being held up to the scorn of intuition-based brain trust recruiting yet ultimately showing that what actually works is not always what feels right.
Some may view the combined evidence-informed and outcome-informed approach as counterintuitive with respect to the traditional educational perspective. However, it is imperative that we draw from the successful experience of other disciplines (such as social work and other allied health practices) in which such an approach has led to cultural competence and a collaboration among all stakeholders (8,9). At CSU, this approach has placed outcomes, critical thinking, reflection, and evidence-informed practice at the heart of pedagogy (10), producing job-ready graduates who meet regulatory capabilities and standards within the clinical context.
To what end do we strive to achieve this approach? It affords several benefits. First, this approach provides evidence of learning and of meeting the learning outcomes. A major challenge of the previous program was simply providing evidence of student capability and demonstrated learning outcomes against the AHPRA registration standards (domains). When a program is driven by a structure that informs curricula and then custom-fits learning outcomes, there are gaps, overlaps, and discordance. Mapping these types of learning outcomes to the specific outcomes defined by the AHPRA becomes challenging, to say the least. A course driven by learning outcomes has not only assessment methods and curricula designed to deliver those outcomes but also a rich portfolio of evidence that can be used to demonstrate those outcomes. Second, the previous approach creates barriers to assessment and allows the curriculum to creep. It is common for an assessment mapped across a program to take subtle changes in direction. Ultimately, these can lead to noncontinuity of GLOs, capabilities, knowledge, and outcomes. When the assessment is informed by the learning outcomes, and the curriculum is informed by the assessment, subtle changes to the assessment or curriculum require revision of the learning outcomes—a task that is possible when essential but prohibitive for on-the-fly changes to suit an individual academic preference. Third, the new approach enhances the quality of both product and service to the students and their employers. Finally, it provides integration, allowing students to engage in authentic learning and authentic assessment that are constructively aligned and appropriately scaffolded. One of the by-products of this approach to program design is that reflection has been integrated into assessment tasks and rolled into an overall portfolio by mirroring the AHPRA continuing education template for practitioners, making assessment a rich learning environment (Supplemental Fig.1; Supplemental Table 1). Collectively, this approach provides a clear vision for the program at a macro level, mapping the interconnectedness of student needs, university mission, program vision, industry needs, regulatory capabilities, and future opportunities.
CONCLUSION
Program review need not be an onerous task. The quality of courses is enhanced by reimagining the development of a program and reengineering its design, in turn allowing the learning outcome to become the hero of the program. This approach creates graduates who have capabilities that align with the expectations of industry and affords institutions rich resources to provide evidence that student learning meets regulatory criteria.
DISCLOSURE
No potential conflict of interest relevant to this article was reported.
Acknowledgments
In undertaking this review, it is important that we recognize the contribution of the broader medical radiation science team at CSU to the work done. Although we were the architects of this approach, specific mention needs to be made of the members of the group championing and leading this process, without whom it would not have been possible: Kelly Spuur (discipline lead), Johnathan Hewis (associate head of school), Matt Collins (associate head of school), Andrew Kilgour (radiography), and Chad Harris (radiation therapy). We give them enormous thanks for investing in the vision and doing the hard yards to get the job done.
Footnotes
Published online May 3, 2018.
- Received for publication January 16, 2018.
- Accepted for publication March 1, 2018.