Abstract
In the past, program assessment was considered a supplemental activity designed to analyze program performance once instruction had concluded. This process was often a summative activity that ignored the possibility of being able to change instruction throughout the implementation of the curriculum. However, the assessment process has evolved in such a way that assessment can now be considered an integral part of curriculum development. Forms J and L of the Joint Review Committee on Educational Programs in Nuclear Medicine Technology (JRCNMT) requirements for the annual report have recently been updated to support Nuclear Medicine Technology (NMT) programs in JRCNMT’s effort to meet and exceed industry standards. At Bronx Community College, the NMT program has taken advantage of these newly developed forms to streamline program assessment. These modifications have changed not only how assessment is implemented at the end of the program but also how students are evaluated throughout their coursework.
The reason for instituting an assessment plan into a program or curriculum is based on the need for overall improvement. Considering this goal, an assessment plan must be designed to address a particular set of learning outcomes. As described by the University of Central Florida, “behavioral and cognitive learning outcomes are given to highlight how Bloom’s taxonomy can be incorporated into the larger-scale educational goals or guidelines” (1). At Bronx Community College (BCC), the hierarchical structure of Bloom’s taxonomy is used as a guiding principle in the creation of appropriate and meaningful learning outcomes. The Nuclear Medicine Technology (NMT) program at BCC executes various levels of assessment, with the intention of creating a continually evolving program based on its assessment findings.
Program assessment happens at 2 levels. The first (and most frequent) level is to ensure the program is able to satisfy student-level outcomes (SLOs) set forth by the college and approved by the Joint Review Committee on Educational Programs in Nuclear Medicine Technology (JRCNMT). The director of the NMT program and the college administration collaborate to determine which goals the program should target for assessment purposes. These outcomes are then clearly defined in the college’s course catalog as well as in the individual course syllabi. The idea is to ensure that the students are aware of what is required of them and how they will be evaluated throughout the course as a formative assessment and ultimately, at the end of the course, as a summative measure. Generally speaking, and as a practice at BCC, these outcomes are accompanied by a rubric that acts not only as a metric for formative assessment for the instructor but also as a guide for student expectations.
The second level of assessment is a directive from the JRCNMT. In recent years, the JRCNMT has begun to foster a strong emphasis on assessment (at both the student and the program level). As part of this emphasis, the JRCNMT has established assessment standards that are reflected on several forms that are the basis of the assessment portion of both the required annual report and the larger self-study report.
ASSESSMENT RESOURCES
It stands to reason that NMT programs across the country will likely have similar resources when it comes to assessment. At BCC, the NMT program benefits from the guidance of the college’s Assessment Council, wherein each department has its own representation. This council was created to help design assessment strategies that address the stated outcomes for each program as listed in the course catalog.
In past years, the NMT program at BCC had to rely on this council to determine how to properly use the gathered data to formulate a strategy for overall improvement. This strategy was designed on the basis of the program-level outcomes and SLOs that were ultimately decided on by the college administration, NMT advisory board, and JRCNMT. Recently, the JRCNMT has increased its involvement in assessment by devoting more resources to and creating new streamlined metrics for its assessment requirements (i.e., Form J, Assessment of Program Student Learning Outcomes; Form L, Program Effectiveness Data).
SLOs
According to Cornell University, SLOs can be defined as “measurable statements that articulate at the beginning what students should know, be able to do, or value as a result of taking a course or completing a program” (2).
Each institution is required to create its own SLOs based on JRCNMT guidelines and requirements for the accredited program. Recent communications from the JRCNMT have offered guidance focusing on the development and implementation of SLOs. Through a collaborative effort between the NMT program director, department assessment coordinator, and college administration, BCC has embraced these suggestions and consequently updated the college’s current SLO statements.
SLO DEVELOPMENT
Developing learning outcomes for the program is a multifaceted process. These outcomes must serve several purposes. First, the list of SLOs should be designed in such a way that, in theory, when all are met, the student possesses the skills and knowledge required for graduation. This list should address the most important skills, knowledge, and aptitude that students should acquire across the entire program. To be effective, this list is published in the course catalog, making it available for incoming (or current) students to use as a rubric for self-assessment. Through data analysis of final grades, class participation efforts, and direct observation reports from both instructors and clinical supervisors as part of the formal assessment procedures for the college, it has been determined that students who remain cognizant of these outcomes tend to perform at higher levels because of increased understanding of course and program requirements. To supplement the effectiveness of “publicly” posting these SLOs, BCC requires inclusion of the list of SLOs on each course’s syllabus, which is distributed to the students at the beginning of each semester.
Previously, the number of SLOs for the NMT program at BCC was 10; these were created based on the requirements of each course. However, through feedback provided by the assessment coordinator and the JRCNMT, we found this list of outcomes to be too cumbersome for assessment to be properly performed. The original intent was to try and link individual SLOs to individual courses. However, program-level outcomes are not designed to address specific course outcomes; instead, they provide an overall evaluation of the skills and knowledge a student acquires over the entire program. After evaluating the number of SLOs (not necessarily the SLOs themselves), it was determined there was a redundancy between these outcomes that resulted in inaccurate assessment due to aligning program outcomes to individual courses thus diluting the differentiation between the two.
To rectify this redundancy, we elected to reduce the number of SLOs to 5. However, to accomplish this reduction and still have the list represent all the skills and knowledge the students are required to obtain, the SLOs had to be rewritten. The new list of SLOs is general enough to encompass all that is required but still retain an alignment to specific assessment tools for proper evaluation.
The next issue that had to be addressed was to determine the assessment vehicle that was to perform the assessment on each of these SLOs. At this point, a collaborative effort materialized between the teaching faculty and the assessment and program coordinators. The goal was to look at the syllabus of each course in the NMT program and determine which courses offered the content or activities that addressed the specific SLOs. We would then look toward the formative or summative evaluations of those activities (e.g., tests, presentations) and use that data for the assessment vehicle for a particular SLO.
As a welcomed, yet unintended, consequence, we encountered another set of redundancies, which involved multiple assessment vehicles for each of the SLOs. This time, however, these redundancies would benefit the assessment process. Having multiple assessment vehicles for the same SLO allows the SLOs to be assessed uninterrupted, through each assessment cycle. For example, because of the current coronavirus disease 2019 pandemic, some of the assessment methods in each course had to be modified to satisfy the change in teaching modality. Because there are multiple ways of assessing each SLO, we are less likely to be in a situation that does not allow the assessment of any particular outcome due to a change or omission of the curriculum. Ultimately, being able to use multiple assessment vehicles across various courses for the same SLO assured the college that each SLO was able to be implemented and assessed.
PROGRAM EFFECTIVENESS DATA & BENCHMARKING
Once a proper method of evaluating SLOs has been constructed, and assessment vehicles chosen to address specific SLOs, the data collected must be analyzed for the ultimate purpose of improving the learning experience for students. Again, this is a multifaceted process.
The method chosen to evaluate a program’s effectiveness is useful only when compared with a standard. This standard is known as a benchmark. According to the Center for Community College Student Engagement (CCCSE), “Benchmarking is the systematic process of comparing an organizations performance on key measures to the performance of others” (3).
At BCC, the Nuclear Medicine Technology Program has established benchmarks that can be found on the recently updated Forms J and L of the JRCNMT compliance report.
The benchmarks found on Form J reflect the level of competence required of each student as stated in the published SLOs. These benchmarks were chosen on the basis of several factors. First, historical assessment data of the program were analyzed to determine an appropriate and reasonable goal (as described by the SLOs) for the students to achieve. Historical assessment data were used to minimize the “shot-in-the-dark” attempts at establishing reasonable student goals. These goals are ultimately assessed through both formative and summative means in various courses and throughout various stages of a student’s progress through the program.
Another consideration in the formulation of a benchmark is how it compares with outside requirements. Benchmarks that reflect individual student performance are created at the “local” level and tend to address the requirements of the college. Although these benchmarks are designed with academic performance in mind, they must also align with industry performance as well.
Form L of the JRCNMT compliance report establishes the benchmarks at the industry level. Largely influenced by accrediting standards, these benchmarks are designed as an assessment tool for the program. These benchmarks are influenced by assessment data gathered on an occupational level and reflect a common standard throughout the profession.
Regardless of either a program or student level, a benchmark that will yield accurate assessment data is overwhelmingly assessing a quantitative activity. Because of the objective nature of quantitative analysis, program data can be gathered and assessed across the curriculum, regardless of who is performing the evaluation. This is an extremely crucial aspect of assessment when dealing with program-level effectiveness. To yield accurate assessment results, limiting the amount of subjective variance is critical.
Effective assessment needs to happen at multiple levels at varying times to yield meaningful results. To compile the most accurate data possible, it is up to the instructor to maintain a focus on addressing the student and program outcomes. At the program level, although data are collected on a continual basis, outcomes are generally assessed every 2 y (which represents a full program cycle). These outcomes should differ from those that are course-level outcomes or SLOs.
Over the past few years, our program at BCC has implemented several tools to streamline the process of completing Forms J and L while ensuring that the SLOs are met. The feedback from the JRCNMT has helped to restructure our program by targeting more efficient ways to retain records, organize data, and implement teaching tools. Below is a summary of some elements that we have already restructured to improve the assessment process of our program while also focusing on plans to enhance the monitoring of our SLOs.
WEB-BASED COURSE MANAGEMENT SYSTEMS
Web-based course management systems (such as Blackboard) have served as an integral tool for program assessment, allowing for thorough record keeping that helps track individual grades on assignments and exams. Writing assignments can now be kept in a digital format rather than in paper form, which often required not only numerous paper files for each student but also storage for the files. In addition, the use of discussion boards has allowed students to interact with their classmates, resulting in an environment that promotes immediate and meaningful feedback, as compared to a more traditional strategy of assignment submission that does not allow such interaction. Another major benefit of these course management systems is the ability to run reports and statistics on assignments or exams. If all students in the course have their assignment recorded in the grade center of Blackboard, for example, the instructor can simply select the Column Details option from the drop-down menu. This will determine the average, median, SD, and range of grades, and more (Fig. 1). Using a course management system improves teaching methods as well as calculates benchmark results on Form J. Administering exams on Blackboard (especially if done in-person in a monitored computer lab) can also be beneficial. By limiting (and in some cases eliminating) the need for traditional test evaluation methods, such as “by-hand” markup, the instructor can provide immediate feedback with the added benefit of reducing student testing anxiety. In addition, taking an online exam while being monitored in the classroom sets up an environment similar to that for board exams. We found that administering a “mock board exam,” with the same amount of time and number of questions set by the American Registry of Radiologic Technologists, has better prepared students for their licensure exams. Finally, the way the course material is presented to the student through Blackboard has improved overall student organization. Web-based course management allows the student to access materials all in 1 location. The course syllabus, lectures, homework assignments, learning outcomes, handbooks, and more can be placed in 1 location for the student. This has been an incredible asset in improving student performance and retention within our courses.
ONLINE SURVEYS
Transitioning from a paper-based collection method to an online format has had some challenges; however, it is proving to be a more efficient method of record keeping. In the past, all of our surveys were administered on paper and retained for the appropriate amount of time. This not only took up a lot of space but also made data analysis a daunting task. In recent years, we have begun the transition to online surveys, which has immensely improved our organization and collection. Surveys administered to students for individual course instruction, clinical site evaluation, and overall program effectiveness are now administered online. Administering the surveys in this format has allowed us to quickly run reports based on student feedback, aiding in assessment and program improvement. Addressing JRCNMT standard D3.1 g (Evaluating graduate assessment of program effectiveness) has been particularly helpful with conducting online surveys. In this online format, the program can put together several questions that pertain to program effectiveness, sending them to the students to easily complete. These results serve as an integral component in completing assessment questions on Form L (Fig. 2). Currently, the surveys that we give the clinical instructors for evaluating student performance are still on paper. This has created some recent issues, because analyzing the data on specific questions that relate to individual SLOs can be time consuming. In addition, if a clinical instructor is busy with other tasks, they sometimes fill out the form incorrectly and will eventually have to redo it, creating inefficiencies for both the clinical coordinators and the affiliate education supervisors. Moving forward, we will begin to implement these surveys online as well. We feel that this will streamline the process of student evaluations, allowing us to easily interpret trends and areas we need to address with the entire class and improving student performance and assessment strategies.
EPORTFOLIOS
Electronic portfolios are valuable tools that improve student learning while also aiding in assessment strategies. These portfolios allow students to create individual work that they can store electronically and on which they can continue to reflect at their leisure, thus enhancing their learning. In addition, electronic portfolios allow students to access information not only while they’re enrolled in the program but also even after graduation. Our program has created an NMT EPortfolio for students enrolled in the program. Currently, it is being used for resources to be stored in 1 easy location. In our current EPortfolio, our handbooks, blank evaluations/rubrics, student learning outcomes, and competency forms can all be accessed from the same site. Moving forward, we plan to add a collaborative area for job postings, allowing both recent graduates and instructors to post information about current job openings. We plan to use this to improve the job placement assessment portion of Form L. The main benefit of this database in comparison to the Blackboard learning management system is that students can still access it after graduation. Helpful resources such as job postings, CT competency forms, board exam information, and the like can all be accessed in this 1 location.
VIDEO CONFERENCING
The pandemic has brought unforeseen challenges, which required instructors to quickly adapt to new teaching methods and technology. Although the incorporation of video conferencing software, such as Zoom, was a definite transition, it has proved very useful with both teaching and assessment.
Our program has primarily been using Zoom for a combination of online instruction, meetings, and advisement since the pandemic began in March 2020. This online conferencing platform has allowed us to improve some of our teaching methods as well as assessment. Using this technology has allowed us to hold online information sessions for incoming and prospective students, largely improving the participation at these events while still allowing us to share our screen to show PowerPoints, course expectations, prerequisites, and more. In addition, it still allows students to ask any questions they may have about the nuclear medicine field or program expectations. We feel strongly that the increased participation at the information sessions will improve student retention in the program.
Many of our students are considered “nontraditional” (i.e., not fresh out of high school but rather older than 25 y). According to an article published in Contemporary Issues in Education Research, “a vast majority of fresh-out-of-high-school ‘traditional’ aged (18–24) enrollees have shifted toward a wave of ‘nontraditional’ aged (25+) students, featuring displaced workers, first-generation college attendees, returning students, and those who desire a change in career (either due to financial hardship or preference), administrators have no choice but to alter collegiate curriculums, services, and overall philosophies. An overwhelming majority of institutions affected by this trend are community colleges” (4). Many of our nontraditional students often deal with the challenges associated with balancing work, family, and school. With these students, in particular, we feel it is important to hold detailed information sessions specifying program requirements and expectations. During the clinical internship portion of the program, we feel that this transparency is imperative to improve student retention and graduation rates. Form L in the compliance report asks for an assessment of the graduation rate, which the colleges’ learning management system (LMS) should help to improve.
Aside from an increase in information session participation, we also have noticed an increase in participation at the advisory board meetings since these began to be held online. Although the pandemic forced a transition to online, because of the noticeable increase in attendance we plan to retain this format. In addition, many of the clinical instructors find it challenging to commute to our campus after they finish work for the day. Traffic, weather, and our proximity to Yankee stadium can cause immense delays in travel time to our campus during rush hours. Fortunately, Zoom use for our advisory board meetings has allowed board members to call in from anywhere, largely increasing our advisory board attendance and improving assessment strategies on Form L.
Last, using this technology has helped us to communicate with students in a private setting. We can now easily hold individual Zoom sessions for radiation badge review, midrotation clinical evaluations, and advisement. Zoom sessions can easily be worked around students’ clinical internship schedules while accommodating the instructors. As we transition out of the pandemic, our plan is to continue these meetings online.
STUDENT RESOURCES
The annual compliance report has helped our program to recognize areas in need of improvement, especially due to additional challenges associated with the pandemic. More than ever, students are dealing with additional pressures, whether they be financial, psychologic, or physical. Over the last few years, we have worked to compile resources offered to our students, easing the burden of some of the financial constraints associated with attending college while also working to improve their job outlook on graduation.
In the last few years, we have been fortunate enough to have applied for and received grant funding for the program. We have used this funding to jumpstart tutoring, CT instruction, review classes, allocations for conferences, and textbooks. Students in the program now have an option for free tutoring, where select second-year students tutor the first-year students. The second-year students receive an hourly wage (helping them make some money during clinical internship) while the first-year students can review core nuclear medicine topics. Similarly, we have recently begun review sessions for the board exams with past lecturers or outside speakers. Both tutoring and these review sessions are free for the students and helped to improve both program retention and board exam pass rates.
In addition, with the growing need for PET/CT technologists, we felt it was imperative to incorporate CT instruction into the program. This grant funding has allowed us to hold an elective CT course for students, again at no additional charge. Also secured within this grant funding are allocations for conference attendance. We have been able to recently secure funding for hotel and travel expenses to the annual Greater New York Chapter of Society of Nuclear Medicine (GNYCSNM) conference. This conference allows students to present abstracts while increasing their opportunity to network within the industry. These resources have largely helped to improve job placement rates on graduation, again allowing us to more easily meet our benchmarks on Form L.
As with any program, there is a direct correlation between the support that it receives and how well it meets its intended goals. When using the feedback from the JRCNMT compliance report, specifically Forms J and L, the NMT program at BCC has been able to restructure the tools used for assessment. This restructuring allowed us to not only to improve areas of instruction and assessment that focus on student success but also to streamline data collection for future analyzation. We plan to continue using the resources provided by the JRCNMT to track trends within assessment data while focusing on overall student performance.
DISCLOSURE
No potential conflict of interest relevant to this article was reported.
Footnotes
Published online Jun. 30, 2022.
- Received for publication January 24, 2022.
- Revision received June 29, 2022.