To the University of Wyoming: The members of the Committee approve the work of Kari Brown-Herbst presented on May 4, 2020. Dr. Courtney McKim, Chair Dr. Alan Buss, Outside Member Dr. W. Reed Scull APPROVED: Dr. Suzanne Young, Director, School of Counseling, Leadership, Advocacy and Design Dr. Ray Reutzel, Dean, College of Education Brown-Herbst, Kari, Measuring Faculty Use of the Learning Management System in the First Semester of Availability, Ed.D., School of Counseling, Leadership, Advocacy, and Design, May 2020 Abstract The purpose of this project is to quantify the use of a Learning Management System (LMS) by faculty teaching online at a rural community college in the western United States. The LMS is ubiquitous in higher education, in place at 99% of colleges and universities and used by more than 80% of faculty and 82% of students. It is used in all instructional modalities but is an essential tool in online teaching. At institutions offering distance education, the LMS is the web- based environment through which faculty and students interact despite separation by time and geography. Understanding the level of online faculty activity in the LMS is foundational to knowing the opportunities for meaningful learning afforded to online students. This project will analyze the empirical data generated through faculty behavior in the LMS in the first semester of availability at a community college. This examination of faculty activity will provide a baseline measurement to describe faculty usage in the initial semester. It is the College’s first step towards understanding what has taken place in its online campus, anticipating what will happen next, and being impactful with interventions in subsequent semesters. Keywords: LMS, online faculty, analytics 1 MEASURING FACULTY USE OF THE LEARNING MANAGEMENT SYSTEM IN THE FIRST SEMESTER OF AVAILABILITY by Kari Brown-Herbst A project submitted to the University of Wyoming in partial fulfillment of the requirements for the degree of Doctor of Education in INSTRUCTIONAL TECHNOLOGY Laramie, Wyoming May 2020 Acknowledgements I would like to thank my colleagues at High Plains Community College for their participation in this work. Without their supportive conversation regarding Canvas Data and its role in the institution this project would not have gotten off the ground. Their interest in the story unfolding in our Canvas Data was motivational and important in seeing the work through to completion. I also want to acknowledge the supportive nudges of James and Joe. They are academicians always and have been steadfast in their encouragement throughout the dissertation process. My collaborators at Canvas were willing to provide redirection, clarification, and technology assistance when I needed it most. Thank you to Casey, Caroline, Erik, Stan, and Michael. Courtney McKim served as Committee Chair and her insight and guidance has made this work stronger. I am indebted to her for her willingness to assume Chair responsibilities long after this work began, and for the energy and expertise she invested in encouraging its completion. Alan Buss and Reed Scull rounded out the committee enthusiastically, prompting me to consider broader research than I might otherwise have. Thank you. ii Dedication To friends and family members who have shared this journey. When my ambition faltered and my commitment waivered, your encouragement lifted me. To my husband Glenn, you have been my partner in every important step forward for 26 years. There is no milestone in this work that doesn’t bear your mark. I am so grateful to have your steady support. iii Table of Contents Abstract ..................................................................................................................................... 1 List of Tables ............................................................................................................................ v List of Figures .......................................................................................................................... vi Introduction ............................................................................................................................... 1 Problem ............................................................................................................................... 8 Purpose ................................................................................................................................ 9 Research Questions ........................................................................................................... 10 Definition of Terms and Acronyms .................................................................................. 10 Methodology ........................................................................................................................... 11 Context .............................................................................................................................. 11 Participants ........................................................................................................................ 13 Online faculty. ........................................................................................................... 13 Classes............................................................................................................................... 13 System of record. ....................................................................................................... 13 The learning management system. ............................................................................ 15 Procedure .......................................................................................................................... 19 Reliability and Validity ..................................................................................................... 24 Limitations ........................................................................................................................ 26 Results ..................................................................................................................................... 26 Discussion ............................................................................................................................... 41 Implications ............................................................................................................... 47 References ............................................................................................................................... 49 Appendix A ............................................................................................................................. 56 Appendix B ............................................................................................................................. 58 iv List of Tables Table Page 1 The Seven Principles and LMS Tools ..........................................................................5 2 Online Class Categorization per HPCC Enrollment Report .......................................14 3 HPCC Revised Class Dataset......................................................................................19 4 Days per Calendar Block, Fall 2018 ...........................................................................27 5 Calendar Block Course Distribution per School .........................................................29 6 Average Daily Engagement per Week (h:mm) Fall 2018 ..........................................33 7 Canvas-native Tools Used in the Fall 2018 Semester ................................................36 8 Canvas-native Tools Application at HPCC ................................................................38 9 Academic Divisions’ Portion of Overall Tool Use .....................................................40 v List of Figures Figure Page 1 Dataset Determination ..........................................................................................16 2 Average Online Faculty Engagement per Calendar Block ...................................28 3 Average Online Faculty Engagement per Academic Division .............................31 4 Average Online Faculty Engaged Time per Day ..................................................32 5 Average Online Faculty Activity per Day ............................................................34 6 Tools Activation per Academic Division .............................................................37 7 Tools Count per Academic Division.....................................................................39 A1 Canvas Data Star Schema ...................................................................................56 B1 Requesting Faculty Participation in Data Verification .......................................58 B2 Post-Conference Confirmation of Data Interpretation ........................................59 B3 Participant Faculty Data Parity Confirmation .....................................................59 vi Introduction Online education is no longer a trend. From 2002-2012 enrollments in online classes at colleges and universities in the United States saw an annual increase which mirrored the increase in college enrollments overall. However, enrollments in higher education overall began to decline in 2013 and declined annually through 2016. More specifically, the number of on- campus higher education students dropped by more than one million from 2013-2016. During this same period the number of students enrolled exclusively in face-to-face courses dropped by more than 11% (Seaman, Allen, & Seaman, 2018). In Fall 2016, there were more than 6.3 million students in the United States enrolled in at least one college-level class being delivered online, more than 30% of the total student population (Allen & Seaman, 2017; Magda, 2019; Schroeder & Cook, 2018; Seaman, Allen, & Seaman, 2018). Despite this growth the 2017 Horizon Report identified expanding access to education as an ongoing impediment to progress in teaching and learning in the United States (Adams, Brown, Cummins, & Diaz, 2017). Broad use and continued development of the Internet and sophisticated web-based technologies has encouraged new interest in online education (Falvo & Johnson, 2007; Rhode, Richter, Gowen, Miller, & Wills, 2017). The demand for the technology-rich, asynchronous, and always available classroom characterizing online programs today cannot be ignored. Colleges and universities positioned to meet this demand have invested in technology solutions through which such classes can be delivered. A Learning Management System (LMS) is such a solution, enabling institutions to create and deliver course content, monitor enrolled student participation, and assess performance. Technology is an important element in all instructional modalities, from traditional brick and mortar to fully online or “brick and click” classrooms, and the hybridized modality which 1 combines the face-to-face and online environments into a single structure (Dahlstrom & Brooks, 2014). According to Mars and Ginter (2007), instructional technology is an important component of community colleges’ endeavor to reach a diverse student population. The technology in online programs affords educational opportunities across geographic and socioeconomic boundaries which expands the potential student market (Mars & Ginter, 2007). The belief that technology did support student learning was an identified theme in the 2014 ECAR Study of Faculty and Information Technology (Dahlstrom & Brooks, 2014). Faculty participants in that survey agreed effective technology integration held the power to enhance teaching across the academy. Faculty further reported a willingness to engage in professional development opportunities to help them integrate technology more effectively (Dahlstrom & Brooks, 2014). Participants in the 2018 Survey of Faculty Attitudes on Technology reported full support for the expanded use of educational technologies (Jaschik & Lederman, 2018). Participants cited their own enjoyable experiences with technology, former classroom success in integrating technology, and a belief that technology increased student engagement as important factors in their support. With reference to all instructional modalities, Ertmer and Ottenbreit-Leftwich (2010) resolved that a low-level of technology adoption and integration into instruction was not an option for higher education. The presence of technology-based tools to engage students and extend learning is an expectation of students. Ertmer and Ottenbreit-Leftwich proposed such academic technology was an essential tool in effective teaching (2010), a position that is supported throughout the literature (Dahlstrom & Brooks, 2014; Jaschik & Lederman, 2018; Mars & Ginter, 2007; Pomerantz & Brooks, 2017). In the 2014 Study of Faculty and Information Technology (Dahlstrom & Brooks, 2014), 57% of the faculty participants agreed 2 they would be more effective instructors if they were more skilled at integrating academic technology. The call for inclusion of technology in instruction is clear. However, it is not clear how faculty utilized technology in a purely online environment. The educational technology most widely used by higher education faculty is the LMS (Dahlstrom & Brooks, 2014). For many faculty the LMS represents the entry point for integrating technology in their instructional practices (Morgan, 2003). The LMS was formerly known as a Virtual Learning Environment or a Course Management System (Rhode et al., 2017). Its introduction in education dates back to the 1960s when the first computer-assisted learning platform, PLATO, was introduced (Rhode et al., 2017; Watson & Watson, 2007). This new platform was the first integrated learning system which offered functionality such as learner management, tracking, and application across a broad system (Watson & Watson, 2007), very similar to the affordances of most LMSs today. The modern LMS is characterized as self-contained and web-based (Wichadee, 2015). It serves as “the course hub for management and administration, communication and discussion, material creation and storage, and subject mastery assessment” (Dahlstrom & Brooks, 2014, p. 16). Watson and Watson (2007) similarly defined the LMS as the framework through which instructional content is managed and delivered, and learning assessment can be cataloged or tracked. Nearly all higher education institutions in the United States have an LMS (Pomerantz & Brooks, 2017). Functionality of the LMS has evolved with changes in the communications technology and infrastructure upon which such systems rely and with the advent of distance learning in higher education. The modern LMS has become systemic; inclusive of tools that enable the 3 delivery and management of course content and the identification of and tracking towards mastery of academic performance goals (Watson & Watson, 2007; Williams & Whiting, 2016). Many of the tools in the modern LMS support the seven principles of good practice. The principles were derived from multiple resources and were presented as common-sense practices in response to a national search for ways to improve college teaching and learning (Chickering & Gamson, 1987). According to Chickering and Gamson, good practice: (a) encourages contacts between students and faculty; (b) develops reciprocity and cooperation among students; (c) uses active learning techniques; (d) gives prompt feedback; (e) emphasizes time on task; (f) communicates high expectations; and (g) respects diverse talents and ways of learning (Chickering & Gamson, 1987). Chickering and Gamson could not have envisioned the technology-enriched classroom common in higher education today when their seminal work was first published in 1987. However, because the seven principles are focused on how we teach as opposed to what is taught (Chickering & Gamson, 1999; Lai & Savage, 2013) they remain relevant today. Though decades old, the seven principles remain applicable to current definitions of quality instruction in the traditional classroom, as well as the technology-enriched environment of an LMS-supported online class (Bangert, 2004; Chickering & Ehrmann, 1996; Chickering & Gamson, 1999; Hathaway, 2014; Lai & Savage, 2013; Suen, 2005; Tirrell & Quick, 2012). For example, LMS- based discussion forums and announcements afford communication between faculty and students and provide a secured and exclusive platform for student-to-student interactions. Collaborative efforts among students are supported in the LMS through the native assignment tools. Additionally, immediate and impactful feedback is possible through the built-in quizzing and gradebook features in the LMS (Hathaway, 2014; Williams & Whiting, 2016). The relationship 4 between each of the seven principles and common LMS tools is presented in Table 1. Research conducted by Tirrell and Quick (2012) found a direct relationship between faculty-reported inclusion of the seven principles in their instructional practices, and lower student attrition rates in online classes. Similar studies have determined the seven principles provide an appropriate framework for the evaluation of online courses (Bangert, 2004; Hathaway, 2014; Lai & Savage, 2013; Suen, 2005). Lai and Savage (2013) applied the seven principles in their examination of LMSs’ support for “good” teaching, acknowledging the potential for effective instruction in the LMS environment. Table 1 The Seven Principles and LMS Tools Principle LMS tool 1. Encourage contacts between students and Messaging or Inbox, Chat, Announcements, faculty Roster 2. Develops reciprocity and cooperation Collaborations, Discussions, Assignments, among students Groups 3. Uses active learning techniques Collaborations, Rich Content editor, Assignments, Virtual meeting 4. Gives prompt feedback Gradebook, Assignment, Discussion, Virtual meeting, Chat, Confirmation 5. Emphasizes time on task Calendar, Schedule, Timed-release, Analytics 6. Communicates high expectations Rubrics, Assignments, Content, Syllabus 7. Respects diverse talents and ways of Rich Content editor learning Faculty use of the LMS varies among the disciplines and among institution types. Morgan reported the LMS was being employed primarily in face-to-face classes (2003). Faculty 5 participants identified static information sharing as the main LMS function being used (Morgan, 2003). Several years later, Dahlstrom and Brooks reported 60% of faculty identified the LMS as a critical tool in their teaching (2014). Faculty in Associate’s and Master’s degree programs reported a slightly higher importance of the LMS than other faculty (Dahlstrom & Brooks, 2014). Research has identified several factors that impede technology adoption in higher education. Buchanan, Sainter, and Saunders (2013) examined Internet self-efficacy as a predictor of technology adoption. In interview-based research of online educators they found the degree of self-reported confidence for using online technologies was a strong indicator of the level of adoption to instructional practice. Their research also supported the construct of perceived usefulness as a determining factor, similar to the Technology Acceptance Model (TAM). First introduced in 1986, TAM and its modifications have been widely used to understand why faculty adopt technology tools, including those within the LMS (Buchanan, Sainter, & Saunders, 2013; Ertmer & Ottenbreit-leftwich, 2010; Fathema, Shannon, & Ross, 2015; Rhode et al., 2017; Schoonenboom, 2014; Wichadee, 2015). Buchanan et al. (2013) introduced structural factors such as resource limitations and training as a third variable that impacted adoption. Similar factors were identified in the modification of TAM, named TAM II, which held reference to system quality and institutional support as obstacles to adoption (Lee et al., 2003). Pomerantz and Brooks reported that the particular LMS has little impact on faculty members’ use which is generally reserved for the basic features. Their longitudinal examination of faculty satisfaction found little variance from 2014-2017 in terms of use and favored features (2017). In an examination of faculty attitude towards the LMS, Wichadee (2015) found no 6 correlation between a positive attitude towards the LMS and the actual application of its tools in instruction. Faculty participants in that study agreed that the LMS was highly useful however this attitude was not conveyed in the actual use of the LMS tools (Wichadee, 2015). Similar findings were reported by Lonn and Teasley (2009) and Jaschik and Lederman (2018). Despite advances in the LMS’s tools, faculty reported their most common application of the LMS to instruction was as a repository for static content sharing (Jaschik & Lederman, 2018; Lonn & Teasley, 2009). Additional research has documented the impact on faculty use caused by the migration from one LMS to another. Continued use of the LMS by faculty through a software migration depends upon available training, increased technical support, and a provision for faculty support through planning and implementation (Rucker & Downey, 2016; Ryan, Toye, Charron, & Park, 2012; Sanga, 2016). This previous research was founded in self-reported use of the LMS. It has provided important insights from key stakeholder groups in the environment including students and faculty. However, data-mining is gaining interest as an avenue for empirical evidence regarding the use of the LMS (Mandernach & Palese, 2016; Rhode et al., 2017; Romero, Ventura, & Garcia, 2008; Siemens & Long, 2011) and will serve as the foundational process in this project. Data-mining refers to the process of extracting meaningful information from a large set of raw data. For purposes of this study the large dataset will represent all user activity in the LMS at a rural community college in a western state. To ensure anonymity throughout this research the college was identified by a pseudonym, High Plains Community College (HPCC). Within the LMS a digital footprint of usage is created as a direct reflection of user actions within the system. Every mouse click, quiz question response, discussion post, and course announcement that occurs in the LMS are recorded and preserved. When examined 7 systematically this vast data collection tells the College’s story of “What happened?” For an institution that holds evidence-based decision making as a transparent value, understanding the digital footprint in the LMS provides a context for institutional planning to impact resource allocation and, ultimately, instruction in the online campus. The HPCC Vision Statement delivers the expectation that data-driven decision making will inform operations throughout the institution. The College defines its intentions as being diligent in the use of evidence to make decisions pertinent to resource allocation, instructional initiatives, and productivity in operations (High Plains Community College [HPCC], 2016). HPCC implemented the Canvas by Instructure (Canvas) LMS throughout the College in 2018 and has contracted with this software service through June 2023. An examination of the faculty- produced Canvas data during the inaugural semester at the College will assist HPCC in understanding the degree to which the LMS is being used by faculty. This examination will create a baseline measure established by the faculty behaviors recorded in the system. It will further provide a benchmark for future planning such as establishing indicators of acceptable adoption associated with the online initiative, resource allocation, and faculty training. Problem High Plains Community College has invested considerable time and resources in LMS implementation in recent years. The College transitioned to a new LMS in 2008, to a different LMS in 2013, and most recently to Canvas in 2018. These transitions have been triggered by a broad faculty appraisal of existing LMS functionality and vendor success in a Request for Proposal (RFP) process at the expiration of contracted licenses. Like most colleges, HPCC has provided the LMS as a tool through which online courses should be delivered, but the institution has not examined the application of the LMS and its use across the College. To date the College 8 has relied on self-reported information from faculty regarding their use of the LMS and the implementation of the LMS’s native tools in instruction. The disparate collection of this information has been treated as an end in itself; perhaps a reflection of the difficult task of analyzing it. Consequently, after more than 15 years of offering online courses and programs the College is without any aggregated understanding of the degree to which the LMS is being used by the online faculty. Purpose The purpose of this quantitative research was to understand the patterns of LMS activity generated by HPCC online faculty during the first semester of Canvas availability. This examination of activity in the initial semester provided a baseline measurement of the current status of Canvas implementation at the College. Understanding the status of Canvas use during the first semester of availability allows the College to define its envisioned future status, to develop internal supports in targeting that status, and to establish indicators of acceptable adoption in its pursuit of that state. Baseline studies are employed to help an organization benchmark a performance metric upon which to gauge impact over time. They examine a starting point from which to define behavior and consequently a marker from which to measure growth (Odhiambo, 2013). The baseline measurement determined through this research provides a benchmark for describing the use of Canvas by HPCC online faculty over time. This measure sets the stage for the examination of online faculty use longitudinally which will be an important factor in the next LMS renewal consideration at HPCC, scheduled for 2022. 9 Research Questions The research questions addressed in this project are: 1) What was online faculty’s level of activity in the LMS in the Fall 2018 semester? 2) What Canvas-native tools were used in online classes during the Fall 2018 semester? Specifically, to what extent were those Canvas-native tools that encourage student-to- student, faculty-student, and student-to-content engagement used by the online faculty. Canvas’ Announcements, Assignments, Discussions, Quizzes, and Modules tools were examined in this regard. Definition of Terms and Acronyms Canvas Data is the collection of text-based tables that report all user activity in an institution’s Canvas instance. The tables reflect the visible actions from a user, such as a click, as well as the series of systematic backend responses within the architecture of the Canvas LMS. High Plains Community College (HPCC) is the pseudonym applied to the research location throughout this project. Learning Management System (LMS) is a web-based environment through which online instruction can be delivered. It centralizes the management and administration of a class and includes the tools that enable the development and delivery of course content. School of Arts & Humanities (A&H) is the academic division at HPCC that includes such disciplines as English, Communication, and Humanities and those associated with the visual and performing arts. School of Business, Agriculture, & Technical Studies (BATS) is the academic division at HPCC that includes such disciplines as Business, Agriculture and Equine, Computer Information Systems, and industrial trades such as Wind Energy, Welding, and Automotive. 10 School of Health Science & Wellness (HSW) is the academic division at HPCC that includes a wide range of health-related fields such as Nursing, Radiography, Dental Hygiene, and Surgical Technology. School of Math & Sciences (M&S) is the academic division at HPCC that includes such disciplines as Mathematics, Statistics, Chemistry, Biology, and History. Stacking is an enrollment-based practice at HPCC wherein two different sections of a single course are combined and delivered through one Canvas shell. Stacking allows the institution to deliver low-enrolled classes by adding the enrollments of two classes together and counting the stacked section as a single offering on the faculty workload. Methodology This study sought to understand the use of Canvas by online faculty at HPCC during the first semester of implementation. The selection of participant faculty and courses is discussed in this section. Additionally, the procedures used in refining these data and the establishment of reliability and validity are examined. Finally, a discussion of the limitations of this study is included. Context High Plains Community College is in the western United States. The College was established in 1968 and since its inception has become the largest community college in the state. The environment is essentially rural; though the College is situated in the state’s largest city. In addition to its main campus HPCC also has limited operations at three satellite locations within a 100-mile radius. HPCC offers more than 80 different credentials ranging from Associates degrees to Credit Diplomas. HPCC has the second largest population of online students in the state, second only to the state’s sole 4-year institution (Seaman & Seaman, 2017). 11 The College is comprised of four academic schools. Each school is administered by a dean, every dean reports directly to the Vice President of Academic Affairs. The academic schools are: Arts & Humanities (A&H); Business, Agriculture, and Technical Studies (BATS); Health Science & Wellness (HSW); and Math & Sciences (M&S). The School of Math & Sciences employs the greatest percentage of the full-time faculty; all schools rely on the contributions of adjunct faculty. The largest program at the College is Nursing which is positioned in the School of Health Science & Wellness. Education and Computer Information Systems are second and third largest, and are housed under the School of Arts & Humanities and Business, Agriculture, & Technical Studies, respectively. The College serves a student population of approximately 2,638 full time equivalent (U.S. Department of Education, 2017). Academic courses at HPCC are delivered through face- to-face, online, and hybrid modalities. The student population is primarily Caucasian. More than 80% of the student population are in-state residents, 59% are women, and 58% are part-time students. The average age of an HPCC student was 27 years and 62% of the student body was under 25 years of age (2015). The College operates on a semester basis; each academic year includes a consecutive sequence of Fall, Spring, and Summer. HPCC strives to make evidence-informed decisions in its pursuit of teaching excellence. However, to date the College has relied on anecdotal and self-reported information in regard to the faculty use of learner-centered practices and technology tools to support instruction. HPCC transitioned from the Desire2Learn (D2L) LMS to Canvas in 2018. When the LMS transition conversation was initiated at the College the five-year contract with D2L was nearing expiration. In addition, there was a new initiative to consider a common LMS throughout the state. The College’s commitment to that initiative availed the HPCC faculty the exposure to and 12 comparison of several LMSs. It eventually resulted in the determination that HPCC would join five other institutions in higher education and several K-12 partner school districts in the state in adopting Canvas. Fall 2018 was the first semester in which Canvas was the only LMS through which academic courses at HPCC could be conducted or managed. Participants Online faculty. Participants in this research included all faculty teaching a fully online and credit-bearing class at HPCC during the Fall 2018 semester. Credit-bearing courses are those that are designated in the College Catalog and are required in one or more of the College’s degree programs. Faculty participants have either full-time or part-time employment status at the College with no specified gender, race, age, or class. Administrative Procedure at HPCC requires that all faculty must meet minimum faculty qualifications prior to employment; procedural compliance means all participants will have two or more years of post-secondary education. Length of employment at HPCC among the participants is as great as 32 years and as recent as first year. Faculty members were affiliated with one of the College’s four academic schools as determined by the class(es) in their instructional workload for the Fall 2018 semester. The faculty listing provided by the College indicates 125 full-time faculty were employed at HPCC during the studied semester. The listing further indicates 323 part-time faculty were eligible for employment during this time. At a minimum, faculty members in all academic schools at HPCC utilize the LMS to share the course syllabus, to provide availability and contact information, and to maintain a current gradebook. Classes System of record. HPCC maintains a single student information system that serves the institution as the system of record. It is considered the definitive source for many data elements 13 at the College including semester schedules, class enrollments, mid-term and semester grades, and student demographic data. The system of record provides these data from which many of the College’s metrics are derived and from which the official reporting to external stakeholders is developed. The system of record at HPCC is Colleague, a secure database in which much of the institution’s records are stored. The Fall 2018 Section Enrollment Report, FINAL (HPCC Enrollment Report) was derived from Colleague. This report indicated a total of 814 classes were delivered in the studied semester. Applying a LOCATION filter to this report revealed 162 fully online classes were delivered during this time (HPCC, 2018). The reported online classes represent 19.9 percent of all classes offered at the College in the studied semester. Categorization of the online classes in the HPCC Enrollment Report across the academic schools indicated 41 in A&H, 35 in BATS, 27 in HSW, and 52 in M&S. Additionally, seven classes that were classified as Academic Affairs (AA) offerings were held in the Fall 2018 semester. Table 2 disaggregates the HPCC Enrollment Report data pertaining to faculty assignments in all online classes offered. Table 2 Online Class Categorization per HPCC Enrollment Report Fall 2018 School Online Classes Percentage of Online Offerings AA 7 4% A&H 41 25% BATS 35 22% HSW 27 17% M&S 52 32% TOTAL 162 100% 14 The learning management system. The disaggregated class list from Colleague was compared with the Canvas Data COURSE_DIM (Canvas Course) table in order to define the sample set of classes to be used in this study. The Canvas Course table is one of several extracts provided through Canvas Data. It attributes numerous values to each class created in the LMS. There is an entry in the Canvas Course table for each class migrated from Colleague into Canvas which associates the class with identifiers that assist with institutional reporting. Two LMS- specific metrics reported in the Canvas Course table that are pertinent to this study are the enrollment_term_id and the workflow_state variables (Canvas, 2015). Filtering the table for the Fall 2018 semester (i.e., enrollment_term_id) isolated those classes conducted in the semester of interest. Applying a second filter to the workflow_state value further reduced the dataset by excluding those classes that were created but never published and therefore never available to students. In the HPCC Canvas instance unpublished courses could be accessed by the assigned faculty, however any work conducted in these course shells was concealed from students. For purposes of this study faculty activity in an unpublished course was considered exploratory and experimental and not directly associated with students. The dataset was further reduced to reflect the institutional practices that impact these data recorded in the LMS. This distillation of the Fall 2018 classes at HPCC is demonstrated in Figure 1. 15 Figure 1 Dataset Determination Other Modality Dataset No Data Stacked to Hybrid Unused Shell Off Block Enrollment-based scheduling decisions in place at HPCC can impact the degree to which enrollment in the student information system was reflected in the LMS. For example, if a faculty member was assigned two sections of the same course in a single semester, that faculty may choose to conduct the sections through a single Canvas class shell. This practice yields class shells in the LMS with no recorded faculty activity, and similarly results in LMS class shells where the faculty activity was reflective of two classes as opposed to one. There is no accounting for this practice in the College’s system of record, and consequently there is misalignment between Colleague and the LMS in semesters where the combining of sections was in place. Combined course sections were easily identified in the ENROLLMENT_DIM (Canvas Enrollment) table in the Canvas Data. By filtering the type variable for StudentEnrollment and the workflow_state variable for active, the visible dataset displayed active student enrollments in all classes (Canvas, 2015). Enrollment totals were determined by using the SUM operation in 16 Excel. Comparing these totals to the enrollments identified in the HPCC Enrollment Report located the discrepancies; sorting the HPCC Enrollment Report by instructor conveniently aligned common courses and allowed for the identification of courses where a single instructor was assigned multiple sections. This analysis resulted in five classes being identified as combined, and the associated empty class shells were ultimately eliminated from the dataset of online classes used in this study. An additional practice at HPCC which impacted activity data reflected in the LMS is stacking. This practice involves the combining of two course sections, generally two sections of the same course offered in a single semester under differing instructional modalities. Stacking is a way of meeting the institutionally determined enrollment threshold. The enrollment threshold is a measure of financial efficacy for a class at HPCC and is a determinant in whether a class is conducted or cancelled. Stacking allows for the joined sections to be conducted as a single hybridized class, generally an instructor-determined combination of face-to-face and asynchronous online instruction. This naturally alters the instructional modality of the class however; the impacted classes are not recharacterized in the College’s system of record when stacking occurs. The Canvas Enrollment table also served as a source to initially identify courses in the Fall 2018 semester which held the potential to be stacked. By filtering the same variables as were used to determine combined sections—type and workflow_state—the filtered dataset from the Canvas Enrollment table reflected all classes with active student enrollments. Comparing this dataset with the HPCC Enrollment Report, filtered by Section, identified those courses where multiple sections were scheduled in a variety of instructional modalities. Scheduling multiple sections of the same course in different modalities affords students the opportunity to 17 enroll in a class that best meets their needs and/or is more easily accessible due to scheduling constraints or other individual considerations. HPCC schedules classes in this manner to positively impact enrollment (High Plains Community College [HPCC], 2013). The examination of these data for stacked classes revealed six classes that were classified as online offerings in Colleague that were stacked to face-to-face sections of the same course and were delivered as a hybrid option. Because these classes were not delivered fully online, they were removed from the list of online classes used for this study. A third impactful practice at HPCC is the postponement of a class start date in order to allow for late enrollments. This impacts the comparability of that class’s data generated in the LMS. The College normally schedules classes into five distinct calendar blocks; A16, A8, B14, B12, and B8. The numeric reference in a block’s title reveals the number of weeks of instruction included in that block. When the start date is delayed beyond the start of the block the class is referred to as “off block” in Colleague. While it falls within the defined parameters of the College’s academic calendar an “off block” class does not adhere to the published start and end dates of any block. In the studied semester there were five online classes that were conducted “off block” with each class offered in a unique date and duration configuration. The classes that were delivered “off block” were removed from the dataset. Their inclusion in this study would have placed each course as a separate category in the analysis and would have made the class and the assigned faculty easily identifiable. Finally, there were two faculty assigned to seven online classes for which the Canvas Course table showed no activity despite the inclusion of student enrollments in the HPCC Enrollment Report. The underlying cause of this inconsistency in Canvas Data cannot be determined. Similarly, the faculty activity cannot be reconstructed. This necessitated removing 18 these courses from the dataset. The revised list of participant courses used for this study is defined in Table 3. Table 3 HPCC Revised Class Dataset Removed due to Enrollment-based Scheduling Decisions Classes per Single No Data HPCC Shell, Stack to Off Reported in Revised Total Enrollment Multiple Hybrid Block Canvas Course School Report Class AA 7 7 A&H 41 2 39 BATS 35 1 4 5 25 HSW 27 1 3 23 M&S 52 1 2 4 45 TOTAL 162 5 6 5 7 139 Summarily, the sample size of 139 classes in this study represents all classes that were offered at HPCC in the Fall 2018 semester that were characterized as published, conducted completely online within dates determined by an identified calendar block, and reflected in the Canvas Course table as a class within which a single faculty member was active. The courses in the sample set were categorized according to academic division. The School of Math & Sciences had the largest portion of classes in the study at 32.4%. The second largest percentage was in the School of Arts & Humanities (28.1%) followed by Business, Agriculture, & Technical Studies (18.0%), Health Science & Wellness (16.5%), and Academic Affairs (5.0%). Procedure The research questions posed in this study required the identification of the online faculty at HPCC and the courses to which they were assigned for the Fall 2018 semester. An examination of the HPCC Enrollment Report was completed to determine the volume of online 19 classes offered at HPCC during the semester studied (HPCC, 2018). The information in this report was reduced to data pertinent to this study. Specifically, the report was filtered by LOCATION to determine the number of online class sections, and filtered further by SCHOOL to determine the class volume per academic school at the College. This provided an initial identification of study participants and the academic schools in which they taught during the studied semester. A comparison of the reduced report with the Canvas Course and Canvas Enrollment tables extracted from Canvas Data resulted in the finalized list of participant courses categorized in Table 3. Additional extracts from Canvas Data were conducted in order address the research questions. Canvas Data is a voluminous collection of tab delimited flat files. This large dataset includes LMS activity by all account users related to courses, enrollments, activities, logins, and student submissions for a particular Canvas instance. The files are formatted following data warehouse conventions and are comprised of fact and dimension tables (Instructure, 2015). In Canvas Data the fact tables contain the basic units of measurement of user activity. Examples include assignment, discussion, enrollment, and login. The related dimension tables contain the “who, what, when, and where” information for each of the measures identified as fact. Sample dimensions from Canvas Data include timestamps, course names, and enrollment status. Joining the fact and dimension tables together through associated keys allowed for the comparison of associated files to understand patterns and query the dataset in an efficient manner. Joining the fact and dimension tables associated with the Assignments tool, for example, allowed for the analysis of every assignment in the HPCC Canvas instance in terms of the class in which it was included, the dates upon which it was created, assigned, and due, the submission type specified, the point value, and the presence or absence of a formalized scoring rubric. 20 The Canvas Data Portal extracts and delivers Canvas activity data for a member institution. HPCC enabled Canvas Data in its hosted environment and is the recipient of all data generated in the LMS since its implementation in 2018. This data is exported to the College upon request and it allows for just-in-time queries of interest. Research question 1 sought to quantify the level of activity of the online faculty in the LMS during the Fall 2018 semester. In order to address this question, the Canvas requests table was examined. The requests table in Canvas Data is the single repository of all pageview activity in Canvas and is a compilation of Canvas log files. Essentially, the table shows all events associated with the learning management system for a defined period of time. This includes every visible, user-triggered event such as a click as well as the back-end details associated with that event. The requests table is exceptional for the breadth of data it contains and likewise for its size. Processing the table monopolizes computer resources for extended periods of time. Two attempts at HPCC to import the requests table into Excel or open in Tableau resulted in failed jobs running for several days. The need to refine the requests table in order to access its content was bridged through collaboration with a Canvas data scientist. While the requests file contains a vast amount of data that is not pertinent to this study, it is the most comprehensive record of faculty activity in the LMS (Canvas, 2015). Prior to transmitting the file to HPCC it was refined for this project in the following manner: 1. Data pertaining to semesters other than Fall 2018 and to classes not represented on the HPCC Enrollment Report was removed. 2. The research question was posed as a Canvas user story in order to evaluate the relationship of all variables included in the original requests table. 21 a. The user story was stated as, “As a researcher working with Canvas Data, I want to know how many 'requests', or course actions, each teacher had in any online course over the span of the Fall 2018 term. This includes date and time-stamped access to each course and date and time-stamped exit from each course but not recorded activity within the courses.” b. Two files resulted from analysis of the requests table via the user story. The first file, named teacher_access_rollup, included a single row of data for each class and associated the teacher name and the number of requests made by that teacher. c. The second resultant file, teacher_access, included a separate row for each access time, for each class. The variables included were Teacher Name, Course Name, Timestamp, IP Address, and URL. The refined request files were sent to HPCC via ShareFile for security. The teacher_access_rollup file contained 162 rows of data and was 15 KB in size. The teacher_access file contained approximately 5.2 million rows of data and was 5.2 GB in size. For efficiency in downloading this file was divided into 12 separate files, each representing the user-triggered activity of a subset of the online faculty. The analysis of data from the requests table was completed in Excel. All files were formatted as comma separated values (.csv). The tabular data was seamlessly sorted, and categorized using standard capabilities of Excel. These data revealed the first time an instructor entered a class on a given day. Determining when that instructor left the class, however, was not apparent and is indeterminate in the data gathered. While activity in a second course could be interpreted as departure from a previous class, that interpretation is not necessarily true and cannot be verified. For example, an instructor may have been using multiple browser tabs or 22 windows, and in that case would have been active in both classes simultaneously. Using the elapsed time function, it was possible to calculate the time an instructor spent in a course on a given day. This information was used to determine the length of time a faculty member was “active” in the online course on any calendar day in the semester. Analysis of the refined requests table has provided a baseline understanding of online faculty activity in the LMS. Canvas defines the purpose of the requests table as an opportunity to examine activity in the aggregate (Canvas, 2015). It has been used for exactly this purpose in its application to research question 1. Research question 2 sought to identify the Canvas-native tools that were being used in the online classes at the College. For purposes of this study tools were considered “Canvas-native” if they do not require services external to the LMS. For example, the Canvas Gradebook is “native”; the remote proctoring solution provided by another vendor and integrated into the LMS via Learning Tools Interoperability (LTI) is not. The Canvas-native tools are Assignment, Communication, Discussion, Files, Groups, Modules, Quiz, and Wiki. Information about the inclusion of each of these tools in any class is stored in the fact and dimension tables bearing the tool’s name. Unlike the requests table the Canvas Data fact and dimension tables are small text files that can be efficiently downloaded and accessed. To determine the extent to which the Canvas- native tools were included in an online course at HPCC, and to further describe those tools, the fact and dimension tables were joined by the foreign and primary keys defined in the Canvas Data dictionary (Canvas, 2015). A detailed explanation of this process is provided in Appendix A. 23 Reliability and Validity The Canvas Data afforded this study was drawn in a single extract from all activity recorded in HPCC’s instance of Canvas as of the end of the Fall 2018 semester. Much of the work in this study was associated with drawing from that data the subset that pertained to the restricted dates of the Fall 2018 semester. Further data cleaning was required to restrict the subset to activity in fully online classes and data that was generated exclusively by faculty activity. Once that subset was determined it was important to demonstrate the reliability and validity in order to assess the quality of the proposed study. Reliability and validity are indicators of how well a test or technique measures something. Reliability is associated with the consistency of a measure while validity is about a measure’s accuracy (Huck, 2008). The reliability of the Canvas Data was determined through a random sampling of online classes in the dataset. One class was randomly selected from each of the academic schools. The instructor of each class was contacted and their participation in data verification was requested and secured (K. Brown-Herbst, personal communication, September 22, 2109) (see Appendix B). Prior to engaging the instructor, the Canvas Data tables respective to each class were analyzed to quantify the inclusion of the Canvas-native tools in the class. This analysis was shared with the instructor. In three cases this occurred in a face-to-face meeting while the instructor was logged into Canvas. Analysis of the fourth class occurred remotely. The comparison of the information discerned by the researcher from the Canvas Data fact and dimension tables with the elements of the actual classes as reported by the faculty revealed that these data are reliable. Discrepancy between the interpretations of the researcher and the actual activity in each class were assumed to be misinterpretations of the Canvas Data variables by the researcher. Such discrepancy prompted reexamination of the Canvas Data until 24 interpretation of all variables aligned with the actual class activity as defined by the faculty. Without exception in each of the four classes sampled the final count of the Canvas-native tools interpreted from the Canvas Data was an exact match to the appearance of the tools in the class shell. This parity was confirmed via email correspondence with the instructor of each of the sampled courses (K. Brown-Herbst, personal communication, September 22, 2019) (see Appendix B). Reliable measures are said to be those that produce consistent results over time, among different items and/or among different participants (Huck, 2008). The Canvas Data is consistent for each of these considerations, quantifying the different Canvas-native tools through the duration of a semester in a multitude of different classes. The assessment and confirmation of reliability is a condition upon which validity must be established. Data must be reliable if it is valid (Huck, 2008). The endorsement by the faculty that Canvas Data presented an accurate reflection of their course activity is a demonstration of the validity of these data and of the process through which the researcher has examined it (K. Brown-Herbst, personal communication, September 22, 2019) (see Appendix B). This project sought to define the first semester of Canvas LMS online faculty use through descriptive analytics in order to answer the question, “What happened?” Specifically, the project sought to determine the level of activity recorded in the LMS by online faculty in the Fall 2018 semester and the level of use of Canvas-native tools in online classes during the same period. The Canvas Data flat files pertinent to these questions have provided data with which these research questions were answered. 25 Limitations Limitations to this project that are beyond the control of the researcher do exist: • The Canvas data reflects all LMS activity including any that is accidental or exploratory in nature. Identifying only the intentional activity is not possible in the scope of this project and all activity was considered in the aggregated results. • The technology required to access the whole of Canvas Data was not available to the researcher. The voluminous nature of the Data dictated cloud-based services and/or direct server access in order to view and/or analyze the data in its entirety. Efficiencies in reducing the Data to those tables relevant to this research were achieved through partnership with Canvas. Secure file transfer from Canvas allowed for on-site data analysis for this research. • Generalizability of the results will be limited. This project sought to understand the LMS usage by online faculty at a single institution in order to assist that specific institution in future planning. Results Research question 1 sought to understand the level of online faculty activity in the LMS during the first semester of implementation at HPCC. Following identification of the classes to be included in the study as well as the respective faculty, the Canvas requests table was used to determine the number of days an instructor was active in a class. This project aimed to quantify the aggregate information regarding the faculty activity in the LMS and the results are presented as such. The faculty at HPCC have daily opportunities to be present in the online classes to which they are assigned. Being “present” in an online class for purposes of this study meant the faculty 26 member was logged into Canvas and had accessed one or more of the Canvas tools at least once on a day within the calendar block. The longest calendar block in any semester at the College is the A16 block, followed by the B14, B12, and A8 and B8 blocks, respectively. Online classes at HPCC fall under the same academic calendar as classes held on the physical campuses of the College. As such, when the College is closed for holidays that closed status pertains to all locations of HPCC, including the College’s virtual campus. However, the “always on” nature of the LMS translates to ready access via the Internet regardless of the accessibility of a College’s physical space. This access was evident in the examination of the online faculty activity as reported in the Canvas Data. Table 4 outlines the number of calendar days in each of the blocks in the HPCC Academic Calendar during the Fall 2018 semester. The table further outlines the total calendar days per block when HPCC was closed, as well as the number of Saturdays and Sundays (weekend days). Table 4 Days per Calendar Block, Fall 2018 Block Total Calendar Scheduled Weekend Block Total Less Closures Days in College Days Less Closures and Weekend Block Closures Block Days A16 110 4 26 106 80 B14 96 3 24 93 69 B12 82 3 20 79 59 A8 54 1 12 53 41 B8 54 3 12 51 39 Summarily, Table 4 identifies the number of scheduled days in each calendar block when the online faculty at HPCC could be considered available. For this research each calendar block 27 was considered as the number of calendar days in the block less the scheduled College closures. Scheduled closures are communicated at HPCC via the Academic Calendar; as such they were considered days on which faculty work was not expected. Weekend days, however, cannot be systematically characterized as work or non-work days. The College does not have a procedural determination for expected effort from faculty on Saturday or Sunday. Online faculty at HPCC communicate their availability to students via the course syllabus; the examination of course syllabi was not in the scope of this project. The analysis of all online activity in the Fall 2018 semester revealed that on average the faculty were engaged online 67.5% of days available in a calendar block. Aggregate data from the requests table demonstrated that online faculty activity was more common in the A8 block with engagement on 71.1% of the days while engagement was at its lowest in the B8 block with faculty online 63.1% of the days (Figure 2). Figure 2 Average Online Faculty Engagement per Calendar Block 100.0% 95.0% 90.0% 85.0% 80.0% 75.0% 70.0% 65.0% 60.0% 55.0% 50.0% 45.0% 40.0% A16 B14 B12 A8 B8 Calendar Block 28 % of Available Days Online There is limited application for data aggregated at the institutional level, particularly if an institution has not defined an expectation for faculty engagement as defined by measurable activity in the LMS. Similar analysis of the requests table data per academic school revealed school-based averages generated by the activity among the faculty of associated disciplines. The online courses pertinent to this study were organized according to the academic school in which the assigned faculty teaches. Seven of the participant courses were categorized as Academic Affairs (AA) offerings, indicative of the multiple disciplines these courses served. The distribution of online classes per calendar block is outlined in Table 5. Table 5 Calendar Block Course Distribution per School Classes per Calendar Block % of Total Online A16 B14 B12 A8 B8 Online Classes Classes School Offered AA 4 1 1 1 7 5.0% A&H 23 5 4 2 5 39 28.1% BATS 13 1 2 3 6 25 18.0% HSW 11 8 2 2 23 16.5% M&S 19 4 12 4 6 45 32.4% TOTAL 70 19 21 9 20 139 100% In the Fall 2018 semester the calendar block with the greatest number of classes was A16 which had 70 offerings. The School of Arts & Humanities had the greatest portion of A16 classes with 23 classes followed closely by the School of Math & Sciences with 19. The calendar block with the fewest offerings (9 classes) was the A8 block with courses delivered only from A&H, BATS, and M&S. The analysis of the impact on student success of faculty 29 engagement online was beyond the scope of this study. However, presuming comparable enrollment across the online classes at the College, the categorization of the Fall 2018 online classes as shown in Table 5 allows for some generalizations. For example, it can be presumed that because the School of Math & Sciences hosted the greatest percentage of the online offerings (32.4%) the online faculty teaching in M&S are likely to have impacted the largest portion of the College’s online students. Similarly, the faculty in A&H impacted the second largest portion and that impact was closely followed by BATS and HSW, respectively. The impact of the Academic Affairs (AA) offerings, in terms of course volume, was fairly limited. Further disaggregation of the requests table data demonstrated the levels of online faculty engagement within the different academic areas at HPCC (Figure 3). Though the calendar blocks are institutionally defined and therefore consistent throughout the College, there were significant differences in the average level of online faculty attendance when the classes were considered per academic school. Figure 3 demonstrates that in the A16 calendar block, the average number of days online was greater among the faculty in BATS (75.5%) than it was in any other school. The BATS faculty were also online more days on average than their other school colleagues in the B14 (81.7%) and A8 (77.4%) blocks. During the B12 block the faculty teaching in Academic Affairs (AA) were online on average 79.7% of the days in the block and in the B8 block the HSW faculty were online on average 80.4% of the available days, nearly 20% more than their colleagues in other schools. Figure 3 also demonstrates the AA faculty averaged the fewest days online in three calendar blocks, A16, B14, and B8. The least present school in the B12 block was HSW while the least present in the A8 block was the School of Arts & Humanities (A&H). 30 Figure 3 Average Online Faculty Engagement per Academic Division 100.0% AA 95.0% A&H BATS 90.0% HSW 85.0% M&S 80.0% 75.0% 70.0% 65.0% 60.0% 55.0% 50.0% 45.0% 40.0% A16 B14 B12 A8 B8 Calendar Block A final consideration of the requests table data demonstrated the faculty activity throughout the course of a calendar block. Figure 4 presents the ebb and flow of presence in the LMS per online faculty group as determined by academic area. This representation does not discriminate between calendar blocks; it simply outlines the average number of hours an online faculty member was active in the LMS per calendar day, per week. This visualization also calls attention to the limitations inherent in defining faculty engagement from a purely quantitative lens limited to the LMS metrics applied to this research. For example, Figure 4 indicates that the online faculty in the School of BATS averaged more than 16 hours and 48 minutes in class per day during Week 12 of the Fall 2018 semester. This level of engagement would be 31 % of Available Days Online extraordinary. It is more likely a reflection of the login/logout activity and an institutional setting that does not time out a session in the LMS than it is a realistic indication of faculty activity. This conclusion is supported by the data presented in Table 6. Figure 4 Average Online Faculty Engaged Time per Day 19:12 AA 16:48 A&H BATS 14:24 HSW M&S 12:00 9:36 7:12 4:48 2:24 0:00 Semester Weeks Table 6 summarizes the average daily online faculty activity per calendar day for each week in the Fall 2018 semester at HPCC. The calculation does not include days upon which the College was closed; weekend days, however, were considered in establishing the average for all weeks. As is visualized in Figure 3, the improbability that this project’s consideration of the Canvas data offers a complete indication of the online activity in the LMS is clear. These data were likely influenced by a faculty member’s use of multiple browser windows. It is also likely 32 Average Time Online per Day per Week (h:mm) that a generous time out setting in the HPCC Canvas instance maintained the Canvas login for extended periods of time irrespective of LMS activity by an individual. Table 6 demonstrates a daily average greater than 18 hours (BATS, Week 12) and several weeks in which all faculty were online daily for more than a typical 8-hour work day. The per day average across all online faculty at the College during the Fall 2018 semester as examined through the Canvas data was 8 hours and 56 minutes. Table 6 Average Daily Engagement per Week (h:mm) Fall 2018 Academic Areas All Faculty, Weekly AA A&H BATS HSW M&S Semester Week Average 1 9:43 8:37 11:24 10:43 9:12 9:56 2 6:32 6:16 10:14 11:25 8:11 8:32 3 7:22 8:22 8:18 10:14 6:16 8:06 4 14:13 10:19 11:03 9:43 8:32 10:46 5 1:12 9:01 10:00 5:03 5:44 6:12 6 7:23 8:42 13:03 9:45 10:06 9:48 7 6:54 9:48 12:28 10:53 9:07 9:50 8 9:14 10:32 13:07 9:49 9:13 10:23 9 3:18 12:58 10:36 6:46 8:40 8:28 10 5:17 7:52 10:56 8:19 9:02 8:17 11 13:31 10:30 11:01 8:38 10:02 10:44 12 9:21 7:39 18:14 6:44 10:08 10:25 13 3:55 7:50 10:47 6:52 7:30 7:23 14 7:08 8:41 8:04 7:29 7:06 7:41 15 5:36 5:07 5:42 5:20 6:38 5:41 16 10:39 10:46 10:09 10:08 11:48 10:42 AVERAGE 7:35 8:56 10:57 8:37 8:35 8:56 The average online faculty activity for each day of the week is presented in Figure 5. The figure demonstrates that the overall faculty activity when aggregated across all academic areas at HPCC was greatest during the typical workweek defined as Monday through Friday. Numerically, the activity across all schools was highest on Tuesday when faculty were in the 33 LMS on average for 10 hours and 12 minutes. The average activity on the remaining weekdays ranged from 10 hours and eight minutes (Monday) to nine hours and 27 minutes on Friday. The average online time on Saturday was considerably less at seven hours and 22 minutes, and the average fell to six hours and 18 minutes on Sunday. This representation and the accompanying calculations were impacted by the same limitations previously expressed. The scope of the analysis could not discern intentional activity from accidental activity, nor was it possible to understand from these data when faculty were performing tasks online versus simply being logged into the LMS. Figure 5 Average Online Faculty Activity per Day 19:12 AA 16:48 A&H BATS 14:24 HSW M&S 12:00 9:36 7:12 4:48 2:24 0:00 Sunday Monday Tuesday Wednesday Thursday Friday Saturday Days of the Week Research question 2 sought to examine the use of Canvas’ native tools by the online faculty at HPCC during the Fall 2018 semester. The Canvas LMS places numerous tools in the 34 Average Time Online (h:mm) hands of users. The availability of these tools generally rests in the hands of a system administrator; the implementation of tools in a single class rests on the assigned faculty. In addition to the learning tools inherent in the Canvas LMS, the Canvas Data revealed during the Fall 2018 semester there were 42 unique LTI tools deployed in one or more of the studied courses across the HPCC Canvas instance. Among these tools were publisher’s e- textbook supplements, web-based homework and tutorial packages that provided supplemental skills development, and online media repositories such as YouTube. The deployment of an LTI in any course at the College represents the selection of the tool by the assigned faculty and the subsequent implementation by an LMS Administrator. While LTI tools are developed for compatibility with the LMS they are external to the instance and are generally LMS agnostic. The usage data of an LTI is considered proprietary. The simple existence of an LTI across the institution is the extent to which these tools were reflected in the LMS’s empirical data obtained for this study. An examination of the LTI tools’ application was not further examined for this project. Research question 2 examined the prevalence of those Canvas-native tools associated with student-to-student and/or student-to-faculty interactions in an online class. While there are proprietary tools and features unique to individual LMSs, these tools are commonly found in most. The tools that support interactions between faculty and students and between students and their peers create remote opportunities for students to engage with the course content in a structured format. The Canvas LMS architecture provides several tools that support this interaction. Research question 2 sought to quantify the inclusion of the Canvas-native tools in the online classes offered at HPCC during the Fall 2018 semester. The examined tools and their functionality and purpose are outlined in Table 7 (Instructure, 2017). 35 Table 7 Canvas-native Tools Used in the Fall 2018 Semester Tool Functionality Purpose Announcements Broadcasting information to Sharing resources, posting reminders, all members of a class clarifying materials, addressing time- bound alterations to class progression Assignments Submission of student work Organizing students’ artifacts, assessing in varied formats students’ progress on expected outcomes, establishing the association between demonstrated learning and the gradebook Discussions Collaborative development Structuring the exploration of class of conversation around a questions, refining complex ideas, central theme or topic exchanging of resources, debating Modules Organization of content Creating the intentional flow through the curricular elements of a class, associating activities and resources with the outcomes they serve Quizzes Formative and summative Evaluating student comprehension, assessment of student learning soliciting student opinions via an through the presentation of ungraded survey varied question types The examination of the Canvas Data revealed Discussions were used at least once in 95.1% of the online classes at HPCC during the Fall 2018 semester. The Canvas Data table addressing the frequency of implementation for the Discussions tool included discussions in three different states: active, deleted, and unpublished. Those discussions identified as either deleted or unpublished were removed from the data prior to analysis. Tools in either of these states were not impactful to the course as they could not invite student engagement or communication with the instructor. Where identical states were included in other Canvas Data tables associated with this project, those data points were also excluded. 36 Discussions was the Canvas-native tool used in the greatest number of classes, followed by Modules (92.3%), Announcements (85.4%), Assignments (83.4%), and Quizzes (74.5%). Figure 6 represents the application of the tools across each academic division in the Fall 2018 semester. When disaggregated to this level the presence of each tool within the division is more clearly defined. Discussions was used in more classes than any other tool in Academic Affairs (AA) and BATS. In the School of Arts & Humanities (A&H) however, Discussions and Modules were both present in 97.4% of the division’s classes. In Health Science & Wellness (HSW) Announcements were present in more classes than any other tool. In Math & Sciences (M&S) the tool most consistently present was Modules. Figure 6 Tools Activation per Academic Division 120% 100% 80% 60% 40% 20% 0% Announcements Assignments Discussions Modules Quizzes Canvas-native Tools AA A&H BATS HSW M&S 37 Percentage of Classes where Tool was Active Table 8 outlines overall tool use during the Fall 2018 semester represented by each of the Canvas-native tools studied. When the tools’ use was quantified across the College the most commonly applied Canvas-native tool was Modules, followed by Announcements, Discussions, Assignments, and Quizzes, respectively. This quantification is indicative of tool count across the online classes at the institution only. It did not take into consideration any of the impactful factors of calendar block or academic discipline. A more granular representation of quantity per academic division is provided in Figure 7. Table 8 Canvas-native Tools Application at HPCC Tools Usage n Percentage of Tools Applied Tool Announcements 1,411 22.6% Assignments 974 15.6% Discussions 1,284 20.6% Modules 1,624 26.0% Quizzes 945 15.1% Each division at HPCC is comprised of faculty from numerous academic disciplines. A school’s disciplines are related to some degree and the courses associated with the school’s disciplines are also considered components of that school. While the tools in an LMS are developed for application in all disciplines, the quantified use of each tool at HPCC varied per discipline and, in this case, per academic division. The Fall 2018 implementation of the Canvas- native tools in each academic division is depicted in Figure 7. 38 Figure 7 Tools Count per Academic Division 600 AA A&H 500 BATS HSW M&S 400 300 200 100 0 Announcements Assignments Discussions Modules Quizzes Canvas-native Tools As was represented in Table 8, the Modules tool was the most often deployed tool at the College. The use of this tool outnumbered the use of all other Canvas-native tools in four of the five academic divisions in the Fall 2018 semester: A&H, BATS, HSW, and M&S. In Academic Affairs (AA) the Announcements tool was applied more than two times as often as Modules (130 occurrences of Announcements as opposed to only 56 Modules across the division). Figure 6 also identifies the tools that were applied with the least frequency in each of the academic areas. Summarily, Quizzes was the least used tool in Academic Affairs, Arts & Humanities, and BATS. In Health Science & Wellness the Canvas-native tool with the lowest frequency in Fall 2018 was Discussions while in Math & Sciences the tool implemented at the lowest frequency was Assignments. 39 Total Active Tool Count A summarized overview of the Fall 2018 implementation of the Canvas-native tools throughout the academic divisions is outlined in Table 9. The table considers each of the Canvas-native tools and provides an analysis of the extent to which an academic division used each tool when compared with use of the same tool in the other divisions. It offers an indication of where the depth of the faculty experience with each tool existed in the semester of interest. Table 9 Academic Divisions’ Portion of Overall Tool Use Apportioned Tools Use School Announcements Assignments Discussions Modules Quizzes AA 9.2% 10.1% 5.7% 3.4% 1.8% A&H 28.8% 29.9% 28.9% 27.4% 21.1% BATS 11.3% 15.5% 17.7% 21.2% 9.6% HSW 20.1% 30.2% 14.0% 18.4% 20.8% M&S 30.5% 14.4% 33.7% 29.6% 46.7% Table 9 demonstrates the greatest portion of the faculty experience with four of the five tools studied existed in the School of Math & Sciences during the first semester of Canvas availability at HPCC. Given that 45 of the College’s online classes during that semester were associated with M&S, the larger portion of tools application when compared with other divisions is not surprising. Table 9 also shows the M&S faculty accounted for less than 15% of the Assignments tool usage. This is less than half the usage attributed to the faculty in HSW, a division with nearly half as many online classes (23 classes as compared to 45). The Academic Affairs division hosted the fewest number of online classes in the Fall 2018 semester. The apportioned use of the Canvas-native tools attributed to this division was understandably smaller than the use of the tools in any of the other divisions. 40 Discussion This study was undertaken to establish a measurement of the online faculty activity within the LMS at High Plains Community College in the western United States. By examining the use of the LMS in the first semester of availability the study sought to determine the baseline level of activity as an opportunity for the institution to gauge the degree to which the LMS was being used in online instruction. It was proposed that this baseline was foundational to the institution’s knowing the opportunities for meaningful learning afforded students in the online environment. While learning is a shared proposition between an institution and the students it serves, it is incumbent upon any college to verify that opportunities for meaningful learning exist (Coates, 2005). This study further proposed that an examination of the empirical data generated by faculty activity in the LMS would clarify for HPCC what took place online in the first semester of LMS availability and would provide an opportunity for the College to be impactful with interventions in subsequent semesters. Understanding the level of online faculty activity in the LMS provides an objective lens for the College in regards to institutional expectations of faculty presence in an online class. Previous research has demonstrated that the faculty use of the LMS directly impacts student learning outcomes (Rhode et al., 2017; Walker, Lindner, Murphrey, & Dooley, 2016; Williams & Whiting, 2016; Zanjani, Edwards, Nykvist, & Geva, 2016). The Canvas Data revealed that HPCC’s online faculty were available an average of 67.5% of the days for which a class was scheduled. With this metric established the College is able to assess the online faculty presence and determine the parameters it must set to assure faculty availability meets the needs of the online student population. This average availability marker was determined at the institution level. It was impacted by the block scheduling at HPCC and the variable configuration of 41 classes as 16, 14, 12, and eight weeks duration. Understanding the level of online faculty activity for the calendar blocks also provides baseline information from which the College can examine scheduling practices and their alignment with the faculty work. For example, in the B8 calendar block the online faculty in the School of Math & Sciences and Health Science & Wellness were active in the LMS more than 80% of the days in the block, notably greater than the online presence of their colleagues in the other academic divisions. The B8 trend identified at HPCC suggests that teaching online in the compressed block poses an increased demand on the faculty. This may be an important consideration when the shorter calendar blocks are configured and the faculty assignments are made. Dawson, McWilliams, and Tan found differences in faculty usage trends per academic discipline in their analysis of LMS empirical data (2008). Additional patterns of online faculty activity at HPCC were revealed in this study and can serve institutional conversations as the College considers efficacy in its online environment. For example, the Canvas Data demonstrated that the online faculty in the School of Business, Agriculture, & Technical Studies averaged more than two hours more per day online than their colleagues in other academic divisions of the College. These results can serve as the foundation in discussions regarding course offerings and their alignment with online instructional methods. Considerations for additional faculty development in using the LMS can also sprout from the realization that the time attributed to teaching online in BATS is exceptional when compared to the time online in other academic divisions. During the Fall 2018 semester the online faculty at HPCC were most commonly available in the LMS on Tuesdays and only slightly less on Mondays. Conversely, the faculty were least engaged online on Sunday and Saturday. The online faculty activity on weekends averaged 42 between two and four hours less per day than weekdays. The availability of the faculty is an important clarification for students and is particularly so when instruction exists in the remote LMS environment. Detailing faculty availability in any class is normally a syllabus component and as such this metric may not be of value across the College. However, understanding the daily activity averages and the limited access to instructors on the weekends may be of import in conversations with students in the advising and/or academic support services arenas. The always on nature of the Internet extends naturally to the LMS and perceptions of 24/7 availability abound. While technologically that perception is accurate, this study demonstrated that the availability of the LMS does not equate to the availability of the instructor. In addition to quantifying the faculty activity level in the LMS, this study further sought to define what activities HPCC’s online faculty engaged in when they were present in online classes during the Fall 2018 semester. This analysis of the Canvas Data focused on five tools embedded in the LMS in order to determine the degree to which they were implemented in the first semester of Canvas availability. Each of the tools supports faculty-to-student and student- to-content interaction (Hodges & Grant, 2015; Rhode et al., 2017; Walker, Lindner, Murphrey, & Dooley, 2016). They are commonly found in most LMSs and with the exception of Canvas’ Modules tool, all of the studied tools are characteristic of face-to-face instruction as well as being common elements in the LMS. At HPCC 95% of the online classes contained one or more Discussion. This tool was present in the greatest number of classes at the College though it was used less frequently than Modules and Announcements when the institution was examined as a whole. Modules, Canvas’ organizational tool through which class components can be grouped was implemented in every class studied. The Canvas architecture requires at least one Module in a class; the average 43 number of modules per class in the dataset was 12. The inclusion of Modules in a class is an indication of intentional organization (Hodges & Grant, 2015). The inclusion of Discussions and Announcements represents the establishment of communication channels between and among the students and the faculty. The online faculty at the College implemented these tools across all academic divisions and in all calendar blocks during the first semester of Canvas availability. The quantity of each tool in each class has been determined; familiarity with the tools has been established. This creates the opportunity for the College to determine the quality with which the tools are being used, to identify the faculty expertise, and to further enhance the online classes through the pursuit of valued class design standards. It is through the Assignments tool in Canvas that student work is submitted. The tool allows for distinct assignment design by the faculty such that student submissions can take varied forms including an online upload, media recording, media upload, or an online text entry. Assignments is one of the most versatile of the LMS tools, allowing for faculty annotations on any submission, student revision and resubmission, and a conversational channel that wraps around this process. The Canvas Data revealed that the Assignments tool was widely used in all academic divisions at HPCC with the exception of Math & Sciences. Every class attributed to Academic Affairs (AA) implemented Assignments, and more than 79% of the classes in A&H, BATS, and HSW had included active Assignments. However, in Math & Sciences the Assignments tool was the least frequently applied of the engagement tools used in the studied semester, present in only 62% of the classes associated with that division. This measurement may be an important element in several conversations at HPCC as the Canvas Data is considered. For example, is the absence of Assignments in the online classes impactful to student success? Is there an external tool being deployed in lieu of the Assignments tool? If so, is it serving the 44 same purpose and is it doing so with success? The baseline determined what happened in the semester; it is intended to invite subsequent questions. Formative assessment opportunities in the LMS are supported by the Quizzes tool which was integrated in more than 75% of the online classes in the Fall 2018 semester. Canvas Quizzes tool supports quizzing through multiple question types, surveying, and exams. In addition to the student submission Quizzes also allows for faculty-to-student interaction through preprogrammed feedback for correct and incorrect responses and faculty narrative response as well. As was demonstrated with each of the tools studied, the Canvas Data reflected an overall awareness of the Canvas-native tools among HPCC’s online faculty. The faculty have implemented the tools in all academic divisions and in all calendar blocks. Previous research suggested faculty implementation of LMS tools was increased through peer feedback, professional development, student pressures, and increased familiarity with a tool (Jaggars, 2018; Morgan, 2003; Zanjani et al., 2016). HPCC is positioned to identify those areas where an increase to implementation rates is desired, and to incentivize that with these researched approaches. The purpose of this study was to quantify the level of online faculty activity in the LMS during the first semester of availability. The analysis of the Canvas Data provided metrics that define faculty activity respective of calendar block, semester week, and academic division. These data further defined the level to which the Canvas-native tools were applied in HPCC’s online environment in the studied semester. Understanding the level of use of instructional tools can provide an indication of the level of adoption across an institution (Dawson, McWilliam, & Tan, 2008; Morgan, 2003; Rhode et al., 2017). 45 There are limitations to this study. First, the acquisition of a Canvas Data extract is subject to technology specifications that are beyond the scope of most individuals and many institutions. The dataset is voluminous and the transmission of the immense file is not easily achieved. Consequently, the collection of the pertinent data and the subsequent verification processes were arduous and time consuming. Future studies would best be completed with iterative data collection and analysis to prevent a bulk download. Additionally, the Canvas Data provides only non-immediate feedback on activity in the LMS. While it is applicable to the development of historical benchmarks it does not provide actionable information on a regular and on-going basis. Additionally, the generalizability of the results is impacted by the single location of the research study, the distinct composition of the academic divisions of the institution, and the levels to which data was aggregated in the research. While the application of the LMS activity data in the determination of a baseline understanding is an approach that can be replicated; the results of any such study would be contextualized for the respective institution. The school and/or college structure of any institution is localized; results reported at the academic division level in this study cannot be associated with academic disciplines or clusters of such. Finally, in the analysis of the LMS activity the data was limited to disaggregation at the academic division level. This approach associated one online faculty member’s LMS activity with that of a colleague defined solely by organizational structure at the College. This limits the ability to draw conclusions about the LMS activity associated with academic discipline, course level, and the experience level of the faculty participants. However, the limitations do shape ideas for further research. The LMS activity data provides the objective foundation from which future 46 discussion associated with the use of the LMS can take shape. The data does not call attention to individual faculty, nor does it dissect activity in a single course or in a single discipline. This study examined only what teachers did. There was no intention to equate the quantity of time online or the quantity of tools usage with standards for quality online teaching. Rather, this study was an examination of the use of an institutional resource, the LMS, in a subset of the College’s academic offerings. The applicability of the results is limited to HPCC and is pertinent to LMS usage in a single semester only. While the results provided a measure of what happened in its initial deployment of Canvas, the value is less in the measure and more directly in the subsequent direction that it yields. Implications. During the completion of this research many colleges and universities around the world faced an immediate need to transition all instruction to a virtual environment. The COVID-19 pandemic of 2020 forced institutions to shutter their campuses for public health reasons and to establish alternative means to deliver on their promise of academic opportunity to the students they were serving. HPCC was challenged to convert all instructional modalities to remote delivery through the LMS for an undetermined length of time. This mandate to teach remotely came to the College at a time when the faculty were on vacation. The nature of the pandemic prevented face-to-face contact and necessitated the remote delivery of professional development in order to assist the faculty in the transition of their courses. Ultimately, in the final eight weeks of the Spring 2020 semester all instruction at HPCC was online and all instructors were online faculty. Mid-way through that period the College determined the Summer 2020 semester would be similarly characterized. The Canvas Data applied in this research will provide to HPCC an objective platform from which to understand the institutional response as it pertained to virtual instruction during 47 the COVID-19 pandemic. The data associated with faculty activity can reveal tools’ application, time required for course development and instructional design, faculty availability, and faculty engagement. In addition to examining the faculty activity in this context, further examination of the Canvas Data can associate the resultant student responses with faculty activity and further examine the impact of both on the academic success of engaged students. Such application of the LMS activity data extends the research beyond baseline determination to a quantified understanding the impact of the faculty response to the extreme circumstances surrounding the institution’s need for change. Such an understanding can be impactful for the institution as the pandemic recedes and the College establishes instructional practices from its own lessons learned. 48 References Adams Becker, S., Brown, M., Cummins, M., & Diaz, V. (2017). NMC horizon report: 2017 higher education edition. Retrieved from Educause website: https://library.educause.edu/resources/2017/2/2017-horizon-report Allen, I. E., & Seaman, J. (2017). Digital learning compass: Distance education enrollment report 2017. Retrieved from Babson Survey Research Group website: https://www.onlinelearningsurvey.com/highered.html Bangert, A. W. (2004). The seven principles of good practice: A framework for evaluating on- line teaching. Internet and Higher Education, 7(3), 217-232. doi: http://proxy.lccc.wy.edu:2145/10.1016/j.iheduc.2004.06.003 Buchanan, T., Sainter, P., & Saunders, G. (2013). Factors affecting faculty use of learning technologies: Implications for models of technology adoption. Journal of Computing in Higher Education, 25(1), 1-11. doi: http://proxy.lccc.wy.edu:2145/10.1007/s12528-013- 9066-6 Canvas data portal version 4.2.4. (October 22, 2015). Retrieved from https://portal.inshosteddata.com/docs Chickering, A., & Ehrmann, S. (1996). Implementing the seven principles: Technology as lever. AAHE Bulletin, 49(2), 3-6. Chickering, A., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, March(1987), 3-7. Chickering, A., & Gamson, Z. (1999). Development and adaptations of the seven principles for good practice in undergraduate education. New Directions for Teaching and Learning, 1999(80), 75-81. 49 Coates, H. (2005). The value of student engagement for higher education quality assurance. Quality in Higher Education, 11(1), 25-36. doi:10.1080/13538320500074915 Dahlstrom, E. & Brooks, D.C. (2014). ECAR study of faculty and information technology. Research report. Louisville, CO: ECAR. Retrieved from http://www.educause.edu/ecar. Dawson, S., McWilliam, E. & Tan, J. (2008). Teaching smarter: How mining ICT data can inform and improve learning and teaching practice [Address]. Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education, Melbourne, Australia. http://ro.ouw.edu.au/medpapers/141 Ertmer, P. A., & Ottenbreit-Leftwich, A. (2010). Teacher technology change: How knowledge, confidence, beliefs, and culture intersect. Journal of Research on Technology in Education, 42(3), 255-284. Retrieved from http://proxy.lccc.wy.edu/login?url=https://proxy.lccc.wy.edu:2144/docview/742874943?a ccountid=29648 Falvo, D. A., & Johnson, B. F. (2007). The use of learning management systems in the United States. Techtrends, 51(2), 40-45. doi:10.1007/s11528-007-0025-9 Fathema, N., Shannon, D., & Ross, M. (2015). Expanding the technology acceptance model (TAM) to examine faculty use of learning management systems (LMSs) in higher education institutions. MERLOT Journal of Online Learning and Teaching, 11(2), 210- 232. Hathaway, K. L. (2014). An application of the seven principles of good practice to online courses. Research in Higher Education Journal, 22, 1-12. Retrieved from http://proxy.lccc.wy.edu/login?url=https://proxy.lccc.wy.edu:2144/docview/1720062840 ?accountid=29648 50 Hodges, C. & Grant, M. (2015, April 16-17). Theories to support you: Purposeful use of learning management system features [Address]. Global Learn 2015, Berlin, Germany. https://www.researchgate.net/publicatin/27709610 Huck, S.W. (2008). Reading statistics and research. Boston, MA: Pearson Education, Inc. Instructure. (2017). Canvas instructor guide – Table of contents. Retrieved from https://community.canvaslms.com/docs/DOC-10460-canvas-instructor-guide-table-of- contents Instructure. (2015). Getting started with Canvas Data [Video file]. Retrieved from https://community.canvaslms.com/videos/1916 Jaggars, S.S. (2018). Online learning in the community college context. In Moore, M. G., & Diehl, W. C. (Eds.). Handbook of distance education (pp. 445-455). Retrieved from https://ebookcentral.proquest.com Jaschik, S. & Lederman, D. (2018). 2018 survey of faculty attitudes on technology. Inside Higher Education. Retrieved from https://www.insidehighered.com/booklet/2018-survey- faculty-attitudes-technology Lai, A., & Savage, P. (2013). Learning management systems and principles of good teaching: Instructor and student perspectives. Canadian Journal of Learning and Technology, 39(3), 1-21. Retrieved from http://proxy.lccc.wy.edu/login?url=https://proxy.lccc.wy.edu:2144/docview/1651857815 ?accountid=29648 Laramie County Community College. (2018). 2018 Fall Section Enrollment Report FINAL. Retrieved from https://lcccwy.sharepoint.com/Offices/InstitutionalResearch/SitePages/Home.aspx 51 Laramie County Community College. (2013). Course Enrollment Management Procedure. Retrieved from http://policies.lccc.wy.edu/Files/Procedure%202.13P%20Course%20Enrollment%20Man agement%20Procedure-CCjun14-13.pdf Laramie County Community College. (2016). Mission, Vision & Values. Retrieved from http://www.lccc.wy.edu/about/ourFuture/mission.aspx Lee, Y., Kozar, K., & Larsen, K. (2003). The technology acceptance model: Past, present, and future. Communications of the Association for Information Systems, 12(1), 752-780. Lonn, S., & Teasley, S. D. (2009). Saving time or innovating practice: Investigating perceptions and uses of learning management systems. Computers & Education, 53(3), 686-694. doi: http://proxy.lccc.wy.edu:2145/10.1016/j.compedu.2009.04.008 Magda, A.J. (2019). Online learning at public universities: Recruiting, orienting, and supporting online faculty. Louisville, KY: The Learning House, Inc. Mandermach, B.J., & Palese, K. (2016). Teaching analytics: Efficient strategies to support effective online instruction. Paper presented at the Higher Learning Commission’s 2016 Annual Conference, Chicago, IL. Mars, M. M., & Ginter, M. B. (2007). Connecting organizational environments with the instructional technology practices of community college faculty. Community College Review, 34(4), 324. Morgan, G. (2003). Faculty use of course management systems. Research report. Boulder, CO: ECAR. Retrieved from https://library.educause.edu/resources/2003/7/faculty-use-of- course-management-systems 52 Odhiambo, F.O. (2013, May 28). Baseline studies/surveys [Web log post]. Retrieved from https://impact-evaluation.net/2013/07/02/types-of-monitoring-in-monitoring-and- evaluation-me/ Pomerantz, J., & Brooks, D. (2017). ECAR Study of Faculty and Information Technology, 2017. Research report. Louisville, CO: ECAR, October 2017. Rhode, J., Richter, S., Gowen, P., Miller, T., & Wills, C. (2017). Understanding faculty use of the learning management system. Online Learning, 21(3) 68-86. doi: 10.24059/olj.v%vi%oi.1217 Romero, C., Ventura, S., & Garcia, E. (2008). Data mining in course management systems: Moodle case study and tutorial. Computers & Education, 51(1), 368-384. doi: http://proxy.lccc.wy.edu:2145/10.1016/j.compedu.2007.05.016 Rucker, R., & Downey, S. (2016). Faculty technology usage resulting from institutional migration to a new learning management system. Online Journal of Distance Learning Administration, 19(1). Ryan, T. G., Toye, M., Charron, K., & Park, G. (2012). Learning management system migration: An analysis of stakeholder perspectives. International Review of Research in Open and Distance Learning, 13(1), 220-237. Sanga, M. W. (2016). An analysis of technological issues emanating from faculty transition to a new learning management system. Quarterly Review of Distance Education, 17(1), 11- 21. Seaman, J. E., Allen, I. E., & Seaman, J. (2018). Grade increase: Tracking distance education in the United States. Retrieved from Babson Survey Research Group website: https://www.onlinelearningsurvey.com/highered.html 53 Seaman, J. E., & Seaman, J. (2017). Digital learning compass: Distance education state almanac 2017. Retrieved from Babson Survey Research Group website: https://www.onlinelearningsurvey.com/highered.html Schoonenboom, J. (2014). Using an adapted, task-level technology acceptance model to explain why instructors in higher education intend to use some learning management system tools more than others. Computers & Education, 71, 247-256. doi:10.1016/j.compedu.2013.09.016 Schroeder, R. & Cook, V. S. (2018). Needs assessment and strategic planning in distance education. In Moore, M. G., & Diehl, W. C. (Eds.). Handbook of distance education (pp. 445-455). Retrieved from https://ebookcentral.proquest.com Siemens, G., & Long, P. (2011). Penetrating the fog: Analytics in learning and education. Educause Review, 46(5), 31-40. Suen, L. (2005). Teaching epidemiology using WebCT: Application of the seven principles of good practice. Journal of Nursing Education, 44(3), 143-6. Tirrell, T., & Quick, D. (2012). Chickering's seven principles of good practice: Student attrition in community college online courses. Community College Journal of Research and Practice, 36(8), 580-590. Walker, D., Lindner, J., Murphrey, T., & Dooley, K. (2016). Learning management system usage: Perspectives from university instructors. The Quarterly Review of Distance Education, 17(2). 41-50. Watson, W., & Watson, S. (2007). An argument for clarity: What are learning management systems, what are they not, and what should they become? TechTrends: Linking Research 54 and Practice to Improve Learning, 51(2), 28-34. doi: http://proxy.lccc.wy.edu:2145/10.1007/s11528-007-0023-y Wichadee, S. (2015). Factors related to faculty members' attitude and adoption of a learning management system. Turkish Online Journal of Educational Technology - TOJET, 14(4), 53-61. Williams, D., & Whiting, A. (2016). Exploring the relationship between student engagement, Twitter, and a learning management system: A study of undergraduate marketing students. International Journal of Teaching and Learning in Higher Education, 28(3), 302-313. Zanjani, N., Edwards, S., Nykvist, S. & Geva, S. (2016). LMS acceptance: The instructor role. The Asia-Pacific Education Researcher, 25(4), 519-526. 55 Appendix A The Canvas Data is built on a start schema in which data is presented in a collection of fact and dimension tables. To evaluate the inclusion of the Canvas tools in the online classes at HPCC and to further describe those tools, the fact and dimension tables were joined by the foreign and primary keys of the respective tables as shown in Figure A1. Figure A1 Canvas Data Star Schema For example, to evaluate the inclusion of the assignment tool in any course an analysis of the ASSIGNMENT_FACT table was conducted. This table detailed a number of measures associated with every assignment including the assignment’s unique identifier, the course in which the assignment was used, the points possible on the assignment, and the parameters of peer review for the assignment when that feature had been attributed. The ASSIGNMENT_FACT table also contains foreign keys for several of the dimension tables in the Canvas Data. The foreign keys are the columns in the fact table that cross-reference columns in a dimension table. By joining the foreign key from ASSIGNMENT_FACT with the primary key 56 from ASSIGNMENT_DIM the attributes of the assignments that occurred in the HPCC Canvas instance during the studied semester were discovered and a detailed analysis was done in Excel. The joining of the tables in this case allowed for the categorization of assignments as published, unpublished, or deleted. Deleted and unpublished assignments would not have impacted students in a class and were not included when considering the use of the LMS tools in an online class. Fact and dimension tables exist for each of the Canvas-native tools quantified in response to research question 2. These tables were joined as described above and their analysis and the results were visualized with Excel. 57 Appendix B Faculty Participation in Canvas Data Verification Four Fall 2018 online classes were selected randomly as participants in the verification of Canvas Data reliability and validity. The assigned faculty were introduced to the project and their participation was sought via email correspondence. Figure B1 presents a print screen image of this correspondence. Figure B1 Requesting Faculty Participation in Data Verification Email correspondence was used to summarize the shared examination of the online class and the results of that initial conversation (Figure B2). When necessary this correspondence identified discrepancies in the researcher’s interpretation of the Canvas Data. 58 Figure B2 Post-Conference Confirmation of Data Interpretation Confirmation of parity between the researcher’s interpretation of Canvas Data and the actual faculty behavior was received via email (Figure B3). Figure B3 Participant Faculty Data Parity Confirmation 59