Linguistics

Download as pdf or txt
Download as pdf or txt
You are on page 1of 281

Introduction to Online Learning Volume 24, Issue 1

Introduction to Online Learning Volume 24, Issue 1

Peter Shea
University at Albany, SUNY
Editor of Online Learning

Our first issue of 2020 contains 13 articles in three sections. These papers are included in
sections on student and faculty issues and concerns, as well as a collection of other empirical
studies investigating online learning environments from a variety of theoretical and
methodological approaches.
The first section includes four papers on faculty, professional development, and online
teaching. This section begins with “Examining How Online Professional Development Impacts
Teachers’ Beliefs about Teaching Statistics” by Hollylynne Lee and Gemma Mojica of North
Carolina State University and Jennifer Lovett of Middle Tennessee State University. In this study
the authors are concerned with improving the teaching of statistics through online professional
development provided through a Massive Open Online Course (MOOC). The data is drawn from
412 participants in the MOOC who identified themselves as classroom teachers. The central
questions of the study center on identifying elements of the MOOC that trigger critical reflection
and evidence that engaging in the MOOC influenced teachers’ beliefs, perspectives, and practices
in teaching statistics. The paper identifies aspects of the MOOC that hold promise in promoting
positive change in teacher beliefs and practices.
The second paper in this section is “Facilitation Matters: Instructor Perception of
Helpfulness of Facilitation Strategies in Online Courses” by Florence Martin, Chuang Wang, and
Ayesha Sadaf of the University of North Carolina, Charlotte. In an earlier study some of these
authors found online instructors’ roles can be categorized as facilitator, course designer, course
manager, subject matter expert, and mentor. This paper seeks to flesh out the role of facilitator by
first reviewing the literature on facilitation and then presenting a result of a faculty survey. The
results, from 100 online instructors, identify what they deem the most and least helpful facilitation
strategies that were identified in the literature.
The next paper is “Social Media Learning Activities (SMLA): Implications for Design” by
Ghania Zgheib of the University of Balamand, Lebanon, and Nada Dabbagh of George Mason
University. There can be little doubt that our students are very engaged in social media use and
that many instructors are exploring its educational applications. The author of this paper review
promising research in this area and conclude that there is consensus on the benefits of social media
use for learning. However, we need to better understand principles that can effectively guide
SMLA design going forward. The paper investigates the types of learning activities designed
through social media, the knowledge and cognitive processes they promote, and strategies
experienced faculty use to design SMLAs.
The fourth paper in this issue is “Shifting Teaching and Learning in Online Learning
Spaces: An Investigation of a Faculty Online Teaching and Learning Initiative” by Jayson
Richardson and John Eric Lingat of the University of Kentucky, Ericka Hollis of Regis College,

Online Learning Journal – Volume 24 Issue 1 – March 2020 51


Introduction to Online Learning Volume 24, Issue 1

and Mikah Pritchard of Eastern Kentucky University. This study utilizes the Diffusion of
Innovation (DOI) theory to interpret pedagogical changes that occurred as a result of professional
development activities and a subsequent year-long faculty learning community and how
perceptions of the diffusion of innovations characteristics influence the level of adoption of
online/blended teaching. Results of the study indicate that participants most frequently mentioned
experiences that refer to DOI components reflecting relative advantage, compatibility, and
trialability.
The next section of this issue contains five papers broadly related to students, community,
and online learning. This first of these is “From Discussion Forums to eMeetings: Integrating High
Touch Strategies to Increase Student Engagement, Academic Performance, and Retention in Large
Online Courses” by Glenda Gay of The University of the West Indies at Cave Hill and Kristen
Betts of Drexel University. This paper utilizes an action research approach to examine strategies
to address issues that can arise in larger format online courses: student disengagement, poor
performance, and subsequent dropout. Based on data collected from more than 3,300 students over
a six-year period, results indicate the use of the new eMeeting format integrating online high touch
strategies, are correlated with significant increases in student engagement and academic
performance. Additionally a comparison between the pre- and postintegration data revealed
decreases in attrition, and higher scores on the standardized final exam. Course evaluations after
the introduction of these strategies also reflect increased student satisfaction with the course. These
approaches appear very promising and deserver further study.
The second paper in this section is “Postgraduate Online Teaching in Healthcare: An
Analysis of Student Perspectives” by Cuisle Forde and Silvia Gallagher of Trinity College, Dublin.
The goal of this study was to understand student perspectives in online health science courses. The
main research questions explored in this study related to expectation and concerns healthcare
students have before they start a postgraduate online course and their perceptions and experiences
during and after the course. The authors to develop a set of recommendations for online educators
that can serve as a guide for online course development and facilitation for students in the
healthcare field.
The next paper is “Student Preferences for Learning Resources in a Land-based
Postgraduate Online Degree Program” by Duncan Royd Slater of Myerscough College, Lancashire
and Richard Davies of the University of Central Lancashire. This paper focuses on a newly
emerging area of online study: specialized graduate education. Options for providing learning
resources for such programs are myriad and understanding which resources and modes of delivery
(e.g., text, audio, and video) are deemed useful is an important component of ensuring
effectiveness. The study identified three resources currently used in the program that were
significantly more favored than the others: online lectures, academic papers, and tutor’s
viewpoints. Other resources were rated lower. A number of responses showed there was a clear
focus on relevant content over medium of delivery. This study demonstrates the importance of
context in making decisions regarding the selection of resources for online learning.
The fourth paper in this section is “Factors Influencing Programming Expertise in a Web-
based E-learning Paradigm” by Wajid Rafique, Wanchun Dou, and Khurshid Ahmed of Nanjing
University and Khalid Hussain of East China University of Science and Technology. This study
investigates the challenges of teaching computer programming in an online environment through
the lens of the Technology Acceptance Model (TAM). Using data from 460 seniors in an
undergraduate computer science program the authors validate and extend the TAM for students

Online Learning Journal – Volume 24 Issue 1 – March 2020 52


Introduction to Online Learning Volume 24, Issue 1

enrolled in computer programming. They conclude that teaching practices, intrinsic factors,
perceived usefulness, efficacy problems, and learning intentions are key factors contributing
toward programming expertise development in online learning environments. Much more detail is
included in the full paper.
The final paper in this section is “Purposeful Interpersonal Interaction: What is it and How
is it Measured?” by Scott Mehall of Carlow University. While we can all agree that interaction is
an important contributor to learning in online settings, we can also acknowledge that not all
interaction promotes learning or processes that contribute to learning. This paper provides an in-
depth investigation of the nature of productive interpersonal interaction to promote online learning.
The author outlines a framework for purposeful interpersonal interaction characterized by three
components: instructional, social, and support. These forms of interaction have been associated
with either processes that support learning (a sense of community) or learning itself. The study
details the conditions for creating these forms of productive interaction.
The final section of this issue contains four additional empirical studies on a variety of
topics. The first paper is “The Role of an Interactive Visual Learning Tool and its Personalizability
in Online Learning: Flow Experience” by Young Ha of California State University, Long Beach
and Hyunjoo Im of the University of Minnesota, Twin Cities. Flow theory suggests that interactive
visual learning tools have a high potential to engage students in learning processes and the effect
is greater when the students’ ability is close to the task difficulty level. This study tests these
hypotheses with two experiments. The first experiment examines the effect of online interactivity
on student learning process that manifest as flow experiences. The second experiment investigates
whether students’ learning experience is enhanced when students are able to match their skill level
with the task difficulty through personalization options. Among other findings, the results
demonstrate the important role of dynamic, real-time interactivity in improving students’ learning
by reducing awareness of physical surroundings and increasing flow states.
The next article in this section is “Using Structured Pair Activities in a Distributed Online
Breakout Room” by Jeffrey Saltz and Robert Heckman of Syracuse University. Benefits of
classroom-based collaborative learning when using breakout rooms include deeper learning, better
grades, longer retention of information, greater communication and teamwork skills, and a better
understanding of the professional environment in which students will work. How to structure
breakout sessions in synchronous online environments is less well documented, especially for
coursework in data science. This exploratory study seeks to close that gap by investigating a
promising approach: structured pair activities, specifically using a strategy called “paired
programming.” In pair programming one member of the pair types at the keyboard while the other
reviews each line as it is typed, checking for errors and thinking about the overall design. The
paper provides observations of structured and unstructured student behaviors in online
synchronous breakout rooms to highlight how the approach improves collaborative learning
processes and outcomes.
The third paper in this section is “The Validity and Instructional Value of a Rubric for
Evaluating Online Course Quality: An Empirical Study” by Ji Eun Lee and Mimi Recker of Utah
State University, and Min Yuan of the University of Utah. Rubrics designed to assess the quality
of online course design are commonly used in higher education institutions, but few have been
empirically tested for reliability or validity. Even fewer have been assessed for their influence on
promoting productive online interactions or actual student outcomes (e.g., grades). This paper
seeks to address that issue by providing validity and reliability measures for an online course

Online Learning Journal – Volume 24 Issue 1 – March 2020 53


Introduction to Online Learning Volume 24, Issue 1

quality rubric and tying those metrics to learner outcomes (course passing rates). Using data from
121 online courses enrolling 5,240 students, an analysis demonstrates that only rubric items related
to learner engagement and interaction have positive effects on online interactions, while only
student-content interaction positively influence course passing rates. This paper will be on interest
to faculty and instructional designers seeking to improve the quality of online coursework.
The next paper is “A Dramaturgical Examination of Online University Student Practices
in a Second Year Psychology Class” by Dawn Marie Gilmore of the Royal Melbourne Institute of
Technology, Australia. This study adopts a framework based on Erving Goffman’s classic
sociological theory which posits metaphors for the presentation of self as being either on the front
stage or the backstage. If the Learning Management System (LMS) is conceived as the front stage,
then other environments that students use to prepare for their performance in the LMS combine to
form the students’ backstage learning environment. This study analyzes what students do beyond
the LMS and how social media spaces (especially Facebook) preferred by students support social
learning and enhance the student experience. In part the paper concludes that some students avoid
the front stage discussion board because the audience is too slow, too harsh, and too formal. The
backstage online audience in social media solved these drawbacks of the front stage, which made
it a more attractive option for learning.
We invite you to read and share this issue with colleagues and to consider submitting your
own original work to Online Learning.

Online Learning Journal – Volume 24 Issue 1 – March 2020 54


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

Examining How Online Professional Development


Impacts Teachers’ Beliefs About Teaching Statistics
Hollylynne S. Lee and Gemma F. Mojica
North Carolina State University

Jennifer N. Lovett
Middle Tennessee State University

Abstract
With online learning becoming a more viable option for teachers to develop their expertise, our
report shares one such effort focused on improving the teaching of statistics. We share design
principles and learning opportunities in an online course developed specifically to serve as a wide-
scale online professional development opportunity for educators, thus deemed as a massive open
online course for educators (MOOC-Ed). In this report we focus on a subset of 412 participants
who identified themselves as classroom teachers. We use multiple data sources, quantitative and
qualitative, to characterize changes in teachers’ beliefs and perspectives about statistics and
identify triggers in the course that appear to influence teachers’ sense making about issues related
to teaching statistics. Implications about specific course experiences that served as triggers for
critical reflection and change are discussed.

Keywords: MOOC teacher training, online professional development, statistics education, beliefs

Lee, H.S., Mojica, G.F., & Lovett, J.N. (2020). Examining how online professional development
impacts teachers’ beliefs about teaching statistics. Online Learning, 24(1), 5-27.
https://doi.org/10.24059/olj.v24i1.1992

Examining How Online Professional Development Impacts Teachers’ Beliefs


About Teaching Statistics
Innovations in online learning environments and changes in K-12 mathematics curricula have
created new opportunities to think creatively for how technological solutions could be used for
providing professional development for teachers. Indeed, in 2013 Marrongelle, Sztajn, and Smith
proclaimed it was “incumbent on the field to capitalize on emerging technologies in the design and
delivery of effective professional development” and emphasized the need for “research that
focused on teacher learning in these environments” (p. 208). The past several decades have
included an increased emphasis on student-centered, investigative approaches to learning and
teaching content within science, technology, engineering, and mathematics (STEM) classrooms
(Granger, Bevis, Saka, Southerland, Sampson, & Tate, 2012; National Research Council, 2000).
Changes in mathematics standards over the past twenty years have given the topic of statistics a
prominent place in secondary curricula in the U.S. and many other countries.

Online Learning Journal – Volume 24 Issue 1 – March 2020 55


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

Across the globe, platforms, tools, and internet access paved the way for many Massive Open
Online Courses (MOOCs) and other distance course offerings related to STEM content, especially
statistics. For learning statistics, options abound for courses in which a learner can develop
knowledge in statistics. Two examples include the Data to Insight course at University of Auckland
in New Zealand (www.futurelearn.com/courses/ data-to-insight), and a five course sequence
developed at Duke University in the U.S. (www.coursera.org/specializations/statistics). However,
online courses designed for learning to teach STEM content, particularly teaching statistics, are
relatively rare.
Franklin et al. (2015) call for greater attention to the statistical education of teachers, including
practicing teachers. Professional development (PD) for secondary mathematics teachers to develop
their statistical content and pedagogy are being offered across the world, typically in local small
settings in schools or districts. While such efforts may effectively impact the practices of teachers
in these small settings, the need for preparing teachers to teach statistics is much bigger than what
can be addressed only by local programs. For example, in Germany, Biehler (2016) led
development and implementation of PD for secondary teachers that started on a smaller scale and
expanded to reach many more math teachers in Germany. Two efforts to offer MOOCs on learning
to teach statistics, with very different approaches, have been developed in the U.S. The design of
these courses and lessons learned have been shared by Lee and Stangl (2015; 2017). One of these
courses, Teaching Statistics with Data Investigations (TSDI), is the focus of this paper.
With an online solution at a much larger scale, methods for examining impacts must also
evolve. While research on face-to-face PD can examine teachers’ development in-situ and their
local classroom practices, PD done at a distance online adds challenges for examining such
development. We offer a glimpse at one effort to use participants’ online activity, forum
discussions, and self-reported changes on surveys to measure impact.
Specifically, our focused questions are:
1. Which resources and experiences in the course seem to trigger critical reflection?
2. What evidence is there that engaging in the MOOC-Ed impacted teachers’ beliefs and
perspectives about teaching statistics, that could in turn impact teaching practices?

Review of Related Literature


The intent of this section is to provide background information critical in the domain of STEM
teacher education, especially statistics teacher education. However, we then quickly focus the
literature on broader issues of designing online professional learning experiences and how to frame
our study to examine impacts of an online PD course for teaching statistics.
Teaching Beliefs, Perspectives, and Practices
The success of reform movements in STEM education are contingent on changes in teachers’
classroom practice (Milner, Sondergeld, Demir, Johnson, & Czerniak, 2012). Many researchers in
STEM education agree that understanding teachers’ beliefs is critical to integrating reforms in
classrooms (e.g., Yasar, Baker, Robinson-Kurpius, Krause, & Roberts, 2006) as teachers’ beliefs
are an important factor in influencing their practice (Grossman, 1990). According to Stipek,
Givvin, Salmon, and MacGyvers (2001), most teachers believe mathematics is a static body of
knowledge that involves rules and procedures that lead to one right answer, whereas inquiry-

Online Learning Journal – Volume 24 Issue 1 – March 2020 56


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

oriented mathematics teachers view mathematics as dynamic and as a tool for problem solving.
They found that teachers’ beliefs were associated with their classroom practices in predicted
directions (i.e., more traditional beliefs were associated with more traditional practices). Caps and
Crawford (2012) found that even well-qualified, highly motivated teachers had difficulty enacting
reform-based teaching in science; particularly, teachers held limited views of inquiry-based
instruction and the nature of science where these perspectives were reflected in their practice.
However, there is evidence to suggest that teachers are able to shift from a perspective that learning
is about rules and procedures to one of inquiry, investigation, and critical thinking about key
STEM concepts (e.g., Seung, Park, & Narayan, 2011). De Vries, Jansen, and Van De Grift (2013)
found that the more teachers engaged in continuing PD, the more student-centered they became,
shifting from more traditional orientations.
Beliefs and perspectives that teachers may hold specifically related to statistics include ideas
about the nature of statistics, about themselves as learners of statistics, and about what they
perceive as important goals for students’ learning of statistics (e.g., Eichler, 2011; Pierce & Chick,
2011). Statistics beliefs and perspectives include how teachers view themselves as learners of
statistics, which often include memories of lessons focused on graphing or using formulas to
generate statistical measures, often without the aid of technology (Lovett & Lee, 2017). Such
experiences may lead teachers to believe statistics is about performing a set of procedures.
However, teachers may also feel that reasoning with context-rich data and uncertainty in statistical
claims can make statistics difficult to learn and teach (e.g., Lovett & Lee, 2017; Leavy, Hannigan,
& Fitzmaurice, 2013). One’s confidence to teach statistics is then influenced by beliefs and
perspectives about statistics, prior experiences in learning and teaching statistics, and
understanding of statistical content (Lovett & Lee, 2017; Harrell-Williams, Sorto, Pierce, Lesser,
& Murphy, 2015).
Teachers’ beliefs and confidence levels would likely lead to different teaching practices. For
example, if a teacher believes that statistics is a way of quantifying data and that procedures for
computing statistical measures lead to such quantification, they may be quite confident in teaching
statistics and their teaching practices may favor a focus on statistical procedures. Such teaching
would likely have less emphasis on the rich contexts of data, the process of ensuring good data is
collected and available, and making claims about data that are uncertain in nature (Pierce & Chick,
2011). Eichler (2011) posited that the focus of teachers’ intended curriculum in statistics can be
considered on a continuum from traditionalists (focused on procedures absent of context), to those
wanting students to be prepared to use statistics in everyday life (focused on engaging in an
investigative process that is tightly connected to contexts of real data). A goal in statistics teacher
PD is to move teachers along this continuum towards a focus on investigative processes, which
requires impacting teachers’ beliefs about the nature of statistics and learning goals for students
related to statistics.
Designing Online Professional Development
Seaton and colleagues (2015) found that teachers (university and K-12) were enrolling in
content-focused MOOCs on the edX platform and that they were highly engaged as participants
in discussion forums. The teachers, representing only 4% of MOOC participants, contributed 22%
of posts in forums. This suggests that an online community in a MOOC may attract and support
teachers as they learn new content and pedagogy. Designing PD in a MOOC context, though,
should be based on effective practices for teachers’ learning, on and offline.

Online Learning Journal – Volume 24 Issue 1 – March 2020 57


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

The Conference Board of Mathematical Sciences (2012) recommends that PD engages


teachers in solving problems and deeply exploring content in a professional learning community,
analyzing authentic student work, and participating in collaborative task design. PD that includes
accessible, personalized, and self-directed elements can provide increased opportunities for
sustained, collaborative, and meaningful work among teachers that can affect their knowledge,
beliefs, and practice (e.g., Vrasidas & Zembylas, 2004). Online PD that addresses the varied needs
and abilities of its participants has been shown to be effective in changing teachers’ instructional
practice (e.g., Renninger, Cai, Lewis, Adams, & Ernst, 2011). Many designers of online PD
emphasize that activities should be meaningful, accessible, and relevant so participants can apply
their professional learning to their individual educational context (e.g., Luebeck, Roscoe, Cobbs,
Diemert, & Scott, 2017; Vrasidas & Zembylas, 2004). While research on impacts of MOOCs often
examine click logs as an indicator of whether or not educators are accessing important learning
material, Jacobsen’s (2019) work clearly illustrates how busy professional educators that appear
to have “dropped out” of a PD MOOC indeed accessed and utilized selected resources they
perceived as relevant to their educational context that in turn had an impact on their teaching
perspectives and practices.
Active learning experiences and peer interactions are hallmarks of most PD experiences for
teachers and can help build a community among participants. Just as communities can form in
face-to-face PD, online PD should facilitate an online community. Designers of online courses
should build infrastructure to support active learning and peer interaction across geographic and
time zone boundaries. Within online PD for educators, asynchronous discussion forums, for
example, provide opportunities for participants to reflect on practice, exchange ideas, and discuss
ways to improve on their own schedules with colleagues with whom they may not otherwise
interact (e.g., Treacy, Kleiman, & Peterson, 2002). Researchers have highlighted benefits of such
communities that are not always afforded in traditional face-to-face PD. For example, Mackey and
Evans (2011) argued that online communities provide members with “extended access to resources
and expertise beyond the immediate school environment” (p. 11), thereby offering ongoing PD
and the potential for increased application in classrooms. In order to maximize benefits, designers
of online PD programs must be creative in building the infrastructure necessary to support such
communities, as participants have the challenge of not being physically in the same place when
engaging in online activities.

Online Course Context for the Study


In recognizing the potential for MOOCs to serve as large-scale teacher PD, we are part of
teams that have created MOOCs for Educators (MOOC-Eds) to assist teachers in developing new
strategies for improving teaching and forming local and global communities of educators. While
MOOC-Eds have not had the “massive,” large-scale enrollment of other MOOCs, they do reach
larger numbers of educators than typical online PD courses. MOOC-Eds are intended to attract
professional educators who are specifically looking to engage in a free, open online course that is
marketed to educators beyond specific geographical boundaries. Thus, the MOOC-Ed effort at the
Friday Institute for Educational Innovation at North Carolina State University includes a collection
of courses built using research-based design principles of effective PD and online learning (Garet
et al., 200; Darling-Hammond et al., 2009) that emphasize: (a) self-directed learning, (b) peer-
supported learning, (c) job-connected learning, and (d) learning from multiple voices (Kleiman,
Wolf, & Frye, 2015).

Online Learning Journal – Volume 24 Issue 1 – March 2020 58


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

In accordance with suggestions from Sztajn (2011) on aspects of PD that are necessary to
understand and interpret research results based on PD, we provide details about the intent, learning
goals, and specific designs of the TSDI course. The overarching goal of the course is to engage
participants in thinking about statistics teaching and learning in ways that are likely different from
their current practices in middle school through college-level introductory statistics
(http://go.ncsu.edu/tsdi). The course did not focus on a particular grade level or specific statistical
content. A major goal was for teachers to be introduced to and use a framework to consider
statistics as a four-phase investigative process (pose, collect, analyze, interpret) that incorporates
statistical habits of mind, and views learning statistics from a developmental perspective (Franklin
et al., 2007).
The course consisted of an orientation unit and five units, each with seven components. The
course was open for about 15 weeks to allow for flexibility for participants to engage while
managing their busy professional lives. On September 21, 2015 the Orientation and Unit 1 opened.
The Orientation unit included an overview video, survey to self-assess their confidence in teaching
confidence (i.e., SETS), and a forum in which they could introduce themselves and learn about
other participants. Each unit opened in weekly intervals for 4 weeks thereafter, with earlier units
always remaining accessible. This allowed participants to start and engage in course material at
their own pace. Once Unit 5 opened, the entire course remained active for seven more weeks. Upon
closure, participants could still access material and discussion forums in a read-only format (no
new posts allowed), though this activity was not included in our analysis.
Each unit began with an Introduction video of the instructor highlighting critical aspects of
teaching and learning statistics that participants can learn about in the unit. The Essentials included
materials to read or watch that were created by the course development team or compiled from
open online resources (open journal articles, lesson plans, data, videos). Each unit included video
of students and teachers engaged in statistics lessons. Teacher educators have shown how
impactful video cases depicting learning and teaching in classrooms can be in focusing teachers’
learning about pedagogical issues (e.g., Wilson, Lee, & Hollebrands, 2011; Sherin & Van Es,
2005). However, when rich examples were available in statistics education literature, animated
illustrations of real students’ work were created (using tools like Go Animate or Powtoon) that
represented students’ statistical reasoning and use of technology tools. Such animations have been
shown to be an effective way to include artifacts of practice in teacher education materials (e.g.,
Herbst, Chazan, Chen, Chieu, & Weiss, 2011; Chazan, 2018). The teachers and students in videos
also brought in multiple voices that are closest to the practice of teaching.
Self-directed and job-connected learning opportunities often included a selection of statistics
tasks for different grade levels (to provide choice) to engage teachers in doing statistics in ways
likely different than what they have experienced before (Franklin, et al., 2015; Stein & Smith,
1998). These tasks included Dive into Data experiences for participants to use free technology
tools (e.g., Gapminder, Tuva, CODAP, GeoGebra simulations) or import data into their favorite
data analysis tools. These active learning experiences allowed teachers to experience investigative
statistics tasks using tools accessible in their schools and connected them to relevant and free
sources of data. For example, in Unit 4, Dive into Data uses the Census at School website and
asked teachers to download data and engage in a cycle of statistical investigation. Extensions
include extra material (e.g., datasets, lesson plans, brief articles, applets, videos) to explore content
and resources of interest that may be useful in their own teaching context. Again, these extension
materials provide opportunities for self-directed learning.

Online Learning Journal – Volume 24 Issue 1 – March 2020 59


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

The design principle of learning from multiple voices also guided the decision for each unit to
include a video of an Expert Panel discussion with the instructor and 3 experts in statistics
education. The conversations in these videos brought forth practical experiences and research-
based suggestions in a conversational tone where listeners could feel they were part of the
conversation. Peer-supported learning is a cornerstone of the MOOC-Ed experience to provide
focused and ample opportunities for participants to connect with and support one another (e.g.,
Borko, 2004). Each unit contains two discussion forums: (a) a forum focused on discussing a
specific Pedagogical Investigation about aspects of teaching statistics (e.g., analyzing statistics
tasks, considering students’ approaches to statistics tasks through video clips), and (b) a forum
where participants Discuss with Colleagues about unit materials or other ideas related to teaching
statistics.
Because of its importance in the course, we provide details about a critical framework
integrated across the course. Frameworks can assist teachers in applying content and strategies
learned in PD to their own instructional practices (Franke, Carpenter, Levi, Fennema, 2001; Boston
& Smith, 2011). Building upon an existing framework (Franklin et al., 2007), the development
team incorporated recent research on students’ statistical thinking and productive statistical habits
of mind (e.g., Burrill & Biehler, 2011; Wild & Pfannkuch, 1999). A habit of mind is developed
when a person approaches situations in similar ways so they develop a more general heuristic over
time (Cuoco, Goldenberg, & Mark, 1996). The new framework, Students’ Approaches to
Statistical Investigations (SASI), needed a variety of learning materials and opportunities for
participants to develop an understanding of its importance and potential ways it can influence their
classroom practices. Both a static and interactive version of a diagram was created to communicate
the investigative cycle, reasoning in each phase at three levels of sophistication, and an indication
of productive habits of mind (Figure 1). Two brief documents described the framework and how
to apply it to task design. In a video, the instructor illustrated the framework using example student
work, and other videos featured expert discussions and interviews, including one expert statistics
educator illustrating the development of the concept of mean across levels of sophistication.
Participants could also engage in a simulation task and watch two animated video illustrations of
students’ work that highlighted how students approach an investigation using different levels of
sophistication. See Appendix for a list of URLs to these openly accessible resources.

Figure 1. Framework for supporting students’ approaches to statistical investigations.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 10


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

Theoretical Framing of the Study


While making changes in teachers’ statistics teaching practices is a major goal, our research
is framed by an integrated model for teacher learning in PD proposed by Clarke and Hollingsworth
(2002). Their model represents a change process for teachers as including reflection and enactment
among an external domain of PD experiences and a teacher’s professional world that includes
domains of personal, practice, and consequence. The external domain includes information and
resources often experienced through a PD, including interactions with others. In our study, the
external domain includes learning opportunities (through a variety of resources) within the course
and the discussion forums within in each unit. The personal domain includes one’s knowledge,
beliefs and attitudes. The practice domain includes any professional experimentation a teacher may
do in their classroom, with content or instructional strategies, and the domain of consequence is
concerned with salient outcomes that result in sustained practice and impacts in a teacher’s
classroom.
Because of the massive size of our online PD about teaching statistics, we are most concerned
with the reflections and enactments between the external domain (experiences and resources in the
course) and the reflections and enactments we can discern concerning their beliefs and perspectives
about statistics and teaching statistics in the personal domain. To aid us in considering how the
MOOC-Ed experiences may impact teachers’ beliefs, perspectives, and practices related to
statistics, we draw upon Mezirow’s (2009) theory of transformational learning in adult education,
consistent with constructivist assumptions about learning. Mezirow (2009) describes how meaning
schemes—comprised of knowledge, expectations, beliefs and perspectives, and feelings—are used
by an individual to interpret their experiences, and through reflection on these experiences, one
may transform their understandings. Peters (2014) illustrated how this theory could be used to
understand statistics teachers’ development of an understanding of variation. In the context of our
study, our intent is that a teacher might transform their meaning schemes for teaching statistics by
rejecting prior conception of what it means to teach statistics. Transforming meaning schemes
often begins with a stimulus, a disorienting dilemma, which requires one to question their current
understandings and beliefs that have been formed from previous experiences (Mezirow, 2009).
Specifically, we are interested in what stimuli and experiences within the TSDI course may act as
triggers to evoke disorienting dilemmas (or cognitive dissonance) for teachers where they engage
in critical reflection and question their current understandings or perspectives.

Methods
Participant Demographics
Though the course has been offered multiple times, this paper focuses on the Fall 2015 section.
To attract a broad audience, the free course was advertised through websites and listservs of many
different educational organizations (NCTM, ASA, CAUSEweb, IASE), social media posts, emails
to past participants in any MOOC-Ed, state-level leaders in mathematics education in the U.S.,
and personal contacts. For the purpose of the research reported in this paper, we are only interested
in the potential ways the course experiences could be impacting the beliefs and perspectives of K-
12 classroom teachers. Of the course’s total enrollees (n = 829), over half self-classified as
classroom teachers (n = 489). In this study, we focus on these 489 teachers. The enrolled classroom
teachers resided in 46 different states and 29 different countries, with most teachers in the U.S. (n
= 380) and New Zealand (n = 48). The majority of the 489 classroom teachers were female (67.5%)

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 11


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

and 72.8% had a master’s degree or above. Their years of experience in education, however, was
fairly evenly distributed, creating a diverse pool of participants with varied teaching experiences
that impact their starting perspectives and growth opportunities during the course. Of those 489
self-identified classroom teachers, we were able to use additional registration data (e.g.,
organization type and name) to infer that 412 enrollees seemed to be actively working in K-12
contexts. For example, some enrollees identifying as a classroom teacher also identified their
organization type as a college/university and provided a community college as their organization.
Data Sources and Analysis Methods
In our research, we needed data from a variety of sources to help us measure impact of the
online learning opportunities for a broad range of active and passive teacher participants. Aside
from registration data, five other data sources were used: (a) click logs; (b) discussion forum posts;
(c) end-of-unit surveys; (d) an end-of-course survey, and; (e) a follow-up survey six months after
course to participants who engaged in any aspect of the course. The purpose of the follow-up
survey was to inquire about how they may have applied their learning and what they considered
the most impactful ideas from the course.
Course activity was tracked through click logs that allowed us to examine trends in
participants’ engagement. We limited data to those click logs made by classroom teachers that
occurred between September 21, 2015 (opening of Orientation Unit) and December 31, 2015 when
the course closed. All registration and click log data were merged and displayed in a dashboard
that allowed investigators to visualize participants’ engagement over time and with certain types
of resources. Descriptive statistics and graphical displays were used to examine overall
engagement patterns.
Our qualitative analysis initially focused on teachers’ discussions in forums. Because the needs
of a community college classroom teacher may differ than that of a K-12 teacher, we focused our
qualitative analysis of discussion forum data on posts made by those we had inferred were K-12
teachers. There were 2,097 total posts made by all participants in the course (after removing
instructional team), across 12 forums. We eliminated the introduction forum in the Orientation unit
and the project discussion forum, leaving 10 forums across the five units. Of the remaining posts,
977 were made by 206 participants classified as classroom teachers. For this study, since we were
only interested in beliefs and perspectives of K-12 classroom teachers, only these 977 posts were
analyzed, with each post considered a unit of analysis. The posts by teachers were first analyzed
using open coding (Strauss, & Corbin, 1998) guided by our focus on cognitive dissonance and
critical reflection that may lead to change in beliefs, perspectives and practices related to teaching
statistics. Posts were tagged for evidence of what course elements seemed to be triggering critical
reflection and any evidence that a teacher may put forth in their written post that may indicate a
reflection on, or shift in, their perspectives or beliefs related to teaching statistics. We documented
which triggers were the most prevalent and only kept triggers that were associated with many
instances of critical reflection. The occurrences of triggers were quite skewed, with many
occurring an abundance of times, and a few occurring once or twice. Thus, it was a clear distinction
to identify major triggers for impacting changes. Codes for describing perspectives and beliefs
about teaching statistics were sorted and collapsed into broader themes.
In accordance with Loizzo, Ertmer, Watson and Watson (2017), to more deeply understand
aspects of the external domain that triggered critical reflection and impacts on the personal and
practice domains, we examined open-ended responses to end-of-unit and end-of-course surveys,

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 12


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

as well as the follow-up survey. The themes generated from the analysis of the discussion forum
data—related to changes in beliefs and perspectives, and triggers that seemed to impact such
critical reflection—were used as initial codes to examine K-12 classroom teachers’ open-ended
responses on the end-of-units, end-of-course, and follow-up surveys to questions related to what
they appreciated most in a unit and what they considered to be the most impactful learning
experiences. While we were looking for confirming and disconfirming evidence of themes and
triggers, disconfirming evidence was not evident, and no new themes or triggers were documented.

Results
We first briefly describe teachers’ participation in the MOOC-Ed (external domain) to help
situate our findings. We then present our results related to the four elements of the course that
teachers identified that triggered critical reflection. We discuss each element and provide evidence
to illustrate the critical reflection the element triggered. Then, we discuss ways that engagement
with, and triggers from, elements of the external domain seemed to impact teachers’ perspectives
and beliefs about teaching statistics in the personal domain.
Teachers’ Participation
The purpose of this section is to briefly describe how classroom teachers chose to participate
in the course and engage with resources (external domain). The click log data used in this analysis
included all 489 enrollees who self-classified as classroom teachers at any level at registration.
Overall, a majority of enrolled classroom teachers (n = 370, 75.6%) engaged in various aspects
of the course (e.g., accessing a page, viewing a video, downloading a document, posting in a
forum). While some started in Orientation, others started in Unit 1. There were 293 classroom
teachers who engaged in Unit 1, with an assumed intent to engage in PD through accessing learning
material. Participants did not have to view Orientation or earlier units to access later ones, though
almost all traversed the course linearly once they engaged in Unit 1. Figure 2 shows the sharp drop
in teachers’ participation between Units 1 and 2. By Unit 5, 31.4% (n = 92) of classroom teachers
that began Unit 1 were still engaging in the course.

Figure 2. Number of teachers accessing each unit in course.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 13


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

Over half of classroom teachers who began the course posted to a discussion forum (n = 206,
57.5%). The frequency of posts per teacher was a skewed distribution, with 57% of teachers
posting 1–3 times (typically in Orientation and Units 1–2), 38% of teachers posting 4–14 times
across several units, and 11 very active teachers posting 15–45 times. The levels of engagement in
discussion forums by classroom teachers was highest in Units 1–3.
The examination of the click log data provides a strong indication of how classroom teachers
took advantage of learning opportunities in the course through accessing resources and
participating in discussion forums, with about a third of them finishing the course. A deeper dive
into the qualitative data highlights which of the learning experiences in the course (external
domain) seemed to trigger pedagogical dilemmas for them.
Course Features Triggering Critical Reflection
Four elements from the external domain emerged as often cited for triggering critical
reflection. We briefly discuss each trigger and use examples from classroom teachers to illustrate
the types of dilemmas or critical reflection they engaged in.
SASI framework. By far, the SASI framework (and all documents and multimedia associated
with it, see Appendix) was the most dominant trigger for change. For example, in Unit 5, upon
reflecting on why their confidence to teach statistics had increased, some teachers noted how the
framework triggered changes. Triggers are bolded. A teacher posted,
The most important point that I got from this course is being able to develop habits
of mind that will help students to build conceptual frameworks for statistics. …
We should be interested in the students’ reasonings (as opposed to the result).
In the same discussion thread, a teacher responded, “I have found the frameworks for statistical
thinking presented in the videos and materials to be very helpful in articulating the essence of
statistics to my students.” These teachers view statistics as more than a set of procedures and
describe how the SASI framework impacted their perception of teaching statistics. Also in Unit 5,
another teacher reflected on how the framework will help to improve her lessons.
I feel more confident as well. It is my first time teaching stats and I was
overwhelmed with ideas of how to approach it. This MOOC has supplied us with a
framework to base our classwork on. I am developing a set of tasks for my class
using the A-B-C levels as a way for me to differentiate instruction because I have
a wide variety of ability. I knew I wanted to go in this direction but … the
framework gave me the perfect guidelines to do this.
More specifically, this participant indicated that this framework guided her in developing several
tasks to differentiate instruction and support students at different levels of statistical sophistication.
Another participant indicated, “The SASI framework instilled in me a new mind-set. It showed
me the study Statistics under a different light. It allowed me to view it from a different angle and
really excited me to start applying and implementing it.” Engaging with the SASI framework in
the course not only led to teachers expressing a different perception of statistics, it supported them
in imagining ways to change their practice.
Expert panel videos. The discussions among experts within the expert panel videos were
another main trigger to assist teachers in reconsidering prior experiences in learning and teaching

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 14


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

statistics. In Unit 2, a teacher began a discussion thread detailing a dilemma about prior teaching
practices because of points made in a video by the expert panel. The extensive post began as:
I had a "lightbulb moment." Although I have been teaching HS math for 24 years,
I have never actually taught "statistics" as defined by the members of the expert
panel. I have taught units that I THOUGHT were statistics, but I was merely
providing students with a few mathematical tools that statisiticians [sic] can use
(e.g. finding a mean, making a histogram, calculating a standard deviation, etc.) ...
Twelve participants joined that discussion, 10 of which were teachers. They echoed that they were
“guilty” of teaching statistics this way and that their own prior experiences in learning statistics
treated the subject in a procedural manner for computing measures and creating graphs. Similar
discussions and replies about this issue were also started by several others. To complete the first
shift in perspective, teachers also recognized that attending and engaging in all parts of an
investigation would give students opportunities to make sense of how statistics is used to answer
questions and how important data collection (or experimental design) is to the process. Many
admitted they spent little time on this with students and aimed to improve.
In their reflections in discussions and on surveys, several teachers referred to a Unit 3 video
where one expert illustrates developing the concept of mean through tasks at different levels of
sophistication.
Wow—that whole idea around how to introduce the idea of variability as seen in
the 'Number in your family activity' at level A through to C is fantastic. Loved
the video of [Expert Name]. I can see what an advantage it is when they get to
high school level to have been introduced to the concept [of mean] in this way.
The expert panel videos evoked critical reflection and many opportunities for teachers to consider
different perspectives and learn how statistics learning and teaching could be conceived of as
something different from their own experiences as teachers and learners.
Classroom-based videos. The videos of students and teachers engaged in statistics tasks, both
those of real classrooms, and the animated videos depicting real students’ work, also triggered
critical reflection about how students and teachers engage in statistics, helping them envision a
different outcome for their students if they change their practices. In Unit 4, several teachers
discussed the use of hands-on projects and experiments.
I loved the Gummy Bears In Space Video. It was short, and to the point but I
loved the activity … The students in this video were able to conduct their own
experiment, collect data, and really analyze what was going on... A common theme
I am seeing with statistics is that it is very project based friendly and can be an
extremely engaging classroom!
Another teacher in Unit 4 shared their reflection after watching two animated videos of
representations of students’ work with a sample of messy Census at Schools data with technology
and how they envisioned using such an approach with their students.
I had several "aha" moments throughout these two videos. It occurred to me that
cleaning up data is a valuable lesson that students must know in order to correctly
interpret their findings and draw conclusions to answer their questions. If my
students were to work with Census at School data to investigate a question of
interest to them, I think they would struggle with cleaning up their data to interpret

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 15


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

their results... I would think my students would accept the data as is, and begin to
draw conclusions using the raw messy data. I think this tool would be a great
resource for teaching this type of lesson, and showing students how to make sure
their data is meaningful in accordance to the context.
These quotes represent typical posts where teachers reflected on and discussed videos of students
and teachers engaging in statistics and made connections to their own classroom practices.
Dive into Data activities. The use of technology in the Dive into Data activities for
investigating real data that were multivariable and sometimes “messy” served as an additional
trigger that seemed to impact teachers’ perspectives. Technology experiences directly influenced
their ideas that engaging in statistics is enhanced by using dynamic technology tools and real-
world messy data. As illustrated in quotes from teachers in the above section on the impact of
viewing videos of students’ work with data, experiences that triggered reflection on the usefulness
of technology came from learning opportunities that included videos of students using technology,
discussions in expert panel videos, and opportunities to Dive into Data themselves.
Two prominent triggers were using the Gapminder tool in Unit 1 and engaging with Census at
School for gathering and sampling data from students in Unit 4. In a Unit 5 discussion, teachers
were prompted to discuss course impacts and share ideas for their classrooms. One teacher posted,
“I loved the Gapminder site! I spent three very engaging days doing activities with the site and my
students were simply shocked at some of the numbers. What an eye-opener!” Another indicative
post mentions Census at School,
The School Census [sic] data is very interesting and serves as a great resource for
teaching. This type of data is applicable to our students and since it is real data,
not simply some fabricated textbook example, it has more power to influence
learning and thinking.
The teacher discussing Gapminder used this new resource and implemented it in his classroom;
whereas we cannot tell from the teacher discussing Census at School if he intends to use it with
students, but it seemed to trigger the notion of using real data as an important aspect of statistics.
On a follow-up survey that asked participants the most valuable thing they learned, teachers
often identified one or some of the four previous triggers. The following is an example of a teacher
reflecting on the MOOC-Ed holistically and identifying several triggers.
The most valuable aspect of the MOOC was obtaining resources for the improved
use of technology to make instruction come to life and be more meaningful to
students. I was able to see the statistical process in action and now have an idea
of what it should look like in the classroom.”
For this teacher, a combination of learning about new technologies to use in statistics (Dive into
Data) and engaging with videos that showed students and teachers using technology in statistical
investigations seemed to make a lasting impact.
Impact on Perspectives and Beliefs
In accordance with our guiding framework, we are interested in ways that engagement with,
and triggers from, elements of the external domain impact teachers’ perspectives and beliefs in the
personal domain. Here we describe evidence of impact on teachers’ perspectives and beliefs
related to teaching statistics. Because we saw comments related to these themes in discussion

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 16


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

forums in Units 1–2, on unit and end-of-course surveys, and on the follow-up survey from
participants who had only engaged in early units, the impacts on perspectives and beliefs seemed
to occur with both classroom teachers who completed the course as well as those who only engaged
in early units. It is beyond the scope of this paper to include a deeper analysis about differences
between these groups of participants.
We found four major ideas related to how teachers’ beliefs and perspectives about teaching
statistics may have changed:
● viewing statistics as more than computations and procedures,
● engaging in statistics is enhanced with technology,
● engaging in statistics requires real data, and
● statistical thinking develops across a continuum.
Each perspective is described below highlighting teachers’ beliefs and implied changes they would
need to make in their teaching practices.
We noticed a shift in thinking about statistics as more than computations and procedures that
began in discussion forums in Unit 1 and expanded in later units. This was also evident in responses
to surveys. There were two aspects to this shift in perspective. The first can be characterized as a
realization that the statistics they experienced and tended to teach was too focused on procedures.
This was illustrated above as a teacher who had a “lightbulb moment” when listening to an expert
panel video. Further, teachers recognized that a procedural approach to statistics was not aligned
with their experiences in the TSDI course. For example, one teacher posted that she
used to teach statistics like a pure mathematics course with a focus more on the
process rather than the investigative side. This course has opened my eyes to the
variety of statistical methods you can demonstrate using data investigations.
This shift in beliefs about statistics appeared in teachers’ responses to the follow-up survey, where
one teacher suggested that, “The MOOC prompted me to rethink what sorts of questions I ask
students, shifting more to statistical reasoning questions and away from statistical processes.” One
teacher summarized what she learned in the course.
The statistics that I got in high school and higher education was only based on direct
teaching of formulas and drill learning. After going through all the simulations,
videos, and technological tools that are provided here I came to realize what
statistics really is. It is much more than just the ability to read graphs or compute
numerical results, but it is more about quantitative reasoning, figuring/analyzing
the messy data, and building critical arguments.
The second theme that emerged is that teachers recognized that engaging in statistics is
enhanced with technology. For some teachers, using statistical software was also intertwined with
using real data. An example of this perception is expressed by a teacher who stated on a follow-up
survey “I use more technology throughout my semester to help intergrate [sic] my lessons that help
intertwine real world applications.” Another teacher joined in a discussion in Unit 4 started by
another participant to express gratitude (subject: “a Big thank you”) for the course focused on how
a particular Dive into Data experience in Unit 2 had made an impact for her.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 17


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

I have really enjoyed getting to know the Tuva labs website [an online graphing
tool] and exploring some of the activity worksheets. I created box plots from the
Pixar and Dreamworks data and got the students to try and discuss the different
comparisons using the SASI levels of sophistication with median, range, IQR and
LQ and UQ.
For this teacher, a combination of learning about new technologies to use in statistics and
applying her understandings of the SASI framework was assisting her in creating new experiences
for her students. There were certainly several posts where teachers explicitly described how they
were using technology to assist themselves in learning new approaches and how they hoped to use
these in their classroom. For example, a teacher in Unit 4 described:
Last year, I created an account with tuvalabs, but never looked into it. So I took the
data from census at schools and was able to upload into tuva labs. There, I was able
to create dot plots, bar graphs, histograms, and more. The stats section is coming
up here at the end of November, and I'm excited to have my students be able to use
this free resource.
While she had previously accessed Tuva it was her experience using Tuva in the TSDI course with
Census at School data that gave her the needed knowledge to make plans to implement this with
students in her practice.
A third theme that emerged was that engaging in statistics requires the use of real (and messy)
data, and in many cases datasets that included bigger data (more attributes and cases). One
participant shared in a follow-up survey,
the data emphasis was what I really took away from the course. There were little
tidbits here and there I have "borrowed" to polish what I do—but by far I am most
proud of creating more concrete data sets for my students to actually experience
(say, the left/negative skew effect) rather than just showing a picture.
Teachers recognized the need to use data that included a large number of cases and multiple
attributes (numerical and categorical) and that may require some cleaning (e.g., “getting real/messy
data that needs to be cleaned is an important exercise in itself”). Using real data was one idea that
experienced teachers contributed a lot in the community discussions. These teachers were
reaffirming their pedagogies and sharing what they do for others to learn from. Consider how this
classroom teacher gave glimpses into her practice, which was part of one of the longest discussion
threads, in Unit 2, with 48 different posts, about the subject “Classroom experiments.”
I think that by having these meaningful discussions about the real world
implications of statistics is what makes it real for them. Using real data sets and
showing them how it relates to the world around them is not only meaningful, but
is what statistics truly is. Use contexts that are real for your students. I had a class
last year that was made up mostly of students who played sport. I used lots of sports
datasets which are easily accessible and full of stats. This year I had a lot of students
passionate about government and politics so I used a lot of governmental datasets
This extended discussion is a strong example of how the online community allowed the teachers
to learn from one another by discussing issues that emerged when they did classroom experiments,
some sharing types of experiments they have tried, and others reflecting on their newfound bravery
to try these types of experiments in their classroom.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 18


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

The final theme is that teachers began to realize that statistical thinking and understanding
develops across a continuum and that they could use this thinking to inform instructional decisions,
use of tasks, and assessment of students. For example, one teacher indicated that, “The idea of the
4-process cycle and the different levels for different ages of each process, has helped me a lot. I
understand more and feel I am a better teacher to my students.” Considering statistics as developing
across levels was a cornerstone aspect of the SASI framework and seemed to take hold for many
teachers. After commenting on students’ work in a video in Unit 3 and describing what levels she
thought students may be working at on a task, another teacher noted,
… with the SASI framework, I like how it never mentions age or grade level. I feel
it's a continuum that students, depending on the context, can move back and forth
between. If they get to a harder problem, they may not know how to exactly collect
the data without bias and ensuring randomness. But with an easier experiment, that
may be more obvious to them.
Some teachers indicated they would use specific tasks from the courses with their own
students, suggesting they would implement tasks that included more student engagement with the
four phases of a statistical investigation. For example, one teacher said, “I have done a lot of labs
with my students but I really loved this one [coke vs. pepsi] to try. I can't wait to see how they
react with this one.” Other teachers showed evidence of applying more general pedagogical
knowledge about implementing tasks that involve the investigation cycle and can develop
statistical habits of mind. Some indicated they would utilize the task design resource in selecting
and/or adapting and implementing tasks in their classrooms that could support students at different
levels.

Discussion and Conclusion


Researchers have yet to agree on the most appropriate ways to measure participants’ progress
and outcomes as they engage in MOOCs (Perna et al., 2014). Despite these inconsistencies, a
common way to evaluate the impact of MOOCs has been to report completion rates or retention
rates. Koller, Ng, and Chen (2013) define retention rate, or completion rate, as the fraction of
participants who enroll who successfully complete the course using criteria established by the
instructor. Perna et al. (2014) define retention rate as the number of people who accessed the last
module of the MOOC, divided by the number of participants who accessed the first module. While
definitions of both vary throughout the literature, completion rates typically range between 5%
and 19% of registrants (Ho et al., 2014; Koller, et al., 2013; Perna et al., 2014). Recall that Jacobsen
(2019) found that educators who had only accessed a few resources in the first two modules of an
online PD, reported having meaningful interactions with those resources, and how their
engagement impacted their practices. Loizzo et al. (2017) found that one measurement of success
of a MOOC was that participants gained new resources. The major findings from our study are
discussed below to provide broader implications for research and design in online PD.
In just this one course offered over a 15-week period, almost 300 classroom teachers engaged
in at least the first unit, with 31% of those teachers completing the course through Unit 5. Thus,
the MOOC-Ed succeeded in reaching and engaging K-12 teachers, with evidence of high
engagement by many with different resources and active participation in discussion forums. This
completion rate is higher than reported with most other MOOCs (e.g., Perna et al. 2014). We know
that not everyone intended to complete the course, but some teachers who only participated in Unit

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 19


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

1 engaged in discussions, responded to follow-up surveys, and showed evidence of reflections


based on triggers such as expert video discussions about how statistics is different than
mathematics and seeing students in a video using the Gapminder tool (all introduced in Unit 1).
By using data from discussion forums, end-of-unit surveys and follow-up surveys that included
anyone who enrolled in the course, we were able to include perspectives of teachers who may have
only engaged with a few resources. Thus, our approach to data sources expands how Jacobsen
(2019) examined ways online PD can impact educators’ beliefs, perspectives, and practices.
One challenge in designing online PD for teachers is identifying how to leverage stimuli that
has the potential to act as triggers to impact teachers’ beliefs about teaching. For those who are at
a crossroads facing this challenge, our identification of triggers can provide guidance as they
embark on designing and implementing online PD efforts for teachers. While we have no evidence
(yet) that teachers’ experiences in a brief online PD in teaching statistics has impacted actual
teaching practices and students’ learning, our research indicates that the purposeful design
elements of the course were successful in causing critical reflection through certain triggers.
Having a framework that can guide teachers’ ability to plan tasks and assess students can provide
a way for teachers to understand a bigger picture of teaching the content beyond what is in their
particular grade-level curriculum. Active learning opportunities to experience new technology
tools and engaging tasks was a critical trigger. PD for teachers should include opportunities to
engage more deeply, and perhaps in a different way, with content teachers are expected to teach.
Designers of online PD need to continue to find ways to engage teachers in such active learning
opportunities.
The use of two types of videos that appeared as triggers is important to consider in future
designs. For those that work in teacher education, it is not surprising to hear that teachers can learn
much from watching and reflecting on videos depicting students’ thinking on tasks and teachers’
pedagogical moves (e.g., Chazan, 2018). It may be surprising though, that teachers learn a lot from
videos that are conversational in nature between expert educators in a domain. In a typical face-
to-face PD, there is generally 1–2 leaders who engage teachers in activities and present material.
Current practices in online PD may tend to feature a single instructor presenting critical
information in lecture-style videos. Rarely do teachers get an opportunity to hear a discussion
about critical issues related to teaching and learning. While each unit in the TSDI course had a
brief video of the instructor introducing key ideas in the unit, these were rarely brought up in
discussions. The exception was a video in Unit 3 where the instructor illustrated the SASI
framework with examples from students’ work. Quite simply, hearing from the instructor alone in
videos did not seem impactful; but, hearing from the instructor engaged in discussions with experts
in the field (see sample expert video linked in Appendix) served as triggers for educators to
experience cognitive dissonance about their own ideas that they in turn seemed to willingly discuss
in forums.
The classroom teachers not only learned from expert opinions, but also from the voices and
experiences of other teachers and participants with whom they interacted with in the course. This
is similar to findings from Loizzo et al. (2017) where some MOOC participants expanded their
world views by engaging in forums where they shared their personal experiences. In other research
on the posting behaviors of participants in this course, Bonafini (2018) found that there was one
classroom teacher and three other non-classroom teachers who served as super-posters and
contributed greatly to conversations through starting threads and replying to many posts by others.
Peer voices along with the voices of the instructional staff in the forums acted as additional

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 20


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

resources to support collegiality and practical exchange of ideas outside of teachers’ physical
school environment (Borko, 2004; Mackey & Evans, 2011). Well-designed discussion prompts
focused on pedagogical issues and an open forum for sharing indeed provided opportunities for
teachers to express their critical reflections and share in development of new classroom practices.
Many teachers reported increasing their confidence to teach statistics and appeared to move
towards beliefs that we should engage students through investigations, not merely teach them
mathematical tools to apply to numbers devoid of context. Thus, our results in this online context
align with others who have done PD about teaching STEM content in face-to-face contexts (De
Vries et al., 2013; Eichler, 2011; Seung et al., 2011). Like MOOC participants’ in Loizzo et al.’s
(2017) study, who measured one aspect of success in that teachers were able to apply things they
had learned, our teachers were attracted to and made sense of how to apply a framework to their
practice. Teachers learned a lot about what it means to engage in statistics, by doing it themselves,
as well as from examining students’ thinking in videos. Is any of this a big surprise? Perhaps not
to experienced teacher educators. However, the key is to include these types of learning
opportunities in online PD, whether it is to a local group or massive and open to teachers around
the world. To help answer the call from Marrongelle et al. (2013), our research also supports the
idea that online courses that emphasize: (a) self-directed learning, (b) peer-supported learning, (c)
job-connected learning, and (d) learning from multiple voices can be effective for designing online
PD in teaching STEM content (e.g., teaching statistics) that need wide-scale efforts to impact the
perspectives and practices of classroom teachers.
Of course, our research is limited by the fact that we did not include interviews, collection of
artifacts of practice (e.g., lesson plans or tasks), or conduct classroom observations of a subset of
teachers. Such methods should be included in future studies and would provide more nuanced and
direct evidence of whether teachers’ espoused changes in perspectives and beliefs, and intentions
for changes in their practices, were actually realized in classrooms.

Acknowledgements
The design, implementation, evaluation, and research of MOOC-Ed courses at the Friday
Institute for Educational Innovation was partially funded by the William and Ida Hewlett
Foundation. Any opinions, findings, and recommendations expressed are those of the authors, and
do not necessarily reflect the views of the Foundation. Thank you to Dung Tran, Theresa Gibson,
Alex Dreier, and Glenn Kleiman for their contributions in course design. A preliminary version of
this manuscript appeared in the 2017 proceedings of Psychology for Mathematics Education-North
American conference.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 21


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

References
Borko, H. (2004). Professional development and teacher learning: Mapping the terrain.
Educational Researcher, 33(8), 3–15.
Bonafini, F. C. (2018). Characterizing super-posters in a MOOC for teachers’ professional
development. Online Learning, 22(4), 89–108. doi:10.24059/olj.v22i4.1503
Boston, M. D., & Smith, M. S. (2011). A ‘task-centric approach’ to professional development:
Enhancing and sustaining mathematics teachers’ ability to implement cognitively
challenging mathematical tasks. ZDM, 43(6–7), 965–977.
Burrill G., & Biehler, R. (2011). Fundamental statistical ideas in the school curriculum and in
training teachers. In C. Batanero, G. Burrill, C. Reading, & A. Rossman (Eds.), Teaching
statistics in school mathematics: Challenges for teaching and teacher education (pp. 57–
69). Springer.
Capps, D. K., & Crawford, B. A. (2013) Inquiry-based instruction and teaching about nature of
science: Are they happening? Journal of Science Teacher Education, 24(3), 497–526, doi:
10.1007/s10972-012-9314-z
Chazan, D. (2018). Considering what we want to represent. In O. Buchbinder & S. Kuntze
(Eds.), Mathematics teachers engaging with representations of practice (pp. 163–167).
Springer.
Clarke, D., & Hollingsworth, H. (2002). Elaborating a model of teacher professional growth.
Teaching and Teacher Education, 18(8), 947–967.
Conference Board of the Mathematical Sciences (2012). The mathematical education of teachers
II. American Mathematical Society and Mathematical Association of America.
https://www.cbmsweb.org/archive/MET2/met2.pdf
Cuoco, A., Goldenberg, P. E., & Mark, J. (1996). Habits of mind: An organizing principle for
mathematics curricula. Journal of Mathematical Behavior, 15(4), 375–402.
Darling-Hammond, L., Wei, R. C., Andree, A., Richardson, N., & Orphanos, S. (2009).
Professional learning in the learning profession. National Staff Development Council.
De Vries, S., Jansen, E. P. W. A., & Van De Grift, W. J. C. M. (2013). Profiling teachers’
continuing professional development and the relation with their beliefs about learning and
teaching. Teaching and Teacher Education, 33, 78–89.
Eichler, A. (2011). Statistics teachers and classroom practices. In C. Batanero, G. Burrill, & C.
Reading (Eds.), Teaching statistics in school mathematics-challenges for teaching and
teacher education (pp. 175–186). Springer.
Franke, M. L., Carpenter, T. P., Levi, L., & Fennema, E. (2001). Capturing teachers’ generative
change: A follow-up study of professional development in mathematics. American
Educational Research Journal, 38(3), 653–689.
Franklin, C., et al. (2007). Guidelines for assessment and instruction in statistics education
(GAISE) Report: A Pre-K-12 curriculum framework. American Statistical Association.
https://www.amstat.org/asa/files/pdfs/GAISE/GAISEPreK-12_Full.pdf

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 22


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

Franklin, C., Bargagliotti, A. E., Case, C. A., Kader, G. D., Schaeffer, R. L., & Spangler, D. A.
(2015). The statistical education of teachers. American Statistical Association.
Granger, E. M., Bevis, T. H., Saka, Y., Southerland, S. A., Sampson, V. & Tate, R. L. (2012).
The efficacy of student-centered instruction in supporting science learning. Science, 338,
105–108.
Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001). What makes
professional development effective? Results from a national sample of teachers. American
Educational Research Journal, 38(4), 915–945.
Grossman, P. (1990). The making of a teacher. Teacher’s College Press.
Harrell-Williams, L. M., Sorto, M. A., Pierce, R. L., Lesser, L. M., & Murphy, T. J. (2015).
Identifying statistical concepts associated with high and low levels of self-efficacy to teach
statistics in middle grades. Journal of Statistics Education, 23(1).
http://jse.amstat.org/v23n1/harrell-williams.pdf
Herbst, P., Chazan, D., Chen, C. L., Chieu, V. M., & Weiss, M. (2011). Using comics-based
representations of teaching, and technology, to bring practice to teacher education courses.
ZDM, 43(1), 91–103.
Ho, A.D., Reich, J., Nesterko, S., Seaton, D.T., Mullaney, T., Waldo, J., & Chuang, I. (2014).
Harvard X and MITx: The first year of open online courses. Harvard X & MITx Working
Papers No. 1. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2381263
Jacobsen, D. Y. (2019). Dropping out or dropping in? A connectivist approach to understanding
participants’ strategies in an e-Learning MOOC pilot. Technology, Knowledge and Learning
24(1), 1–21. https://doi-org.prox.lib.ncsu.edu/10.1007/s10758-017-9298-z
Kleiman, G., Wolf, M.A., & Frye, D. (2015). Educating educators: Designing MOOCs for
professional learning. In Kim, P. (Ed.), Massive open online courses: The MOOC revolution
(pp. 117–144). Routledge.
Koller, D., Ng, A., & Chen, Z. (2013). Retention and intention in massive open online courses:
In depth. EDUCAUSE Review Online, 48(3), 62–63.
https://er.educause.edu/articles/2013/6/retention-and-intention-in-massive-open-online-
courses-in-depth
Lee, H. S., & Stangl, D. (2015). Professional development MOOCs for teachers of statistics in
K-12. Chance 28(3), 56–63. http://dx.doi.org/10.1080/09332480.2015.1099368
Lee, H. S., & Stangl, D. (2017). Design and implementation of professional development
MOOCs for teachers of statistics. AMSTAT News (Special issue on Statistics Education),
September. http://magazine.amstat.org/blog/2017/09/01/pd_teachers/
Lovett, J. N., & Lee, H. S. (2017). New standards require teaching more statistics in high school:
Are preservice mathematics teachers ready? Journal of Teacher Education, 68(3), 299–311.
https://doi.org/10.1177/0022487117697918
Loizzo, J., Ertmer, P. A., Watson, W. R., & Watson, S. L. (2017). Adults as self-directed and
determined to set and achieve personal learning goals in MOOCs: Learners’ perceptions of
MOOC motivation, success, and completion. Online Learning, 21(2).
http://dx.doi.org/10.24059/olj.v21i2.889

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 23


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

Luebeck, J., Roscoe, M., Cobbs, G., Diemert, K., & Scott, L. (2017). Re-envisioning
professional learning in mathematics: Teachers’ performance, perceptions, and practices in
blended professional development. Journal of Technology and Teacher Education, 25(3),
273–299.
Mackey, J., & Evans, T. (2011). Interconnecting networks of practice for professional learning.
International Review of Research in Open and Distance Learning, 12(3), 1–18.
Marrongelle, K., Sztajn, P., & Smith, M. (2013). Scaling up professional development in an era
of common state standards. Journal of Teacher Education, 64(3), 202–211.
Mezirow, J. (2009). Transformative learning theory. In J. Mezirow & E. W. Taylor (Eds.),
Transformative learning in practice: Insights from community, workplace, and higher
education (pp. 18–31). Jossey-Bass.
National Governors Association Center for Best Practice & Council of Chief State School
Officers. (2010). Common core state standards for mathematics. Author.
National Research Council. (2000). How people learn: Brain, mind, experience and school. The
National Academies Press.
Perna, L. W., Ruby, A., Boruch, R. F., Wang, N., Scull, J., Seher, A., & Evans, C. (2014).
Moving through MOOCs: Understanding the progression of users in massive open online
courses. Educational Reviewer, 43(9), 421–432.
Peters, S. A. (2014). Developing understanding of statistical variation: Secondary statistics
teachers’ perceptions and recollections of learning factors. Journal of Mathematics Teacher
Education, 17(6), 539–582.
Pierce, R., & Chick, H. (2011). Teachers’ beliefs about statistics education. In C. Batanero, G.
Burrill, & C. Reading (Eds.). Teaching statistics in school mathematics: Challenges for
teaching and teacher education (pp. 151–162). Springer.
Renninger, K. A., Cai, M., Lewis, M. C., Adams, M. M., & Ernst, K. L. (2011). Motivation and
learning in an online, unmoderated, mathematics workshop for teachers. Educational
Technology Research and Development, 59(2), 229–247.
Seaton, D. T., Coleman, C., Daries, J., & Chuang, I. (2015). Enrollment in MITx MOOCs: Are
we educating educators? EDUCAUSE Review.
https://er.educause.edu/articles/2015/2/enrollment-in-mitx-moocs-are-we-educating-
educators
Seung, E., Park, S., & Narayan, R. (2011). Exploring elementary pre-service teachers’ beliefs
about science teaching and learning as revealed in their metaphor writing. Journal of Science
Education and Technology, 20(11), 703–714.
Sherin, M. G., & van Es, E. A. (2005). Using video to support teachers' ability to notice
classroom interactions. Journal of Technology and Teacher Education, 13(3), 475–491.
Stein, M. K., & Smith, M. S. (1998). Mathematical tasks as a framework for reflection: From
research to practice. Mathematics Teaching in the Middle School, 3(4), 268–275.
Stipek, D., J., Givvin, K., B., Salmon, J. M., & MacGyvers, V. L. (2001). Teachers' beliefs and
practices related to mathematics instruction. Teaching and Teacher Education, 17, 213–226.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 24


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

Strauss, A., & Corbin, J. (1998). Basics of qualitative research: Techniques and procedures for
developing grounded theory. Sage.
Sztajn, P. (2011). Standards for reporting mathematics professional development in research
studies. Journal for Research in Mathematics Education, 42(3), 220–236.
Treacy, B., Kleiman, G., & Peterson, K. (2002). Successful online professional development.
Leading & Learning with Technology, 30(1), 42–47.
Vrasidas, C., & Zembylas, M. (2004). Online professional development: Lessons from the field.
Education Training, 46(6/7), 326–334.
Wild, C. J., & Pfannkuch, M. (1999). Statistical thinking in empirical enquiry. International
Statistical Review, 67(3), 223–248.
Wilson, P. H., Lee, H. S., & Hollebrands, K. F. (2011). Understanding prospective mathematics
teachers’ processes for making sense of students’ work with technology. Journal for
Research in Mathematics Education, 24(1), 39–64.
Yasar, S., D. Baker, S. Robinson-Kurpius, S. Krause, S., & Roberts, C. (2006). Development of
a survey to assess K–12 teachers’ perceptions of engineers and familiarity with teaching
design, engineering, and technology. Journal of Engineering Education, 95(3), 205–16.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 25


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

Appendix A

Multimedia resources for participants to learn elements of SASI framework in Units 2 and 3.

Title Multimedia elements Link to Resource

Resources accessed through the course library resource database

Statistical habits of text with color-coded diagrams https://fi-


mind courses.s3.amazonaws.com/tsdi/un
it_2/Essentials/Habitsofmind.pdf

Describing the SASI text with diagrams and color https://s3.amazonaws.com/fi-


framework coordinated tables courses/tsdi/unit_3/SASI%20Fram
ework.pdf

Illustrating the SASI “talking head” video with https://youtu.be/XTobbqSpUZc


framework diagrams, animated titles,
interspersed with slides with
voice overlay (12:32 min)

Interactive Diagram webpage with framework https://s3.amazonaws.com/fi-


of SASI framework diagram and pop-up courses/tsdi/sasi_framework/index.
descriptions of different aspects html
of the framework

Considerations for text with tables that applies http://fi-


design and SASI framework to task design courses.s3.amazonaws.com/tsdi/un
implementation of it_3/CDIST.pdf
statistical tasks

Resources accessed through video embedded on a course page

Expert Panel video with instructor and 3 https://youtu.be/Te5EyDD-QE8


discussion on experts having discussion.
investigation cycle, (16:39 min)
differences between
mathematics and
statistics, and
statistical habits of
mind.

Expert Panel video with instructor and 3 https://youtu.be/xG-5ockl7Tg


discussion on task experts having discussion.
design (18:32 min)

Expert interview on video interview between https://youtu.be/QSEPd7afQRo

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 26


Examining How Online Professional Development Impacts Teachers’ Beliefs About Teaching Statistics

development of 2007 instructor and expert (7:06 min)


GAISE K-12
framework

Expert Interview on video instructor interviewing https://youtu.be/h5t0V9qe82k


developing the expert with interspersed slides
concept of mean (22:07 min)
across levels

Working with a video with animated depiction https://youtu.be/VuFjTaGgsCw


dynamic simulation of students working on task
tool (to explore with human reading task and
Schoolopoly task) real student voices and images
of computer work and video of
computer work (4:24 min)

Multiple levels of video with animated depiction https://youtu.be/tdLx7eMecB4


sophistication (with of teacher introducing task and
Schoolopoly task) three student pairs working on
task with computer images or
written work (voices automated)
(5:09 min)

Sample Dive into Data experiences

Dive into Data About A random sample of 300 https://codap.concord.org/releases/l


Vehicles Using vehicles manufactured in 2015 atest/static/dg/en/cert/index.html#s
CODAP is provided to explore questions hared=16202
about relationships between fuel
economy in the city and
highway, types of transmission,
hybrid vehicles, annual fuel
cost, and number of cylinders.

Dive into Data about Given a simulation of dice Die Roll Simulation
Fairness of Dice for produced by six companies. https://www.geogebra.org/m/KBA
Schoolopoly game Investigate whether or not the EuEJh
with GeoGebra die made by each company is
simulation fair. Collect data through a PDF of activity
simulation and support a https://s3.amazonaws.com/fi-
decision as to whether to courses/tsdi/unit_3/Schoolopoly%2
recommend that dice be 0Task.pdf
purchased from each company.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 27


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

Facilitation Matters: Instructor Perception of


Helpfulness of Facilitation Strategies in Online Courses
Florence Martin, Chuang Wang, and Ayesha Sadaf
University of North Carolina, Charlotte

Abstract
Online course facilitation is critical to the success of online courses. Instructors use various
facilitation strategies in online courses to engage students. One hundred instructors were surveyed
on their perception of helpfulness of twelve different facilitation strategies used in online courses
to enhance instructor presence, instructor connection, engagement, and learning. Instructors’
timely response to questions and instructors’ timely feedback on assignments/projects were rated
the highest in three of four constructs (instructor presence, engagement, and learning). For
instructor connection, ability to contact the instructor in multiple ways was rated the highest.
Interactive visual syllabi of the course were rated the lowest in all four constructs. In the open-
ended comments, group projects and synchronous sessions were rated helpful. Descriptive
statistics for each of the construct by gender, delivery method, and course level taught are
presented. Significant differences were found between gender but analysis of variance failed to
detect differences between primary delivery method or course level taught.

Keywords: facilitation strategies, instructor presence, online learning, instructor


perception, instructor connection

Martin, F., Wang, C., & Sadaf, A. (2020). Facilitation matters: Instructor perception of
helpfulness of facilitation strategies in online courses. Online Learning, 24(1), 28-49.
https://doi.org/10.24059/olj.v24i1.1980

Facilitation Matters:
Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses
According to National Center for Educational Statistics (2017), almost twenty million
students are enrolled in online courses, and enrollment is likely to grow. By interviewing award-
winning online instructors, (Martin, Budhrani, Kumar & Ritzhaupt, 2019) found online
instructors’ roles to be categorized as facilitator, course designer, course manager, subject matter
expert, and mentor. In this study, the online instructor role of being a facilitator is examined (Berge,
1995; Pappas, 2014). Online facilitation is described as to be present, available, to share expertise
online and model for the students what it means to participate in an online course (Martin,
Budhrani, Kumar & Ritzhaupt, 2019). Gustafson and Gibbs (2000) state that successful online
facilitators need to learn strategies to humanize the online course and identify new ways to engage
the learners to construct meaning. Online instructors use multiple strategies to facilitate student
learning and critical thinking skills (Richardson et al., 2015; Schindler & Burkholder, 2014), to

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 28


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

improve students’ sense of community (Rovai, 2007), and to promote students’ connectedness and
learning (Shea, Li, & Pickett, 2006). Berge’s (1995) study on Instructor Roles Model focused on
the functions of instructors, which shifted from a subject expert to a course facilitator, and
categorized facilitation into four categories: pedagogical, social, managerial, and technical.
Previous literature discussed various types of facilitation strategies in an online setting,
such as instructor’s feedback to students’ assignments (Badiee & Kaufman, 2014; Thiele, 2003),
responses to students’ questions (Sheridan & Kelly, 2010), announcements (Ko & Rossen, 2010),
questionings (Wang, 2014), and video-based course introduction (Jones, Naugle & Kolloff, 2008).
Few studies have looked at students’ perception of facilitation strategies in online environments
and outcomes (Martin, Wang & Sadaf, 2018; Shea, Li, & Pickett, 2006; Hew, 2015). Martin, Wang
and Sadaf (2018) reported that instructor’s timely response and feedback were highly valued by
students on establishing instructor presence, instructor connectedness, engagement and learning.
Hosler and Arend (2012) found that course organization and timely specific feedback improved
students’ participation. Shea, Li, and Pickett (2006) added that instructors’ questioning and
feedback have positive impact on students’ perception of learning and connectedness.
However, few studies have examined instructor perceptions regarding facilitation
strategies in online classes and their impact on students’ learning achievements. Cavanaugh and
Song (2014) compared instructor and students’ perspectives regarding audio feedback and written
feedback and found that instructors had mixed feelings about giving feedback using audio, whereas
students welcome audio feedback. Borup, West, and Thomas (2015) surveyed both students and
instructors on their perceptions of text and video feedback in blended courses and discovered that
both students and instructors believed that feedback in a written form is more efficient and
organized whereas video feedback facilitated supportive communication. Santilli and Beck (2005)
examined graduate faculty perceptions of online learning and found that about half of the
instructors considered peer interaction as the most significant feature of online discussion and
instructor feedback as the second most important feature. Hsiao (2012) discovered that online
teachers use several strategies to facilitate online communication, including providing clear
guidelines, rubrics and examples for online discussions; showing instructor presence by
monitoring students’ discussion; and absorbing other strategies that facilitate online discussion.
Although these studies identified faculty perceptions regarding a few facilitation strategies, online
faculty need to be knowledgeable in the use of facilitation strategies in order to maintain high
academic standards in online courses (Bigatel & Williams, 2015; Al-Salman, 2011).
While specific online facilitation strategies have been examined by other researchers,
faculty perception on the helpfulness on a variety of these facilitation strategies and the factors
associated with them have not been studied in online settings. In order to address this limitation,
Berge (1995) Instructor Roles Model was used as a comprehensive validated model in online
instruction to identify a variety of facilitation strategies based on the most important roles of online
instructors as learning facilitators. In this study, we examine (a) which facilitation strategies do
instructors perceive to be most and least helpful in establishing instructor presence, instructor
connection, engagement, and learning in online courses, and; (b) which factors (gender, delivery
method, level taught, discipline) are associated with instructor perception of facilitation strategies
in online teaching.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 29


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

Theoretical framework for online course facilitation


Berge (1995) categorizes instructor facilitation strategies into four functions: Managerial,
Social, Pedagogical and Technical (Figure 1). These instructor facilitation roles were initially
described within the online discussion context, but later Berge (2008) changed the roles to focus
on broader online learning environments that are “informal, collaborative, reflective learning, with
user-generated content” (p. 412). Berge (2008) suggested that some functions of instructors and
facilitation strategies may overlap or can be categorized in more than one group.

Managerial

Online
Technical Course Pedagogical

Facilitation

Social

Figure 1. Online Facilitation framework (Berge, 1995).


Pedagogical
In the pedagogical role, instructors facilitate students’ learning and sustain their
participation and motivation in an online course (Bawane & Spector, 2009). Pedagogical
facilitation strategies include having clear objectives, encouraging participation, promoting
conversations, making the course material relevant, and encouraging contributions (Berge, 1995).
Instructors also model effective learning and keep discussions on track, provide special knowledge
and perceptions, combine course content, and maintain group harmony (Rohfeld & Hiemstra,
1995). To facilitate and focus effective discussions, instructors use questions and probes (Berge,
2008). Eskey and Schulte (2010) found that instructors’ prompt responses to questions in the
discussion and via email are two important facilitation strategies for students to be successful in
online courses. Swan (2001) found that student to instructor interaction and active discussions
significantly impact student’s satisfaction and their perceived learning of the course material in
asynchronous online environments.
Managerial
In the managerial role, instructors design the logistics of the course. Some of the
managerial strategies include providing administrative responsibilities, procedural leadership,
planning and developing course materials, organizing the course, deciding the dues dates, and

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 30


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

pacing the online discussions (Anderson, Rourke, Garrison, & Archer, 2001; Berge, 1995). Wei
and Chen (2012) suggested that online instructional design should include a roadmap to effectively
guide the learner through the course to foster a positive learning environment. In addition to
facilitation and scaffolding, instructors should focus on organizational structure, such as learning
objectives, due dates, and expectations to facilitate effective online learning (González-
Sanmamed, Muñoz-Carril, & Sangrà, 2014; Richardson et al., 2015). Research indicates that
instructors facilitation in terms of prompt response to questions and timely feedback on
assignments are important in creating instructor presence, student engagement in their courses,
and facilitating higher levels of learning (Hodges & Cowan, 2012; Martin, Wang & Sadaf, 2018;
Sheridan, & Kelly, 2010). Ko and Rossen (2010) suggested that regular announcements in an
online course can be used to get students attention and remind them about the course activities
during the semester. In online courses, instructors sending weekly reminders activities and
assignments that are due is helpful for students to manage their time effectively (Kelly, 2014).
Social
In the social role, instructors encourage and promote meaningful human relationships for
working together in a mutual cause. Some of the social facilitation strategies include using
introductions to help build the sense of community, facilitating interactivity, modeling the
discussions behaviors, and reinforcing online etiquettes (Berge, 1995). In online learning,
promoting student-student or student-instructor relationships, developing cohesive groups, and
helping students work together for their shared benefit are helpful to the success of online learning
activities (Berge, 2008). Ko and Rossen (2010) suggested strategies for instructors to design and
facilitate the discussions that include narrowing down topics, starting topic threads, responding to
discussion posts, and mentioning student names.
Jones et al. (2008) found that video-based instructor introduction assisted in connecting
with the students from the start of the course which contributed to students’ growth in the course.
Researchers suggest that students demonstrate high levels of cognitive presence in discussions
facilitated by well-structured discussions and discussion questions (Oh & Kim, 2016;Richardson,
Sadaf & Ertmer, 2012; Sadaf & Olesova, 2017). Lowenthal (2010) recommend instructors to
create a space to interact socially with the students, engage them, and provide feedback on time.
Technical
In the technical role, instructors facilitate a transparent technology environment so that the
learners can focus on the academic tasks and learning activities (Berge, 1995). Technical
facilitation strategies include providing resources, materials, and other tools to facilitate learning
within the online course. Berge (2008) suggested that it is important for the facilitator to help
learners become comfortable with the information and communication technologies being used
within the online course. Research suggests that using multimedia tools in online courses increase
student learning and engagement. Using synchronous tools provide opportunities for instructors
and students to interact with each other using various features within the synchronous tools
including audio, video, text chat, interactive whiteboard, and applications (Martin & Parker, 2014).
Draus, Curran, & Trempus (2014) found positive relationships between content created by
instructors in the form of videos and student engagement, satisfaction, and retention. Instructor-
created videos assists students grasp the instructional content better and connect with their
instructors (Borup et al., 2012; Rose, 2009). Table 1 lists the 12 facilitation strategies proposed by
Martin, Wang and Sadaf (2018) categorized by Berge’s framework.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 31


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

Table 1
Facilitations Strategies in Online Courses (Martin, Wang & Sadaf, 2018)
Facilitation Strategies
Social Video-based instructor introduction
Instructor being present in the discussion forums
Able to contact the instructor in multiple ways
Managerial Video-based course orientation
Instructors timely response to questions
Instructors weekly announcements to the class
Pedagogical Instructors timely feedback on assignments/projects
Instructor’s feedback using various modalities
Instructors personal response to student reflections
Technical Instructors use of various features in synchronous sessions to interact
with students
Interactive visual syllabi of the course
Instructor created content in the form of short videos/multimedia

Helpfulness of online facilitation in this manuscript is examined through four variables,


instructor presence, instructor connection, engagement, and learning. The following sections
discuss the literature on how facilitation strategy helped the instructor be present in the classroom
(instructor presence), how the facilitation strategy helped the instructor get to know the students
(instructor connection), how the facilitation strategy helped the instructor engage the students in
the online course (engagement), and how the facilitation strategy helped the instructor facilitate
learning of the content (learning).
Instructor Presence
According to Richardson et al. (2015), instructor presence is defined as the “specific
actions and behaviors taken by the instructor that project him/herself as a real person” (p. 259).
Within the context of online instruction, instructor’s role can be seen as more of a facilitator instead
of a teacher or lecturer (Richardson & Swan, 2003). With a focus on the role of facilitator,
instructor presence is described as the instructor validating their personal identity by
acknowledging and performing their role through various strategies (Martin, Wang & Sadaf,
2018). Research has demonstrated that instructor’ presence influences their students in their
affective learning, cognition, and motivation (Baker, 2010), students’ satisfaction (Brinkerhoff &
Koroghlanian, 2007), and students’ sense of community (Sheridan & Kelly 2010). Vesely, Bloom,
and Sherlock (2007) stated that receiving frequent, timely, and constructive feedback from
instructor are important elements of instructor presence for the online students. Richardson,
Besser, Koehler, Lim, and Strait (2016) found that instructors perceived their presence as an
important factor in online courses. Mandernach, Gonzales, and Garrett (2006) studied instructor
interactivity and establishing standards to enhance instructor presence in online discussions.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 32


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

Results showed that the majority of instructors believed online instructors’ participation in online
discussions is important.
Instructor Connectedness
According to Gallien and Oomen-Early (2008), “Connectedness refers to a person’s sense of
belonging or presence, feelings of support, and level of communication/interaction with the
instructor. Students who perceive a sense of connectedness with their instructor are likely to feel
satisfied and perform well in their online courses” (p. 468). Similarly, D’Alba (2014) believes
connectedness is the “perceived closeness between the student and instructor as well as the
instructor and student” (p. 8). Regarding connectedness and its psychological effects, students with
close connection with instructors are likely to build more confidence (Ryan, Gheen & Midgley,
1998), feel less isolated (Cates & Slagter van Tryon, 2002), and reduce anxiety (Creasey, Jarvis &
Knapcik, 2009). Creasey et al. (2009) conducted a survey with 94 students to validate the scale of
student and instructor relationship and found students were less anxious as they felt more
connected with their instructors. Creasey, Jarvis, and Gadke (2009) found that instructor
immediacy impacts student achievement orientations which was partially mediated by student-
instructor relationship. Micari and Pazos (2016) reported that instructor connectedness together
with self-efficacy and peer alignment are predictors of student satisfaction. LaBarbera (2013)
examined how email correspondence between student and instructor influences students’
perceived connectedness with instructors. Results showed that students’ sense of connectedness
were associated with instructor feedback, instructor interaction and support, email correspondence,
and their satisfaction with the online course.
Engagement
Newman, Wehlage and Lamborn (1992) defined student engagement as “the students’
psychological investment in and effort directed toward learning, understanding, or mastering the
knowledge, skills, or crafts that academic work is intended to promote” (p. 12). Student
engagement denotes student commitment and effort to learning (Krause & Coates, 2008).
Compared to traditional classes, engagement is more important to online courses due to its lack of
face-to-face interactions between instructor and students. Engagement has a positive impact on
students’ satisfaction (Swan 2001), sense of community (Robinson, 2011), and persistence (Kuh
et al., 2008). An interactive online course that connects instructors and students can help to
eliminate students’ feelings of isolation and reduce dropout rates and online attrition (Banna, Lin,
Steward, & Fialkowski, 2015; Boton & Gregory, 2015). Dixson (2010) studied 186 students
enrolled in six universities and found that instructor presence had a positive influence on student
engagement. Bolliger and Martin (2018) compared student and instructor perceptions of online
student engagement strategies and found that instructors and students showed consensus on the
significance of multiple engagement strategies. However, instructors tend to rate most of the
strategies higher than students, including “the use of virtual lounges, icebreaker discussion,
reflections, peer review, interaction with peers, student moderation of discussions, collaborative
activities and projects, and the use of learner’s names in discussion forums by instructors” (p. 13).
Learning
Learning is defined as the attainment of knowledge or skills through experience or
education (Martin, Wang & Sadaf, 2018). In terms of online learning, Ally (2004) defined learning
as “the use of the internet to access learning materials, to interact with the content, instructor, and
other learners, and to obtain support during the learning process, in order to acquire knowledge, to

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 33


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

construct personal meaning, and to grow from the learning experience” (p. 7). Online learning
benefit learners and instructors in which they can update or access learning materials anytime at
any locations. However, there are differences between learner and instructor perceptions of online
learning. Tanner, Noser, and Totaro (2009) found that faculty showed less preference to online
learning than students when they compared business faculty and undergraduate students’
perceptions. Delaney-Klinger, Vanevenhoven, Wagner, and Chenoweth (2014) found that faculty
members who lack online teaching experience and the knowledge of using effective tools in online
environment are at a disadvantage that may have negative impact on their students’ learning.
Role of Instructor Demographics in Online Courses
Because of the impact online instructors have on students’ learning achievements,
researchers have explored and found differences in demographics factors that may influence
faculty facilitation in online teaching and learning environments (Chang, Lin, & Song, 2011; Shea,
2007). For example, Chang, et al., (2011) investigated faculty perceptions of teaching efficacy and
their relation to their demographic backgrounds and found that education faculty have higher
perception of efficacy than faculty in other disciplines, female faculty score higher in class
management and learning assessments than male faculty, and faculty with less teaching experience
indicate low perception of their teaching efficacy. In another study, Shea (2007) explored
instructors’ motivations to teach online and found that female faculty were more attracted to online
teaching than male faculty, older faculty (those 45 or over) were more motivated to experiment
with new pedagogy then were younger faculty, and faculty at four-year institutions were more
motivated to teach online than community college faculty. Similarly, Seaman (2009) examined
online teaching and course development by gender found that females more confident in
instructional skills and are more involved in course development than males. The results of these
studies show that since the demographics of online faculty can play an important role in their
online teaching, having a clear picture of whether or how demographics affects instructors’
perceptions of facilitation strategies is essential to enhance student learning.
Purpose of the Study
There is limited research focusing on online course facilitation and the studies on
facilitation focus on individual facilitation strategies. Since the choices faculty make to facilitate
online learning in their courses can have important effects on desired student learning outcomes,
identifying their perceptions of facilitation strategies can help enhance student learning in online
courses. Therefore, the main purpose of this study is to bring together several strategies and faculty
perception of these strategies on how it helps their online teaching. The second purpose is to
identify factors associated with faculty perceptions of facilitation strategies. The following
questions guided the study:
1. What facilitation strategies do instructors perceive to be most and least helpful in
establishing instructor presence, instructor connection, engagement, and learning in online
courses?
2. What factors (gender, delivery method, level taught, discipline) are associated with
instructor perception of facilitation strategies in online teaching?

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 34


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

Method
Data Collection Procedure
Once the institutional review board approval was received, data was collected using an
online survey tool (SurveyShare) that was used at the university where the researchers were
affiliated. Email invitations were sent to Association of Educational Communications and
Technology (1,900 members) and to the distance education instructors at a southeastern university
through the director of distance education (411 instructors). The response rate was at 4.8%. This
survey had instructions stating that only faculty who teach hybrid or online courses to complete
the survey. The AECT email list has practitioners, students, and faculty. Hence the low response
rate was expected. One reminder was sent about two weeks after the initial email. Three $25 gift
cards were given as incentives for their participation in this study through a random drawing.
Participants
A total of 115 instructors responded to the survey. Out of these 115 respondents, 11 missed
at least 10% of the questions and were therefore dropped from the study. Three respondents
reported teaching face-to-face and one person did not report this information, so these four
respondents were removed from the analyses. The final sample consisted of 100 instructors who
responded to at least 90% of the questions in the survey. The sample was mostly female instructors
(n = 65, 65%) with 34 (34%) male instructors. One person reported “other” as his/her gender
identity. Their age ranged from 25 to 68 years with a mean of 49.25 years and a standard deviation
of 10.71 years. Half of the participants (n = 48, 48%) taught undergraduate students and the other
participants (n = 51, 51%) taught graduate level courses. One participant did not report this
information. Most of the participants (n = 83, 83%) reported the delivery method, 72 asynchronous
online and 11 synchronous online, and the rest of them reported teaching hybrid courses (n = 17,
17%). Faculty were from various disciplines such as arts (n = 20, 20%), business (n = 7, 7%),
engineering (n = 6, 6%), health (n = 9, 9%), and education (n = 53, 53%). Five participants (n = 5,
5%) did not report this information.
Instrument
The instrument developed in a previous study on facilitation strategies by Martin, Wang
and Sadaf (2018) and administered to students was used in this study. The Cronbach’s alpha for
students’ responses to all items was .98, and that for students’ responses to items used to measure
instructor presence, instructor connection, and engagement was .91, .94, and .95, respectively
(Martin, Wang & Sadaf, 2018). The evidence of structural validity was measured by confirmatory
factor analysis and the results were satisfactory with all comparative fit index values greater than
.93, normed fit index values greater than .90, and standardized root mean residual values less than
.09 (Martin, Wang & Sadaf, 2018). The online facilitation strategies survey was developed after
conducting an extensive literature review on facilitation strategies in online courses and based on
the practical experience of expert online instructors. Participants were asked to rate each of the 12
facilitation strategies on a five-point Likert scale from 1 (Strongly Disagree) to 5 (Strongly Agree)
for the four aspects of facilitation strategies: instructor presence, instructor connection,
engagement, and learning. The questions that the online instructors were asked include:
(1) The following facilitation strategy helped me be present in my classroom (instructor presence);
(2) The following facilitation strategy helped me get to know my students (instructor connection);

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 35


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

(3) The following facilitation strategy helped me to engage my students in the online course, and;
(4) The following facilitation strategy helped me facilitate learning of the content.
The international consistency, measured by Cronbach’s alpha, for instructors’ responses
to all items was very satisfactory (.96). The Cronbach’s alpha for the subscales were .85 for
instructor presence, .88 for instructor connection, .87 for engagement, and .81 for learning. In
addition to these 12 items, two open-ended questions were used to solicit instructor use of
facilitation strategies in addition to those listed in the 12 items: (a) What are some facilitation
strategies that you use but not listed here and you have found it helpful? (b) What are some
facilitation strategies that you use but not listed here and you have found it least helpful?
Data Analytical Procedure
Participants’ perception of the facilitation strategies were reported with descriptive
statistics. Analysis of variance (ANOVA) was used to see if instructor perceptions of facilitation
strategies vary across gender, delivery method (hybrid versus online), level taught (undergraduate
versus graduate level courses), and discipline (education versus non-education majors). Pearson
correlation coefficients were used to represent relations between the perceptions of facilitation
strategies and age and the number of online courses taught. Thematic analyses were used to code
the instructor responses to open-ended questions.

Results
Facilitation Strategies
Instructors rated the helpfulness of facilitation strategies listed on the 12 items with a mean
of 3.80 and a standard deviation of 0.63. Descriptive statistics at the item level and subscale level
(instructor presence, instructor connection, engagement, learning) are presented in Table 2.

Table 2
Instructor Perception of the Helpfulness of Facilitation Strategies
Instructor Instructor
Facilitation Strategies Presence Connection Engagement Learning
M (SD) M (SD) M (SD) M (SD)
1 Video-based instructor introduction 3.83 (1.21) 3.40 (1.35) 3.55 (1.25) 3.39 (1.24)
(e.g., Voicethread, animoto, Camtasia )
2 Video-based course orientation (e.g., 3.87 (1.14) 3.05 (1.22) 3.61 (1.12) 3.69 (1.13)
recording using Camtasia, screencast-
o-matic)
3 Able to contact the instructor in 4.47 (0.78) 4.31 (0.88) 4.29 (0.85) 3.91 (1.01)
multiple ways (Contact the Instructor
Forum, Email, Phone, Virtual Office
hours)

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 36


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

4 Instructors timely response to questions 4.74 (0.59) 4.18 (1.08) 4.50 (0.74) 4.45 (0.75)
(e.g.,within 24 to 48 hours) via forums,
email
5 Instructors weekly announcements to 4.17 (1.02) 3.25 (1.35) 3.92 (1.12) 3.78 (1.09)
the class (e.g. Every Monday via
announcement forum, email)
6 Instructor created content in the form 3.94 (1.09) 3.02 (1.29) 3.91 (1.04) 4.05 (1.06)
of short videos/multimedia (e.g.,
Camtasia, articulate modules)
7 Instructor being present in the 3.93 (1.16) 3.92 (1.14) 3.97 (1.13) 3.85 (1.12)
discussion forums (e.g., refers to
students by name, responds to students’
posts)
8 Instructors timely feedback on 4.62 (0.69) 4.09 (1.15) 4.43 (0.78) 4.43 (0.75)
assignments/projects (e.g., within 7
days).
9 Instructor’s feedback using various 3.58 (1.26) 3.31 (1.31) 3.61 (1.22) 3.70 (1.15)
modalities (e.g., text, audio, video, and
visuals) on assignments/projects.
10 Instructors personal response to student 4.04 (1.10) 3.86 (1.18) 4.06 (0.95) 4.03 (1.05)
reflections (e.g., via journals to
questions on benefits/challenges)
11 Instructors use of various features in 3.43 (1.23) 3.45 (1.30) 3.55 (1.28) 3.49 (1.23)
synchronous sessions to interact with
students (e.g., polls, emoticons,
whiteboard, text, or audio and video
chat).
12 Interactive visual syllabi of the course 2.98 (1.18) 2.82 (1.27) 2.89 (1.23) 2.94 (1.20)
(e.g., includes visual of the instructor
and other interactive components)

Subscale Total 3.97 (0.65) 3.55 (0.82) 3.86 (0.68) 3.81 (0.64)

Participants rated item 4 (Instructor’s timely response to questions) most helpful for
instructor presence, engagement, and learning. Although the most helpful facilitation strategy for
instructor connection was item 3 (Able to contact the instructor in multiple ways), item 4 was rated
second to Item 3 only. Item 8 (Instructors timely feedback on assignments/projects) was rated
second highest as helpful for instructor presence, engagement and learning. Item 12 (Interactive
visual syllabi of the course) was rated least helpful for all four subscales: instructor presence,
instructor connection, engagement, and learning.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 37


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

Most-Helpful Instructor Facilitation Strategies


Responses to open-ended questions demonstrate most helpful facilitation strategies
(Table 3). Using group projects to support peer learning was rated as helpful by 13% of the
respondents. Another facilitation strategy rated helpful by 11% of the respondents was using
synchronous sessions to explain the content. An exemplary quote is “I have found an always
open synchronous chat to be helpful, especially with graduate students. Tools like this could
include Skype chat, Slack chat, and the like.”

Table 3
Most Helpful Facilitation Strategies
Codes Frequency Percentage
Group projects to support peer learning 15 13
Synchronous sessions to present content or answer 13 11
questions
Feedback to enhance communication between 9 8
students and instructor.
Students to take active role in leading discussions 8 7
or presenting projects etc.
Consistent course structure in terms of deadline and 8 7
content
Having personal interaction with students for 7 6
clarifying the concepts

Least-Helpful Instructor Facilitation Strategies


The least helpful facilitation strategies are presented in Table 4. Some participants (4%)
did not find synchronous sessions helpful as one of them said that “Synchronous sessions tend to
not be well attended and does not really encourage active learning due to limitations in how you
can present information.” Similarly, discussion boards are not effective for some instructors (4%):
for example, “Required replies to student discussion posts without a specific requirement or
prompt. Discussions too hard to read via threads and don't have sufficient new info to make the
effort valuable.”

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 38


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

Table 4
Least Helpful Facilitation Strategies
Codes Frequency Percentage
Synchronous sessions 4 4
Discussion boards 4 4
Group projects 3 3
Personal interactions 3 3
Announcements 2 2
Exams and quizzes 2 2

Demographics and Facilitation Strategies


Since the relationships between the subscales of facilitation strategies and the total score
(ranged from .87 to .95) were very high (Table 5), the total score of instructor facilitation strategy
instrument was used for the following analyses.

Table 5
Relationships between Subscales of the Facilitation Instrument
Connection Engagement Learning Total
Presence .73*** .89*** .81*** .93***
Connection .74*** .64*** .87***
Engagement .84*** .95***
Learning .89***
Total --
Note. *** p < .001.

Statistically significant differences were noted between male and female instructors who
teach online with respect to their perception of the helpfulness of facilitation strategies.
Specifically, female instructors endorsed the strategies more than male instructors: t (97) = 2.63,
p = .01, Cohen’s d = 0.54 (medium effect size). Results from four-way ANOVA suggested no
statistically significant differences in either delivery method, level taught or discipline after
controlling for gender. No statistically significant interaction effects were found either (p > .05).
Specifically, no statistically significant differences in instructor perception of facilitation strategies
were found between education and non-education faculty: F (1, 90) = 0.34, p = .56, partial η2 =
.004 (small effect size). No statistically significant differences in instructor perception of
facilitation strategies were found between faculty who teach online courses and faculty who teach
hybrid courses: F (1, 90) = 0.96, p = .33, partial η2 = .011 (small effect size). Moreover, no
statistically significant differences in instructor perception of facilitation strategies were found

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 39


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

between faculty who teach undergraduate courses and faculty who teach graduate courses: F (1,
90) = 0.96, p = .33, partial η2 = .011 (small effect size). Means and standard deviations of instructor
perception of facilitation strategies by gender, delivery method, and course level taught are
reported in Table 6.

Table 6
Means and Standard Deviations of Instructor Perception of Facilitation Strategies
Gender Delivery Level Discipline M SD n
Method Taught
Female Hybrid Under Non-Education 4.38 -- 1
Education 3.86 0.38 5
Graduate Non-Education 4.29 -- 1
Education 3.64 0.63 7
Online Under Non-Education 3.83 0.69 15
Education 3.81 0.33 10
Graduate Non-Education 3.77 0.62 7
Education 4.13 0.44 19
Male Hybrid Under Non-Education -- -- 0
Education 3.92 -- 1
Graduate Non-Education -- -- 0
Education 4.11 0.58 2
Online Under Non-Education 3.36 0.63 13
Education 2.41 2.00 2
Graduate Non-Education 3.68 0.50 8
Education 3.88 0.63 7

Discussion
In the following section, we discuss the most helpful and least helpful instructor facilitation
strategies based on instructors’ perception.
Timely response to questions/feedback is very helpful
Instructors rated timely response to questions and timely feedback on assignments/projects
as the two most helpful facilitation strategies in three out of the four constructs (instructor presence,
engagement, and learning). This is consistent with findings from research studies that indicate that
instructors' facilitation in terms of timely response to questions and timely feedback on
assignments are important in establishing instructor presence, student engagement in their courses,
and facilitating higher levels of learning (Hodges & Cowan, 2012; Sheridan & Kelly, 2010).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 40


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

Martin, Wang and Sadaf (2018) noted that instructor’s timely response and feedback were highly
valued by students. There is consistency in both students and instructors valuing the importance
of timeliness in online course facilitation. When instructors respond promptly it establishes
immediacy and reduces isolation for the online students. Instructors can use a variety of strategies
to provide timely responses, including a group forum where all students can see the questions
posted, periodic virtual office hours, and providing collective feedback.
Group Work is helpful
In the open-ended comments, instructors rated group projects as a helpful facilitation
strategy. Research has shown the benefits of group work. Koh, Barbour, and Hill (2010) identified
strategies for instructors to improve online group work that include assist group formation, build
a sense of connection, be involved in group processes and evaluate group processes. Chang and
Kang (2016) recommend instructors to split group work into individual portions, use peer
evaluation, create guidelines for communication, and oversee group work processes. Instructors in
the open-ended comments listed that group work supports peer learning and these studies confirm
the findings.
Synchronous Session helpful or not
In the quantitative data, the synchronous session was rated as average helpful by the
instructors, and the open-ended comments showed 11% of the instructors consider it as helpful.
Synchronous session was considered least helpful by 4% of the instructors. There has been mixed
perception by instructors on the benefits of synchronous session. In online programs that are
entirely asynchronous instructors may not see the benefit of facilitation strategies since their
students are not mandated to participate in synchronous sessions. Moreover, in this study 72% of
the instructors primarily taught in an asynchronous format. According to Lowenthal, Dunlap, and
Snelson (2017), faculty avoid using synchronous communication for various reasons including not
having to be in class at specific time, scheduling, and technological challenges. Instructors who
see the benefit of synchronous sessions use it to assist student’s isolation and provide immediacy
in online environment (Martin & Parker, 2014).
Visual Syllabi is the least helpful
Using Visual Syllabi was rated the least helpful by the instructors for all four constructs.
Although there is reference in the literature about the importance of using a visual syllabi, it is not
widely researched (Grigorovici, Nam, & Russill, 2003; Richards, 2003). Like the instructors,
students rated the visual syllabi the least helpful (Martin, Wang & Sadaf, 2018). They
recommended that for visual syllabi to be beneficial, online instructors should create syllabi with
hyperlinks and visuals where students can easily find information they need and answers to all
their questions.
Demographics
Female instructors rated the facilitation strategies higher than male instructors. Female
faculty were more interested in online teaching compared to male faculty (Shea, 2007) and to be
more confident in instructional skills and involved in course development than male faculty
(Chang, Lin, & Song, 2011; Seaman, 2009). The sample in this study included 65% of female
instructors and majority of them teaching online (83%). The findings on high ratings of the female
instructors is consistent with the previous studies.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 41


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

Implications and Recommendations


This study has implications for online instructors, instructional designers and
administrators. The results of this study recommend helpful facilitation strategies for instructors
who teach online. It is not only important for the instructors to design an effective online course,
but also be an effective facilitator.
Instructors
Use a variety of facilitation strategies: All the facilitation strategies except the one to use
visual syllabi were rated strongly high. This shows that instructors who teach online could use all
the 11 strategies in their online teaching and benefit from it. Most important, it is essential for the
instructors to provide timely responses to questions and provide timely feedback along with
providing multiple ways for the students to contact them.
Set aside time for facilitation: It is important to reserve time for facilitation of the online
course. This will assist in responding to questions and in grading work and providing timely
feedback to students. Some of the creative strategies to provide timely responses include using a
common forum for questions that saves them time from responding to questions individually, using
a frequently asked questions that students can read and benefit from, and hosting synchronous
office hours to answer student questions. Re-using feedback comments, providing collective
feedback to the class, using various modalities to provide feedback (audio, video) in situations
where it saves time will assist in providing timely feedback.
Include policies for facilitation in syllabus: Regarding providing timely feedback to
students, it is essential to have a policy in the syllabus on the timeframe when students can expect
to receive feedback.
Instructional Designers
This study also has recommendations for instructional designers who support instructors
in the design of the online course. Facilitation Strategies are important in addition to design. In
many cases, the emphasis is placed on design when instructional designers work with instructors.
The findings from this study recommends that instructional designers also recommend faculty to
build in various facilitation practices in their online courses. All the recommendations listed above
for the online instructors also apply for the instructional designers.
Administrators
This study has recommendations for administrators who provide support for instructional
designers and faculty who teach online.
Teaching evaluation: When peer observation or teaching evaluation forms and processes
are created, it is important to include items on facilitation along with design. More and more
campuses are adopting processes such as Quality Matters that focus on design only. The findings
from this study emphasizes that in addition to design, facilitation is also important. A well-
designed course if not implemented well will not be effective.
Policies on online teaching: Creating policies on presence of online instructors in the online
courses will enable instructors to be present in the online course.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 42


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

Limitations and Future Research


This study is limited due to small sample size and low response rate. We received only 100
complete responses, with a response rate of 4.8%, from the instructors surveyed because the AECT
email list includes faculty, practitioners, and students. We did not have access to the email list of
only faculty. Only faculty who teach online or hybrid were requested to complete the survey. In
addition, the data were self-reported so there might be a response bias. For example, instructors
who chose to respond to the survey might be different from those who chose not to. In this case,
the data would not be representative of the population and conclusions reached in this study would
be limited in external validity. Moreover, only the 12 facilitation strategies identified in a previous
study (Martin, Wang & Sadaf, 2018) were used in the survey. They might not be an extensive list
of strategies. Future research studies could examine other facilitation strategies and use qualitative
methods to interview expert instructors to identity more facilitation strategies in online teaching.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 43


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

References
Ally, M. (2004). Foundations of educational theory for online learning. In Terry, A. & Elloumi,
F. (Ed.), Theory and practice of online learning (pp. 3–32). Athabasca University.
Al-Salman, S. M. (2011). Faculty in online learning programs: Competencies and barriers to
success. Journal of Applied Learning and Technology, 1(4), 6–13.
Anderson, T., Rourke, L., Garrison, R., & Archer, W. (2001). Assessing teaching presence in a
computer conferencing context. Journal of Asynchronous Learning Networks, 5(2), 1–17.
https://auspace.athabascau.ca/bitstream/handle/2149/725/assessing_teaching_presence.pd
f?sequence=1
Badiee, F., & Kaufman, D. (2014). Effectiveness of an online simulation for teacher education.
Journal of Technology and Teacher Education, 22(2), 167–186.
https://www.learntechlib.org/primary/p/45934/
Baker, C. (2010). The impact of instructor immediacy and presence for online student affective
learning, cognition, and motivation. Journal of Educators Online, 7(1), 1–30.
doi:10.9743/jeo.2010.1.2
Banna, J., Lin, M. F. G., Stewart, M., & Fialkowski, M. K. (2015). Interaction matters: Strategies
to promote engaged learning in an online introductory nutrition course. Journal of Online
Learning and Teaching, 11(2), 249. http://jolt.merlot.org/Vol11no2/Banna_0615.pdf
Bawane, J., & Spector, J. M. (2009). Prioritization of online instructor roles: Implications for
competency-based teacher education programs. Distance Education, 30(3), 383–397. doi:
10.1080/01587910903236536
Berge, Z. L. (1995). Facilitating computer conferencing: Recommendations from the field.
Educational technology, 35(1), 22–30. https://www.jstor.org/stable/44428247
Berge, Z. (2008). Changing instructor’s roles in virtual worlds. Quarterly Review of Distance
Education, 9(4), 407–415. https://www.learntechlib.org/p/106706/
Bigatel, P., & Williams, V. (2015). Measuring student engagement in an online program. Online
Journal of Distance Learning Administration, 18(2).
https://www.westga.edu/~distance/ojdla/summer182/bigatel_williams182.html
Bolliger, D. U., & Martin, F. (2018). Instructor and student perceptions of online student
engagement strategies. Distance Education, 39(4), 568–583.
Borup, J., West, R. E., & Thomas, R. (2015). The impact of text versus video communication on
instructor feedback in blended courses. Educational Technology Research and
Development, 63(2), 161–184. https://www.learntechlib.org/p/159957/
Boton, E. C., & Gregory, S. (2015). Minimizing attrition in online degree courses. Journal of
Educators Online, 12(1), 62–90. doi:10.9743/jeo.2015.1.6
Brinkerhoff, J., & Koroghlanian, C. M. (2007). Online students' expectations: Enhancing the fit
between online students and course design. Journal of Educational Computing Research,
36(4), 383–393. doi:10.2190/r728-28w1-332k-u115

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 44


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

Cates, W., & Slagter van Tryon, P. (2002). Thinking systematically about online learning:
Strategies for enhancing social connectedness. Paper presented at the annual meeting of
the Association for Educational Communications and Technology, Dallas, TX.
Cavanaugh, A. J., & Song, L. (2014). Audio feedback versus written feedback: Instructors' and
students' perspectives. Journal of Online Learning and Teaching, 10(1), 122.
http://jolt.merlot.org/vol10no1/cavanaugh_0314.pdf
Chang, B., & Kang, H. (2016). Challenges facing group work online. Distance Education, 37(1),
73–88. doi:10.1080/01587919.2016.1154781
Chang, T. S., Lin, H. H., & Song, M. M. (2011). University faculty members’ perceptions of
their teaching efficacy. Innovations in Education and Teaching International, 48(1), 49–
60. doi:10.1080/14703297.2010.543770
Creasey, G., Jarvis, P., & Gadke, D. (2009). Student attachment stances, instructor immediacy,
and student–instructor relationships as predictors of achievement expectancies in college
students. Journal of College Student Development, 50(4), 353–372.
doi:10.1353/csd.0.0082
Creasey, G., Jarvis, P., & Knapcik, E. (2009). A measure to assess student-instructor
relationships. International journal for the scholarship of teaching and learning, 3(2), n2.
doi:10.20429/ijsotl.2009.030214
D'Alba, O. A. (2014). A case study of student instructor connectedness in an asynchronous
modular online environment (Doctoral dissertation). Retrieved from
https://scholarworks.gsu.edu/msit_diss/140/
Delaney-Klinger, K., Vanevenhoven, J., Wagner, R., & Chenoweth, J. (2014). Faculty transitions
in online delivery: Make or buy? Tips for developing a 'new to you' online course.
Journal of College Teaching & Learning, 11(1), 45–52. doi:10.19030/tlc.v11i1.8396
Dixson, M. D. (2010). Creating effective student engagement in online courses: What do
students find engaging? Journal of the Scholarship of Teaching and Learning, 10(2), 1–
13. https://scholarworks.iu.edu/journals/index.php/josotl/article/view/1744
Draus, P. J., Curran, M. J., & Trempus, M. S. (2014). The influence of instructor-generated video
content on student satisfaction with and engagement in asynchronous online classes.
Journal of Online Learning and Teaching, 10(2), 240.
http://jolt.merlot.org/vol10no2/draus_0614.pdf
Eskey, M. T., & Schulte, M. (2010). What online college students say about online instructors
and what do online faculty members say about online instruction: A comparison of
attitudes. Journal of Online Education, 1–20.
http://www.nyu.edu/classes/keefer/waoe/eskeym.pdf
Gallien, T., & Oomen-Early, J. (2008). Personalized versus collective instructor feedback in the
online courseroom: Does type of feedback affect student satisfaction, academic
performance and perceived connectedness with the instructor? Internal Journal on E-
Learning, 7(3), 463–476. https://www.learntechlib.org/primary/p/23582/
González-Sanmamed, M., Muñoz-Carril, P., & Sangrà, A. (2014). Level of proficiency and
professional development needs in peripheral online teaching roles. International Review

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 45


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

of Research in Open & Distance Learning, 15(6), 162–187.


doi:10.19173/irrodl.v15i6.1771
Grigorovici, D., Nam, S., & Russill, C. (2003). The effects of online syllabus interactivity on
students' perception of the course and instructor. The Internet and Higher
Education, 6(1), 41–52. doi:10.1016/S1096-7516(02)00163-X
Gustafson, P., & Gibbs, D. (2000). Guiding or hiding? The role of the facilitator in online
teaching and learning. Teaching Education, 11(2), 195–210.
Hew, K. F. (2015). Student perceptions of peer versus instructor facilitation of asynchronous
online discussions: further findings from three cases. Instructional Science, 43(1), 19–38.
doi:10.1007/s11251-014-9329-2
Hodges, C. B., & Forrest Cowan, S. (2012). Preservice teachers’ views of instructor presence in
online courses. Journal of Digital Learning in Teacher Education, 28(4), 139–145.
doi:10.1080/21532974.2012.10784694
Hosler, K. A., & Arend, B. D. (2012). The importance of course design, feedback, and
facilitation: student perceptions of the relationship between teaching presence and
cognitive presence. Educational Media International, 49(3), 217–229.
doi:10.1080/09523987.2012.738014
Hsiao, E. L. (2012). Synchronous and asynchronous communication in an online environment:
Faculty experiences and perceptions. Quarterly Review of Distance Education, 13(1), 15.
https://www.learntechlib.org/p/131977/
Jones, P., Naugle, K., & Kolloff, M. (2008). Teacher presence: Using introductory videos in
hybrid and online courses. Learning Solutions. Retrieved March 26, 2018, from
www.learningsolutionsmag.com
Kelly, R. (2014). Five things online students want from faculty. Faculty focus. Retrieved
November 20, 2019 from
https://www.saddleback.edu/uploads/goe/five_things_online_students_want_from_facult
y_-_faculty_focus.pdf
Ko, S., & Rossen, S. (2010). Teaching online: A practical guide. Routledge.
Koh, M. H., Barbour, M., & Hill, J. R. (2010). Strategies for instructors on how to improve
online groupwork. Journal of Educational Computing Research, 43(2), 183–205.
doi:10.2190/EC.43.2.c
Kuh, G. D., Cruce, T. M., Shoup, R., Kinzie, J., & Gonyea, R. M. (2008). Unmasking the effects
of student engagement on first-year college grades and persistence. Journal of Higher
Education, 79(5), 540–563. doi:10.1353/jhe.0.0019
LaBarbera, R. (2013). The relationship between students’ perceptive sense of connectedness to
the instructor and satisfaction in online courses. Quarterly Review of Distance Education,
14(4), 209. https://www.infoagepub.com/qrde-issue.html?i=p54c3c328b31d0
Lowenthal, P. R. (2010). Social presence. In Social computing: Concepts, methodologies, tools,
and applications (pp. 129–136). IGI Global. doi:10.4018/9781605669847.ch011

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 46


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

Lowenthal, P., Dunlap, J., & Snelson, C. (2017). Live Synchronous Web Meetings in
Asynchronous Online Courses: Reconceptualizing Virtual Office Hours. Online Learning
Journal, 21(4). doi:10.24059/olj.v21i4.1285
Mandernach, B. J., Gonzales, R. M., & Garrett, A. L. (2006). An examination of online
instructor presence via threaded discussion participation. Journal of Online Learning and
Teaching, 2(4), 248–260. http://jolt.merlot.org/vol2no4/mandernach.htm
Martin, F., Budhrani, K., Kumar, S., & Ritzhaupt, A. (2019). Award-winning faculty online
teaching practices: Roles and competencies. Online Learning, 23(1), 184–205.
Martin, F., & Parker, M. A. (2014). Use of synchronous virtual classrooms: Why, who, and how.
MERLOT Journal of Online Learning and Teaching, 10(2), 192–210.
Martin, F., Wang, C., & Sadaf, A. (2018). Student perception of helpfulness of facilitation
strategies that enhance instructor presence, connectedness, engagement and learning in
online courses. The Internet and Higher Education, 37, 52–65.
doi:10.1016/j.iheduc.2018.01.003
Micari, M., & Pazos, P. (2016). Fitting in and feeling good: the relationships among peer
alignment, instructor connectedness, and self-efficacy in undergraduate satisfaction with
engineering. European Journal of Engineering Education, 41(4), 380–392.
doi:10.1080/03043797.2015.1079814
National Center for Educational Statistics. (2017). Digest of Education Statistics.
https://nces.ed.gov/programs/digest/d17/tables/dt17_311.33.asp?current=yes
Newman, F. M., Wehlage, G. G., & Lamborn, S. D. (1992). The significance and sources of
student engagement. In F. M. Newman (Eds.), Student engagement and achievement in
American secondary schools. (pp. 11–39) Teachers College, Columbia University.
Pappas, C. (2014). From instructor to effective online facilitator. eLearning Industry. Retrieved
January 16, 2020, from https://elearningindustry.com/from-instructor-to-effective-online-
facilitator
Richards, S. L. (2003). The interactive syllabus: A resource-based, constructivist approach to
learning. The Technology Source.
Richardson, J., & Swan, K. (2003). Examining social presence in online courses in relation to
students' perceived learning and satisfaction. Journal of Asynchronous Learning
Networks. 7(1). 68–88. http://hdl.handle.net/2142/18713
Richardson, J. C., A. Sadaf, and P. A. Ertmer. (2012). Relationship between types of question
prompts and critical thinking in online discussions. In Educational communities of
inquiry: Theoretical framework, research and practice, ed. Z. Akyol and D. R. Garrison,
197–222. Hershey, PA: IGI Global. doi:10.4018/978-1-4666-2110-7.ch011
Richardson, J. C., Koehler, A. A., Besser, E. D., Caskurlu, S., Lim, J., & Mueller, C. M. (2015).
Conceptualizing and investigating instructor presence in online learning environments.
The International Review of Research in Open and Distributed Learning, 16(3).
doi:10.19173/irrodl.v16i3.2123

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 47


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

Richardson, J. C., Besser, E., Koehler, A., Lim, J., & Strait, M. (2016). Instructors’ perceptions
of instructor presence in online learning environments. The International Review of
Research in Open and Distributed Learning, 17(4). doi:10.19173/irrodl.v17i4.2330
Robinson, J. (2011). Assessing the value of using an online discussion board for engaging
students. Journal of Hospitality, Leisure, Sports and Tourism Education (Pre-2012),
10(1), 13. doi:10.3794/johlste.101.257
Rohfeld, R. W., & Hiemstra, R. (1995). Moderating discussions in the electronic classroom.
Computer Mediated Communication and the Online Classroom, 3, 91–104.
http://roghiemstra.com/moderating.html
Rose, K. K. (2009). Student perceptions of the use of instructor-made videos in online and face-
to-face classes. MERLOT Journal of Online Learning and Teaching, 5(3).
http://jolt.merlot.org/vol5no3/rose_0909.htm
Rovai, A. P. (2007). Facilitating online discussions effectively. The Internet and Higher
Education, 10(1), 77–88. doi:10.1016/j.iheduc.2006.10.001
Ryan, A. M., Gheen, M. H., & Midgley, C. (1998). Why do some students avoid asking for help?
An examination of the interplay among students' academic efficacy, teachers' social–
emotional role, and the classroom goal structure. Journal of Educational Psychology,
90(3), 528. doi:10.1037//0022-0663.90.3.528
Sadaf, A., & Olesova, L. (2017). Enhancing cognitive presence in online case discussions with
questions based on the practical inquiry model. American Journal of Distance Education,
31(1), 56–12. doi:10.1080/08923647.2017.1267525
Santilli, S., & Beck, V. (2005). Graduate faculty perceptions of online teaching. Quarterly
Review of Distance Education, 6(2). https://www.infoagepub.com/qrde-
issue.html?i=p54c3cb86a9a11
Seaman, J. (2009). Online Learning as a Strategic Asset. Volume II: The Paradox of Faculty
Voices—Views and Experiences with Online Learning. Results of a National Faculty
Survey, Part of the Online Education Benchmarking Study Conducted by the APLU-Sloan
National Commission on Online Learning. Association of Public and Land-Grant
Universities. https://files.eric.ed.gov/fulltext/ED517311.pdf
Shea, P. (2007). Bridges and barriers to teaching online college courses: A study of experienced
online faculty in thirty-six colleges. Journal of Asynchronous Learning Networks, 11(2),
73–128. https://onlinelearningconsortium.org/jaln_full_issue/volume-11-issue-2-july-
2007/
Shea, P., Li, C. S., & Pickett, A. (2006). A study of teaching presence and student sense of
learning community in fully online and web-enhanced college courses. The Internet and
Higher Education, 9(3), 175–190. doi:10.1016/j.iheduc.2006.06.005
Sheridan, K., & Kelly, M. A. (2010). The indicators of instructor presence that are important to
students in online courses. MERLOT Journal of Online Learning and Teaching, 6(4),
767–779. http://jolt.merlot.org/Vol6_No4.htm

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 48


Facilitation Matters: Instructor Perception of Helpfulness of Facilitation Strategies in Online Courses

Schindler, L., & Burkholder Jr, G. J. (2014). Instructional design and facilitation approaches that
promote critical thinking in asynchronous online discussions: A review of the literature.
Higher Learning Research Communications, 4(4), 10–29. doi:10.18870/hlrc.v4i4.222
Swan, K. (2001). Virtual interaction: Design factors affecting student satisfaction and perceived
learning in asynchronous online courses. Distance Education, 22(2), 306–331.
doi:10.1080/0158791010220208
Tanner, J. R., Noser, T. C., & Totaro, M. W. (2009). Business faculty and undergraduate
students' perceptions of online learning: A comparative study. Journal of Information
Systems Education, 20(1), 29. https://www.learntechlib.org/p/105718/
Thiele, J. E. (2003). Learning patterns of online students. Journal of Nursing Education, 42(8), 3.
doi:10.3928/0148-4834-20030801-08
Vesely, P., Bloom, L., & Sherlock, J. (2007). Key elements of building online community:
Comparing faculty and student perceptions. MERLOT Journal of Online Learning and
Teaching, 3(3), 234–246. http://jolt.merlot.org/vol3no3/vesely.htm
Wang, Y. M. (2014). Questioning as facilitating strategies in online discussion. Journal of
Educational Technology Systems, 42(4), 405–416. doi:10.2190/ET.42.4.f
Wei, C. W., & Chen, N. S. (2012). A model for social presence in online classrooms.
Educational Technology Research and Development, 60(3), 529–545.
doi:10.1007/s11423-012-9234-9

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 49


Social Media Learning Activities (SMLA): Implications for Design


Social Media Learning Activities (SMLA):
Implications for Design

Ghania E. Zgheib
University of Balamand, Lebanon

Nada Dabbagh
George Mason University

Abstract
This study explored how experienced faculty are using social media to support student learning. It
analyzed the types of social media learning activities (SMLAs), their design, the cognitive
processes that they support, and the types of knowledge that students engage in when completing
SMLAs. Data gathered from five different cases of six faculty using social media in their courses
revealed that social media has the potential to support student learning and promote different levels
of cognitive processes and types of knowledge. Results also revealed that experienced faculty
select social media tools based on their technology features or their popularity in the field of study,
and they recommend integrating several media sources in the design of a single SMLA.
Furthermore, this study suggested that experienced faculty who use social media, specifically
wikis and blogs, use them as Learning Management Systems (LMS). Finally, the social factor of
social media was not evident in the design of the learning activities, and faculty reported promoting
more dialogue in their revised SMLA. The findings of this study yielded significant considerations
for faculty when designing SMLA.

Keywords: social media, social media learning activities, social media design,
instructional design, online learning, LMS

Zgheib, G. E., & Dabbagh, N. (2020). Social media learning activities (SMLA): Implications for
design. Online Learning, 24(1), 50-66. https://doi.org/10.24059/olj.v24i1.1967

Social Media Learning Activities (SMLA): Implications for Design


In the last two decades, the world has experienced a degree of networked digital
connectedness that exceeds the limits of traditional communication tools such as phone or email.
The rise of social media over the last ten years has led to a wired universe impacting the way
people interact with each other and the way they process the wealth of information surrounding
them. Social media technologies have become integral in today’s learning environments,
especially for college students, leading to a paradigm shift in the education system calling for
learner collaboration, personalization, and user-generated content.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 50


Social Media Learning Activities (SMLA): Implications for Design

Social media, also referred to as Web 2.0 applications or technologies (Ravenscroft,


Warburton, Hatzipanago & Conole, 2012; Valjataga, Pata, Tammets, 2011), are defined as "a
group of Internet-based applications that build on the ideological and technological foundations of
Web 2.0, and that allow the creation and exchange of user-generated content” (Kaplan & Heinlein,
2010, p.61). There are hundreds of social media technologies at the user’s disposal and many of
them share similar affordances like networking, communicating, and sharing while other tools
have additional distinguishing features. Kitsantas and Dabbagh (2010) and Dabbagh and Reo
(2011b) classified social media technologies as follows:
• Experience- and resource-sharing tools that enable online/social bookmarking,
blogging, wiki-ing, and microblogging such as Delicious, WordPress, PbWorks, and
Twitter;
• Media sharing tools that enable social tagging such as Flickr and YouTube;
• Social networking tools that enable socio-semantic networking such as Facebook and
LinkedIn;
• Communication tools such as email and web-conferencing tools like Skype.
In a recent EDUCAUSE Center for Applied Research (ECAR) study, Brooks and
Pomerantz (2017) reported that 97% of undergraduate students own a smartphone and 78% of
these students consider these devices as moderately important for their academic achievement. The
use of mobile devices and mobile apps are driving forces in the increase of adopting social media
(Bannon, 2012). Perrin (2015) reported that 90% of young adults (18–29 year-olds) use social
media. Likewise, in a 2015 ECAR study, students requested the use of social media as a learning
tool (Dahlstrom, et al., 2015). Research shows that students are mainly using social media
technologies for collaboration through online file sharing tools, online sharing of information
through websites, tracking and managing their academic schedule, and communicating with peers
(Smith, 2017).
On the other hand, faculty adoption of social media to support student engagement and
learning has been on the rise. Seaman and Tinti-Kane (2013) reported that 41% of faculty in higher
education use social media in their teaching with a higher percentage in the Humanities and Arts
disciplines. They also reported that faculty mostly use wikis and blogs for instructional purposes
and prefer using online videos through YouTube and similar platforms as course resources.
Similarly, del Valle, Gruzd, Haythornthwaite, Paulin and Gilbert (2017) reported that multimedia
repositories, social networking sites, and document sharing tools where the most commonly used
tools for teaching. Del Valle et al. (2017) also reported a correlation between faculty personal use
of social media and academic use; the more faculty use social media for personal benefit, the more
likely they are to integrate these tools in their teaching.
While social media use for teaching and learning is on the rise at the tertiary level, few
studies have examined how faculty are designing learning activities using social media and
whether faculty are leveraging the intrinsic or integral affordances of social media for teaching.
Understanding how experienced faculty are using social media in higher education is essential to
developing best practices for implementing social media in teaching and learning contexts.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 51


Social Media Learning Activities (SMLA): Implications for Design

Social Media as Learning Resources


A review of existing research suggests a positive impact of social media on student
learning, specifically on students’ engagement with peers and with the content, and as tools to
supplement classroom teaching (Yang & Chang, 2012; Churchill, 2009; Rambe, 2012; Hung &
Yuen, 2010; Domizi, 2013; Fox & Varadarajan, 2011; Menkhoff & Bengtsson, 2012; Lichter,
2012). These studies revealed that faculty from different disciplines including education,
pharmacy, language learning, public administration, information technology, science, business,
music, and visual arts are using social media to support their face-to-face or online courses.
Studies revealed that social media learning activities mainly engaged the students in
connecting with peers and with learning outside the classroom, commenting on each other’s work,
collaborating, and creating projects through microblogging platforms, social networking sites,
media sharing tools, and experience and resource sharing tools. A review of the literature
conducted by Zachos, Paraskevopoulou-Kollia and Anagnostopoulos (2018) synthesized the
benefits of using online social networks (OSNs) like Facebook and Twitter in education. Their
findings suggested that OSNs support student formal and informal learning, provide opportunities
for students to be exposed to new perspectives for learning through virtual communities, enhance
student communication, collaboration, and motivation.
Furthermore, blogs have been used for writing essays, giving students opportunities to
comment on each other’s blogs, access course material, post course artefacts, form online groups
and as a reflection journal (Chawinga, 2017; Churchill, 2009; Farwell & Kruger-Ross, 2013;
Gedera, 2011; Yang & Chang, 2012). Wikis have been used as collaboration tools to complete
group projects and Capstone projects, for peer reviewing and editing, for sharing resources, asking
questions, and reflecting on readings (Abdekhodaee, Chase & Ross, 2017; Berthude & Gliddon,
2018; Bonne & Lin, 2013; Franklin & Thankachan, 2013; Hu & Johnston, 2012; Oskoz & Elola,
2011; Park et al., 2010). Social networking tools are used for asking and answering questions and
participation in discussion forums, sharing resources, inviting guest speakers, and posting
notifications and reminders (Cain & Policastri, 2011; Hung &Yuen, 2010; Irwin, Ball, Desbrow
& Leveritt, 2012; Junco, 2012; Omar, Embi, &Yunus, 2012; Rambe, 2012). Microblogging tools
such as Twitter are being used to post tweets about a course topic, tweet class announcements and
reminders, discuss a topics in class and outside class, ask and answer questions, and vote on
answers (Andrade, Castro & Ferreira, 2012; Chawinga, 2017; Domizi, 2013; Fox & Varadarajan,
2011; Gao, Luo, & Zhang, 2012; Junco, Heibergert & Loken, 2011; Lin, Hoffman, & Borengasser,
2013). Media sharing tools such as YouTube and Flickr are being used to create a video and share
it, upload and tag photos, comment on photos and videos, summarize important lecture notes and
record demonstrations (Bussert, Brown, & Armstrong, 2008; Lehmen, Dufren & Lehman, 2010;
Lichter, 2012; Orùs, 2016; Price, Tsui, Hart & Saucedo, 2011). While the research is clear
regarding the benefits of social media use for learning, it is lacking in the area of designing social
media learning activities (SMLA). In other words, how are faculty integrating SMLA in their
teaching? Is there a well-defined process that guides the design of SMLA?
Social Media Learning Design Frameworks
Existing Web 2.0/social media learning design frameworks have taken into consideration
the interaction between technology and pedagogy. Bower, Hedberg, and Kuswara (2010) proposed
a Web 2.0 learning design process through the following steps: (a) identifying learning goals; (b)
identifying the type of knowledge that students should gain from the activity; (c) identifying the

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 52


Social Media Learning Activities (SMLA): Implications for Design

cognitive processes that the students should engage in; (d) selecting the type of pedagogy, and
finally; (e) selecting the “preferred modalities of representation” such as audio, video, and text.
Two main components of Bower et al.’s (2010) Web 2.0 learning design process are cognitive
processes established by Bloom’s Taxonomy of Cognitive Domains and knowledge dimensions or
types of knowledge, factual, conceptual, procedural, and metacognitive, proposed by Anderson
and Krathwohl (2001). Bower et al. presented a conceptual framework that cross-tabulated
Bloom’s revised cognitive processes with the types of knowledge and another component, types
of online pedagogies.
Similarly, Karvounidis, Chimos, Bersimis, and Douligeris (2015) presented i-SERF as a
guiding framework for the integration of social media in higher education. I-SERF is a two-layered
framework in which the first layer is educational and draws on the interaction between three forms
of knowledge: content, technology, and pedagogy while the second layer proposes an evaluation
methodology to the first layer. This framework adds the elements of the learner’s self-regulation
and self-evaluation that were missing in previous frameworks (Karvounidis, Chimos, Bersimis,
and Douligeris (2018).
Since Bloom’s taxonomy plays a key role in the design of learning activities, Bosman and
Zagenzysk (2011) and Lightle (2011) interpreted social media learning using Bloom’s Taxonomy.
For instance, they reported that social bookmarking promotes remembering, social blogging
promotes understanding, social file sharing supports applying, social collaboration supports
analyzing, social decision-making tools stimulate evaluating, and social creativity sharing tools
promote creating. However, Bosman and Zagenzysk’s (2011) and Lightle’s (2011) analysis of
social media in the light of Bloom’s taxonomy is only perceptual. Hence, there is a need to
formalize our understanding of social media use for learning and the levels of cognitive skills and
types of knowledge though evidence-based research.
Current Study and Research Questions
This study aimed to explore how experienced faculty are using social media to support
learning activities in their courses. More specifically, it aimed to analyze social media learning
activities (SMLA) in light of cognitive processes and types of knowledge that students engage in
when completing these activities. Research questions addressed in this study were:
a. What types of learning activities are designed through social media?
b. What cognitive processes do SMLA promote?
c. What types of knowledge do SMLA promote?
d. What strategies do experienced faculty use to design SMLA?

Method
This study was conducted in a public higher education institution in the mid-Atlantic region
of the U.S. A qualitative approach was used with quantification of some results. A multiple case-
study design was implemented and data was gathered from five cases of six faculty (n = 6) who
were using social media in their courses for at least two years. Students enrolled in the six courses
taught by the faculty participants were considered secondary participants, and consented to
observation of their course-related posts in the examined SMLA. Out of 279 students who were

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 53


Social Media Learning Activities (SMLA): Implications for Design

enrolled in the six courses, 115 (n = 115) students gave consent to the researcher to observe their
course-related social media posts.

Table 1
Description of Participants
Number of
Course Title in Semesters Course Years in Year Number of Number of
Faculty Which SM is Teaching Delivery Higher Started Students Student
Participants Used this Course Format Ed Using SM per Course Consent
Digital Future: 2 Hybrid 17 1997 18 N/A
Faculty A Digital Activism 6 credits
(DFDA)
Faculty B1 Food, Culture, 3 Face-to-Face 15 2007 6 5
and Technology Non-credit
Faculty B2 3 18 2010
(FTC)
Leading Change 3 Face-to-Face 19 2009 25 22
(LC) 4 credits
Faculty C Leadership
Theory and 3 Face-to-Face 20 16
Practice (LTP) 3 credits
Introduction to 1 Face-to-Face 9 2005 25 22
Faculty D Digital Studies 3 credits
(IDS)
Introduction to 5 Face-to-Face 3 2011 185 50
Business 3 credits
Faculty E
Information
Systems (IBIS)

Data Sources
Data sources included syllabi and course documents describing the social media learning
activities (SMLAs), students’ posts in SMLAs, and faculty initial and follow-up interviews. The
syllabi and the descriptions of the SMLA provided baseline data about the requirements and
deadlines that guided the analyzes of the SMLAs. Faculty participants were interviewed at the
beginning and end of the semester in initial and follow-up interviews giving participants the
freedom to express their range of perceptions about the use of social media in their courses
(Maxwell, 2013). Both interviews were semi-structured and included open-ended questions. In the
initial interview, faculty were asked to analyze their SMLAs in light of Bloom’s taxonomy, and
they were asked about their perceptions regarding social media to support student learning, the
criteria they use to choose their social media, and strategies they used to develop the learning
activities involving social media. In the follow-up interview, faculty were asked to describe their
experiences with the outcomes of the social media activity, whether it has achieved what it was
intended to achieve, the types of knowledge that students gained, and revisions they would make
to their SMLAs. Social media platforms used by the faculty and the students were also observed
online and then students’ posts and interactions in the SMLA were analyzed. The focus of the

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 54


Social Media Learning Activities (SMLA): Implications for Design

observations was to identify cognitive processes and knowledge domains observed in students’
SMLA posts.
Data Analysis
Influenced by Bower et al.’s conceptual framework for Web 2.0 learning design, two
taxonomies guided the analysis of the SMLAs in this study: original and digital versions of
Bloom’s Taxonomy of Cognitive Domain (Churches, 2009) (see Figure 1), and Knowledge
Dimensions or Types of Knowledge (Anderson & Krathwohl, 2001).
Krathwohl (2002) provided a detailed explanation of the different types of knowledge:
• Factual Knowledge—The basic elements that students must know to be acquainted with a
discipline or solve problems in it.
• Conceptual Knowledge—The interrelationships among the basic elements within a larger
structure that enable them to function together.
• Procedural Knowledge—How to do something; methods of inquiry, and criteria for using
skills, algorithms, techniques, and methods.
• Metacognitive Knowledge—Knowledge of cognition in general as well as awareness and
knowledge of one's own cognition (p. 215).
As presented in Tables 3 and 4, both the researcher “R” and the faculty participants “F”
analyzed the social media activities as described in the syllabi. Content analysis of SMLAs was
conducted using preestablished categories pulled from Bloom’s Taxonomy of Cognitive Domains
and Krathwohl’s (2000) Knowledge Dimensions. The students’ posts in social media were also
analyzed by the researcher using the preestablished categories. Percentages in Tables 3 and 4
suggest the extent to which cognitive processes and knowledge dimensions where evident in the
students’ posts. The boxes that include “F” indicate that Faculty identified the presence of the
corresponding cognitive process or knowledge domain in the SMLA and the “R” shows the
researcher’s analysis of the SMLAs. Patterns relevant to the absence of cognitive processes and
knowledge domains were identified based on triangulated data from faculty analysis, researcher’s
analysis, and students’ posts. In some boxes, the researcher’s analysis and the analysis of students’
posts highly converged as indicated by a percentage greater than 50.
In order to achieve fairness in the analysis of students’ posts in SMLAs, 30% of the posts
in each SMLA were selected, resulting in a total of 343 student posts analyzed. The 30% of posts
were sampled from students’ beginning, middle, and end of activity, in order to analyze the
students’ work across the whole activity. The researchers conducted the same analysis to achieve
inter-rater reliability.
Initial and follow-up interviews were analyzed using deductive coding (Miles &
Huberman, 1994). Deductively, categories from the initial and follow-up interview questions were
first established based on the research questions that were addressed in the interviews. Further,
open coding was conducted to analyze data that does not align with the preestablished categories.
Credibility was established by obtaining member checks, triangulation of data, and long-term
involvement in data collection. Since this multiple-case study is holistic in nature, a meta-matrix
was created in order to focus on the findings across cases rather than on every individual case
(Miles & Huberman, 1994).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 55


Social Media Learning Activities (SMLA): Implications for Design

Figure 1. Bloom’s digital taxonomy (adapted from Churches [2009]).

Research Findings
The analysis revealed that overall, social media has the potential to support student learning
and promote different levels of cognitive processes and types of knowledge. The findings are
reported by research questions below.
Research Question 1: What types of learning activities are designed through social media?
A total of 12 SMLA across the five courses were identified in this study. Out of these
activities, there were four microblogging activities, two blogging activities, three wiki activities,
one podcasting activity, one infographic activity integrated into a blog, and one YouTube activity
(see Table 2). Out of the 12 SMLA, 2 were unstructured used for informal class reminders,
announcements and discussions while 10 were structured, graded, and described in the syllabi. The
latter represented 5% to 100% of the total course grade. Seven out of the structured 10 activities
were mandatory and the remaining three were optional. In the optional activities, students had the
alternative to select SMLA or traditional non-social media activities identified in the course
syllabus that would count toward the course grade.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 56


Social Media Learning Activities (SMLA): Implications for Design

Table 2
Social Media Learning Activities (SMLAs) Included in the Study

Social Media Social Media Learning Private Structured vs. Mandatory Course
and Course Activities (SMLAs) vs. Public Unstructured vs. Optional Grade
Title Percentage
Microblogs Personal Transformation Public Structured Mandatory 15%
Leading Experiment using
Change (LC) Twitter.

Introduction to Online Class Public Structured Optional 5%


Business Participation at Twitter
Information
Systems
(IBIS)

Digital
Digital Activism Twitter Public Structured Mandatory 15%
Futures:
Projects
Digital
Activism
(DFDA)
DFDA Twitter in-class and small Public Unstructured Optional Unspecified
group participation
Blogs Language Blog Public Structured Mandatory 100%
Food, Culture
and
Technology
(FCT)

Introduction to Digital Studies Course Public Structured Mandatory 20%


Digital Studies Blog
(IDS)
Wiki Collaborative Note- Private Structured Mandatory 25%
Leadership Taking
Theory and Wiki as LMS Private Unstructur-ed Optional Unspecified
Practice
(LTP)
Podcasts Podcasting Public Structured Mandatory Unspecified
FCT
Infographic Creating Infographics Public Structured Mandatory Unspecified
FCT
YouTube Participatory Action Private Structured Mandatory 50%
DFDA Video using YouTube

Wikipedia Wikipedia Public Structured Optional 25%


DFDA

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 57


Social Media Learning Activities (SMLA): Implications for Design

The use of the social media technologies to support the SMLAs was either private or open
to the public, allowing any person to observe the students’ work or interact with them. Nine out of
12 SMLAs were public and three were private. Microblogging or Twitter activities were all public
because the tool does not have private features. Two blogging activities were public. Both were
also searchable online, although only specified users could contribute to them. Wiki activities were
private and access to them requires an invitation from the wiki administrator. However, Wikipedia
activity was public because students had to edit an existing Wikipedia entry and could get feedback
on their edits from the public. Podcasts and infographics activities were public since they were
posted on a public blog while the YouTube activity was also private, since students posted their
videos privately to YouTube and only students and faculty had access to them.
Research Question 2: What cognitive processes do SMLAs promote?
As explained in the data analysis, the SMLAs were analyzed using Bloom’s original and
digital taxonomy of cognitive processes to identify the level of cognitive processes that students
are expected to achieve while completing the learning activities, as well as evidence of students’
cognitive processes in their SMLA posts. The analysis of the data across courses and social media
technologies revealed two overarching themes. First, both higher and lower levels of cognitive
processes were evidenced through SMLAs. Second, alignment was perceived between particular
social media affordances and cognitive processes.
These overarching themes were based on common patterns observed in the analysis. Based
on Bloom’s Digital Taxonomy, “Remembering” and “Understanding” were perceived as basic
cognitive processes promoted in all the examined SMLAs (see Table 3). The analysis of blogging
and wiki activities revealed that blogs and wiki SMLAs may promote several cognitive processes
ranging from “Remembering” to “Creating.” Furthermore, the analysis suggested that higher levels
of cognitive processes may be promoted mainly by blogs, wikis, and media sharing tools such as
the Collaborative Note Taking activity, the Language Blog, and the Digital Studies Course Blog.
Finally, the results suggested that SMLAs may promote “Analyzing” through hyperlinking and
may promote “Evaluating” through judging and critiquing peer work.

Table 3
Sample Analysis of SMLAs Based on Bloom’s Taxonomy
Cognitive Processes
Social Media Remember Understand Apply Analyze Evaluate Create
Activities
Twitter: Personal F F
Transformation R R
Experiment (PTE) 37.2% 95% 8.13% 12.8% 33.7%
Twitter: Online Course
Participation (and F F
sharing resources) R R R R
50% 22% 12.9% 1.2%
Note. Letter “F” indicates the faculty member’s content analysis of the SMLA as presented in the syllabus. Letter
“R” shows the researcher’s content analysis of the SMLA as described in the syllabus. The % shows the researcher’s
analysis of the presence of cognitive processes in the students’ posts on social media.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 58


Social Media Learning Activities (SMLA): Implications for Design

Research Question 3: What types of knowledge do SMLAs promote?


Both faculty and the researcher analyzed the knowledge domains perceived in SMLAs as
listed in the course syllabi and documents. The students’ posts in SMLAs were also analyzed in
terms of types of knowledge using pre-established categories (see Table 4). The convergence in
the data analysis revealed two overarching themes: all types of knowledge were evidenced through
SMLA and there is a perceived alignment between particular social media affordances and types
of knowledge.
The overarching themes were based on common patterns observed in the data analysis. The
analysis of students’ posts in seven SMLAs revealed that students achieved factual knowledge
about the course content in five out of the seven observed SMLAs, which suggests that factual
knowledge is a common outcome in SMLAs. Results also suggested that linking and tagging in
SMLAs promoted conceptual knowledge especially in activities that required them to use
#hashtags or include links to external resources. Procedural knowledge was mainly evident in
activities that required students to create a product such as a Personal Language Blog, podcasts, or
infographics. In both activities, students engaged in a procedure to create the final product and to
learned how to use it. Finally, metacognitive knowledge was identified in three out of seven
activities that were examined. Students were expected to think about their learning or how they
might use the subject matter to reflect on their own cognition. The design of the SMLAs in these
courses suggested that students had several chances to reflect on their learning and revise their
posts before sharing them.
Table 4
Sample Analysis of the SMLAs Based on Krathwohl’s (2002) Knowledge Dimensions
Knowledge Domain
Social Media Activities Factual Conceptual Procedural Metacognitive

Twitter: Personal
Transformation Experiment F
(PTE) R R 1% R
44% 65% 60%

Twitter: Online course F


participation R F
68% 54%
Note. Letter “F” indicates the faculty member’s content analysis of the SMLA as presented in the syllabus and
shows that the knowledge domain was present in this SMLA. Letter “R” shows the researcher’s content analysis of
the SMLA as described in the syllabus. The % shows the researcher’s analysis of the presence of knowledge
domains in the students’ posts on social media.

Research Question 4: What strategies do experienced faculty use to design SMLA?


Experienced faculty with social media were selected for this study in order to capture best
practices in designing SMLAs. In the initial interview, faculty were asked about the criteria they
used to select social media technologies, and how they paired it with the learning activity. The
follow-up interview captured faculty’s reflection on the SMLA and suggestions for future

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 59


Social Media Learning Activities (SMLA): Implications for Design

revisions. The two overarching themes that emerged in data collected for this research question
were Faculty Reliance on Social Media Affordances and Fit With their Course Content and
Integrating Additional Media Sources to Enhance SMLAs. These themes were the result of six
common patterns observed across findings related to strategies that faculty use when designing
SMLAs:
a. Matching the discipline with the social media;
b. Selecting social media based on affordances;
c. Taking advantage of the affordances of social media;
d. Including media sharing (website, video, audio) in the SMLA;
e. Integrating tools or social media affordances that support dialogue, and;
f. Making the SMLA mandatory and not optional.

Discussion
Evidence of Several Cognitive Processes and Types of Knowledge in SMLAs
The analyzed SMLA in this study suggested that all social media tools could promote more
than one type of knowledge or level of cognitive processes depending on the design of the SMLA
and how students use the social media technology, a finding that is in line with Bower et al. (2010),
Bosman and Zagenczyk (2011), Lightle (2011), and Gülbahar, Rapp, Killis and Sitnikova (2017).
Findings suggested that wiki SMLAs can promote all levels of cognitive processes, and can
support Factual, Conceptual, and Metacognitive knowledge. Blog SMLA can also foster all levels
of cognitive processes and can support all types of knowledge, a finding that resonates with
Gülbahar et al. (2017). Microblog SMLA can promote Remembering, Understanding, and
Analyzing, and foster Factual, Conceptual, and Metacognitive Knowledge. Podcast SMLA can
support Creating, Applying, and Remembering, and promote all types of knowledge. Finally,
media editing and sharing SMLA can support Creating, Understanding, and Remembering, and
promote Factual, Conceptual, and Metacognitive Knowledge.
The Absence of Dialogue
One of social media’s roles is to promote social presence through social networking in
addition to shareable user-generated content (Anderson, 2017). As social media is grounded in
social learning theory of Bandura, it is supposed to enhance students’ self-efficacy beliefs through
social interaction in a low-risk environment (Deaton, 2015). The examined SMLAs in this study
did not require conversational or interaction tasks among students. This was evidenced in the
description of the SMLAs and in the deactivation of the comment feature in the blogging activities,
the lack of comments in wikis, and sparse commenting or retweeting between students on Twitter.
As a result, the communication took place mainly between faculty-students and not between
students-students or students-others. This suggested that the design of the SMLAs were mainly
used at the level of “private information management,” and “basic interaction or sharing,” without
taking advantage of the social networking affordance of social media (Dabbagh & Reo, 2011a).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 60


Social Media Learning Activities (SMLA): Implications for Design

Social Media as Learning Management Systems


Analysis and observations of SMLAs revealed that four out of the five cases in this study
used mostly wikis and blogs as social media technologies in their courses, a finding that concurs
with Moran, Seaman, and Tinti-Kane (2012), who suggested that wikis and blogs are faculty’s
most adopted social media tools for teaching. Furthermore, the findings revealed that social media
is used to replace Learning Management Systems (LMS) and share course content or communicate
with students. More specifically, in courses where blogs and wikis were used, the faculty did not
use the institution’s LMS to share content and communicate with students. Rather, wikis and blogs
were used as an integrative platform to share content with students, post assignment descriptions,
and allow students to share their work. In other studies, Meishar-Tal, Kurtz, and Pieterse (2013),
Zachos et al. (2018), and Salavuo (2008) reported the advantages of using social media as LMS in
promoting collaboration and active learning over traditional institutional LMSs.
This study went a step further and suggested that the public nature of blogs gives them an
advantage over LMSs, which are limited to the course participants. For instance, public blogging
activities made students’ work visible beyond their peers and teacher reaching out to a public
audience which made their posts of higher quality and activated their metacognitive knowledge.
In line with this finding, Chawinga (2017) reported the benefits of blogs as tools that allow students
to write longer posts and comments as there is no word limit which results in self-expression and
self-reflection (Deng & Yueng, 2011). Previous studies revealed blogs’ usage as LMSs in some
cases, and a platform for students to access course materials and to comment on each other’s blogs,
and in other cases, they are used as reflective journals or personal writing sites (Churchill, 2009;
Farwell & Kruger-Ross, 2013; Gedera, 2011; Yang & Chang, 2012).
Wikis are primarily used as collaboration tools and support peer reviewing and editing
(Abdekhodaee, Chase & Ross, 2017; Bonne & Lin, 2013; Franklin & Thankachan, 2011;
Menkhoff & Bengtsson, 2012; Ozkoz & Elola, 2011; Park et al., 2010). The wikis examined in
this study resembled LMS in their private access, but little evidence of student social interaction
was perceived. Hence, this study revealed that blogs and wikis were used for sharing course
content and assignments rather than promoting social interaction and collaboration among
students.
Strategies for Designing SMLAs
This study did not reveal a formal approach or strategy for designing SMLAs. Rather,
experienced faculty approached this task differently based on their familiarity with social media
technology, the popularity of the tool in their discipline, and affordances of the technology.
However, in the follow-up interviews, faculty suggested that SMLAs should be mandatory because
students should learn to experiment with technology. This finding resonated with Lin, Hoffman
and Borengasse (2013), who explained that Twitter activities should be structured and mandatory
so that students participate in them.
Bower et al. (2010) explained that the design of the learning activity and the selection of
social media are interdependent. When the faculty in this study designed the SMLAs, some were
more intuitive in how they selected the social media technology because they had been using it for
a while, while others designed the activity and selected the social media whose technology
affordances supported the learning goals of the learning activity. On the other hand, others selected
the social media technologies because they were popular and they could experiment with them and
add an innovative layer to their course delivery. Therefore, experienced faculty strategies for

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 61


Social Media Learning Activities (SMLA): Implications for Design

designing SMLAs concurs with Bower et al. (2010), who emphasized the interdependence
between social media tool and the design of learning activities. Integrating different media sources
within a SMLA was also another design feature that faculty recommended to help students gather
information from different sources, a finding confirmed by Soares (2008).
The findings also revealed that while faculty were not aware of Bloom’s Taxonomy or did
not design SMLA with cognitive processes and types of knowledge in mind, the researcher’s
analysis showed that SMLAs promoted different cognitive processes and different types of
knowledge. This finding suggests that faculty have little pedagogical training. In a previous study,
Keengwe, Kidd, and Kyei-Blankson (2009) and Hughes and Zulkifli (2012) explained that faculty
need organizational support and technology training in order to use technology in their teaching.
Twitter as a Popular Course Tool
Although Moran et al. (2012) revealed that faculty use Twitter the least in their courses,
Twitter was used by three faculty participants in three out of five cases in this study. Twitter
assignments in this study were mainly a micro-reflection activity and course participation tweets
about course topics. A more informal activity was in-class participation using Twitter. The findings
in this study concurred with previous studies that revealed Twitter as a reflection tool and a
platform to post tweets about course related topics (Domizi, 2013; Fox & Varadarajan, 2011;
Junco, Heibergert & Lokert, 2011; Lin, Hoffman, & Borengasser, 2013). However, there was little
evidence of communication using Twitter in the observed SMLAs, a finding that contradicted
previous research that claimed Twitter is a tool that supports communication with the professor
and classmates (Fox & Varadarajan, 2011; Junco, et al., 2011).

Conclusion
This study and previous studies implied that social media technologies may engage students
with the subject matter when integrated in course learning activities. Hence, designing SMLAs that
take into account the technology affordances of social media can engage students’ higher levels of
cognitive processes and knowledge.
Findings from this study inferred that faculty use of social media in their courses is varied.
SMLAs can promote learning as perceived by faculty participants in this study. The study also
suggested that wikis and blogs may replace and be used as LMS as perceived by faculty in this study.
Furthermore, well-structured SMLA activities should take into consideration the social affordances of
the tools to optimize the use of these activities and designing SMLAs is a process of reciprocity
between the selection of social media affordances and the fit of the tools. Mandatory use of SMLAs in
courses may ensure student engagement. The study also suggested that there is a perceived disconnect
between faculty intended and observed cognitive processes and types of knowledge of SMLAs. As a
result, faculty should receive pedagogical training and support to design more effective SMLAs.
Although the study examined the use of social media in higher education within cases and
across cases, because of the nonexperimental design of the study, the impact of social media activities
on students’ learning was not measured. Furthermore, the study was limited to faculty perceptions and
students’ posts in social media. Hence, students’ perceptions about these SMLA were not explored.
Due to the complexity of cognitive processes, identification of students’ processes was limited in cases
where students had short posts on social media. Furthermore, this study included faculty from a single
institution, which might have limited the external validity and the generalizability of the study. Further
research could involve the students in the evaluation of these SMLA and their impact on their learning.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 62


Social Media Learning Activities (SMLA): Implications for Design

References

Abdekhodaee, A., Chase, A.-M., & Ross, B. (2017). Wikis for group work: Encouraging
transparency, benchmarking, and feedback. Australasian Journal of Educational
Technology, 33(5), 15–31. doi.org/10.14742/ajet.2829
Anderson, T. (2017). How communities of inquiry drive teaching and learning in the digital age.
Retrieved from https://teachonline.ca/tools-trends/insights-online-learning/2018-02-27/how-
communities-inquiry-drive-teaching-and-learning-digital-age
Anderson, L., & Krathwohl, D. (2001). A taxonomy for learning, teaching and assessing: A revision
of Bloom’s taxonomy of educational objectives. Longman.
Andrade, A., Castro, C., & Ferreira, S. A. (2012). Cognitive communication 2.0 in higher education:
To tweet or not to tweet? Electronic Journal of E-learning, 10 (3), 293–305.
Bannon, D. (2012, December 4). State of the media: The social media report 2012. The Nielson
Company. http://www.nielsen.com/us/en/reports/2012/state-of-the-media-the-social-media-
report-2012.html
Berthoud, L., & Gliddon, J. (2018). Using wikis to investigate communication, collaboration and
engagement in Capstone engineering design projects. European Journal of Engineering
Education, 43(2), 247–263. doi.org/10.1080/03043797.2017.1332574
Bosman, L., & Zagenczyk, T. (2011). Revitalize your teaching: Creative approaches to applying
social media in the classroom. Social media and Platforms in Learning Environment, 3–15.
doi:10.1007/978-3-642-20392-3_1
Bower, M., Hedberg, J. G., & Kuswara, A. (2010). A framework for Web 2.0 learning design.
Educational Media International, 47(3), 177–198. doi:10.1080/09523987.2010.518811
Brooks, D. C., & Pomerantz, J. (2017). ECAR study of undergraduate students and information
technology, 2017. Research report. ECAR. Retrieved from
https://library.educause.edu/~/media/files/library/2017/10/studentitstudy2017.pdf
Bussert, K., Brown, N. E., & Armstrong, A. H. (2008). IL 2.0 at the American University in Cairo.
Internet Reference Services Quarterly, 13(1), 1–13. doi:0.1300/J136v13n01_01
Cain, J., & Policastri, A. (2011). Using Facebook as an informal learning environment. American
journal of pharmaceutical education, 75(10), 207. Retrieved from
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3279026/
Chawinga, W. D. (2017). Taking social media to a university classroom: teaching and learning using
Twitter and blogs. International Journal of Educational Technology in Higher Education, 14(3).
doi:10.1186/s41239-017-0041-6
Churches, A. (2009). Bloom’s digital taxonomy. Retrieved from
https://www.researchgate.net/publication/228381038_Bloom's_Digital_Taxonomy
Churchill, D. (2009). Educational applications of Web 2.0: Using blogs to support teaching and
learning. British Journal of Educational Technology, 40(1), 179–183. doi:10.1111/j.1467-
8535.2008.00865.x
Dabbagh, N., & Reo, R. (2011a). Impact of Web 2.0 on higher education. In D.W. Surry, T. Stefurak,
and R. Gray (Eds.), Technology integration in higher education: Social and organizational
aspects (pp. 174–187). IGI Global.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 63


Social Media Learning Activities (SMLA): Implications for Design

Dabbagh, N., & Reo, R. (2011b). Back to the future: Tracing the roots and learning affordances of
social software. In M.J.W. Lee and C. McLoughlin (Eds.), Web 2.0-based e-Learning: Applying
social informatics for tertiary teaching (pp. 1–20). IGI Global.
Dahlstrom, E., Brooks, D. C., Grajek, S., & Reeves, J. (2015). ECAR study of students and
information technology. Educause. Retrieved from
https://library.educause.edu/resources/2015/8/~/media/24ddc1aa35a5490389baf28b6ddb3693.as
hx
Deaton, S. (2015). Social learning theory in the age of social media: implications for educational
practitioners. Journal of Educational Technology, 12(1), 1–6.
Del Valle, M. E., Gruzd, A. Haythornthwaite, C. Paulin, D., & Gilbert, S. (2017). Social media in
educational practice: faculty present and future use of social media in teaching. Proceedings of
the 50th Hawaii International Conference on System Sciences. Retrieved from
https://www.researchgate.net/profile/Marc_Esteve_Del_Valle/publication/317121792_Social_M
edia_in_Educational_Practice_Faculty_Present_and_Future_Use_of_Social_Media_in_Teachin
g/links/59fc3bad0f7e9b9968bb8fc9/Social-Media-in-Educational-Practice-Faculty-Present-and-
Future-Use-of-Social-Media-in-Teaching.pdf
Domizi, D. P. (2013). Microblogging to foster connections and community in a weekly Graduate
seminar course. TechTrends, 57(1), 43–51. doi:10.1007/s11528-012-0630-0
Donne, V., & Lin, F. (2012). Special education teacher induction: The Wiki way. The Clearing
House, 86, 43–47. doi:10.1080/00098655.2012.735279
Deng, L., & Yuen, A. H. (2011). Towards a framework for educational affordances of blogs.
Computers & Education, 56(2), 441–451.
Farwell, T. M., & Kruger-Ross, M. (2013). Is there (still) a place for blogging in the classroom?
Using blogging to assess writing, facilitate engagement, and evaluate student attitudes. In K.K.J.
Seo (Ed.), Using social media effectively in the classroom: blogs, wikis, Twitter, and more (pp.
207–222). Routledge.
Fox, B., & Varadarajan, R. (2011). Technology in pharmacy education: Use of Twitter to encourage
interaction in a multi-campus pharmacy management course. American Journal of
Pharmaceutical Education, 75(5), 1–9. doi:10.5688/ajpe75588
Gedera, D. S. P. (2011). Integration of weblogs in developing language skills of ESL learners.
International Journal of Technology in Teaching and Learning, 7(2), 124–135.
Gülbahar, Y., Rapp, C., Kilis, S., & Sitnikova, A. (2017). Enriching higher education with social
media: development and evaluation of a social media toolkit. International Review of Research
in Open and Distributed Learning, 18(1), 23–39.
Hu, Q., & Johnston, E. (2012). Using a wiki-based course design to create a student-centered
learning environment: strategies and lessons. Journal of Public Affairs Education, 18(3), 493–
512.
Hung, H.-T., & Yuen, S. C.-Y. (2010). Educational use of social networking technology in higher
education. Teaching in Higher Education, 15(6), 703–714. doi:10.1080/13562517.2010.507307
Irwin, C., Ball, L., Desbrow, B., & Leveritt, M. (2012). Students’ perceptions of using Facebook as
an interactive learning resource at university. Australasian Journal of Educational Technology,
28(7), 1221–1232.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 64


Social Media Learning Activities (SMLA): Implications for Design

Junco, R., Heiberger, G., & Loken, E. (2011). The effect of Twitter on college student engagement
and grades. Journal of Computer Assisted Learning, 27(2), 119–132. doi:10.1111/j.1365-
2729.2010.00387.x
Junco, R. (2012). Too much face and not enough books: The relationship between multiple indices of
Facebook use and academic performance. Computers in Human Behavior, 28(1), 187–198.
doi:10.1016/j.chb.2011.08.026
Kaplan A. M. & Haenlein M., (2010). Users of the world, unite! The challenges and opportunities of
social media. Business Horizons, 53(1). doi:10.1016/j.bushor.2009.09.003
Karvounidis, T., Chimos, Κ., Bersimis, S., & Douligeris, C. (2015). I-SERF: An integrated self-
evaluated and regulated framework for deploying web 2.0 Technologies in Higher Education.
The Electronic Journal of e-Learning, 13(5), 319–333.
Karvounidis, T., Chimos, K., Bersimis, S., & Douligeris, C. (2018). Factors, issues and
interdependencies in the incorporation of a web 2.0 based learning environment in higher
education. Education and Information Technologies, 23(2), 935–955. doi:10.1007/s10639-017-
9644-8
Kitsantas, A., & Dabbagh, N. (2010). Learning to learn with Integrative Learning Technologies
(ILT): A practical guide for academic success. Information Age Publishing.
Krathwohl, D. R. (2002). A revision of Bloom’s taxonomy: An overview. Theory into Practice,
41(4), 212–218.
Lehman, C. M., DuFrene, D. D., & Lehman, M. W. (2010). YouTube video project: A “Cool” way to
learn communication ethics. Business Communication Quarterly, 73(4), 444–449.
doi:10.1177/1080569910385382
Lichter, J. (2012). Using YouTube as a platform for teaching and learning solubility rules. Journal of
Chemical Education, 89, 1133–1137. doi:10.1021/ed200531
Lin, M. F. G.,Hoffman, E. S., & Borengasser, C. (2013). Is social media too social for class? A case
study of Twitter use. TechTrends, 57(2), 39–45. doi:10.1007/s11528-013-0644-2
Maxwell, J. A. (2013). Qualitative research design an interactive approach. SAGE Publications.
Meishar-Tal, H., Kurtz, G., & Pieterse, E. (2013). Facebook groups as LMS: A case study. The
International Review of Research in Open and Distance Learning, 13(4), 33-48.
Menkhoff, T., & Bengtsson, M. L. (2011). Engaging students in higher education through mobile
learning: lessons learnt in a Chinese entrepreneurship course. Educational Research for Policy
and Practice, 11(3), 225–242. doi:10.1007/s10671-011-9123-8
Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook. (2nd
ed.). Sage.
Omar, H., Embi, M. A., & Yunus, M. M. (2012). ESL learners’ interaction in an online discussion
via Facebook. Asian Social Science, 8(11), 67–74. doi:10.5539/ass.v8n11p67
Orús, C., Barles, M. J., Belanche, D., Casalò, L., Fraj, E., & Gurrea, R. (2016). The effects of
learner-generated videos for YouTube on learning outcomes and satisfaction. Computers and
Education, 95, 254–269. doi:10.1016/j.compedu.2016.01.007

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 65


Social Media Learning Activities (SMLA): Implications for Design

Oskoz, A. & Elola, I. (2011). Meeting at the Wiki: The new arena for collaborative writing in foreign
language courses. In M. J. W. Lee and C. McLoughlin (Eds.), Web 2.0-based e-Learning:
Applying social informatics for tertiary teaching (pp. 209–227). IGI Global.
Park, C. L., Crocker, C. Nussey, J., Springate, J., & Hutchings, D. (2010). Evaluation of a teaching
tool—wiki—in online graduate education. Journal of Information Systems Education, 21(3),
313–321.
Perrin, A. (2015). Social Networking Usage: 2005–2015. Pew Research Center. Retrieved from
http://www.pewinternet.org/2015/10/08/2015/Social-Networking-Usage-2005-2015/
Pomerantz, J., & Brooks, C. (2017). ECAR study of faculty and information technology. Educause.
Retrieved from
https://library.educause.edu/~/media/files/library/2017/10/facultyitstudy2017.pdf
Price, E., Tsui, S., Hart, A., & Saucedo, L. (2011). Don’t erase that whiteboard! Archiving student
work on a photo-sharing website. The Physics Teacher, 49(7), 426. doi:10.1119/1.3639151
Rambe, P. (2012). Constructive disruptions for effective collaborative learning: Navigating the
affordances of social media for meaningful engagement. Electronic Journal of e-Learning,
10(1), 132–146.
Ravenscroft, A., Warburton, S., Hatzipanagos, S., & Conole, G. (2012). Designing and evaluating
social media for learning: shaping social networking into social learning? Journal of Computer
Assisted Learning, 28(3), 177–182. doi:10.1111/j.1365-2729.2012.00484.x
Seaman, J. & Tinti-Kane, H. (2013). Social media for teaching and learning. Pearson. Retrieved
from http://www.pearsonlearningsolutions.com/assets/downloads/reports/social-media-for-
teaching-and-learning-2013-report.pdf#view=FitH,0
Smith, E. E. (2017). Social media in undergraduate learning: Categories and characteristics.
International Journal of Educational Technology in Higher Education, 14(12).
doi:10.1186/s41239-017-0049-y
Soares, D. A. (2008). Understanding class blogs as a tool for language development. Language
Teaching Research, 12(4), 517–533.
Valjataga, T., Pata, K., & Tammets, K. (2011). Considering students’ perspectives on personal and
distributed learning environments in course design. In M. J. W. Lee and C. McLoughlin (Eds.),
Web 2.0-based e-Learning: Applying social informatics for tertiary teaching (pp. 85–108). IGI
Global.
Yang, C., & Chang, Y.S. (2012). Assessing the effects of interactive blogging on student attitudes
towards peer interaction, learning motivation, and academic achievements. Journal of Computer
Assisted Learning, 28, 126–135. doi:10.1111/j.1365-2729.2011
Zachos, G., Paraskevopoulou-Kollia, E.-A., & Anagnostopoulos, I. (2018). Social media use in
higher education: A review. Education Sciences, 8(4), 194. doi:10.3390/educsci8040194

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 66


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Shifting Teaching and Learning in Online Learning


Spaces: An Investigation of a Faculty Online Teaching
and Learning Initiative
Jayson W. Richardson, and John Eric M. Lingat
University of Kentucky

Ericka Hollis
Regis College

Mikah Pritchard
Eastern Kentucky University

Abstract
This article presents results from a study of a year-long, teaching and learning center-directed,
professional development initiative that focused on both the technology and the pedagogical
supports for online and blended course delivery at a research university. The purpose of this mixed
methods study was two-fold. The first purpose was to investigate pedagogical changes that
occurred as a result of the professional development that included a year-long faculty learning
community by exploring influences on pedagogical changes. The second purpose was to
understand the perceptions of the diffusion of innovations (DOI) characteristics that influenced the
level of adoption of online/blended teaching by faculty participants. A survey was used to measure
the perceived characters of innovation as defined in the theoretical framework. Following the
survey, one-on-one interviews that were linked to the DOI theoretical framework were conducted
to better understand those characteristics. The results presented herein focus on barriers,
challenges, and successes of adopting e-learning pedagogy in these online and blended learning
environments.

Keywords: online learning, higher education, professional development, diffusion of


innovations theory, faculty development, instructor development

Richardson, J.W., Hollis, E., Pritchard, M., & Lingat, J.E.M. (2020). Shifting teaching and
learning in online learning spaces: An investigation of a faculty online teaching and learning
initiative. Online Learning 24(1), 67-91. https://doi.org/10.24059/olj.v24i1.1629

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 67


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Review of Related Literature


Recent trends in higher education indicate that distance learning courses are in high
demand with over 31.6% of enrolled undergraduate, graduate, and noncredit students in the United
States taking at least one course in a fully online format (Allen & Seaman, 2018). Given the
popularity of online and blended courses and programs, 63.3% of chief academic officers in
institutions of higher education in the United States have integrated online or blended learning into
their long-term strategic planning (Allen & Seaman, 2018; Chen, Lambert, & Guidry, 2010). With
more than 6.3 million university students enrolled in online courses (Allen & Seaman, 2018), it
has become accepted that “institutions must ensure that online students receive high-quality
instruction, support services, and other fringe benefits enjoyed by traditional face-to-face students”
(Chen et al., 2010, p. 1229). Universities are attempting to meet this need through an array of
professional development opportunities for their instructors that focus on various aspects of
teaching and learning in a distance learning environment. While universities address the growing
complexities of distance education, studies regarding these efforts by institutions remain limited.
The existing literature often provides numerical figures that depict how many faculty
members adopted a given online teaching practice as a result of professional development.
Alternatively, researchers tend to list barriers or lessons learned that are disconnected from existing
innovation adoption or implementation theories. In contast, some existing research focuses on the
design and implementation of a given professional development program without considering how
these design decisions influenced adoption decisions by faculty members. Seldom does the
existing research use theory to support the investigation of these practices. Hence, there is a dearth
of literature at the nexus of theory, experiences of the instructors, and professional development
for online teaching and learning in higher education.
The existing literature base contains several studies of adopting online teaching among
higher education instructors in specific fields where the focus is expanding upon the nuances of
that field such as with agriculture (e.g., Drape, 2013) or with nursing (e.g., Cash & Tate, 2012).
The literature, however, rarely focuses on researching the adoption of distance education through
a theoretical framework. Additionally, the details of how professional development influenced
faculty members’ teaching approaches are seldom told. When this story of adoption is told, it is
usually captured in a single survey as in the work conducted by Shea (2007) where the researcher
used a survey to capture motivating and demotivating factors to teaching online.
Of the studies reviewed that focused specifically on course instructors’ professional
development in higher education around online learning, only four studies were located that explicitly
noted a theoretical framework that grounded the study. For example, Barker (2003) researched faculty
development that used change theory to leverage faculty buy-in. Additionally, Shipman (2017) used
the Substitution, Augmentation, Modification, and Redefinition (SAMR) model, which focuses on
technology’s impact on teaching and learning, to identify challenges and barriers to technology use in
university classrooms. Shea, Pickett, and Li (2005) used the DOI theory as a lens to analyze satisfaction
with online learning of faculty members with online learning of 913 faculty members in the State
University of New York (SUNY) Learning Network. A study by Wingo, Ivankova, and Moss (2017)
took a different approach and used the Technology Adoption Model (TAM) to organize a review of
the research to discuss what is known about faculty perceptions about teaching online. These theory-
driven research approaches to understanding the experiences of instructors at higher education
institutions with professional development for online and blended learning, though useful, remain
limited in the current literature body.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 68


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Nevertheless, there remains a paucity of, and yet increasing interest in, research focused
on how universities support e-learning efforts to improve online and blended teaching and
learning. As evidence, Mohr and Shelton (2017) conducted a four-survey-round Delphi study of
higher education leaders of online learning initiatives to determine best practices for online faculty
professional development. Mohr and Shelton found that professional development topics should
include training in faculty roles, classroom design, learning processes, and legal issues. This
research is compelling but does not bring to light the lived experiences of the stakeholders.
A limited number of existing studies of online professional development focus on training
faculty for blended course delivery (Childre & Van Rie, 2015; Linder, 2017; Littlefield, 2012;
Varkonyi, 2012), training faculty for online course delivery (Barker, 2003; Gunay, 2013; Keengwe
& Georgina, 2011), understanding factors that influence faculty satisfaction with asynchronous
teaching and learning (Fredericksen, Pickettt, Shea, Pelz, & Swan, 2000), and student engagement
in online learning (Chen et al., 2010). Few studies addressed both online and blended course
deliveries (Powell, 2010). Some studies take an anecdotal approach and explain how a given
training was conducted and what worked or did not work in that training (Linder, 2017; Terantino
& Agbehonou, 2012). Nevertheless, these studies lack a theory to drive the investigation.
Alas, the ever-changing nature of online and blended learning, coupled with a broad
conception of professional development, makes comparing studies difficult. For example, studies
of professional development around distance education in higher education institutions include on-
demand training (Sullivan, Burns, Gradel, Shi, Tysick, & van Putten, 2013), traditional seated
courses (Linder, 2017; Littlefield, 2012; Powell, 2010), workshops (Keengwe & Georgina, 2011),
and faculty mentorship programs (Barker, 2003; Childre & Van Rie, 2015). Despite these efforts,
there is a lack of empirical research that connects faculty experiences and perceptions of their
professional development with e-learning and the resultant shifts in their attitudes and teaching
approaches with regards to online and blended learning.
Given the lack of empirical research published in peer-reviewed journals on this topic, it is
possible that this knowledge remains contained within universities as internal evaluations. Thus,
it is likely that most e-learning program evaluations are reported internally within a given
university and not shared with the outside world. Another complication is that professional
development opportunities might be constrained to the implementation in a specific college or
department, rather than a university-wide implementation. The few published works that exist
typically take the approach of anecdotally explaining how a given training was conducted and what
worked or did not work (e.g., Linder, 2017; Terantino & Agbehonou, 2012) or understanding
motivators and demotivators to teaching online (Shea, 2007). Success is typically based on an
internally developed self-reported survey instrument that has not been analyzed for validity or
reliability. Concomitantly, these studies are often devoid of a theoretical approach. Thus, there is
a need to disseminate research on e-learning professional development that is theoretically driven,
situated in institutions of higher education, and captures the lived experiences of the stakeholders.
In this study, this multilayered approach is taken.
Theoretical Framework
The diffusion of innovations (DOI) theory was used to guide the current research. The
primary focus of diffusion research is to understand the adoption of a given innovation (Rogers,
1962). This theory was chosen as it is prominent in research studies situated in instructional
technology as well as general postsecondary faculty development (Drape, Westfall-Rudd, Doak,

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 69


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Guthrie, & Mykerezi, 2013; Grosz, 2012; Huun & Hughes, 2014; Jordan et al., 2012; Lewis &
Slapak-Barski, 2014; Martin, Parker, & Allred, 2013; Molina, 2013; Soffer, Nachmias, & Ram,
2010). This theory has also been used to understand technology initiatives such as massively open
online courses (MOOCs) (Claffey, 2015), technology policy diffusion (DeRousie, 2014), team-
based learning (Freeman, 2012), mobile campuses (Han & Han, 2014), personalized learning
(Karmeshu & Nedungadi, 2012), adoption of online education by traditional liberal arts colleges
(Hollis, 2016), and technology in the education systems of developing countries (Richardson,
2009, 2011). The theory has also been used to understand changes in organizational culture
(Shiflett, 2013). Additionally, the DOI theory has been applied to determining barriers to the
continued growth of online teaching based on faculty satisfaction in the entire SUNY Learning
Network (Shea, Pickett, & Li, 2005). According to Meyer (2004), Rogers’ theoretical model has
been used in thousands of studies across many fields including education and technology (e.g.,
sociology, marketing, public health, economics).
Rogers (2003) defined an innovation as “an idea, practice, or object that is perceived as new
by an individual or other unit of adoption” (p. 12) and noted how “diffusion is the process in which
an innovation is communicated through certain channels over time among the members of a social
system” (p. 5) through four fundamental elements: innovation, communication channels, time, and
social system. This definition indicates a critical point—the newness of the “idea, practice, or
object”—is not objectively measured but rather based on the perception of the adopter. DOI seeks
to explain the processes through which ideas, practices, or objects are communicated and thereby
adopted by members of a particular social system.
There are five characteristics of innovation that explain differences in adoption rates:
relative advantage, compatibility, complexity, trialability, and observability. These five attributes
account for most of the variance (between 49–87%) in the rate of adoption of an innovation (Rogers,
1962). Subsequently, research regarding these attributes has been further conducted, modified,
operationalized, and expanded by Moore and Benbasat (1991), who generated three additional
adoption constructs (see Table 1). The authors included: image (the degree to which the use of a
system is perceived to enhance one’s image or status in one’s social system); voluntariness (the
degree to which use of the innovation is perceived as being of free will); and result demonstrability
(the ability to show results of using an innovation).
While Rogers (1962) provided a general approach to the theory, Moore and Benbasat
(1991) focused specifically on the adoption of information technology innovations. As such,
Moore and Benbasat created an instrument to measure the eight characteristics. Given the
increasing demand for online and blended courses, the limited body of literature on e-learning
professional development in higher education, and the need to use theory to understand this
innovation in higher education, this study is both timely and needed.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 70


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Table 1
Description of the Perceived Characteristics of Innovation

PCI Description

Relative Advantage Degree to which an innovation is perceived as a better idea measured by


economics, social factors, convenience, and satisfaction

Image Degree the innovation enhances one’s reputations with peers

Compatibility Degree of perceived consistency with one’s values, experiences, and needs

Ease of Use Perceived degree of difficulty with using the innovation

Visibility Degree the innovation is visible

Results Demonstrability Degree one can see results using the innovation

Trialability Degree the innovation can be experimented or practiced

Voluntariness Degree using the innovation is viewed to be voluntary


Source. Rogers, E. M. (1962). Diffusion of innovations. Free Press and Moore, G. C.; Benbasat, I. (1991).
Development of an instrument to measure the perceptions of adopting an information technology
innovation. Information System Research, 23, 192–220.

Method
A mixed method sequential explanatory design (Creswell & Plano-Clark, 2018) was used
in this study so that quantitative results could be further explored through the collection and
analysis of qualitative interview data. An initial survey was used to measure the perceived
characters of innovation as defined in the theoretical framework. Following the survey, one-on-
one interviews that were linked to the DOI theoretical framework were conducted to better
understand those characteristics. The research questions guiding this study were:
1. What pedagogical changes occurred as a result of the professional development and
subsequent year-long faculty learning community?
2. How did the perceptions of the diffusion of innovations characteristics influence the
level of adoption of online/blended teaching by participants?
Project Background
The University of Kentucky launched the eLearning Innovation Initiative (eLII) in 2014.
The eLII provided funding for the creation of new online or blended degree programs and the
innovative redesign of large-lecture courses. Recruitment for participation in this training initiative
occurred via email. The Center for the Enhancement of Learning and Teaching (CELT) emailed
all faculty and instructors at the university through an open call for applications. Participation was
open to anyone who wanted to participate. Thirty-six faculty members received eLII professional
development funding and agreed to participate in two training initiatives.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 71


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Phase 1 of the initiative was a week-long, face-to-face professional development workshop


that occurred in the summer. Phase 2 required faculty members to participate in monthly, face-to-
face faculty learning communities (FLCs) for one year. These FLCs consisted of eight to ten
faculty members and were led by an instructional coach from CELT. The year-long FLCs were
designed as opportunities for small groups of faculty members to come together monthly to share
their experiences with their own online and blended efforts. Each FLC was tasked with creating a
resource that would be of service to the other learning communities. This practice allowed each
group to work on a given topic while discussing the challenges and successes experienced by
individual faculty members.
Participants
After Institutional Review Board approval, all 36 course instructors who participated in the
training were emailed a link to the DOI survey on January 8, 2015. Of the possible participants,
31 out of 36 completed the online survey thus yielding an 86.1% response rate. The last question
on the survey linked to a new survey where participants were asked to volunteer to engage in an
interview. Thirteen of the 31 survey completers indicated their willingness to be interviewed. The
interviews were conducted via Uberconference. The interviews ranged from 30 to 45 minutes long.

Table 2
Survey & Interview Participants by Rank

Instructor Survey Interview


Role N = 31 N = 13
Lecturer n = 11 n=5
Assistant n=5 n=3
Associate n=7 n=4
Full n=6 n=1
Other n=1 n=0
Unknown n=1 n=0

Measures
Survey instrument. The survey used to measure DOI characteristics was a slightly altered
version of the Moore and Benbasat (1991) survey (see Appendix A). The survey used a 4-point
Likert-type scale and consisted of eight scales with a total of 25 items. Items were reworded for the
eLII professional development program such that “personal work stations” was replaced with
“skills gained from the eLII professional development.” This initial instrument was developed and
tested by Moore and Benbasat in three stages: item creation, scale development, and instrument
testing in two pilot rounds and two field test rounds. The parsimonious instrument was developed
with “a high degree of confidence in their content and construct validity” (p. 210).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 72


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

In addition to the survey, three 5-point Likert-type scaled questions were used for
participants to self-assess their level of adoption of the training techniques. In this study, this score
is referred to as an innovation score. Here, participants rated their level of adoption using digital
technology, blended learning, and online learning. The scale ranged from 1 (last to adopt) to 5 (first
to adopt). Each participant received one innovation score that was calculated by averaging answers
to the three items.
Semi-structured interviews. Interviews were conducted to explore survey responses
further, providing concrete examples about the experience. This additional investigation allowed
the exploration of latent themes and underlying trends that may not have been immediately evident.
Questions for the semi-structured interview protocol (see Appendix B) were designed to explore
the constructs on the Moore and Benbasat (1991) survey. Hence, interview questions were
designed to understand better the eight theory-driven constructs detailed in Table 1.
Data Analysis
Analysis of the quantitative data began with running tests for reliability to determine if this
population responded to the survey differently than tested in the construction of the original
instrument. ANOVAs were conducted to determine if and how characteristics of innovation
accounted for innovation uptake. Next, t-tests were run to determine if the instructors who
completed the survey and then were interviewed differed on the eight perceived characteristics of
innovation from those who only completed the survey. This was done to ascertain if selection bias
existed for the individuals interviewed.
The quantitative analysis was followed by the analysis of the interviews. Analysis of the
qualitative data began with an a priori coding scheme that was restricted to the eight characteristics
defined by the DOI framework (see Table 1). As a first step, one coder coded all data within the
eight constructs. After coding for these constructs, the codebook was expanded by the team to
include codes related to perceptions of professional development as they related to the theoretical
framework. As a second step, using inductive coding, one researcher coded all the transcripts. A
second and third researcher confirmed all codes. This allowed the team to capture deep rich details
about the professional development as it related to the theory-driven characteristics.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 73


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Results
Internal consistency of reliability was investigated for the eight individual characteristics
using coefficient alpha (see Table 3). Most characteristics had a Cronbach’s alpha of greater than
0.80, with only visibility (⍺ = 0.79) and trialability (⍺ = 0.69) falling below this level. The internal
consistency of the trialability characteristics being the lowest of all constructs is similar to what
was reported by Moore and Benbasat (1991). The internal consistency of reliability for the entire
instrument was considered suitable (⍺ = 0.92).

Table 3
Diffusion of Innovations Short Scales Cronbach’s Alpha

Characteristic Number of Items Cronbach’s alpha Cronbach’s


reported by Moore and alpha of the
Benbasat (1991) current study

Compatibility 3 0.86 0.90

Ease of use 4 0.84 0.94

Image 3 0.79 0.99

Relative advantage 5 0.90 0.90

Results demonstrability 4 0.79 0.92

Trialability 2 0.71 0.69

Visibility 2 0.83 0.79

Voluntariness 2 0.82 0.86

A one-way ANOVA was used to compare the effect of innovation score on the DOI
characteristics for the 31 participants who completed the survey to determine if there were group
differences. There was not a significant effect of innovation level on any characteristic at the p <
0.05 level. These results indicate that the survey did not accurately capture the degree to which the
participants adopted this innovation, which could be attributed to the small sample size (see Cohen,
1992). Table 4 provides the innovation score for each of the 13 interview participants.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 74


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Table 4
Interview Participants Descriptive Summary

Participant Faculty Gender College Innovation Score Adoption Level


Rank

Instructor A Lecturer Female Fine Arts 5.00 Early

Instructor B Associate Male Education 4.67 Early

Instructor C Associate Female Business 4.67 Early

Instructor D Lecturer Female Business 4.67 Early

Instructor E Associate Male Law 4.33 Moderate

Instructor F Lecturer Female Engineering 4.33 Moderate

Instructor G Lecturer Female Communication & 4.33 Moderate


Information

Instructor H Lecturer Female Communication & 4.0 Moderate


Information

Instructor I Assistant Male Arts and Sciences 3.67 Moderate

Instructor J Assistant Female Design 3.33 Late

Instructor K Assistant Female Education 3.33 Late

Instructor L Full Male Arts and Sciences 2.67 Late

Instructor M Associate Male Education 2.67 Late

The results from an independent samples t-test were used to determine if interview
participants differed from the rest of the population on scales (see Table 5). No statistically
significant differences were found between interview participants (n = 13) and participants who
only completed the survey but did not interview (n = 18). Thus, it is believed that selection bias
was not an issue. The qualitative results reported below are constrained to only those faculty
members who completed the survey and participated in the interviews (n = 13).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 75


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Table 5
Results of t-Test and Descriptive Statistics for Diffusion of Innovations Survey Short-Scale by
Interview Participation
Participant 95% CI for
Mean
Did Not Interviewed Difference
Interview

M SD n M SD n t df

Compatibility 3.09 0.69 18 2.97 0.67 13 -0.63, 0.39 -0.47 29

Ease of Use 2.75 0.75 17 2.77 0.84 13 -0.58, 0.62 0.07 28

Image 2.31 0.76 18 1.85 0.90 13 -1.08, 0.14 -1.57 29

Relative 3.03 0.71 18 2.86 0.70 13 -0.70, 0.35 -0.67 29


Advantage
Results 3.01 0.71 18 3.04 0.83 13 -0.54, 0.59 0.09 29
Demonstrability
Trialability 2.67 0.51 18 2.38 0.71 13 -0.73, 0.17 -1.28 29

Visibility 2.47 0.76 18 2.23 0.86 13 -0.84, 0.35 -0.83 29

Voluntariness 2.69 1.11 18 2.54 1.31 13 -1.05, 0.74 -0.36 29

* p < .05.

Based on interview results, participants most frequently discussed their experiences related
to relative advantage, compatibility, and trialability. Faculty also shared experiences regarding
online teaching in general and professional development specifically. Their innovation scores were
taken into consideration when interpreting the interview. The three adoption classifications
previously used were carried forward into this analysis and were determined based on the rounding
of each participant’s innovation score. Innovation scores that rounded to 5 were considered early
adopters. Moderate adopters were those who had a rounded score of 4. Individuals with a rounded
score of 3 were considered late adopters. These classifications were considered acceptable based
on the idea of a normal distribution or a bell curve of innovation adoption discussed by Rogers
(2003). The following sections outline how the perceived characteristics of innovations were
discussed among participants through the interviews.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 76


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Relative Advantage
While most participants (10 out of 13) found the professional development and FLC
personally advantageous, only two instructors (moderate adopters) stated that the weeklong
professional development was not beneficial. Instructors who did benefit noted advantages related
to social factors, convenience, and personal satisfaction.
The desire to increase student engagement was brought up by three participants across
different adoption levels. One participant noted an effort to increase instructor presence in
discussion boards stating, “I respond to them more frequently. I just want to make sure the students
realize that I’m responding, and they don’t feel like I left them hanging” (Instructor M). Another
participant said, “I think that we learned things that will allow my students to be more engaged”
(Instructor K). Another social factor that was mentioned was the willingness to utilize web
conferencing technologies to hold meetings. Instructor M noted, “I’ve been more open to it, but
I’ve only had one or two students taking me up on Skype meetings or virtual meetings.”
Moderate and late adopters (n = 9) seemed satisfied with the specific pedagogical lessons
gleaned from the professional development. One participant was particularly satisfied with the
training regarding the alignment between learning outcomes and course activities which included
assessments. Instructor G stated “We really talked about ...what those outcomes are and what’s
going to really work best in an online environment and what’s going to work best in a face-to-face
environment.” A late adopter, Instructor L, shared “the workshop really gave me insight into ways
that I can use a lot of different modes of delivery. When I’m delivering a single topic, I’m using
video, I’m using some writing, I’m using Prezi presentations, I’m using discussions, I’m using
open-ended quizzes…all just to deliver one idea.”
Four participants who were across all adoption levels found that learning how to leverage
a learning management system (i.e., Canvas) was the most advantageous element of the
professional development. “For grading and project submittals, I do a lot more of online submittals
and online grading and doing assessments and rubrics through Canvas. But I also use the
anonymous survey tool in Canvas to get reflective feedback from the students” (Instructor J).
Similarly, Instructor D said that “My face-to-face [courses] continue to improve because I can now
put the very important key pieces of material or expectations in a user-friendly manner online so
the students have access to it 24/7 regardless of the mode of implementation, faculty members
found learning about tools and how to deliver content beneficially.”
Compatibility
Participants (n = 13) discussed the level of compatibility of the professional development
with their needs, teaching styles, and pedagogical preferences. These participants discussed how
networking, with either new or veteran colleagues, proved to be helpful. Instructor C remarked
“We get to network together and share practices on how to do things better. I enjoy that part.”
Similarly, Instructor H noted that “hearing how other people have gone about it and attending some
of the meetings that we have had within our faculty learning communities have been pretty good
because we were able to talk about what worked and what isn't working in others’ courses.”
Consistency with teaching approach. More than 61% (8 out of 13) of participants noted
that components of the professional development and subsequent FLC were incongruent with their
preferred teaching approach. Instructor E remarked that “The pedagogical instruction was
completely disconnected from the way I teach. It was all directed at lecture teachers. I’m not a

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 77


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

lecture style teacher.” Another participant shared similar feelings in saying that “It’s not really
helping so much because...the challenge I have is with the large class size. And, my teaching style
involves mostly interaction with my students. I don’t do lecturing” (Instructor C). According to
four of the participants, the focus of the initial professional development was how to convert
lecture-based instruction into an asynchronous online learning environment. This approach was
incompatible for instructors who were not going to teach in an asynchronous format and created a
schism between participant needs and training objectives. Instructor B, an early adopter,
highlighted this issue by stating “The professional development was more focused on
asynchronous teachings, but all of my courses are synchronous so there’s a little bit of disconnect
there.” Late adopters also noticed this disconnect. As Instructor M noted, “They threw together
synchronous and asynchronous. I think those crowds are a bit different.”
Various benefits of the training were also recognized. Both early and moderate adopters (n
= 8) found the range of topics beneficial, noting that exposure to different technologies allowed
them to find the tool that would best address their teaching needs. Instructor C noted how “the
workshop actually opened my eyes. I can see it as a good way of helping me to make the online
course more interactive. In addition to the content, how I can use it to bring more interaction with
the participants was useful” (Instructor C). Another participant pointed out, “They presented all
kinds of different options...You can pick what you need and what works for you. That really
worked well for me” (Instructor F). One late adopter discussed how her teaching strategies
improved as a result of learning new online teaching strategies and techniques. “I think it really
helped my teaching style. I try to use technology and social media in the classroom to gain
awareness” (Instructor J).
Consistent with expectations. Several participants (n = 7) expected more individualized
and tailored instruction to assist with the design of their own courses. Instructor L stated, “It was
not really tailored to individual needs.” Additionally, Instructor K shared “For me, I'm a very
hands-on learner and so not being able to actually implement what we're learning didn’t really
work for me. But for people who learned by watching someone else do something, this may have
been helpful for them...but it wasn’t for me.” This less hands-on approach led some participants to
feel less confident in executing delivery strategies that were discussed. For example, Instructor E
shared that “I just don’t know how to do it myself. So, I feel like I’m back at square one with just
a lot more knowledge about what’s out there.” Likewise, another participant commented “Some
sessions just kind of talk about technology and we didn’t actually try it. I prefer trying it”
(Instructor F).
In addition to the less hands-on training approach, moderate and late adopters tended to
feel that a one-size-fits-all approach was utilized. Instructor H commented “I probably would have
benefited from having us grouped by level of experience or level of interest in certain topics...I
probably could have utilized my time a little bit better if there had been stronger sessions offered
for different things.” Two participants perceived that prerequisite knowledge was presumed. “I felt
like sometimes the [professional development] instructors almost assumed prior knowledge—at
least for me... I think there were too many assumed knowledges about what you knew for teaching
online” (Instructor M). Instructor M continued by stating “I think that if the talks or workshops
have been individualized to certain interest groups, and more hands-on...that would have been a
lot more helpful.”
Conversely, four early adopters like Instructor D, articulated that “I think participating in
that kind of hands-on, pretty intense professional development helped me find the things that I

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 78


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

could implement and find the things that could apply to me specifically and then go to it.” These
two opposing viewpoints might point to a disparity between the training needs for early adopters
versus moderate and late adopters.
Trialability
Nearly half (46%) of the participants indicated they practiced using some online tools,
skills, and strategies presented in the training. Some participants (n = 7) reported that trying to use
new tools and techniques was vital. For example, “I think we had class time to practice and ask
questions. Some things that interest me, I would practice more than others. I also think I didn't
have a clear enough understanding of what I wanted to know and what I needed to practice”
(Instructor J).
Additionally, Instructor A stated, “I brought my laptop. I did everything as we were
learning. I was able to try out as we were learning it.” As an example, Instructor F created a blog
during the training. “I put all the proctoring websites that I've used on a blog and shared with the
other faculty. So, that was very productive, and I actually got to do it hands-on.”
Five participants commented that they ended up practicing on their own. Instructor M
commented “I think I actually practiced with students or other faculty. I’ve done that with a few
faculty or a couple of faculty where I’m able to show what I’ve created or show them how I created
it and how to put it online. That’s how I’m able to practice it.” Along the same line, Instructor H
stated that “implementing Adobe Connect and just doing that trial and error, trying to see what
works... I didn't do that with the eLII staff. I did that on my own with our information technologist
over in my own college. But I definitely practiced.” One participant even practiced with family
members. “I tested out Adobe Connect with my wife who just acted like a pretend student. That
tool is really easy” shared Instructor E.
Practicing on their own after the training was also noted by Instructor F, who commented
“I learned to use Adobe Captivate and I practiced that on my own.” Likewise, Instructor A
remarked “I tried a lot of different things...I have a lot of accounts to try to find out more and see
what would really work. It took me getting in there, signing up for it and everything to really start
playing around with it to really understand what was going to work best.” Independent
experimentation and exploration of new tools was more common for early and moderate adapters.
Ease of Use
Most participants (9 out of 13) found the skills gained from the professional development
easy to implement. The remaining participants either claimed that implementation would be too
difficult or too time-consuming. One instructor noted that they did not gain any skills and did not
have an opportunity to use the skills. When participants were asked to comment on the ease of
implementation, two participants shared how selectiveness is important when thinking about what
to implement in blended and online courses. Instructor D said “I think one thing I did take away
from it is that you can’t do all of it. You must pick one thing and try to make it work this time.
And if it doesn’t, then try something different. So, I find that every time I try a new platform or a
new app that it seems to work, but I can’t do everything.” Along the same line, Instructor G
commented “I try to be selective in the type of things that I'm going to try to implement in my
classes. If I don't think I can do it, or I think that I'm not going to be able to figure it out and do it
well with my students, then I don't do it. I think that's probably the better way that I handle it.”
This approach really speaks to the classification of implementation as either “easy” or “difficult.”

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 79


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Six participants from each adoption level commented that incorporating video and web
components into a course would be difficult and time-consuming. Synchronous video components,
such as using Adobe Connect, or recording and editing lectures using Echo 360 or Camtasia, were
specifically mentioned as challenges. “It’s such a simple thing, but I didn’t learn how to use it
during our training. I think that it’s such a basic thing that we should have known. We really should
have learned how to use it” remarked Instructor K. Another participant asserted, “Everything is
very time-consuming. Even though Captivate is cool, there’s so much to it, and as I try to explore
it takes a lot of time” (Instructor C). Similarly, Instructor J shared his experiences with video
creation, “Well, I think that it was challenging—creating, adding, and coming up with video stuff.
I just didn't understand. Maybe I didn't have a clear idea of what were the best or most effective
practices, but I didn't know enough.” Another participant shared how initial difficulty resulted in
long-term benefits. “What I've learned about all of this, any time you create something digital, you
have to keep at it! So, I don’t mind putting a lot of work into something that I can use every
semester over and over,” proclaimed Instructor F.
Voluntariness
Out of the 13 interviewed instructors, only three (23%) of the adopters reported being
required to teach online or hybrid. Each of the three was classified as a moderate adopter. The
requirement to teach online appeared to be most closely associated with their rank and title. Those
participants with full faculty rank did not express administrative pressure, while lower faculty rank
individuals felt that demands from their superiors made participation involuntary. One participant
discussed how her rank as lecturer contributed to the requirement of teaching online. “The Dean
asked me to develop the online class. So now that it's developed, I guess I'm kind of required to
teach it. I'm a lecturer, so a lot of this distance learning falls on the lecturers,” commented Instructor
F. Likewise another lecturer expressed how her contract called for her to teach online during the
summer. “I'm on a twelve-month contract as opposed to a nine-month contract. The first time they
[the department] needed somebody to teach online was during a summer when people weren’t
around. So, basically, it was given to me” (Instructor G).
The remaining ten participants reported that they teach blended courses on a voluntary
basis. Instructor I stated, “I'm a tenured faculty member so there would not be any requirement per
se to teach online. There are certainly opportunities provided from my department. I'm interested
in experimenting and trying to figure out new and compelling ways to incorporate [technology].”
Similarly, Instructor H shared, “There is no requirement to do that [teach online]. It's encouraged,
but it's not required. Honestly, it wouldn't work for all of our classes.” Many of the participants
commented that they were just interested in learning more about online and hybrid teaching
practices.
Image
Like voluntariness, image appeared to be unrelated to adoption level. Participants were
neutral (n = 8) on how the implementation of skills was related to image or reputation, or positive
(n = 5) that the training improved their reputation and image with peers. For example, Instructor
K said “There’s not a perceived difference between people who participated in the training. I don’t
think people in my department even know that I participated in it.” Likewise, Instructor H shared
“In my division, honestly, it's not really a big deal. I mean I think people are like, ‘Oh, that's cool.
Tell me how it goes.’ But it's not this prestige thing.” In contrast, another participant shared “I’m

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 80


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

sure that the faculty who are not part of the eLII process see it as perhaps a good thing and
something that we should be doing. We should be training new cohorts of faculty” (Instructor I).
On a similar note, Instructor A commented “I’d say on the university level, it’s perceived
as what’s going to push the university forward and progress the university.” Another participant
shared how her involvement in this professional development lead to speaking engagements.
Instructor G shared “From my perspective people are perceived pretty well. As a result of my
involvement with this program, I've been invited to give professional development sessions not
only for my own college, but also for other colleges around the university for the eLII program. I
reviewed some of the new rounds of eLII grants because of my experience. So, it seems like we're
perceived in a positive manner.”
Those participants who reported a positive impact on their image (n = 5), tended to note
knowledge gained and the status of being an early adopter of online teaching. Instructor D
remarked “The perception is that we’re the most tech-savvy people. However, it seems that I've
always been the person that if anybody has problems with clickers or with Blackboard or with
Echo 360 or with any of other technology, they’ll come find me.” Similarly, Instructor M stated “I
think people probably perceive it positively.” Instructor F shared a similar experience. “My chair
sent another faculty to me who had a question about recording lectures and that kind of thing. So,
I guess we are perceived a little bit as the experts in the area.”
When asked about their improved image, the same five participants indicated positive
perceptions about peers who participated in the professional development. “All of them are pretty
motivated regarding wanting to be better teachers online, so I think of them positively in that sense.
They are motivated to be good teachers” commented Instructor K. Likewise, Instructor C shared
that “It's nice to know others are so excited about teaching because we are research school. And
so most of the time we're excited about research, but the teaching part is so fun on each side. So,
I'm very happy to see that so many of us also have a heart for how our students learn and how can
I do a better job for them and for me.”
Visibility
Participants (n = 11) discussed being more aware of instructors teaching online as a result
of the training. Instructor B commented, “I hear about what some people do, but I have no idea
whether it is connected with eLII or not...Sure we kick around stuff in our departments, and some
of those folks were involved in eLII stuff, but they were doing this stuff already anyway.” On a
more global level, Instructor C asserted “I hear about more people teaching online now I think just
because that’s where the market is going, and we’re going to have to respond to that.” Instructor
A shared her experience:
I've seen it [online learning] across our department...I would say a positive outcome is the
fact that if other people want to do it. This friend of mine over in [another department], we
talk all the time. She tells me about how she is implementing flipped learning. She does
more of the traditional flipped classroom where she does the lectures outside of class and
then they do the problem working inside of class.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 81


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Discussion
Findings from the current study illustrated some of the changes that occurred as a result of
the year-long professional development initiative at a single research university. The results
suggest that early adopters benefited from a wider exposure to tools and required a much less-
formal hands-on approach. In contrast, instructors who were moderate or late adopters of online
and blended learning benefited from a step-by-step training approach that walked them through
the integration of digital tools based on their specific teaching needs.
The current study is a tale of a single university and provides details on barriers, challenges,
and success of a small group of instructors. Nevertheless, this study demonstrated the benefits of
combining a qualitative and quantitative approach when the sample size is small. In this case, the
quantitative results (i.e., the survey) provided a baseline on a point in time, but the data were
inadequate to make comments about group and individual differences. Likely due to this limited
sample size, no significant statistical findings were found regarding differences by innovation
level. However, the qualitative data illustrated nuanced differences and gave voice to the
experiences of the instructors.
As detailed in the literature review, few studies are situated in higher education institutions
that focus on online and blended learning and that use a theory to ground the methodology. The
current study was grounded in Roger’s (1962, 2003) innovation model and Moore and Benbasat’s
(1991) conceptualization of the perceived characteristics of innovation. Using this theory to guide
the current inquiry helped to better understanding how innovation characteristics influenced one
another in the context of preparing instructors at a research university to teach in online and
blended environments. Results of this study indicate that faculty members most frequently
mentioned experiences that fell within the perceived characteristics of innovation of relative
advantage, compatibility, and trialability. The characteristics of voluntariness or visibility were
interpreted as having little influence on adoption levels. The fact that voluntariness did not
influence innovation adoption is likely because instructors at research institutions, on the whole,
do not choose which courses they will teach and in which format those courses will be taught.
With the caveat that rank (i.e., lecturer, assistant, associate, or full) might provide the individual
with leverage in these decisions. The fact that image did not influence adoption rates is likely a
result of the siloed nature of research institutions. At these types of institutions, instructors rarely
interact across departments and might never interact with others across colleges. Thus, an
instructor at a research university might be unaware to what is happening outside of his/her own
department.
The current study furthers the research that has been conducted on faculty development for
online and blended learning in institutions of higher education. For example, a study conducted by
Shea, Pickett, and Li (2005) focused on satisfaction with online teaching of instructors across 33
unique and diverse campuses that include community colleges, technical colleges, four-year
colleges, doctoral universities, as well as university centers. Although those findings were also
theoretically situated in the DOI, those researchers focused on satisfaction with online learning
within a network. In the current study, the findings are focused on the story of one research-
intensive university and pedagogical changes that resulted around the eight perceived
characteristics. The current study also took a more theoretical approach than previous research by
using the perceived characteristics of the DOI theory as the measurable constructs, both
qualitatively and quantitatively. Thus, by focusing on accepted theoretical constructs in the

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 82


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

research design, the study was able to go deeper into the theoretical levers that may impact the
adoption of online teaching and learning, not just overall satisfaction.
Limitations
Limitations of the current study include a lack of distinction between online and blended
delivery. This lack of distinction may have resulted in a feeling of mismatch between the purpose
of the training and faculty expectations. There was also no presurvey data from faculty participants.
The inclusion of presurvey information would have been helpful in determining if the training
assisted in increasing an individual’s self-reported innovation level. Changes in faculty perceptions
of the innovation characteristics may have differed between the initial week-long training versus
the follow-up meetings. Lastly, the relatively small sample size hindered the use of advanced
quantitative analysis.

Conclusion
It is important to note that networking through the initial professional development, and
later in the faculty learning communities, was an unexpected beneficial aspect of the professional
development training. The creation of the learning communities with small groups of participants
allowed faculty members with differing expertise to support one another through the learning
process over a longer period beyond the initial week-long training. This direct application of skills
and networking with peers may result in increases to some innovation characteristics (e.g., results
demonstrability, relative advantage) in the context of a specific endeavor.
The research presented in this article details how one research university used professional
development training to increase the quality, and quantity, of online and blended courses. As
research-intensive universities shift more resources from the brick and mortar classroom into an
online or blended learning environment, professional development of the course instructors will
be imperative. This research highlighted one approach taken to the professional development as
well as the method taken to evaluating the outcomes of that professional development. The lessons
learned can be of service to future instructors, learners, and leaders.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 83


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

References

Allen, I. E., & Seaman, J. (2010). Learning on demand: Online education in the United States, 2009.
Sloan Consortium.
Allen, I. E., & Seaman, J. (2018). Grade increase: Tracking distance education in the United States.
Babson Survey Research Group.
Barker, A. (2003). Faculty development for teaching online: Educational and technological issues.
The Journal of Continuing Education in Nursing, 34(6), 10–16. doi:10.3928/0022-0124-
20031101-10
Cash, P. A., & Tate, B. (2012). Fostering scholarship capacity: The experience of nurse educators.
Canadian Journal for the Scholarship of Teaching and Learning, 3(1), 7.
doi:http://dx.doi.org/10.5206/cjsotl-rcacea.2012.1.
Chen, K. Z., Lowenthal, P. R., Bauer, C., Heaps, A., & Nielsen, C. (2017). Moving beyond smile
sheets: A case study on the evaluation an iterative improvement of an online faculty
development program. Online Learning Journal, 21(1), 85-111. doi:10.24059/olj.v21i1.810
Chen, P. D., Lambert, A. D., & Guidry, K. R. (2010). Engaging online learners: The impact of web-
based learning technology on college student engagement. Computers & Education, 54, 1222–
1232. doi: http://dx.doi.org/10.1016/j.compedu.2009.11.008
Childre, A. L., & Van Rie, G. L. (2015). Mentor teacher training: A hybrid model to promote
partnering in candidate development. Rural Special Education Quarterly, 3, 10–16.
Claffey, G. F. (2015). MOOC learning and impact on public higher education. (Doctoral
dissertation, Northeastern University). Retrieved from ProQuest Dissertations & Theses. (UMI
No. 3732233)
Cohen, J. (1992). A power primer. Psychological Bulletin, 112(1), 155–159.
Creswell, J. W., & Plano Clark, V. L. (2018). Designing and conducing mixed methods research (3rd
ed.). Sage.
DeRousie, J. C. (2014). An exploration of the diffusion and adoption of four innovations in higher
education. (Doctoral dissertation, The Pennsylvania State University). Retrieved from ProQuest
Dissertations & These (UMI No. 3690116).
Drape, T. A., Westfall-Rudd, D., Doak, S., Guthrie, J., & Mykerezi, P. (2013). Technology
integration in an agriculture associate's degree program: A case study guided by Rogers'
diffusion of innovation. NACTA Journal, 57, 24–35.
Fredericksen, E., Picket, A., Shea, P., Pelz, W., & Swan, K. (2000). Factors influencing faculty
satisfaction with asynchronous teaching and learning in the SUNY learning network. Online
Learning Journal 4(3), 245–278.
Freeman, M. (2012). To adopt or not to adopt innovation: A case study of team-based learning. The
International Journal of Management Education, 10, 155–168.
doi:http://dx.doi.org/10.1016/j.ijme.2012.06.002
Garrett, R., Legon, R., & Fredericksen, E. E., (2019). CHLOE 3 behind the numbers: The changing
landscape of online education 2019. Retrieved from qualitymatters.org/qa-resources/resource-
center/articles-resources/CHLOE-3-report-2019

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 84


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Grosz, T. (2012). Faculty training for blended learning in higher education (Doctoral dissertation,
Northcentral University). Retrieved from ProQuest Dissertations & Theses (UMI No. 3504768)
Gunay, N. (2013). Predictors of involvement in online teaching among faculty in technical colleges
(Doctoral dissertation, Capella University). Retrieved from ProQuest Dissertations & Theses.
(UMI No. 1318673167)
Han, I., & Han, S. (2014). Adoption of the mobile campus in a cyber university. The International
Review of Research in Open and Distributed Learning, 15(6).
doi:https://doi.org/10.19173/irrodl.v15i6.1950
Herman, J. H. (2012). Faculty development programs: The frequency and variety of professional
development programs available to online instructors. Journal of Asynchronous Learning
Networks, 16(5), 87–106. doi:10.24059/olj.v16i5.282
Hollis, E. (2016). Traditional liberal arts colleges’ consideration and adoption of online education:
A presidential perspective. (Doctoral Dissertation, University of Kentucky). Retrieved from
ProQuest Dissertations & Theses. (UMI No. 10306932)
Huun, K., & Hughes, L. (2014). Autonomy among thieves: Template course design for student and
faculty success. Journal of Educators Online, 11, 1–30.
Jordan, C., Jones-Webb, R., Cook, N., Dubrow, G., Mendenhall, T. J., & Doherty, W. J. (2012).
Competency-based faculty development in community-engaged scholarship: A diffusion of
innovation approach. Journal of Higher Education Outreach and Engagement, 16, 65–96.
Karmeshu, R. R., & Nedungadi, P. (2012). Modeling diffusion of a personalized learning framework.
Educational Technology Research and Development, 60, 585–600. doi:10.1007/s11423-012-
9249-2
Keengwe, J., & Georgina, D. (2011). The digital course training workshop for online learning and
teaching. Educational Information Technology, 17, 365–379.
Lewis, D., & Slapak-Barski, J. (2014). “I’m not sharing my work!” An approach to community
building. Quarterly Review of Distance Education, 15(2).
Linder, K. E. (2017). Training faculty to teach in blended settings. New Directions for Teaching &
Learning, 149, 47–58.
Littlefield, C. M. (2012). Blended course design and delivery: Faculty approaches, essential
components, and the impact of professional development in community colleges (Doctoral
dissertation, Widener University). Retrieved from ProQuest Dissertations & Theses. (UNI No.
1347340589)
Martin, F., Parker, M., & Allred, B. (2013). A case study on the adoption and use of synchronous
virtual classrooms. Electronic Journal of E-learning, 11, 124–138.
Meyer, G. (2004). Diffusion methodology: Time to innovate. Journal of Health Communication, 9,
59–69. https://doi.org/10.1080/10810730490271539
Mohr, S. C., & Shelton, K. (2017). Best practices framework for online faculty professional
development: A Delphi study. Online Learning Journal, 21(4), 123–140.
doi:10.24059/olj.v21i4.1273
Molina, P. G. (2013). An integrated model for the adoption of information technologies in U.S.
colleges and universities. (Doctoral dissertation, Georgetown University). Retrieved from
ProQuest Dissertations & Theses. (UMI No. 3556162)

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 85


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Moore, G. C., & Benbasat, I. (1991). Development of an instrument to measure the perceptions of
adopting an information technology innovation. Information Systems Research, 2, 192–222.
doi:http://dx.doi.org/10.1287/isre.2.3.192
Powell, T. (2010). Delivering effective faculty training: A course and methods to prepare faculty to
teach online. Mobile, Blended, and On-Line Learning. Proceedings from ELML: Second
International Conference. CPS. https://www.iaria.org/conferences2010/eLmL10.html
Richardson, J. W. (2009). The diffusion of technology adoption in Cambodia: The test of a theory.
International Journal of Education and Development Using ICT, 5(3), 157–171.
Richardson, J. W. (2011). Technology adoption in Cambodia: Measuring factors impacting adoption
rates. Journal of International Development, 23(5), 697–710. doi:10.1002/jid.166
Rogers, E. M. (1962). Diffusion of innovations. Free Press.
Rogers, E. M. (2003). Diffusion of innovations (5th ed.). Free Press.
Shea, P. (2007). Bridges and barriers to teaching online college courses: A study of experienced
online faculty in thirty-six colleges. Online Learning Journal, 11(2), 73–128.
Shea, P., Pickett, A., & Li, C. S. (2005). Increasing access to higher education: A study of the
diffusion of online teaching among 913 college faculty. International Review of Research in
Open and Distance Learning, 6(2), 1–27. https://doi.org/10.19173/irrodl.v6i2.238
Shiflett, K. H. (2013). The relationship between organizational culture and adherence to regulatory
requirements for online programs. (Doctoral dissertation, University of Pittsburgh). Retrieved
from ProQuest Dissertations & Theses. (UMI No. 3582616)
Shipman, C. D. (2017). Perceptions of faculty toward integrating technology in undergraduate
higher education traditional classrooms at research-focused regional universities in south
Texas (Doctoral dissertation, Texas A&M, Kingsville). Retrieved from ProQuest Dissertations
& Theses. (UMI No. 10285869)
Soffer, T., Nachmias, R., & Ram, J. (2010). Diffusion of web supported instruction in higher
education-the case of Tel-Aviv University. Journal of Educational Technology & Society, 13,
212–223.
Sullivan, R., Burns, B., Gradel, K., Shi, S., Tysick, C., & van Putten, C. (2013). Tools of engagement
project: On-demand discovery learning professional development. Journal of Educational
Technology Systems, 41, 255–266.
Terantino, J. M., & Agbehonou, E. (2012). Comparing faculty perceptions of an online development
course: Addressing faculty needs for online teaching. Online Journal of Distance Learning
Administration, 14.
Varkonyi, I. (2012). Blended education: Combining the benefits of face-to-face learning with online
distance learning! Defense Transportation Journal, 68, 22–23, 28.
Wingo, N. P., Ivankova, N. V., & Moss, J. A. (2017). Faculty perceptions about teaching online:
Exploring the literature using the Technology Adoption Model as an organizing framework.
Online Learning, 21(1), 15–35.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 86


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Appendix A

eLII Analysis of Online Learning Professional Development Survey

Q1 Please rate how much you personally agree or disagree with these statements
regarding voluntariness.

My department does not require me to use the Strongly Disagree Agree Strongly
skills I gained in the eLearning Innovation Disagree Agree
Initiative professional development (i.e., I am not
required to teach online or blended now or in the
foreseeable future).

Although it might be helpful, implementing the Strongly Disagree Agree Strongly


skills I gained in the eLearning Innovation Disagree Agree
Initiative professional development is not
compulsory in my job.

Q2 Please rate how much you personally agree or disagree with these statements
regarding implementing the skills you gained in the eLearning Innovation Initiative
professional development.
The skills enable me to accomplish tasks more Strongly Disagree Agree Strongly
quickly. Disagree Agree

The skills improve the quality of work I do. Strongly Disagree Agree Strongly
Disagree Agree

The skills make it easier to do my job. Strongly Disagree Agree Strongly


Disagree Agree

The skills enhance my effectiveness in my job. Strongly Disagree Agree Strongly


Disagree Agree

The skills give me greater control over my work. Strongly Disagree Agree Strongly
Disagree Agree

Q3 Please rate how much you personally agree or disagree with these statements
regarding how people in your organization who implement the skills gained in the
eLearning Innovation Initiative professional development are perceived.
They have more prestige. Strongly Disagree Disagree Agree Strongly Agree

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 87


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

They have a higher profile. Strongly Disagree Disagree Agree Strongly Agree

They are a status symbol in Strongly Disagree Disagree Agree Strongly Agree
my organization.

Q4 Please rate how much you personally agree or disagree with these statements
regarding implementing the skills you gained in the eLearning Innovation Initiative
professional development.
The skills are compatible with all Strongly Disagree Disagree Agree Strongly Agree
aspects of my work.

The skills fit well with the way I Strongly Disagree Disagree Agree Strongly Agree
like to work.

The skills fit into my work style. Strongly Disagree Disagree Agree Strongly Agree

Q5 Please rate how much you personally agree or disagree with these statements
regarding implementing the skills you gained in the eLearning Innovation Initiative
professional development.
Using the skills is clear and Strongly Disagree Disagree Agree Strongly Agree
understandable.

I believe it is easy for me to do Strongly Disagree Disagree Agree Strongly Agree


what I want to do with the skills.

Overall, I believe it is easy for me Strongly Disagree Disagree Agree Strongly Agree
to implement the skills.

Learning the skills is easy for me. Strongly Disagree Disagree Agree Strongly Agree

Q6 Please rate how much you personally agree or disagree with these statements
regarding demonstrability implementing the skills I gained in the eLearning Innovation
Initiative professional development.
I would have no difficulty telling Strongly Disagree Disagree Agree Strongly Agree
others how I implemented the
skills I learned.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 88


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

I believe I could communicate to Strongly Disagree Disagree Agree Strongly Agree


others the consequences of
implementing the skills.

The results of implementing the Strongly Disagree Disagree Agree Strongly Agree
skills are apparent to me.

I would have no difficulty Strongly Disagree Disagree Agree Strongly Agree


explaining why implementing the
skills may or may not be
beneficial.

Q7 Please rate how much you personally agree or disagree with these statements
regarding visibility.

In my organization, I see other eLearning Strongly Disagree Agree Strongly


Innovation Initiative professional grant recipients Disagree Agree
using the skills I gained.

People who use the skills from the eLearning Strongly Disagree Agree Strongly
Innovation Initiative grant are not very visible in Disagree Agree
my organization.

Q8 Please rate how much you personally agree or disagree with these statements
regarding the skills you gained in the eLearning Innovation Initiative professional
development.
Before deciding whether to use any of the skills, Strongly Disagree Agree Strongly
I was able to adequately practice those skills. Disagree Agree

I was permitted to use the skills on a trial basis Strongly Disagree Agree Strongly
long enough to see what I could do. Disagree Agree

Q9 Please rate your adoption level on a scale from 1-5 with 1 being the last person to adopt
and 5 being the first person to adopt.

How would you rate your adoption level using digital technology? 1 2 3 4 5
How would you rate your adoption level with regards to teaching 1 2 3 4 5
blended courses?*

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 89


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

How would you rate your adoption level with regards to teaching fully 1 2 3 4 5
online courses?**

* Blended courses are courses that have traditional face-to-face on campus instruction and some
on campus activities have been replaced by online learning activities.
**Fully online courses are courses that have all content and course activities online. There is no
traditional face-to-face on campus instruction.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 90


Shifting Teaching and Learning in Online Learning Spaces:
An Investigation of a Faculty Online Teaching and Learning Initiative

Appendix B
Interview Guide

1. Which eLII group do you belong to?


2. Which eLII cohort do you belong to?
3. How would you classify yourself with respect to digital technology? On a scale of
one to five, with one being not technologically savvy at all and five being very tech
savvy, where would you rate yourself? Can you tell me a brief story that best
exemplifies this rating?
4. Had you taught blended courses before your participation in the eLII professional
development?
5. Had you taught fully online courses before your participation in the eLII professional
development?
6. Have you taught blended courses after your participation in the eLII professional
development?
7. Have you taught fully online courses after your participation in the eLII professional
development?
8. How has your teaching changed since participating in the eLII professional
development?
9. Do you feel that you are required to teach online?
10. Do you feel you were required to apply for the eLII grant? Please explain your
answer.
11. What skills did you gain in the eLII professional development that you have now
implemented?
12. Talk to me about how people in your organization who implement the skills gained in
the eLII professional development are perceived?
13. Talk to me about how the eLII professional development is compatible with your
needs, teaching style, and pedagogy? Can you give me examples?
14. Describe how easy or difficult it is for you to implement the skills you gained in the
eLII professional development. Can you give me examples?
15. Describe the results of implementing the skills you gained in the eLII professional
development. Can you give me examples?
16. Is the implementation of skills gained in the eLearning Innovation Initiative
professional development visible in your organization? Can you give me examples?
17. How were you able to practice the skills gained the eLearning Innovation Initiative
professional development? Can you give me examples?
18. What were your expectations for your professional development from the eLearning
Innovation Initiative? Did it meet those?
19. Describe one aspect that was particularly beneficial to you?
20. Describe one aspect that was least beneficial to you? How would you change this
aspect?
21. Is there anything else about the eLearning Innovation Initiative professional
development or online teaching and learning that you would like to share?

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 91


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

From Discussion Forums to eMeetings:


Integrating High Touch Strategies to Increase Student
Engagement, Academic Performance, and Retention in
Large Online Courses
Glenda H. E. Gay
The University of the West Indies at Cave Hill

Kristen Betts
Drexel University

Abstract
Student engagement and group work are critical to developing competencies, deeper learning, and
attributes that align with 21st-century skills. In an increasingly competitive and dynamic
workforce, the ability for employees to engage in collaborative workgroups is essential. A new
capstone group-work assignment using Online Human Touch (OHT) strategies was integrated into
an Information Systems course at a regional university in the Caribbean. The course typically
enrolls 250–300+ students per semester with one instructor. The assignment simulated a real-world
business ‘eMeeting’ to proactively increase student engagement and retention. This action research
study collected quantitative and qualitative data three years prior to and three years after the
integration of the new ‘eMeeting’ group-work assignment. Quantitative data showed improved
academic performance, higher scores on the standardized final exam, and decreases in attrition
while qualitative data showed significant increases in student engagement. Integrating the
‘eMeeting’ assignment into the large online course provided students with the opportunity to apply
the knowledge, skills, and experience gained throughout the semester. It also enhanced key soft
skills sought by employers including problem-solving, ability to work in teams, communication,
leadership, and time management.

Keywords: large courses, online instruction, online learning, distance learning, teacher
presence, student engagement, attrition, retention, Online Human Touch, high touch strategies,
group work, 21st-century skills

Gay, G.H.E. & Betts, K. (2020). From discussion forums to eMeetings: Integrating high touch
strategies to increase student engagement, academic performance, and retention in large online
courses. Online Learning, 24(1), 92-117. https://doi.org/10.24059/olj.v24i1.1984

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 92


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase


Student Engagement, Academic Performance, and Retention in Large Online Courses
Online enrollments worldwide have increased exponentially since the turn of the new
millennium. In the United States, the proportion of higher education students enrolled in at least
one online course increased to 33.1 percent in Fall 2017 from 31.1 percent in 2016. Concurrently,
students enrolled exclusively online grew to 15.4 percent while students enrolled in a mix of online
and in-person courses grew to 17.6 percent (Lederman, 2018). While cumulative online enrollment
growth worldwide is more elusive to quantify, the global market for e-learning was estimated at
$90 billion in 2002 (Yong, 2003 cited in Chawla & Joshi, 2012), $166.5 billion in 2016 (Yu & Hu,
2016), to a projected $275 billion by 2022 (Reuters, 2017). It is evident through global market
growth and increasing enrollments that online learning is now a cornerstone in education
worldwide.
As institutions of higher education (IHE) continue to expand online offerings, academic
quality and the student experience must be central to course design and instruction. This is of
particular importance for large online courses in which enrollments may reach 150 students per
course with one instructor. Very large online courses, which enroll 150 or more students per
course, are often managed by one instructor with one or more teaching assistants (Elison-Bowers,
Sand, Barlow, & Wing, 2011). Therefore, instructing up to 150 students in a large online course
or 150+ students in a very large online course is different than teaching the same online course
with 20–60 students with one instructor or teaching the same course in a traditional on-campus
classroom (Elison-Bowers, Sand, Barlow, & Wing, 2011; Berry, 2009).
One of the primary challenges of students enrolled in online courses is their feelings of
isolation, lack of community, and experiences of limited engagement with the instructor (Boton &
Gregory, 2015; Mokoena, 2013). These factors can result in higher attrition rates than traditional
courses (Thomas, Herbert, & Teras, 2014). For courses with enrollments of 150+, students may be
at an even greater risk of attrition if these factors are not considered as part of course design or
addressed through high touch instructional strategies.
The University of the West Indies (The UWI), a regional university in the Caribbean, was
established in 1948 on the island of Jamaica with 33 students. The UWI now enrolls over 45,000
undergraduate students and approximately 9,000 graduate students across three physical campuses
and an online campus. Of these 54,000 students, over 20,000 are enrolled through the online
campus (The University of the West Indies, 2016). As part of the three-year undergraduate
management degree, the Information Systems course is offered online every semester to second
year students with typical enrollments of 250-350+ students per course per semester. Each
Information Systems course has one instructor and four to five tutors who support student-to-
instructor and student-to-student engagement. While instructors teaching the Information Systems
course may not be concerned with students finding an open seat in a crowded large lecture hall,
they are concerned with how to actively engage each of the 250–350+ students while taking the
course that semester.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 93


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Review of Literature
Online Instruction with Large Courses
Online education provides increased opportunities for students to enroll in many programs.
Like on-campus programs, some online courses have increased student enrollments that mirror
large lecture classrooms and may include 150 or more students. According to Berry (2009),
“Teaching an online class session that has over 150 students enrolled is substantially different from
teaching a face-to-face class on campus or an online class with 25–35 students” (para. 1). A key
concern with large courses is that students may become disengaged or feel alienated which
can “erode students’ sense of responsibility and lead to behaviors that both reflect and promote
lack of engagement” (Wilsman, n.d., para. 1).
As online education continues to grow worldwide, there is increasing literature on how to
engage students in online courses. Strategies include keeping work relevant (Toney, 2017),
providing opportunities for learner interaction (Briggs, 2015), and providing effective and timely
feedback (Briggs, 2015; Toney, 2017). Creating opportunities for meaningful discussion and
collaboration in a large online course is one of the biggest challenges of online instruction
(Trammell & LaForge, 2017). Therefore, implementing teaching techniques becomes an important
factor in course design and successful management of large online courses.
Group Work, Communication, and 21st-Century Skills
Collaboration through group work (i.e., team work) is critical to developing competencies
and attributes that align with 21st-century skills and deeper learning. According to the organization
P21 Partnership for 21st-Century Skills, collaboration is the “ability to work effectively and
respectfully with diverse teams” (Framework for 21st century learning, n.d.). In reviewing deeper
learning competencies, collaboration occurs when “students learn to work in teams to achieve
shared goals” (Bitter & Loney, 2015, p. 3). Collaboration also supports the development of
communication skills as students work together to collectively solve problems as a group.
Group work is an important attribute that prospective employers rate highly when
employing graduates (Loughry, Ohland, & Woehr, 2014). This is evident in annual national studies
and media publications which identify skills that employers are seeking. According to the 2018
Job Outlook Report, the National Association of Colleges and Employers (NACE) reported that
the top three attributes an employer seeks on a candidate’s resume included: (a) problem-solving
skills, (b) ability to work in teams, and (c) communication (para. 7). Business Insider in 2018
spotlighted what LinkedIn identified as the four most important soft skills employers are seeking,
which included: (a) leadership, (b) communication, (c) collaboration, and (d) time management
(Leighton, 2018). Recognizing that employers are seeking these critical soft skills, it is important
that they are integrated into course design and instruction to support course and program outcomes.
As corporations become increasingly diverse, the ability to collaborate is critical whether
employees are working onsite or virtually. Employees are expected to be able to communicate
through email, discussion forums, and video conferencing. This study adapted the threads of a
discussion forum to represent the phases of an online meeting conducted in the corporate sector.
For the purpose of this study this capstone group assignment that simulated a business meeting is
referred to as an eMeeting.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 94


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Online Human Touch


The Online Human Touch (OHT) conceptual framework builds upon five areas of research
that support student engagement, retention, and completion. The five areas include:
• Student Engagement (Astin, 1984; Chickering & Gamson, 1987; Tinto, 1993)
• Personalized Communication (Faharani, 2003; Mehrabian, 1971)
• Community Development (Palloff & Pratt, 1999; Stanford-Bowers, 2008)
• Work-Integrated Learning (Milne, 2005), and
• Data Driven Decision-Making (Cranton & Legge, 1978).
Each of these areas, when integrated into program development, course design, and
instruction, support student engagement through high touch strategies during the student lifecycle.
Student engagement. High touch student engagement strategies connect students to the
instructor and other students through course orientation sessions, announcements, discussion
forums, synchronous sessions, and group assignments. When students are fully engaged, focused,
and present, they can experience flow, which is a state of optimal experience (Csikszentmihalyi,
1990; Spencer, 2017). Additionally, student engagement is an important factor in proactively
addressing student retention and creating a lifelong bond with future alumni (Betts, 2008).
The literature has shown that students learn best when they have specific assessment
guidelines, including a rubric (Rose & Smith, 2007). Furthermore, students learn better and faster
through multimedia presentations that supplement text-based coursework, thus allowing them to
review content at their own pace (Buzzetto-More, 2015).
Personalized communication. High touch personalized communication strategies
encourage regular and ongoing interaction with the instructor (e.g., faculty, adjunct faculty), and
students. It involves being active in the discussion forums, such as using students’ names when
responding to posts; providing customized feedback on graded assignments; and having meetings
with students or groups via Zoom or Skype regarding activities, assignments, or as needed.
Feedback using multiple modalities also supports personalized communication through text, voice,
and video feedback on assignments and group work.
A 2018 study by the National Association of Colleges and Employers (NACE) revealed
that students may have a higher perception of their own communication and collaboration skills
than that of actual employers. For example, a NACE report showed that when asked to rate their
oral and written communication skills as well as their ability to work with others in teams, students
overall rated themselves 79.4% for oral/written communication and 85.1% for working with others
in teams (Bauer-Wolf, 2018). However, for these same skill sets, employers rated students at
41.6%, and 77% respectively (Bauer-Wolf, 2018). Providing creative and personalized feedback
can therefore be used to enhance students’ written communication skills, while demonstrating to
students that the instructor and tutors are interested in their contributions (Mokoena, 2013).
Feedback by the instructor is also important since it could serve as a catalyst for students who have
yet to join or engage in the discussion threads (Rose & Smith, 2007).
Community development. High touch community development strategies involve creating
activities that support student-to-instructor and student-to-student engagement. Community
development can be fostered through discussion forums that actively engage students with topics
relevant to weekly/module content, current/emerging issues, and upcoming assignments.
Discussion forums can also incorporate group assignments in which students collaborate

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 95


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

asynchronously or synchronously. “The starting point for learning occurs when knowledge is
actuated by learners connecting to and participating in a learning community” (Goldie, 2016, p.
1065).
Collaboration skills are critical to today’s workforce. According to Laux, Luse, and
Mennecke (2016), “When students use a virtual community as a basis for learning, they are
exposed to unfolding events similar to real life. This is different from a single exposure to a concept
in one classroom session” (p. 289). However, there are some students who tend to participate in
group discussions but give shallow or short responses instead of providing in-depth reflective
responses that integrate their experiences with the material (Rose & Smith, 2007; Mokoena, 2013).
The Partnership for 21st Century Skills organization suggests that when collaborating, students
should develop the ability to work effectively and respectfully in their group, exercise willingness
in making necessary compromises to accomplish a common goal, assume shared responsibility for
collaborative work, and value the individual contributions made by each team member (n.d., para.
1).
Work-integrated learning. High touch work-integrated learning strategies assist students
in understanding the connection between activities and assignments and real-world issues. It aligns
with various instructional strategies that support providing student choice while adhering to the
same learning objectives and rubrics. For key assignments, this high touch strategy could include
having students select a topic of their choice, within identified parameters, which supports interest,
relevance, and significance for assignments that align with real-world contexts on current and
emerging issues related to the course.
Experiential and work-integrated learning are important when students are able to make
content applicable to their real-world experiences and they are involved in assignments in which
they use research and creative thinking skills, develop ideas, or solve a problem (Bigatel, 2016).
In postsecondary education, “experience-based education has become widely accepted as a method
of instruction” (Kolb, 2014). Learning experiences that expose students to a professional culture
and workplace practice are needed to support this transition from study to employment (Betts,
2008).
Data-driven decision-making. High touch data-driven decision-making strategies
involve formative and summative assessment. Diverse learning assignments actively engage
students in becoming reflective learners and practitioners. This can be achieved through
personalized feedback with scaffolded assignments as well as peer evaluation and self-evaluation
(Betts, 2008). More so, it can also engage faculty in modifying, refining, expanding or replacing
activities or assignments based on summative feedback from course evaluations or program
reviews (Betts, 2008).
Collectively, the five research areas within the OHT framework support high touch
strategies for student success and the transfer of learning across real-world contexts. This action
research study, therefore, examined how the integration of an eMeeting design affected student
engagement in the course. The following research questions were addressed:
1. How did the integration of an eMeeting designed using OHT strategies impact student
success (i.e., academic performance, student engagement, attrition) in a large online
course?

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 96


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

2. How did students perceive the eMeeting using OHT strategies in a required Information
Systems online course?
Course Structure 2011–2013
The original course in 2011 had three individual assignments, which included (a) one
orientation activity, (b) one database project, and (c) one essay assignment. There were also weekly
discussion forums and a standardized final exam. The orientation activity involved getting familiar
with the course and an introduction to databases. The database project included creating queries
and reports while the essay assignment focused on responding to an information systems issue.
Students were assigned to groups of approximately 20–30 with one tutor per group. Each tutor was
responsible for their group’s student orientation and for grading their students’ submissions for the
database project, essay assignment, and discussion forums. There was no capstone group work
assignment in this course structure.
The assigned weekly discussion forums were designed to test fundamental concepts where
students were required to read chapters from an online course manual and post responses to generic
questions in the weekly discussion forum. Therefore, group work was incidental since students
were placed in groups on registration and assigned a tutor. There were also limited guidelines on
how to actively participate in the groups. According to the literature, a lack of guidance on how to
effectively work as a group may cause a “sink-or-swim” approach (Vik, 2001). Moreover, research
by Rose and Smith (2007) indicates that the stipulation for ‘participation in group discussions’
tends to be too vague in terms of what is required of students as well as the extent and level of
participation. Having minimal guidance on what was generally permitted or expected in a group
environment did not foster or optimize student interaction. Students therefore replied to the
primary thread and to two other students in their group to meet the requirements, but the responses
typically did not go beyond the initial prompt. The final exam was a proctored, standardized two-
hour written exam that followed The UWI’s regulations for course completion.
Course Structure 2014–2016 with the New eMeeting Design
The original Information Systems coursework in 2011–2013 and the revised coursework in
2014–2016 shared the same learning outcomes, weekly objectives, assignments, readings, and
discussion forums. The orientation activity and database project still involved becoming familiar
with the course and introduction to databases. However, the essay assignment was revamped as a
capstone group-work assignment. It was now a two-week asynchronous discussion forum that was
introduced to the students as an eMeeting. Additionally, a restructured and enhanced standardized
final exam now focused on a case study.
The eMeeting was designed to simulate a real-world online business meeting as well as
proactively increase student engagement and retention. The discussion threads of an eMeeting were
sequenced, starting from student introductions though various tasks to the group’s final submission.
The sequencing provided guidance as students progressed through the threads of the eMeeting.
Additionally, the eMeeting design integrated the strategies from the OHT conceptual framework to
support student engagement and retention. The OHT framework asserts that students are more
likely to persist in an online program if they are engaged in and outside of their courses; the
educational experience is personalized; and activities support transfer of learning across real-world
contexts (Betts, 2008).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 97


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

While instructor presence and teaching were and are still important aspects of the course,
the role of the tutor has shifted from primarily grading course assignments and providing feedback
to guiding and engaging students with high touch strategies through the eMeeting. This new
approach now provides students with extended opportunities to develop critical workforce skills,
explore career interests, and build upon prior knowledge. Additionally, this new knowledge could
allow them to gain exposure to 21st-century skills and align with The UWI’s attributes in order to
expand their network, increase their regional identity and global awareness, and identify
innovative ways to transfer new knowledge and skills as socially, culturally, and environmentally
responsible citizens.

Methods
Action research was selected for this study since this methodology is used in real-world
contexts to solve problems and improve professional practice. Action research is typically
conducted by practitioners to explore practical problems in which the research results in a desired
change that is shared within an educational community (Norton, 2018; Efron, 2013). Parsons and
Brown (2002) define action research as follows:
Action research is a form of investigation designed for use by teachers to attempt to solve
problems and improve professional practices in their own classrooms. It involves
systematic observations and data collection which can be then used by the practitioner-
researcher in reflection, decision-making and the development of more effective classroom
strategies (p. 55).
Action research typically includes between three to seven or more steps. However, for the
purpose of this study, there were five steps (see Figure 1).

Figure 1. Action research steps for this study.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 98


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Both quantitative and qualitative data were collected three years prior to and three years
after the new ‘eMeeting’ assignment. Quantitative data, including exam scores and student attrition
before and after the introduction of the OHT strategies, was summarized and evaluated for trends
in attrition. Qualitative feedback was also collected from over 2,000 posts during the eMeeting.
Context and Setting
One of the core undergraduate management programs in Social Sciences at The UWI is
Information Systems. Like many IHEs, required courses in management programs can have very
large enrollments. The UWI’s Information Systems course typically enrolls 250–350+ students per
course per semester and is offered online every semester to second year students pursuing a three-
year undergraduate management degree. Each Information Systems course has one instructor and
four to five tutors who support student-to-instructor and student-to-student engagement.
Instructors therefore want to ensure that they are actively engaging the 250–350+ students enrolled
in the course.
The Information Systems course model builds upon an instructor-tutor relationship.
Students first register for the course that has one instructor. Within the course, students are then
assigned to groups of approximately 20–30 with one tutor per group. The tutors are adjunct faculty,
who are working professionals with content expertise. They are required to complete The UWI’s
online training courses on managing and facilitating online instruction. In their role as tutors, the
adjunct faculty are actively involved in the discussion forums providing workforce-related
perspectives. They are also responsible for grading their group’s assignments and providing
student feedback.
This method allows the instructor to focus on “managing the course” throughout the
semester. This includes preparing and posting course materials and assignments, such as
integrating supplemental materials related to current and emerging issues into the weekly course
content and the discussion forums, preparing rubrics (mark schemes), leading synchronous
sessions, and managing the student-tutor experience. The instructor also posts weekly reminders,
course-related announcements, and any institutional-related announcements.
Interaction is paramount in this course. The instructor is also responsible for managing the
instructor-tutor interaction, which includes required online meetings throughout the semester. The
instructor first reviews the coursework for the upcoming semester and discusses any nuances
regarding the new assignments with the tutors. Each tutor is required to complete the course
assignments. This provides a unique collaborative opportunity for the instructor and tutors to
discuss the assignments and make any needed modifications before they are approved for use.
Although the tutors are responsible for grading assignments, the instructor “standardizes” the
grading by randomly selecting samples from each group to ensure consistency of grading, and for
quality control. The instructor also ensures that the tutors mark assignments and enter the marks
within a two-week period. Students would then have the opportunity to reflect on the feedback
prior to the next assignment submission. This approach, in many ways, fosters a student
relationship with both the instructor and a professional within the field (the tutor).
Sample
Convenience sampling was used for This study and included 3,386 students who were
enrolled in the Information Systems online course over a six-year period: 2,386 between 2011–
2013, and 1,500 students between 2014–2016. All students were enrolled in the second year of

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 99


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

their undergraduate program. Table 1 provides an overview of student enrollments across the same
course that was offered between 2011–2016.

Table 1
Student Enrollment
Year Student Enrollment
2011 801
2012 834
2013 751
2014a 547
2015 484
2016 469
Note. a refers to the year of integration of the eMeeting design using OHT strategies.

The eMeeting Design with OHT Strategies


Students were sorted alphabetically by their first name and then assigned to sub-groups of
five where they could only see their own small group’s activity in the eMeeting. Five sequenced
threads were created using “MoodleForums,” which supports multiple sub-forums. This tool was
used to monitor student interaction and ensure clarity about participation (see Figure 2).
These five threads played an important role in guiding students through the stages of their
eMeeting with instructions on how to participate at each stage. For a two-week period, students
had access to their eMeeting. Each thread replicated the instructions for the specific task along
with the corresponding rubric and grades (mark allocations). The left half of Figure 2 illustrates
the outline of the five threads of the eMeeting. The right half of Figure 2 illustrates a screenshot
of the contents of thread one, with the assignment document (at top) and an accompanying 10-
minute video created by the instructor (at bottom).

Figure 2. First thread of an eMeeting (upper left), comprising the course assignment document
(upper right) and an accompanying instructor-led video (lower right).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 100


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Table 2 provides an overview of the content for the eMeeting threads with associated high
touch strategies.

Table 2
Overview of eMeeting Threads with High Touch Strategies
Assignment materials: Each group’s assignment instructions are posted in this thread
eMeeting
along with an instructor-led video explaining the requirements.
Thread 1
High Touch Strategy: Student Engagement (Instructor-Led Videos). A video
describing the approach and purpose of the eMeeting was developed to ensure that all
requirements were clear. The video also integrated metacognitive approaches to learning
by relating the direct connection between their eMeeting and business communication
skills that can be readily transferred to the workforce. Additionally, the video referenced
the assignment document that detailed the course work rubric, so that students were aware
of how they would be assessed. Students were also advised that they would not be graded
for partial, vague or general responses.
Introductions: Each member is expected to greet each other and provide specific
eMeeting
information identified in the instructions.
Thread 2
High touch strategy: Personalized communication (self-introductions). This strategy
required students to introduce themselves in preparation for interacting in an online
business setting as opposed to a social situation. The instructor video provided in Thread 1
included examples of how students should introduce themselves in an online environment,
so that they were able to gain marks for creating their personalized introductions for the
eMeeting. The list of students was also sorted by first name before allocating them to their
five-member groups. This intentional sorting resulted in students sharing similar names or
initials, which could be used as an icebreaker to start interaction and build camaraderie.
Topic Selection Group Activity: Students suggest and then agree on a common aspect
eMeeting
from the case study that will be used throughout the e-meeting.
Thread 3
High touch strategy: Community development with agreement on application. To
foster this virtual community, Thread 3 required members to suggest and agree on a
common application theme or topic from the case study that would be used throughout the
eMeeting. Group members were expected to share similar interests and thus create free-
flowing interactions among all group members.
Role Play Group Activity: This main thread was used by students to engage their group
eMeeting
members using key terminology, database queries, and reports from a prior assignment.
Thread 4
They were expected to incorporate this information based on the case study from which
they make real-world decisions. Students are encouraged to use information from their
database project or provide links to Internet sources that supported their decision-making.
High touch strategy: Work-integrated learning across real-world contexts. The
eMeeting provided students with an opportunity to meet online regularly and discuss
elements of their group assignment. This is similar to scheduled meetings within an
organization when working on a project. This strategy also supported instructional
strategies used by tutors to encourage further interaction in some groups, while creating
friendly competition amongst members in other groups.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 101


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Summary Notice Board Group Activity: Students were expected to access their group’s
eMeeting
private online notice board to post outcomes, decisions, or recommendations from their
Thread 5
discussions, but the final product should reflect each group’s collective summary. The
completed summary board is then embedded in this discussion thread as evidence of the
group final activity.
High touch strategy: Community development (student-student brainstorming).
Students demonstrate their collaborative skills by working effectively and respectfully
throughout the eMeeting threads. Then, as they demonstrate willingness to assist group
members to accomplish required objectives, it demonstrates the alignment with Partnership
for 21st-Century Skills.

Evidence of High Touch Strategies: Data-driven Decision-making


Quantitative and qualitative data was collected prior to and after the integration of the new
‘eMeeting’ group-work assignment to determine the effect of the high touch strategies on student
engagement. Quantitative data included examining course pass rate, course evaluations, and
attrition. Qualitative data included instructor observations, student posts, and student feedback on
the new capstone group-work assignment.

Results
This study provides summary quantitative data on academic performance and course
satisfaction as well as qualitative student feedback. Examples of five high touch strategies that
align with the OHT framework are provided in the following tables and figures. The screenshots
of examples and associated strategies are illustrated using the January–April (Semester 2) 2016
cohort. Of the 469 students enrolled in the course with access to Moodle, up to 68 smaller groups
were created using a “MoodleForum” format that supports multiple sub-forums. This also enabled
the five tutors to each monitor and actively work with approximately 13 five-member groups.
Student Engagement
Data collected from this cohort showed an average of 5,536 student visits to the eMeeting
threads and 2,502 posts by the eMeeting deadline. Each of the 13 groups was also provided with
an instructor-led video relating to an eMeeting case study. These videos were viewed 1,037 times
while students worked through the tasks (see Table 3).

Table 3
Summary of Interactions Among Students at End of Each eMeeting for Semester 2, 2016
Number of Average
Video
eMeeting Case Study posts at group mark
views
deadlineb out of 15
1. Management Reporting 498 263 14
2. Management Decision Making 771 190 14
3. Mobile Data Security 332 167 13
4. Systems Development 404 223 14
5. International Information Systems 497 194 14
b
Note. Excluding private and miscellaneous posts

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 102


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Most of the students systematically followed the sequenced discussion threads.


Additionally, student engagement in a majority of the eMeetings went beyond the requirements
posted with each thread. The applied high touch strategies, instructor observations and exemplars
of student posts from each thread are discussed in the following sections. Students’ names included
in posts have been modified to ensure anonymity.
eMeeting Thread 1: High Touch Student Engagement Strategy
Assignment Materials
With instructor-led videos. The new eMeeting design for this thread resulted in fewer
questions from the students asking for clarification on some aspect of the requirements. They were
able to refer to the written instructions and the instructor-led video to assist each other during the
eMeeting.
Instructor observation 1. Many students watched the video first before starting the tasks
and referred to the video repeatedly to ensure that they understood the tasks and knew what was
expected of them. This suggested that they made the effort to meet those expectations (Figure 3,
top).
Instructor observation 2. The information provided in the video was useful in guiding
students with the specific technology tools required, while keeping them on task as they
brainstormed during the group activity (Figure 3 bottom).

Figure 3. Students referring to the video in Thread 1.

eMeeting Thread 2: High touch Student Engagement Strategy


eMeeting with self-introductions. Students were not usually required to formally
introduce themselves in discussion forums. However, it was a requirement for this eMeeting. For
this thread, a few students posted a minimum response while others were professional, not only
posting the correct information but reaching out to welcome their groupmates. Tutors also used
this new self-introduction opportunity to guide students by posting private messages to those who
did not meet the requirements for their introductory post.
Instructor observation 1. Some students used salutations in their greeting, such as “Hello,
my name is...”, which was an accepted and more formal style of communication. However, a few
students greeted their colleagues using casual terms such as “Hey”, or “Hi guys” which is often
associated with informal social media types of greetings. (see Figure 4).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 103


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Figure 4. Examples of unsuitable (top example) and appropriate introductory posts


(Bottom two examples).
Instructor observation 2. Several students reposted their corrected self-introductions (see
Figure 5). The eMeeting also fostered a sense of community among members. For example, Figure
6 shows a sample of personalized communication from a student who reached out to a group
member to direct her to the correct discussion thread.

Figure 5. Example of reposted self-introduction.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 104


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Figure 6. Example of personalized communication among group members.

eMeeting Thread 3: High Touch Community Development Strategy


Group decision-making with agreement on application. The requirements in this thread
provided the structure for the eMeeting. It involved consensus for the selection of the application,
theme, or topic. This thread was therefore designed to engage students in discussions as they
collaborated on various components of the assignment.
Instructor observation 1. In one of the eMeetings, group members were required to post
a suitable online application and explain their choice. In Figure 7, students recommended familiar
video conferencing applications, thus integrating their previous experiences with the application
to the activity in this thread.

Figure 7. Students posting recommendations on a suitable application for video


conferencing.
Instructor observation 2. In another eMeeting, students were required to choose a mobile
phone and provide reasons why it would be appropriate for company employees to use that phone
when travelling, given its cost and data plan. Some students posted their personal brand of mobile
phone, stating generic reasons for their choice, while others compared various attributes of
different phones showing that they conducted research and made informed decisions. The top post
in Figure 8 shows examples of unsuitable posts. The bottom post in Figure 8 shows examples of
well-researched posts.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 105


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Figure 8. Students posting recommendations that were general (top) compared to well-
researched (bottom) regarding a choice of mobile phone when travelling.

eMeeting Thread 4: High Touch Work-integrated Learning Strategy


Role play group activity across real-world contexts. Students communicated and
progressed through the threads of the eMeeting while receiving valuable feedback from their tutor
and group members. Additionally, the instructor and tutors used creative ways to maintain the
momentum for those who were engaged in the discussion. A key strategy used by one tutor was to
engage a group’s members as though they were working as managers and the tutor was a
supervisor, thus changing the group dynamic and enhancing the quality of responses. Another
strategy used by a tutor was a summary post at the conclusion of each eMeeting, which shared the
progress of all groups, and provided them with a comparison of their progress among their peers.
Instructor observation 1. Figure 9 illustrated a strategy used by a tutor to engage students
in an eMeeting. As the number of student comments increased, it became necessary to summarize
the information in such a way, that it became a creative resource. Initially, some groups spent too
much time in an earlier thread trying to agree on a common theme. Other groups were not as active,
indicating that they were waiting for all members to make an initial post. One tutor not only praised
groups for starting their eMeeting but used a memo to provide guidance and encouragement to
those who had not posted or were not posting within the required deadlines. The tutor’s memo was
also written with a tone of urgency to align with the group’s cybersecurity theme thus encouraging
members to complete their tasks by the deadline to avoid a ‘security breach’ (Figure 9).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 106


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Figure 9. Tutor using a memo for role-play.

Instructor observation 2. Another tutor used a different strategy by posting a summary


of the status of the groups’ progression through the threads (Figure 10). This gave the students an
overview of how well each group was performing without embarrassing or identifying anyone
individually. In this summary post, groups B, D and E were progressing very well, and so the
instructor used a competitive strategy to maintain their momentum. A teachable moment for
groups A, C, and F from the memo was that the specific outcomes for the eMeeting were still
required irrespective of the non-attendance of all group members.

Figure 10. Tutor using group summaries for role-play.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 107


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

eMeeting Thread 5: High Touch Community Development Strategy


Group Submission with student-student brainstorming. In previous threads, students were
expected to collaborate and agree on a common application, theme, or topic from the case study
that would be used throughout the eMeetings, and then actively engage each other in a discussion.
This last tasks of the eMeeting required students to assume shared responsibility for collaborative
work. Students were expected to contribute ideas in order to create an online poster that reflects a
cohesive summary of their group assignment.
Instructor observation 1. Students were observed posting messages, reminders, and
questions as they worked effectively in their group. There was evidence of students’ interacting
respectfully with their members, making necessary compromises, and assuming shared
responsibility in collaborating on the final group poster (see Figures 11 to 13). Suggestions on
improving the final product was also observed as an indication that members valued the individual
contributions made by other team members

1Figure 11. Student summarizing a series of posts for the group.

Figure 12. Requesting feedback from other group members.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 108


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Figure 13. Communicating new ideas to other group members.

Instructor observation 2. Most groups completed their summary poster by the two-week
deadline, thus achieving the final objective of the assignment. Samples from the posters shared in
Figures 14 and 15 demonstrated creativity and innovation through working together and
brainstorming in an online environment.

Figure 14. Submission of final online poster by a group–Example A.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 109


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Figure 15. Submission of final online poster by a group–Example B.

Results
Course Evaluations, and Student Feedback
Overall data collected showed an increased pass rates, increased student satisfaction, and
decrease in student attrition. The following sections provide further details on these results.
Course pass rate. Historically, the overall pass rate for the course ranged between a low
of 69% to a high of 88% between 2011–2013. While there are fluctuations in these early years,
since the integration of the eMeeting with the high touch strategies, the pass rate shows a steady
increase with a low of 90% and a high of 93% between 2014–2016. Table 4 presents the
comparison of overall student pass rates from 2011–2013 and 2014–2016 for each semester.

Table 4
Overall Pass Rates from 2011 to 2016 for Students Who Had Both Coursework and Exam Marks
Year Semester 1 Semester 2 Average
2011 69% 83% 76%
2012 79% 75% 77%
2013 86% 88% 87%
c
2014 91% 90% 91%
2015 91% 93% 92%
2016 93% 93% 93%
Note. The UWI refers to the fall and spring semesters respectively as Semester 1, Semester 2. c
indicates the year that the eMeeting design was integrated into the course.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 110


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Course evaluation. The course evaluation is based on a 5-point Likert scale with 1 being
Strongly Disagree and 5 being Strongly Agree regarding workload, learning outcomes, and clarity
of instructions for completing course work. Data provided from 2014 to 2016 indicated a high
level of agreement across three key areas of the evaluation: (a) Lecture’s Communication (see
Table 5), (b) Learning Activities with the Lecturer (see Table 6), and (c) Feedback from the
Lecturer (see Table 7).

Table 5
Instructor’s Communication
I understood… 2014 2015 2016
the course learning outcomes (learning outcomes = what students
4.56 4.56 4.74
should know/ be able to do by the end of the course)
how I could use the knowledge/skills developed during this
4.71 4.64 4.79
course to achieve other goals (e.g. career, further study)
the instructions for completing assignments/assessments 4.57 4.74 4.88
the criteria that the instructor used to grade my
4.61 4.77 4.83
assignments/coursework
Overall 4.61 4.68 4.81

Table 6
Learning Activities with the Instructor
Learning Activities with the Lecturer… 2014 2015 2016
were varied (that is, involved different types of activity) 4.56 4.56 4.63
were interesting or intellectually stimulating 4.71 4.64 4.81
encouraged me to interact/collaborate with other students about
4.57 4.74 4.42
course topics
helped me to develop the knowledge, attitudes and skills
specified in the course learning outcomes (learning outcomes = 4.61 4.77 4.81
what students should know/be able to do by the end of the course)
required me to apply my new knowledge and skills to
4.48 4.78 4.77
problems/new scenarios
Overall 4.61 4.68 4.71

Table 7
Feedback from the Instructor
Statement 2014 2015 2016
Feedback on assignments/in-course assessments was provided in
4.56 4.56 4.83
sufficient time to be useful
Feedback helped me to develop/ improve my knowledge or skills 4.71 4.64 4.78
Grades for assignments/in-course assessments were based only
4.57 4.74 4.98
on the criteria that the instructor had specified
Overall 4.61 4.65 4.86

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 111


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Student feedback. As part of course evaluations, students could provide written feedback
through open-ended questions. These include additional thoughts or information regarding (a) what
they liked best about the course, (b) what they liked least about the course, and (c) how the course
could be improved. The last section on lessons learned provided an opportunity for students to share
their overall experiences of the course. While most comments were based on experiences with
maintaining focus throughout the course or the overall workload, comments from a few students each
semester shared the impact of the eMeeting assignment. There were no comments or negative
experiences posted specifically regarding with the revamped eMeeting design.
Examples of student feedback from 2014–2016 included:
• The graded group discussion on recycling was effective in its intent to stir up a need to be part
of a recycling action, rather than a passive bystander. I feel more confident about what I have
to offer in relation to company decision making the impact on the environment (2014).
• In going forward, I would try to implement the knowledge I have gained from the course in
my business to enhance its performance (2014).
• I would like to say that from the initial tasks to the project and e-meeting in the board room
have all contributed to my learning and the e-meeting definitely had an impact as it provided
guidelines as to what I can expect in a meeting, as this was my first “e-meeting experience”
(2015).
• This course has thought me a lot about teamwork and I enjoyed every moment of it. For some
reason, this course has been the first I have felt so passionate about hence the reason why I felt
obligated to participate in every forum in this course (2016).
• I found I was able to directly apply some areas of this course to other courses and in so doing
enhanced my understanding of those interconnected areas. I feel ready to apply what I have
gleaned to my job and other areas of my life (2016).
Decrease in student attrition. The course has historically had attrition rates of around 5%.
Since the integration of eMeeting design with high touch strategies, the attrition rate dropped to
approximately 1%. Table 8 shows the retention rates from 2011 to 2013 before the new group
assignment with high touch strategies was introduced in 2014 to 2016. Table 8 also highlights the
overall decrease in student attrition from 5% to 1%.
Table 8
Summary Data on Number of Students who Dropped the Course Between 2011 and 2016
Number of Percentage of
Student Number of students
Year students who students who dropped
enrollment who dropped course
completed course course
2011 801 760 41 5.1%
2012 834 790 44 5.3%
2013 751 715 36 4.8%
* 2014 547 534 13 2.4%
2015 484 476 8 1.7%
2016 469 464 5 1.1%
Note. Data was obtained from final exam mark sheets for both course work and final exam marks.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 112


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Recommendations
There are five recommendations from this study that align with the OHT conceptual
framework. Each recommendation builds upon the literature and makes a connection to the context
and results of this action research study. The recommendations can be used in courses whether
they range from low to very large enrollments to support engagement and retention.
1. High Touch Student Engagement Strategy: Enhance Directions with Instructor-Led
Videos
It is recommended that instructors develop supporting materials beyond the syllabus. These
materials can include instructor-led videos and useful examples used in this study, which describe
the approach and purpose of an assignment to ensure that all requirements are clear. The instructor-
led videos should reference the assignment document and detailed course work rubric, so that
students are aware of how they would be assessed.
2. High Touch Personalized Communication Strategy: Provide Examples for Self-
Introductions
It is recommended that as part of the requirements, students introduce themselves in
preparation for interacting in an activity. Examples could also be shared on how students should
introduce themselves in an online environment. Similarities in students’ names, if sorted, could be
used as an icebreaker during initial introductory posts. Instructors should also be aware that it is
important to provide creative and personalized feedback to students whether individually or as a
group. Apart from demonstrating that that they are interested in students’ contributions,
personalized posts can also serve to encourage students who have yet to join or engage in the
threads for various reasons.
3. High Touch Community Development Strategy: Require Collaborative Agreement
It is recommended that instructors integrate strategies that actively engage students with
topics relevant to weekly/module content, current/emerging issues, and prior or upcoming
assignments. These assignments in which students collaborate could include tasks that require
interaction asynchronously or synchronously.
4. High Touch Work-Integrated Learning Strategy: Integrate Role Play Using Real-world
Contexts
It is recommended that assignments align with real-world contexts, to support transfer of
learning. This offers opportunities for communication, critical thinking, problem solving, and
collaboration.
5. High Touch Data-Driven Decision-Making Strategy: Review Quantitative and
Qualitative Data
A consistent review of quantitative and qualitative data from assessments and evaluations
is recommended. Thus, monitoring various aspects of the course can assist in modifying current
assignments or integrating new course assignments into courses with large student enrollments.
These five high touch recommendations are provided to instructors as they seek to actively
engage students in large classes. The eMeeting-type assignment, as shared in this study, has shown
to assist students in developing the knowledge, skills, and experience needed as they transition
from the educational classroom to the corporate sector.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 113


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Conclusion
The UWI is committed to providing all students with the same high-quality courses across
all instructional modalities, and alignment with the program outcomes, 21st Century Skills, and
The UWI’s attributes. This study has demonstrated that an online discussion forum can be
successfully designed and introduced to students as an eMeeting or other real-world group activity.
The results show that since the integration of the new eMeeting format using OHT strategies,
significant increases were observed in student engagement and academic performance, while a
comparison between the pre- and post-integration revealed decreases in attrition, and higher scores
on the standardized final exam. Course evaluations between 2014–2016 also reflect increased
student satisfaction with the course.
The integration of the new eMeeting design using high-touch strategies was successful for
students in this assignment. Future research could involve using these strategies in other course
assignments requiring group work to further evaluate learning outcomes and capture students’
experiences. Monitoring of student attrition obtaining feedback could determine if these types of
strategies influence student satisfaction and retention. eMeetings can be used, by students and
instructors alike, as a valuable teaching tool especially simulating real-world group meetings.
Integrating group assignments in large courses with real-world requirements, not only provide
students with the opportunity to apply the knowledge, skills, and experience gained throughout the
course, but enhance key soft skills sought by employers including problem-solving, ability to work
in teams, communication, leadership, and time management.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 114


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

References

AACSB International. (2013). Eligibility procedures and accreditation standards for business
accreditation (Adopted April 8, 2013). Retrieved from
aacsb.edu/-/media/aacsb/docs/accreditation/business/standards-and-tables/2018-business-
standards
Astin, A. W. (1984). Student involvement: A developmental theory for higher education. Journal of
College Student Personnel, 24(4), 297–308.
Bauer-Wolf, J. (2018, February 23). Overconfident students, dubious employers. Inside Higher Ed.
Washington, DC. Retrieved from https://www.insidehighered.com/news/2018/02/23/study-
students-believe-they-are-prepared-workplace-employers-disagree
Berry, R. W. (2009). Meeting the challenges of teaching arge online classes: Shifting to a learner-
focus. MERLOT Journal of Online Learning and Teaching, 5(1).
http://jolt.merlot.org/vol5no1/berry_0309.htm
Betts, K. (2008). Online human touch (OHT) instruction and programming: A conceptual framework
to increase online student engagement and retention in online education, Part 1. Journal of
Online Learning and Teaching, 4(3), 399–418.
Betts, K., Kramer, R., & Gaines, L. L. (2011). Online faculty and adjuncts: Strategies for meeting
current and future demands of online education through online human touch training and
support. International Journal of Online Pedagogy and Course Design, 1. 20–38. doi:
10.4018/ijopcd.2011100102
Bitter, C., & Loney, E. (2015). Deeper learning: Improving student outcomes for college, career,
and civic life. American Institutes for Research.
Boton, E. C., & Gregory, S. (2015). Minimizing attrition in online degree courses. The Journal of
Educators Online, 12(1), 62–90.
Briggs, A. (2015, Feb.). Ten ways to overcome barriers to student engagement online. Online
Learning Consortium. Retrieved from onlinelearningconsortium.org/news_item/tenways-
overcome-barriers-student-engagement-online/
Buzzetto-More, N. (2015). Student attitudes towards the integration of YouTube in online, hybrid
and web-assisted courses: An examination of the impact of course modality on perception.
Journal of Online Learning and Teaching , 11(1), 55–73.
Bigatel, P. (2016, March 14). Student engagement strategies for the online learning environment.
Faculty Focus. Retrieved from https://www.facultyfocus.com/articles/online-
education/student-engagement-how-to-help-students-succeed-in-the-online-environment/
Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate
education. AAHE Bulletin, 39(7), 3–7.
Cohen, L., & Manion, L. (1994). Research methods in education (4th ed). Routledge.
Cranton, A., & Legge, L. H. (1978). Program evaluation in higher education. The Journal of Higher
Education, 49(5), 464–471.
Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. Harper & Row.
Efron, S. E., & Ravid, R. (2013). Action research in education: A practical guide. Guilford Press.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 115


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Elison-Bowers, P., Sand, J., Barlow, M. R., & Wing, T. J. (2011). Strategies for managing large
online classes. The International Journal of Learning, 18(2), 57–66.
Faharani, G. O. (2003). Existence and importance of online interaction (Doctoral dissertation,
Virginia Polytechnic Institute, 2003). Retrieved from
http://scholar.lib.vt.edu/theses/available/etd-04232003 202143/unrestricted/Gohar-Farahani-
Dissertation.pdf
Framework for 21st century learning. (n.d.). Partnership for 21st-century learning Retrieved from
http://www.battelleforkids.org/networks/p21/frameworks-resources
Goldie, J. (2016). Connectivism: A knowledge learning theory for the digital age? Medical Teacher,
38(10), 1064–1069. https://doi-
org.ezproxy2.library.drexel.edu/10.3109/0142159X.2016.1173661
Kolb, D. A. (2014). Experiential learning: Experience as the source of learning and development.
Pearson Education Ltd.
Laux, D., Luse, A., & Mennecke, B. (2016). Collaboration, connectedness, and community: An
examination of the factors influencing student persistence in virtual communities. Computers
in Human Behavior, 57, 452–464. doi:10.1016/j.chb.2015.12.046
Lederman, D. (2018, November 7). Online Education Ascends. Inside Higher Ed. Retrieved from
https://www.insidehighered.com/digital-learning/article/2018/11/07/new-data-online-
enrollments-grow-and-share-overall-enrollment
Mehrabian, A. (1971). Silent messages. Wadsworth.
Milne, P. (2005). A model for work integrated learning: Optimizing student learning outcomes.
World Association for Cooperative Education (WACE). Retrieved January 16, 2018, from
http://www.waceinc.org/pdf/A%20Model%20for%20Work%20Integrated%20Learning%20
%20Optimizing%20Student%20Learning%20Outcomes%20-%20Milne.pdf
Mokoena, S. (2013). Engagement with and participation in online discussion forums. The Turkish
Online Journal of Educational Technology, 12(2), 97–105.
Norton, L. (2018). Action research in teaching and learning: A practical guide to conducting
pedagogical research in universities. Retrieved from https://ebookcentral-proquest-
com.ezproxy2.library.drexel.edu
Palloff, R. M., & Pratt, K. (1999). Building learning communities in cyberspace: Effective strategies
for the online classroom. Jossey-Bass Inc.
Parsons, R. D., & Brown, K. S. (2002). Teacher as reflective practitioner and action researcher.
Wadsworth/Thomson Learning.
Reuters. (2017, June 15). Global e-learning market 2017 to boom $275.10 billion value by 2022 at a
CAGR of 7.5%. Orbis Research. Reuters Solutions. Retrieved from
https://www.reuters.com/brandfeatures/venture-capital/article?id=11353
Rose, K. K. (2009). Student perceptions of the use of instructor-made videos in online and face-to-
face classes. Journal of Online Learning and Teaching, 5(3), 8. Retrieved from
http://jolt.merlot.org/vol5no3/rose_0909.htm
Rose, R., & Smith, A. (2007). Chapter 9 online discussions. In C. Cavanaugh, & R. Blomeyer (Eds.),
What works in k-12 online learning (pp. 143–160). International Society for Technology in
Education.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 116


From Discussion Forums to eMeetings: Integrating High Touch Strategies to Increase Student Engagement,
Academic Performance, and Retention in Large Online Courses

Spencer, J. (2017). Making learning flow. Solution Tree Post.


Stanford-Bowers, D. (2008). Persistence in online classes: A study of perceptions among
stakeholders. MERLOT Journal of Online Learning and Teaching, 4, 37–50. Retrieved from
http://jolt.merlot.org/vol4no1/stanford-bowers0308.pdf
The University of the West Indies. (2016). The University of the West Indies Strategic Plan 2012–
2017. The University Office of Planning and Development. Retrieved from
http://www.uwi.edu/sf-docs/default-source/planningdocs/UWI_Strategic_Plan_2012–
2017_Final.pdf
Thomas, L., Herbert, J., & Teras, M. (2014). A sense of belonging to enhance participation, success
and retention in online programs. The International Journal of the First Year in Higher
Education, 5(2), 69–80. doi:10.5204/intjfyhe.v5i2.233
Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition. University of
Chicago Press.
Toney, K. (2017, Aug 10). Five ways to keep students engaged in their online classes. Distance and
Online Learning. Retrieved from https://evolllution.com/revenue-
streams/distance_online_learning/five-ways-to-keep-students-engaged-in-their-online-
classes/
Trammell, B. A., & LaForge, C. (2017). Common challenges for instructors in large online courses:
Strategies to mitigate student and instructor frustration. Journal of Educators Online, 14(1),
1–10.
Vik, G. N. (2001). Doing more to teach teamwork than telling students to sink or swim. Business
Communication Quarterly, 64, 112–119.
Wilsman, A. (n.d.). Teaching large classes. Vanderbilt University’s Center for Teaching. Retrieved
December 18, 2018, from https://wp0.vanderbilt.edu/cft/guides-sub-pages/teaching-large-
classes/
Yong, A. (2003). Success factors in e-learning implementation. The Star in Tech, 22(19), 19.
Yu, J., & Hu, Z. (2016). Is online learning the future of education? World Economic Forum.
Retrieved August 8, 2017, from https://www.weforum.org/agenda/2016/09/is-online-
learning-the-future-of-education/

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 117


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

Postgraduate Online Teaching in Healthcare:


An Analysis of Student Perspectives
Cuisle Forde and Silvia Gallagher
Trinity College Dublin

Abstract
The use of online learning in postgraduate teaching has increased dramatically in recent years.
Health-care professionals can benefit from the flexibility afforded by online learning to fulfil their
continuing professional development goals. Understanding student expectations, concerns, and
experiences of such courses is crucial for the development and successful facilitation of this
education modality. The aim of this paper was to examine student perspectives of an online
postgraduate certificate in clinical exercise prescription. A set of recommendations based on these
findings was also created which may serve to inform those involved in online education.
Students expressed their expectations and concerns about taking the course before it began, and
completed surveys on their experience after module completion. A multi-method approach using
both qualitative content analysis and quantitative survey analysis was used to analyze student
responses on the online modules in the virtual learning environment.
Students (n = 19) had a combination of academic, personal, and clinical expectations entering the
course. Concerns entering the course included ability to reach academic standards set by the course
due to personal circumstances or lack of academic ability; the ability to manage time and workload;
and the online nature of the course. Students felt supported throughout the course, although some
had difficulties keeping up with the workload or managing their time. Results of this study can be
used to inform the structure and coordination of online modules, in particular in the postgraduate
healthcare setting.

Keywords: online education, postgraduate, recommendations, student perspectives, student


support

Forde, C. & Gallagher, S. (2020). Postgraduate online teaching in healthcare: An analysis of


student perspectives. Online Learning, 24(1), 118-139.
https://doi.org/10.24059/olj.v24i1.1566

Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives


Recent years have seen an increase in the number and types of courses in higher education
being offered online (Allen et al., 2016). At postgraduate level, online teaching (or e-learning) is
particularly suited to healthcare professionals, as they are required to engage in continued
professional development (CPD), yet cite time and lack of provision of study leave as barriers to
attending traditional face-to-face classes (Haywood et al., 2013a; Haywood et al., 2013b). Many
online courses have been developed specifically for healthcare professionals (Brown & Bullock,

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 118


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

2014; Field, 2002; Gardner et al., 2016; Murphy et al., 2015; Wolbrink & Burns, 2012) and this
delivery method can help overcome some of the challenges for healthcare education (Ruckert et
al., 2014).
As a teaching method, online education in its various forms has been hailed for its potential
to promote higher level thinking. This stems from the fact that theory can be learned at a time that
suits the student, enabling synchronous and asynchronous interactions between academics and
students, to be used for debate, discussions, case scenarios, and problem-solving (Ally, 2004). In
general, student and staff acceptance of such courses has been very high, with effective use of
financial and time resources as well as effective learning being cited as positive benefits to e-
learning (Bergold et al., 2013; Fisher, 2015; Gummesson, 2012; Macznik et al., 2015).
However, health-care students have varied perceptions of using information
communication technology in relation to education (Costello et al., 2014), and incorporating new
pedagogical models can challenge the student learning experience (McDonald et al., 2014).
Understanding student perceptions as online learners in health sciences education can help health
science educators address students’ concerns and expectations, tailor the online modules or
information imparted accordingly, and as previous research has shown, can help build more
effective online courses (Howland & Moore, 2002; Song et al., 2004). It can also provide evidence
to the wider discipline on the concerns, needs, and expectations of health-care professionals
undertaking further education in this space. This can support the development of more online
courses and help guide educational and professional institutions in future efforts.
The purpose of this paper was to expand understanding of student perspectives in this field,
specifically in health science courses. The main research questions explored in this study were
1. What expectations do health-care students have prior to starting a postgraduate online
course?
2. What concerns do health-care students have prior to starting a postgraduate online course?
3. What perceptions and experiences do healthcare students have during and after modules
on a postgraduate online course?
Finally, for the information gathered as part of this study to be of most use, it was an aim
of the authors to develop a set of recommendations for online educators that would serve as a guide
for online course development and facilitation.
Background on the Course and Aims of this Study
The online Postgraduate Certificate in Clinical Exercise was delivered online over one
academic year via four modules with a total of 27 teaching weeks. Teaching included weekly
asynchronous lectures (interactive slides with a voice over), weekly synchronous tutorials
(webinars), self-directed reading, discussion board posts that were moderated by academic staff,
reflective journal entries and multiple-choice questions. Assessment consisted of essays, case
scenarios, engagement with online material (equivalent to attendance), and multiple-choice
questions. Each module had the same week-by-week structure whereby students began the week
with an asynchronous webinar followed by multiple choice questions. Students were then required
to carry out a task (e.g., write a reflective journal entry) and the week ended with a synchronous
webinar that addressed any issues with the material that were presented during the week and
encouraged discussion and debate on the topic at hand. The length of each of the four modules
depended on the credits that were attributed to it (in line with the European Credit Transfer and

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 119


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

Accumulation System) and ranged in length from 10 weeks to 4 weeks. The course had a total of
27 active teaching weeks, one revision week, and three exam weeks. There was a four-week break
for Christmas holidays. Students were expected to log in during each active teaching week. In
order to gain marks equivalent to attendance, students had to be present during the live webinar
and to have engaged with the asynchronous material during the week preceding the live webinar.
The structure of the certificate was determined by staff developing and teaching on the
course in collaboration with the online education team at the university. It was considered
imperative during the development of the course that there would be both asynchronous and live
components and that the students would be required to contribute to the course in the form of a
blog post or reflective journal entry during each active teaching week. The main difference
between these online learning tools is that a blog post is shared with all members of the class
whereas a reflective journal entry is between the academic and student and is not shared with the
class. This structure was hoped to encourage continuous engagement with the course material and
a high level of learning.
Twenty students registered for the Online Postgraduate Certificate in Clinical Exercise and
seventeen students completed the course in 2016. All three students who left the course did so
during the first module. The average student age was 35 (SD = 8) years. The youngest was aged
24 and oldest aged 50. Twelve female and eight male students registered for the course. Of these
students 12 were physiotherapists, four were nurses, and four were other allied health scientists.
All students had English as their mother tongue or had completed a higher intermediate English
language exam within the previous 2 years. The majority of the students (18) lived on the island
of Ireland (from where the course was being hosted) and were working full time in clinical
positions.
As part of the course orientation students were asked to write a short introduction about
themselves, to contribute to a discussion board about their expectations of the course, and to write
a journal entry about their apprehensions or concerns taking the course. After each module,
students were asked to complete a simple feedback survey with five Likert style questions and one
open-ended question that served to assess the basic structure of the online modules and whether
students believed that it succeeded in meeting learning objectives.

Methods
This study used both qualitative content analysis and quantitative survey analysis to
analyze student contributions in the virtual learning environment (VLE) and post-module surveys.
Content analysis is a method for analyzing written, verbal, or visual communication messages
(Hsieh & Shannon, 2005). Its purpose is to produce a condensed description of a phenomenon and
to generate concepts used for theoretical categorization (Elo & Kyngas, 2008). Qualitative data,
which included open-ended comments/suggestions from the feedback surveys as well as relevant
discussion posts, reflective journal entries, were analyzed using NVivo 11 software (see Table 1).
Table 1 outlines data sources for analysis. This software aids qualitative analysis of large amounts
of text-based data, and has been successfully used to support similar research (Anaf & Sheppard,
2010; Lefmann & Sheppard, 2014; Moore et al., 2003). Text can be coded with themes, and also
provides a useful audit trail of the analysis where temporal changes in categorical interpretation
and coding can be seen. Thus, NVivo 11 provided the framework for analysis and consolidation
of themes. For the qualitative analysis an inductive method was employed. Initially, several broad

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 120


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

themes were identified relating to our research question (student expectations and concerns), but
on further synthesis, more complex themes were integrated into main themes and themes that led
to a more concise view of student’s experiences. A reflective iterative approach by two researchers
facilitated this process. All coding was completed in NVivo 11 and grouped into two categories:
“pre-course” and “during course.” Codes were compared within the NVivo system software and
any differences were resolved through consensus discussion. Any remaining difference were
resolved through discussion with a third researcher not directly involved in the study.
All contributions from students were collected after the course was completed and final
marks had been awarded. All students provided written informed consent. Ethical approval for this
study was granted by the Trinity College Medical Research Committee. Individuals could not be
associated with any information given, and this study was designed retrospectively.

Results
Table 1.
Data Sources for Analysis
Number of
Course Resource codable Number
period Question posed type items of words
Pre-course What are you most looking forward Blog 15 2,060
to with regard to the course?
Pre-course What are your concerns surrounding Learning 15 1,817
taking this course? Journal

Pre-course Hello Class Thread Discussion 29 2,740


Board
During Open-ended questions at end of each Survey 28 954
course module survey

A summary of themes and subthemes that emerged from qualitative data analyses are
outlined in Table 2.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 121


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

Table 2
Themes and Subthemes Emerging from Analyzed Qualitative Data
Temporal period Theme Subtheme

NOTE: Only subthemes with the largest number of coded entries have been
included under each theme due to space constraints and to provide a
concise representation of data analysis
Pre-course Expectations
Learning (n = 87) Improve exercise prescription skills
(n = 33)
New knowledge (n = 19)
Broaden or deepen knowledge (n = 8)
Develop evidence-based knowledge
(n = 14)
Multidisciplinary learning (n = 6)
Achievements (n = 34) Complete course (n = 1)
Connecting with others (n = 19)
Improve confidence (n = 8)
Change in clinical practice (n Patient benefit (n = 14)
= 50)
Career benefit (n = 13)
Looking forward (n = 11)
Studying online (n = 10)
Pre-course Concerns
Personal (n = 14) Personal circumstances (n = 9)
Communication (n = 4)
Time and Workload (n = 11)
Academic (n = 11) Personal ability (n = 10)
Lack of evidence base (n = 1)
Technology (n = 10) Internet (n = 7)
Online learning (n = 3)
During course Survey feedback after
implementation completion of each module
Positive Online nature of course
Support
Relevance of material covered
Negative Time or workload
Online nature of course
Practical classes required

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 122


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

Pre-course Expectations.
The following themes emerged for pre-course expectations: learning, achievement, change
in clinical practice, and looking forward and studying online themes.
Learning. Within the learning theme, identified subthemes were to improve skill at
exercise prescription, to learn new knowledge, to broaden or deepen knowledge, to develop
evidence-based knowledge, and to learn from the multidisciplinary student base. Detailed analysis
of this theme revealed a focus on learning exercise prescription as a skill through the acquisition
of knowledge. The subtheme of improving exercise prescription skills was often cited with
reference to clinical populations and students expressing a lack of confidence in this area. This
finding indicates that the students did not feel confident prescribing exercise to certain patient
populations.
My main interest and expectation surrounding the course would be to develop knowledge
and confidence regarding the prescription of exercise for the prevention and treatment of
chronic diseases.
The second-most prominent subtheme was to acquire new information. Students expressed
an expectation of learning about exercise prescription in clinical populations that could be helpful
in their clinical settings. For example:
I plan to develop a specific area of my practice, namely health promotion in the over 60s,
and I believe a more in-depth understanding of the effect of exercise will help me to deliver
a quality service to my clients.
The third subtheme, broaden or deepen current knowledge, was highlighted through
students curiosity to learn about the new research emerging in the area of exercise prescription: “I
am hoping to learn the latest research and methods in prescribing exercise as a physiotherapy
intervention.” Results clearly show that healthcare professionals expected this course to have a
strong evidence base.
The final learning subtheme was to learn from the multidisciplinary student body. One
student remarked: “I'm looking forward to learning from the many different backgrounds and
disciplines that has already been posted here.” Research has shown that teamwork and
collaboration between all health professionals is essential for high quality patient care (Chaboyer
& Patterson 2001; McPherson et al., 2001). It is encouraging to see a desire for multidisciplinary
learning extending into online opportunities for continued professional development.
Achievement. The second-most referenced “expectations” theme was identified as
achievement. Upon commencing the online certificate, students were expecting to be able to
connect with academic staff and their classmates, as well as sharing their knowledge beyond the
classroom environment. Furthermore this was something that they were clearly looking forward
to. Comments that highlight this theme include:
I'm looking forward to meaningful discussions with fellow course mates and clinical
educators so as to learn from their experiences and expertise
One student in particular expressed this theme very succinctly, stating that they: “Can't
wait to chat to you all and learn from other people's experiences.” This theme highlights the fact
that students expected to interact with each other despite the online nature of the course which may
traditionally have been considered to be more isolated than traditional face-to-face teaching.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 123


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

The second subtheme relating to achievement was to simply complete the course. This
theme seemed to stem from the concern of having not engaged in formal education for some years.
For example, one student remarked that their expectation of themselves was to “get through the
course after years working in the clinical environment.” This theme is of particular importance in
postgraduate clinical education and highlights the fact that clinicians may regularly take part in
different forms of continued professional development and still perceive formal education as a
significant personal challenge. Although personal barriers such as family commitments, resistance
from peers, and time constraints (French & Dowds, 2008) have been identified in the literature,
the challenge of returning to formal education after a large temporal gap in completing
undergraduate education has previously not been identified in this space (see O'Donnell et al.
[2009] for a discussion of this theme within the social science discipline).
Change in clinical practice. The third theme to emerge from student expectations was that
of changes to clinical practice. Students expected to change their clinical practice through
introducing or improving exercise prescription. The subthemes in turn were to benefit patient
health and develop their own career. The caring nature of healthcare professionals was evident in
this theme with the most referenced subtheme being that of expected benefits to patient health. For
example, one student stated:
I was drawn to this postgraduate certificate as I have a special interest in rehabilitation and
believe that as clinicians we should lead the way in improving quality of life for individuals
through exercise especially for those with chronic conditions who find it more difficult and
lack the confidence to exercise safely.
Results indicate that students expected the changes they made to result in improved
outcomes for their patients. For example one student remarked:
We know as clinicians why patients need to engage in exercise, I hope this course will
deepen my own understanding of the subject. I anticipate that in turn, this
theoretical knowledge will inform my practice and help me to have more positive outcomes
with service users. My goal is to effect long term change in the lifestyles of my patients,
thereby improving their current physical and mental health, life expectancy and prognosis
Students also expressed an expectation that the course may benefit their careers:
Due to unforeseen circumstances I was away from work for a significant time so I have
decided that undertaking this particular … online course would be an essential step forward
to refresh and regain my career ambition.
Looking forward and studying online themes. The final two themes which emerged from
student expectations were looking forward and studying online. Students expressed positive
expectations regarding studying online: “I am excited by the online interactive setup of this post
grad certificate” and “It is my first time pursuing online education and I'm enjoying the format so
far.”
Pre course Concerns.
Before course teaching began, students were asked to voice their concerns about the course
in their online learning journals. Four main themes emerged, which are listed in Table 2: personal
concerns, academic concerns, time management and workload, and technology.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 124


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

Personal concerns. As with any student body, those taking the online postgraduate
certificate in clinical exercise had concerns around the theme of balancing their personal and
academic lives. Many of the students were working full time in clinical environments and also had
family commitments.
Organisational skills with regard to all the weekly tasks is my main concern at present,
making sure that I can keep up with all the reading requirements and also keep family life
as sane as possible.
My biggest concern about taking on this post graduate study is that it will take away from
quality time with my 14 month old.
Several students also expressed concern over being able to express themselves and their
opinions concisely.
Another concern that I have is around my reflective writing skills. I am very good at
reflecting in my head and verbally but have struggled in the past to get it down on paper
succinctly. I feel that as a result I have lost some confidence in this area so this will be a
good challenge for me.
Academic concerns. Students expressed concern about engaging in an academic course.
Some had not been engaged with formal education for several years:
However, regarding my initial concerns—the main one is that it has been over 10 years
since I have had to do any scientific or academic writing, and I am nervous about my ability
to research and access material, and reference it accurately.
This is similar to the theme of achievement where students expressed a wish to complete
the course. On further examination of this theme, academic writing in particular was stipulated by
students as a concern.
The second subtheme with regards to academic concerns was expressed by only one
student; however, it is important to note. This student in question was concerned that the body of
evidence and learning material available would not meet their academic needs. This reflects the
fact that the area of clinical exercise prescription is a complex and ever-evolving one.
My main concern is that I am expecting that their [sic] will be research that will allow my
[sic] to prescribe exercise intensities based on specific physiological processes, or
biological changes that in turn decrease the patients symptoms. I am in some way expecting
that these levels will be physiologically measurable and patient specific. I think that I will
find that the research is lacking in this area, and will ultimately be disappointed by this.
Unfortunately, there are frequently times when the research to date is not capable of
answering specific clinical questions. This student’s concerns highlights the fact that we do not
yet have all the answers in the area of exercise prescription, but that in a learning environment we
can share our concerns and questions and strive to make informed evidence based decisions in
clinical care with available resources.
Time management and workload. The third main theme to arise from students’ concerns
is related to, but distinct from the first two. Students were concerned about their ability to manage
time and achieve the workload expected of the postgraduate certificate in clinical exercise.
Although this was not the most referenced concern, it was cited as the main concern by some
students which emphasizes its importance. For example: “The main concern I have for this course

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 125


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

is the amount of time and dedication it will take to achieve a high mark” and “My main challenge
will be managing my time effectively.”
Technology. The final theme to emerge was that of technology. Some students were
concerned about their internet connection while others expressed some apprehension about the
online nature of the course and their ability to engage with a course delivered entirely on an online
platform. Ireland has a large rural community. One advantage of online learning is that students
don’t have to make long journeys to cities to avail of learning opportunities. However, there are
also potential shortcomings related to connectivity: “There have been some teething problems,
namely internet access, as I live in a very rural area with no high speed or fibre optic broadband
connection.” In healthcare, learning as part of a group has been shown to have significant
advantages (McPherson et al., 2001), whether this could be achieved through an online format was
a source of concern for one student:
While I am enjoying the online format, it will be unusual to not be in a physical classroom
with real live classmates. The Collaborate forum is surprisingly personal though, and I do
think we will get to know each other as the year goes on.
Learning online is a skill in itself, and therefore ample orientation is required, especially for those
who are concerned about their ability to navigate an online learning platform or those who may
feel uncomfortable contributing to one, this concern was expressed by some students:
While I was somewhat apprehensive starting an online course, it had more to do with my
previous experience than the fact I am not techie enough for it all to go smoothly. So far,
things have gone great and I have managed to be introduced to the online platform and
even engage, as I am now doing.
During course: Feedback.
Table 3 outlines results of the quantitative survey analysis. Overall results show that
students considered topics covered during online modules met learning objectives and learning
outcomes to a good or very good standard. As a group, students considered the number of lectures
as “just right.” The vast majority of students, approximately 90% of survey responders, felt that
the organization of lectures and module timetables were organized or very organized and over 90%
felt that learning materials provided were either good or very good. Overall modules were rated as
being either good or very good.
The final question on the survey was open-ended and simply asked students to contribute
any comments or suggestions related to the module. These answers were analyzed for all modules
and results are categorized into positive and negative comments.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 126


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

Table 3
Results of Post Module Surveys
Average
Question and response options
overall (%)
Q1: To what standard did the topics covered meet the objectives and
learning outcomes stated in the course handbook?
Very good 58
Good 40
Average 0
Poor 0
Very poor 0
I am unaware of the learning outcomes in the handbook 0
Unanswered 2
Q2: Was the number of lectures sufficient to meet the course objectives and
learning outcomes?
Too many 3
Just right 80
Too few 17
I am unaware of the learning outcomes in the handbook 0
Unanswered 0
Q3: How do you rate the organisation of lectures and timetabling for this
module including availability of resources on Blackboard
Very organised 37
Organised 53
Average 11
Not organised 0
Very poorly organised 0
Unanswered 0
Q4: How do you rate the learning material (from presentations to webinars)
provided during the module?
Very good 51
Good 44
Average 4
Poor 0
Very poor 0
Unanswered 0
Q5: How would you evaluate the overall module?
Very good 58
Good 39
Average 0
Poor 3
Very poor 0
Unanswered 0

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 127


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

Positive themes. Subthemes that emerged from positive comments related to the relevance
or applicability of the learning material to student’s clinical environments, and personal interest.
For example: “All in all this module was highly informative and topics for example re limiting
sedentary behaviour were adaptable in most situations for us as healthcare professionals, even
pregnancy!” and “I'm looking forward to reading those articles for personal interest and guidance.”
Online delivery. Students also commended the online nature of the course and its
organization with comments such as: “Excellent module well presented and organised” and
I have to say there is massive benefits to education when you don't have to get into a car or
public transport and race to … your lectures. I am really enjoying this element of flexibility.
So I can just log off right now and finish doing what I was earlier - technology is great!
Quality of online learning content. The quality of the online content was also highlighted
as a strength of the course: “The online material provided is excellent.”
Support. The final positive theme to emerge from the anonymous survey was that of
support. Students felt engaged and supported throughout their learning journey online. This is
reflected in comments such as: “Thanks for the motivation everyone in posting all your
discussions, I was thinking that i wouldn't have a chance to complete these tasks this week but you
all spurred me on!!” and “I am really enjoying the content and engagement with lecturers and other
classmates.”
Negative themes. Negative comments were also collated. Subthemes that emerged echoed
some of the main concerns expressed by students before starting the course. The most referenced
negative comment related to time management and workload.
Time management and workload. Some students felt that the workload was too much: “I
have found the amount of material to pre-read and the level of exercises in the pre read a lot along
with the assessments which seem to leave little time to breath with doing ordinary work and life”
and “The workload was very heavy…All very relevant and interesting but hard to keep up with
everything” while one felt that more learning material was needed: “I actually expected the lectures
to be more in depth in terms of physiology and how it affects exercise.”
Online delivery. There were also negative comments regarding the online nature of the
course, in particular the lack of any practical teaching. One student remarked: “A disadvantage of
the online format was evident for this module as a session in a lab or gym would make facilitate
better understanding of the concepts” while another stated that “Some practical sessions would be
invaluable.” A suggestion on how to address this issue was also provided with one student
commenting that “… one weekend module where one can see exercise testing and a few different
types of exercise prescription in action would be a great addition and ideally a follow on practical
module :).”
Technical difficulties. Finally, one student had technical difficulties that resulted in a
negative experience and that were considered to be a problem specific to online learning as it
would not have happened in a face-to-face situation:
I have been very unfortunate to have my MCQ crash twice during this module which was
really very upsetting at the time when I was all geared to take the exam and on both
occasions [sic] disrupted my work day. This would not happen in a sit down exam.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 128


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

Limitations
A notable limitation of this study is that only one course was analyzed. It is possible that
courses of a different nature would require different student supports and result in different online
student experiences. Data presented in this paper is in relation to a postgraduate health sciences
course and therefore may not be applicable to undergraduate or other postgraduate
courses/students. However, although data was taken from a specific course, case studies such as
these can enrich the literature on healthcare student experiences in a virtual learning environment,
help improve teaching methods, and ultimately ameliorate student experiences. Another limitation
to this paper is that although post-module analysis was gathered, there was no detail on overall
course experience gathered from students after completion of the entire course.

Discussion
Recent years have seen a surge in the number of online courses offered to healthcare
professionals. Courses being offered online come with the advantage of offering clinicians more
flexibility in reaching their continued professional development goals and have been reported as
successfully leading to improvements in both knowledge and skill (Bello et al., 2005; Hopper &
Johns, 2007; Hugenholtz et al., 2008; Rohwer et al., 2013). This paper details the expectations,
concerns, and experience of 19 students enrolled on an online postgraduate certificate in Clinical
Exercise.
Ten general recommendations are proposed as supplemental material to this paper based
on the analysis carried out and the experience of the researchers involved. Understanding students’
expectations and concerns as they begin an online course can greatly help academic staff to tailor
the learning experience in a way that supports students in reaching their learning goals.
The most prominent theme to emerge regarding students’ expectations of the course was
that of learning and gaining or deepening knowledge. This highlights the fact that healthcare
professionals may not feel confident in exercise prescription (Hayes, 2009; Heath & Stuart, 2002),
despite a large amount of evidence that has proven exercise to be an effective treatment method
for many clinical conditions that healthcare professionals may encounter on a daily basis. The gap
in knowledge expressed by students may stem from the fact that much of the evidence in this area
is new. It is likely that when many of the students on this course graduated, the role of exercise as
a treatment tool for those with cancer, depression, communicable diseases, and other such
conditions was not part of their curriculum. This is not a fault of any curriculum per se but
highlights the need for continued professional development in areas where research is evolving
rapidly. This finding highlights the opportunity for online courses to be used to keep professionals
up to date on evolving research in their area.
What is perhaps most interesting about the student expectations was that students expressed
an expectation that the knowledge gained would lead to an improvement in their prescription skills.
It has been shown that fully online courses imparting knowledge can lead to an improvement in
clinician skills (Bello et al., 2005; Edrich et al., 2016; Rohwer et al., 2013).
It is important for academic staff to be aware of concerns that students have as they engage
in further education. The flexibility afforded by online learning enables students who would be
unable to attend traditional face-to-face teaching to engage in formal and informal education.
However, the physical ability to log on and participate in a course does not remove other barriers

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 129


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

to further education, such as family and work commitments (Muilenburg & Berge, 2005; Sorensen
& Donovan, 2017). This was evident from the results of the current study where students were
positive about the online nature of the course but concerned about their ability to fully engage with
and complete the course due to their personal commitments.
High dropout rates are often reported in online courses (Bawa, 2016). The reasons for such
high dropout rates are often unclear since the students of interest are those who are no longer
engaging with the course (Fetzner, 2013). However, results of this study give some insight into the
concerns and difficulties expressed by online learners. Students were concerned about the
workload and time commitment that the course would require. In line with this concern, some
students reported in the post-module surveys that the course had a heavy workload. This issue may
be indirectly related to the online nature of the course whereby students were able to continue
working clinically full time due to the flexibility provided by the online course, whereas with face-
to-face teaching they would likely have had to reduce their working hours to attend classes during
working hours.
Another concern was students’ ability to express themselves. This concern was at times
linked to the fact that the course was online. Some students were anxious about their ability to
navigate and engage with the online platform. A greater concern, however, seemed to be the ability
to meet academic standards. This stemmed from the fact that many students had not taken part in
formal education for a number of years. It is possible that the online nature of this course finally
afforded them an opportunity to do so.
Results of post-module surveys showed students highly rated the teaching resources and
found them to be relevant and informative. All four modules were rated as being good or very
good. Learning material seemed to appeal to students’ clinical/professional as well as their
personal interest. Since the majority of students would have been working in a specific clinical
area (e.g., mental health, cardiovascular medicine or a community setting) it is assuring that
material which may not have been directly related to their speciality was viewed positively and
enjoyed. The online nature of the course was also received positively.
Feedback from the post module surveys also revealed limitations to this course. Most
notably, students expressed a desire to have a practical teaching session. This finding highlights
the limitation of online learning in the area of clinical exercise and has been previously expressed
by students studying anatomy and physiotherapy-specific courses online (Harvey et al., 2014;
Swinnerton et al., 2016). Previous research has shown no difference in the course results of
students who engage with course material online compared to those who engage in the traditional
face-to-face method (Bello et al., 2005; Cook & Steinert, 2013; Edrich et al., 2016; Matzie, 2010;
Pourmand et al., 2013). However, studies have also shown that a blended learning approach where
both methods are used could leverage even better results (Edirippulige et al., 2012; Eksteen, 2011).
The examination of a blended learning approach was beyond the scope of this study.
Many, but not all, courses involving the acquisition of skills have taken a blended learning
approach, combining both traditional and e-learning methodologies in order to optimise face-to-
face time to impart skills. While the literature would suggest that blended learning is effective at
undergraduate level, students at postgraduate level can be successful at enhancing both their
knowledge and practical skills through online learning alone (Rohwer et al., 2013). Few studies
have examined the efficacy of teaching clinical skills through e-learning alone. Of those that have,
Edrich et al. concluded that web-based training was just as effective as traditional methods at

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 130


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

teaching anaesthesiologists lung ultrasound skills (Edrich et al., 2016), and Roesch et al. advocated
a computer assisted learning program for the provision of both theoretical biomedical knowledge
and clinical skills in the area of dermatology (Roesch et al., 2003). Other studies have shown
positive results from teaching airway management (Bello et al., 2005) and paediatric rheumatology
(Manners, 2013) fully online.
Investigating the concerns, expectations, and experience of a cohort of online postgraduate
students in clinical exercise has contributed valuable knowledge to the discipline. Understanding
key themes can support future development in online modules in this space, and has added to the
body of literature on online learning within the health sciences. It appears that overall students
were highly appreciative of the learning resources available to them in an easily accessible and
flexible format, the quality of teaching, the support received by peers as well as teaching staff, and
the relevance of the material presented to their clinical settings and learning goals. The online
nature of the course was perceived as a challenge by some students and not without its limitations.
However, students also saw it as an advantage and as something which enabled them to reach their
learning goals despite work and family commitments.
Overall analysis of the findings presented in this paper provide evidence for the success of
teaching clinical exercise online. However, it should also be noted the workload may be perceived
as heavy for students who choose to continue to work full time and there may be a need to support
some online learning in practical subjects with face-to-face practical teaching sessions. Online
learning results in student expectations and concerns that are unique to the VLE. The evidence-
based recommendations provided as supplemental material to this paper may help online clinical
educators and students to maximize the success of their teaching and learning experiences,
respectively.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 131


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

References
Allen, E., Seaman, J., Poulin, R., & Taylor Straut, T. (2016). Online report card: Tracking
online education in the United States. Babson Survey Research Group and Quahog Research
Group. http://onlinelearningsurvey.com/reports/onlinereportcard.pdf
Ally, M. (2004). Foundations of educational theory for online learning. Athabasca University
Press. http://stoa.usp.br/ewout/files/1073/6047/TerryAndersonEntireBook.pdf.
Anaf, S., & Sheppard, L. (2010). Lost in translation? How patients perceive the extended scope
of physiotherapy in the emergency department. Physiotherapy, 96(2), 160-168.
Bawa, P. (2016). Retention in Online Courses: Exploring Issues and Solutions—A literature
review. Review, SAGE Open, 6(1). https://doi.org/10.1177/2158244015621777
Bello, G., Pennisi, M. A., Maviglia, R., Maggiore, S. M., Bocci, M. G., Montini, L., & Antonelli,
M. (2005). Online vs live methods for teaching difficult airway management to
anesthesiology residents. Intensive Care Medicine, 31(4), 547–552.
Bergold, M., Strametz, R., Weinbrenner, S., Khan, K. S., Zamora, J., Moll, P., & Weberschock,
T. (2013). Evidence-based Medicine online for young doctors–a randomised controlled
trial. Zeitschrift für Evidenz, Fortbildung und Qualität im Gesundheitswesen, 107(1), 36–43.
Blundell, A., Gordon, A. L., Masud, T., & Gladman, J. (2011). Innovations in teaching
undergraduates about geriatric medicine and ageing–results from the UK National Survey of
Teaching in Ageing and Geriatric Medicine. European Geriatric Medicine, 2(1), 12–14.
Brown, M., & Bullock, A. (2014). Evaluating PLATO: Postgraduate teaching and learning
online. The Clinical Teacher, 11(1), 10–14.
Chaboyer, W. P., & Patterson, E. (2001). Australian hospital generalist and critical care nurses'
perceptions of doctor–nurse collaboration. Nursing & Health Sciences, 3(2), 73–79.
Cook, D. A., & Steinert, Y. (2013). Online learning for faculty development: A review of the
literature. Medical Teacher, 35(11), 930–937.
Costello, E., Corcoran, M. A., Barnett, J. S., Birkmeier, M. C., Cohn, R., Ekmekci, O., ... &
Walker, B. (2014). Information and communication technology to facilitate learning for
students in the health professions: Current uses, gaps, and future directions. Online learning:
Official Journal of the Online Learning Consortium, 18(4). doi:10.24059/olj.v18i4.512
Dennis, J. M. P., J.S. and Chuateco, L.I. (2005). ‘The Role of Motivation, Parental Support, and
Peer Support in the Academic Success of Ethnic Minority First-Generation College
Students.’ Journal of College Student Development 46. doi: 10.1353/csd.2005.0023
Edirippulige, S., Smith, A. C., Armfield, N. R., Bensink, M., & Wootton, R. (2012). Student
perceptions of a hands-on practicum to supplement an online eHealth course. Journal of
Medical Internet Research, 14(6). doi:10.2196/jmir.2029
Edrich, T., Stopfkuchen-Evans, M., Scheiermann, P., Heim, M., Chan, W., Stone, M. B., ... &
Szabo, A. L. (2016). A comparison of web-based with traditional classroom-based training
of lung ultrasound for the exclusion of pneumothorax. Anesthesia & Analgesia, 123(1), 123–
128.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 132


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

Eksteen, C. (2011). Blended teaching strategies in physiotherapy to optimize variety learning in


students. Physiotherapy, 97(SUPPL), 1 –eS1476.
Elo, S., & Kyngäs, H. (2008). The qualitative content analysis process. Journal of Advanced
Nursing, 62(1), 107–115.
Field, T. (2002). Internet-based education for enrolled nurses: Could it be e-ffective? Australian
Journal of Advanced Nursing, 19(4), 33.
Fetzner, M. (2013). What do unsuccessful online students want us to know? Online Learning,
17(1). doi:10.24059/olj.v17i1.319
French, H. P., & Dowds, J. (2008). An overview of continuing professional development in
physiotherapy. Physiotherapy, 94(3), 190–197.
Gardner, P., Slater, H., Jordan, J. E., Fary, R. E., Chua, J., & Briggs, A. M. (2016).
Physiotherapy students’ perspectives of online e-learning for interdisciplinary management
of chronic health conditions: a qualitative study. BMC Medical Education, 16(1), 62.
Gilly, S. (2013). E-tivities: The Key to Active Online Learning. New York, Routledge.
Gummesson, C., & Nordmark, E. (2012). Self-reflections in an online course—Reflecting
learning strategies? Advances in Physiotherapy, 14(2), 87–93.
Harvey, L. A., Glinsky, J. V., Lowe, R., & Lowe, T. (2014). A massive open online course for
teaching physiotherapy students and physiotherapists about spinal cord injuries. Spinal
Cord, 52(12), 911.
Hayes, S. C., Spence, R. R., Galvão, D. A., & Newton, R. U. (2009). Australian Association for
Exercise and Sport Science position stand: Optimising cancer outcomes through
exercise. Journal of Science and Medicine in Sport, 12(4), 428–434.
Haywood, H., Pain, H., Ryan, S., & Adams, J. (2013a). The continuing professional
development for nurses and allied health professionals working within musculoskeletal
services: a national UK survey. Musculoskeletal Care, 11(2), 63–70.
Haywood, H., Pain, H., Ryan, S., & Adams, J. (2013b). Continuing professional development:
Issues raised by nurses and allied health professionals working in musculoskeletal
settings. Musculoskeletal Care, 11(3), 136–144.
Heath, J. M., & Stuart, M. R. (2002). Prescribing exercise for frail elders. The Journal of the
American Board of Family Practice, 15(3), 218–228.
Hopper, K. B., & Johns, C. L. (2007). Educational technology integration and distance learning
in respiratory care: practices and attitudes. Respiratory Care, 52(11), 1510–1524.
Howland, J. L., & Moore, J. L. (2002). Student perceptions as distance learners in Internet-based
courses. Distance Education, 23(2), 183–195.
Hsieh, H. F., & Shannon, S. E. (2005). Three approaches to qualitative content
analysis. Qualitative Health Research, 15(9), 1277–1288.
Hugenholtz, N. I., De Croon, E. M., Smits, P. B., Van Dijk, F. J., & Nieuwenhuijsen, K. (2008).
Effectiveness of e-learning in continuing medical education for occupational
physicians. Occupational Medicine, 58(5), 370–372.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 133


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

Lefmann, S. A., & Sheppard, L. A. (2014). Perceptions of emergency department staff of the role
of physiotherapists in the system: a qualitative investigation. Physiotherapy, 100(1), 86–91.
Mącznik, A. K., Ribeiro, D. C., & Baxter, G. D. (2015). Online technology use in physiotherapy
teaching and learning: a systematic review of effectiveness and users’ perceptions. BMC
Medical Education, 15(1), 160.
Manners, P., & Guttinger, R. (2013). FRI0333 The online graduate certificate of paediatric
rheumatology UWA (grad cert prheum): An evaluation. Annals of the Rheumatic Diseases,
71, 426–427.
Matzie, K. A., Philbrook, L., Mitani, A., Lipsitz, S., Gerhard-Herman, M., Pozner, C., & Frendl,
G. (2010). Comparison of web-based and classroom-based training programs for point-of
care, real-time ultrasound-guided central venous catheter placement. American Journal of
Respiratory and Critical Care Medicine, 181. http://doi.org/10.1164/ajrccm-
conference.2010.181.1_MeetingAbstracts.A5535
McDonald, P. L., Lyons, L. B., Straker, H. O., Barnett, J. S., Schlumpf, K. S., Cotton, L., &
Corcoran, M. A. (2014). Educational mixology: A pedagogical approach to promoting
adoption of technology to support new learning models in health science disciplines. Online
Learning: Official Journal of the Online Learning Consortium, 18(4).
doi:10.24059/olj.v18i4.514
McPherson, K., Headrick, L., & Moss, F. (2001). Working and learning together: Good quality
care depends on it, but how can we achieve it? BMJ Quality & Safety, 10(suppl 2), ii46–
ii53.
Moore, A., Morris, J., Crouch, V., & Martin, M. (2003). Evaluation of physiotherapy clinical
educational models: Comparing 1: 1, 2: 1 and 3: 1 placements. Physiotherapy, 89(8), 489–
501.
Muilenburg, L. Y., & Berge, Z. L. (2005). Student barriers to online learning: A factor analytic
study. Distance Education, 26(1), 29–48.
Murphy, J., Worswick, L., Pulman, A., Ford, G., & Jeffery, J. (2015). Translating research into
practice: Evaluation of an e-learning resource for healthcare professionals to provide
nutrition advice and support for cancer survivors. Nurse Education Today, 35(1), 271–276.
O'Donnell, V. L., Tobbell, J., Lawthom, R., & Zammit, M. (2009). Transition to postgraduate
study: Practice, participation and the widening participation agenda. Active Learning in
Higher Education, 10(1), 26–40.
Pourmand, A., Lucas, R., & Nouraie, M. (2013). Asynchronous web-based learning, a practical
method to enhance teaching in emergency medicine. Telemedicine and e-Health, 19(3),
169–172.
Roesch, A., Gruber, H., Hawelka, B., Hamm, H., Arnold, N., Popal, H., ... & Stolz, W. (2003).
Computer assisted learning in medicine: A long-term evaluation of the “Practical Training
Programme Dermatology 2000.” Medical Informatics and the Internet in medicine, 28(3),
147–159.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 134


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

Rohwer, A., Young, T., & Van Schalkwyk, S. (2013). Effective or just practical? An evaluation
of an online postgraduate module on evidence-based medicine (EBM). BMC Medical
Education, 13(1), 77.
Song, L., Singleton, E. S., Hill, J. R., & Koh, M. H. (2004). Improving online learning: Student
perceptions of useful and challenging characteristics. The Internet and Higher
Education, 7(1), 59–70.
Ruckert, E., McDonald, P. L., Birkmeier, M., Walker, B., Cotton, L., Lyons, L. B., ... & Plack,
M. M. (2014). Using technology to promote active and social learning experiences in health
professions education. Online Learning, 18(4). http://dx.doi.org/10.24059/olj.v18i4.515
Sorensen, C. & Donovan, J. (2017). An examination of factors that impact the retention of online
students at a for-profit university. Online Learning, 21(3), 206–221.
Swinnerton, B. J., Morris, N. P., Hotchkiss, S., & Pickering, J. D. (2017). The integration of an
anatomy massive open online course (MOOC) into a medical anatomy
curriculum. Anatomical Sciences Education, 10(1), 53–67.
Tan, S., Ladyshewsky, R., & Gardner, P. (2010). Using blogging to promote clinical reasoning
and metacognition in undergraduate physiotherapy fieldwork programs. Australasian
Journal of Educational Technology, 26(3), 355–368.
Wolbrink, T. A., & Burns, J. P. (2012). Internet-based learning and applications for critical care
medicine. Journal of Intensive Care Medicine, 27(5), 322–332.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 135


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

Appendix A: Recommendations
Each recommendation below was informed by results derived from this study, however they have
been developed to reflect general situations in an attempt to be useful to those teaching in other
subject areas and with other student bodies.
Recommendation Explanation

1: Inter-professional In this study students had positive expectations about learning from
learning should be each other, especially those in other health care disciplines. In
encouraged between medical education, inter-professional learning is encouraged at all
students levels. Results of this paper suggest that qualified professionals
expect to be able to learn from each other, even in an online format,
and are positive about sharing knowledge across disciplines.
Students also expressed a desire to get to know one another. In this
course, students were provided with discussion boards, blogs and
virtual rooms which they were encouraged to use as informal spaces
to discuss course material and share views.
2: Do not limit learning The expectation that the knowledge gained would lead directly to
outcomes to the skill acquisition was interesting since the course offered was fully
acquisition of knowledge online. This shows how students have an expectation to directly
- strive to encourage apply knowledge gained in their work place. This finding is
students to apply this encouraging and supports previous research which has demonstrated
knowledge and develop the ability for health care professional to acquire practical skills
practical skills through online learning. This may be particularly relevant at
postgraduate level where students already possess a clinical skill set
and interact with patients on a daily basis.
3: Support evidence based In line with best practice, it is advised that online courses are
learning with digital evidence based. Results of this study indicated that students
resources and online expected information presented to be heavily evidence based. It can
orientation of web-based be argued that the online space is well equip to present information
learning materials in an evidence based fashion. Active links can be provided to
reference texts, ensuring students have easy and quick access to
relevant sources of information. From the authors experience
running this course, it is recommended that students are provided
with digital resources including journal articles and e-books, and
that this material is easily accessible within the course learning
platform. It is also recommended that students are familiarised with
the electronic databases and how to use them, as well as electronic
libraries (where available) prior to commencing the course. Despite
the majority of students living within easy reach of the college’s
physical library, anecdotal evidence suggests that digital resources
were used by all students as primary resources, over the physical

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 136


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

library. All students were obliged to attend an hour long orientation


of the electronic library prior to commencing the course. During this
time lead academics demonstrated how to use electronic databases
and the college’s electronic library. It is recommended that ample
time be given to orientation and familiarisation with the learning
platform including any digital resources that are available in order
for students’ to gain the most from the course material.
4: Provide flexible student Despite initial concerns, students reported feeling supported by their
supports class and academics teaching on the course. This is important as lack
of support can be a barrier to online learning (Muilenburga 2005).
Support was provided formally by academic staff in the form of open
office hours, whereby a member of staff was present in an online
room during specific times each week. Students did not need an
appointment, but could simply access this room to speak with the
academic during the allocated time. Students were also encouraged
to email or ring academic or technical staff if they encountered any
problems. Furthermore, being fully registered, online students could
avail of the same college supports as any other student including
their student union, medical and counselling services. Of all the
services and supports that were put in place, students typically stated
encouragement they received from one another through discussion
posts as a valued source of support. Peer support is highly valued in
university settings (Dennis 2005). This study illustrated that it is also
possible to garner peer support informally in an online course.
5: Be cognisant that Online learning provides a flexibility of learning, however it is
online students may also subject to the same academic rigour and structures as face to face
be working full time and teaching. As such the online Certificate in Clinical Exercise carried
therefore perceive the the same number of credits, or workload, as other certificate courses.
course workload as This uniform workload may have been perceived as ‘heavy’ by those
‘heavy’ who choose to remain in full time employment. It is important to
highlight the expected workload of the course prior to student’s
enrolling. The authors encourage a realistic approach to this issue.
Online education affords flexibility to students and while students
can ‘catch up’ on material they missed during periods where they
could not dedicate time to the course, the authors suggest that there
is a limit to this flexibility. For example, students may be awarded a
small number of credits to be present during live webinars (online
classes), and assignments set during the course should be submitted
by a pre-determined date before the course ends. These
arrangements reflect traditional face to face teaching and render it
impossible to complete the entire course in a very short space of
time.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 137


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

6: Those who have had a Several students expressed concern over having not engaged in
break in their formal formal education for many years. It is recommended that at the
education may need beginning of a course students are supported and encouraged heavily
additional support. and that as the course progresses this support can be reduced. It is
worth noting that those students who expressed concern over a gap
in their education performed very well academically. This
experience can be used to encourage other students in similar
situations who are concerned about re-entering formal education.
7: Provide resources on Interestingly, students expressed concern about their writing skills.
writing skills It is important to note that academic writing is not a requirement of
continued professional development. Resources to aid students with
their academic writing can easily be put in place to support students
and address this concern.
8: Clearly outline time A key concern students had before they commenced this course,
commitments which remained an issue throughout the course, was time
management. It is recommended that a detailed timetable is provided
to students before they commence the course and that the number of
hours of expected engagement with the course are outlined before
students begin. The degree of flexibility should also be detailed. For
example in some online courses students can engage with the
material in their own time (for example at the weekend) while in
others there is a requirement to be online at a given time. The level
and structure of the engagement required for the course should be
clearly specified before students enrol to enable them to realistically
determine whether they can afford the time needed to undertake the
course. One method could be to ask students to create a timetable of
their usual week and determine whether they can dedicate the
required number of hours to the course before they begin.
9: Consider blended Feedback from this course suggested that a practical element may
learning have been helpful to students. Blended learning combines online
with face to face teaching and has been well received by both
students and staff in the field of medical education (Eksteen 2011,
Gardner et al. 2016). It should be noted that students felt the need
for a practical component of the online course rather than the course
being delivered completely in the traditional face to face method.
Students also felt that very little practical teaching would be
required, with one student suggesting a weekend would suffice, and
another mentioning a single practical session. This finding is
noteworthy, especially since students felt that the learning materials
were of a high quality and that course learning objectives were met.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 138


Postgraduate Online Teaching in Healthcare: An Analysis of Student Perspectives

10: Learning online is a For students who are nervous about learning in an online
skill in itself – let students environment it is worth letting them know that learning to be an
know this so that they online learner will take some time and effort but that it is a life skill
allow themselves time to in itself. Once they are competent in the VLE they will then be
acquire it equipped with the skills needed to engage with other online
resources, for example professional society blogs and learning
spaces online. A student does not have to be a ‘techie’ to be an
effective and efficient online learner, however they do need to set
aside some time to acquire the necessary skills. Course coordinators
should consider students becoming online learners as a goal of their
courses and work this into the course orientation time. The model
we used to support students in becoming independent online learners
was the five stage model by Professor Salmon (Gilly 2013).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 139


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Student Preferences for Learning Resources on a


Land-Based Postgraduate Online Degree Program
Duncan Royd Slater
Myerscough College, Lancashire, UK

Richard Davies
University of Central Lancashire, UK

Abstract
Creating engaging online resources is an important part of the rapidly changing discipline of e-
tutoring. There is increasing use of a wide range of media for online training but only a limited
number of studies assessing their effectiveness. This study involved an educator working
collegiately with cohorts of online students studying a specialist land-based postgraduate degree
program (n = 79). The opinions of these mature online students, on current and potential learning
resources, informed two interventions that provided novel online resources to the course. Student
opinion on these new resources was captured and subjected to thematic analysis. The results
identify that these students’ favored resources were online lectures, course notes, primary
literature, and tutors’ opinion pieces because they were perceived as accessible, easy to engage
with, assignment-related and/or provided something akin to a ‘university campus experience’. In
contrast, podcasts and knowledge review quizzes were strongly disfavored by the majority of
respondents. The implications of this study in relation to online teaching practice are discussed.

Keywords: learning resources; online learning; online lectures; podcasts

Slater, D.R. & Davies, R. (2020). Student preferences for learning resources on a land-based
postgraduate online degree program. Online Learning, 24(1), 140-161.
https://doi.org/10.24059/olj.v24i1.1976

Student Preferences for Learning Resources on a


Land-Based Postgraduate Online Degree Program
Part of the international market in Higher Education (HE) is an increasing number of highly
specialist postgraduate programs whose feasibility depends on global recruitment. For specialist
colleges, in this study in land-based studies, maximizing involvement in this global postgraduate
market enables sufficient “economies of scale.” Further, the global recruitment enables a large
enough cohort of students to be recruited to ensure a “learning community” of peers. The program
on which we focus links together individuals working in a similar profession (urban forestry) in

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 140


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

order to develop mastery in their professional role. It is worth noting that these postgraduate
students are often already the local expert. The development of internet associated technologies
(IATs) has supported the improvement of such courses. Here we report on a review of the resource
preferences for a group of such students. The students, drawn from six countries, are work-based
and are enrolled on one of very few specialist master’s level programs in the field. Although they
all have sufficient English skills to access the course, individual levels of English ability and
opportunities to use English are variable. For most of the students the choice of distance learning
results from a lack of viable alternatives rather than a commitment to distance learning per se.
Although not unique, this type of program and student cohort characteristics are in need of further,
more specific research. In this paper we focus on students’ perspectives on online learning
resources.
For online vocational courses to be effective, ongoing review of their online learning
resources is essential (Anderson, 2008). Such reviews need to engage with students’ perspectives
and consider the impact of a range of online learning resources in relation to the students’ study
behaviors (Palloff & Pratt, 1999; Means et al., 2009; Redmond et al., 2018). Here we report on a
two-stage design to elicit students’ perspectives on the use they make of the present resources and
what other resources they would use. In stage one, four cohorts of students (n = 79) completed a
questionnaire and a sample (n = 15) were interviewed to ascertain students’ views on the present
resources. In stage two, we designed two interventions in relation to two different submodule
learning units. The first offered the same content in multiple, different online formats. The second
offered different content in different online formats. Students used these resources as a normal part
of their learning and were invited to comment on the usefulness of the resources in a
postintervention questionnaire.
Following a review of the extant literature related to online learning, we will set out more
formally our research design. We will then review the data from stage one and then stage two of
the study before identifying some key implications for course leaders and learning designers for
these types of programs and students.

Review of Related Literature


The Experience of Studying Online
If you are part of that minority that chooses to take your degree online, what is the learning
experience like? Much research has reported that online study is often seen as a poorer form of
delivery of HE courses than on-campus study, both by students and tutors (e.g., Picciano, 2002;
Vonderwell, 2003; Song et al., 2004; Muilenberg & Berge, 2005; Weller, 2007; Cole et al., 2014;
Gillett-Swan, 2017). A key factor contributing to this perception, of online study being a ‘lesser
experience’, is the isolation of the online learner (Selwyn et al., 2006). Another major
consideration is that regular communication between student and tutor is considered critical for
the success of an online course (Beaudoin, 2002; Beuschel et al., 2003; Augar et al., 2006).
Previous research, particularly in the 1990s and early 2000s, highlighted that online courses can
suffer from high drop-out rates when compared to their on-campus equivalents (Fisher, 2003;
Palloff & Pratt, 2003; MacDonald, 2006). Withdrawals from programs are associated not only
with feelings of isolation and lack of communication with tutors, but also whether the course is
relevant to the learner and whether student support systems are put in place (Lee et al., 2011).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 141


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Specialist literature on e-learning seeks to address these issues of isolation, lack of


communication and subsequent student drop-out. For example, several authors point to the
potential for online students to have greater autonomy for their studies (e.g., Lockwood & Gooley,
2001; MacDonald, 2006; Smith, 2008) and the flexibility of asynchronous online delivery is
highlighted as giving more choice and more control of their learning to the online student (Inglis
et al., 2002; Gillani, 2003; Conrad & Donaldson, 2004). In addition, some authors speculate that
online learning, if developed in a suitable way, can be more student-centered than typical on-
campus teaching at a university (Forsyth, 2001; Richardson, 2006; Weller, 2007). Some key
characteristics of an effective online student and her/his online learning community that emerge
from educational literature are presented in Figure 1.

Effective online student Effective course design


* Sufficient study time * Good visual and text design
* Committed * Learner-focused
* Access to ICT * Easy to access and navigate
* Open-minded * Relevant content
* Respectful * Time efficient
* Strategy for studying * Interactive
* Critical thinker * Collaborative
* Reflective * Outward looking
* Flexible * Addresses learning styles & culture

Effective
Online
Learning
Community

Effective online tutor Effective student support


* Responsive * Accessible 24/7
* Present regularly on VLE * Technical support & training
* Open & honest * Course handbook
* Reflective * University regulations
* Respectful * Learner-centered
* Knowledgeable * Learning support specialists
* Relevant experience * Well-resourced
* Efficient * Library resources
* Empowers others * Access to e-books & e-journals

Figure 1. Key factors in building of an effective online learning community.


Adapted from Palloff & Pratt (2003). Amendments are displayed in black text.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 142


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Creating Effective Learning Resources for Online Courses


There are a considerable number of instructional texts, aimed at tutors and course
managers, on the creation of engaging online learning materials (e.g., Palloff & Pratt, 2003; Race,
2005; Bonk & Zhang, 2008; Garrison & Anderson, 2011). Unfortunately, due to the fast pace of
change in software and virtual learning environment (VLE) capabilities, some sections of such
texts become rapidly dated. This fast-developing method of course delivery presents a considerable
number of challenges to both the provider and to the cohorts of students engaged with online
learning (McVay-Lynch, 2002). For instance, the last decade has seen a move towards
compatibility of online resources with mobile technology which results in students having their
university courses ‘in their pocket’ (Sharples et al., 2006; Bell, 2008). Creating bespoke resources,
ensuring there is formative assessment, and providing motivational rewards for undertaking these
formative assessments is recommended (Gillani, 2003). Seale et al. (2007) see the challenge of
creating new online learning resources as three-fold, they should be designed for learning, highly
accessible & highly reusable. In addition, Martin and Bolliger (2018) emphasise that interactions
between the learner, other learners, course resources and tutors is of critical importance for good
student engagement.
There are several studies that found the use of audio/visual learning resources for online
learners to be effective, especially the use of instructional video and online lectures (MacPherson
& Nunes, 2004; Mitra et al., 2010; Borup et al., 2011; Carmichael et al., 2017; Crook & Schofield,
2017; Scagnoli et al., 2017). Although now a popular delivery format in online courses, moving
away from a high level of reading content, Pomales-Garcia and Liu (2006) warn against providing
overlong videos or lectures. Their research highlights that online learners were less likely to
complete modules with resources that took longer for them to work through. For some online
provision, the use of synchronous webinars has become more common. These allow students to
interact whilst an online lecture is being delivered (Moore & Kearsley, 2012). It is a format which
comes closest to simulating an interactive lecture hall environment. In addition to these
audio/visual offerings, some researchers have assessed the impact of podcasts as online learning
materials (Richardson, 2006; Salmon, 2008; Lawlor & Donnelly, 2010). Lawlor and Donnelly
(2010) found that podcasts were extensively used by a proportion of postgraduate students taking
an online course, making them a valuable form of differentiation.
Although there is a consensus that students gain from a planned program of high quality,
well-tested online learning resources, Bonk (2001) found, through surveying online tutors, that
less than 40% of sampled online courses contained the interactive elements that the tutors
themselves stated would be valuable for their students. In addition, Kinash et al. (2015) state that
there is a lack of empirical evidence in relation to the effectiveness of the technologies used for
online teaching. They concluded, from their meta-analysis of online student experiences, that this
is a key knowledge gap. This study explores aspects of this identified “knowledge gap” within the
specific context of an educator providing bespoke online learning resources to students on a
specialized postgraduate course.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 143


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

This action research study was framed around two interrelated research questions:
RQ1 Of the current online resources on a selected postgraduate course, which did the
students use most frequently, and for what reasons?
RQ2 Of the newly generated online resources created for the two interventions in this study,
which did the students use more frequently, and for what reasons?

Methods
Participants in this study were enrolled on a specialist online master’s degree course in
arboriculture and urban forestry, delivered by Myerscough College on behalf of the University of
Central Lancashire (UCLan). In addition to present students, a cohort of ex-students who had just
completed the qualification were invited to be involved with the interview phase of the study.
Details of the student cohorts are provided in Table 1.

Table 1
Breakdown of Participating Student Cohorts in terms of Year Groups, Numbers of Students in
Each Cohort, Nationalities in Each Cohort and Gender Mix
Student Number of Student
Cohort Students Nationalities Student Gender
First Years 35 9 British; 25 Hong 14 Female, 21 Male
Kongese: 1 Singaporean
Second Years 11 5 British; 1 Canadian; 4 3 Female, 8 Male
Hong Kongese; 1 Irish
Third Years 15 5 British; 2 Canadian; 1 7 Female, 8 Male
Croatian; 7 Hong Kongese
Ex-Students 18 8 British; 1 Canadian; 9 7 Female, 11 Male
Hong Kongese

An initial online questionnaire (created within SurveyMonkey® and provided in Appendix


1) asked participants about their views of the learning resources they had previously had access to
on the course (Table 2).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 144


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Table 2
Current Learning Resources used in this Postgraduate Course.
Learning Resource Form of resource Accessibility
1. Online lectures Slideshows with accompanying Not downloadable—could only
audio, most with a script to view. be viewed when the student was
These were created in Adobe logged into the VLE.
Presenter®, OfficeMix®, or
Powerpoint®.
2. Academic papers Journal papers—typically in PDF Downloadable & could be
format—selected by the tutors to be viewed on a range of devices.
the most relevant for the topic being
covered in that teaching session.
3. Knowledge review Created within the VLE, these Not downloadable—could only
quizzes quizzes focused on reviewing be undertaken when the student
knowledge gained from reading the was logged into the VLE.
academic paper(s) highlighted in
each session.
4. Tutor’s viewpoints A concise, illustrated article by the Downloadable & could be
tutor on a contentious and viewed on a range of devices.
contemporary topic—typically in
PDF format.
5. Further reading A range of documents, mainly PDF Downloadable & could be
and Word® documents. viewed on a range of devices.
6. Tutor’s own papers Some module tutors have authored Downloadable & could be
their own research papers. Where viewed on a range of devices.
these are relevant to modules, they
are provided to students (typically
as PDF files).
7. External links Links to other websites and external Would initially have to be
learning resources selected by the accessed via the VLE but then
tutor. can be saved and accessed
independently.
8. Discussion board In-built discussion board within the Could only be accessed via the
VLE, where students and tutors can VLE.
create, read and answer discussion
threads.
9. Announcements A messaging system used by tutors Announcements are viewable
to contact the whole of an online within the VLE but are also sent
class of students, in-built to the out to student email addresses.
VLE.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 145


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

This initial questionnaire finished with a question to ascertain students’ preferences for
new learning resources that the tutor could create for them. This was a closed question of
predefined options scored by participants using a Likert scale. Further, fifteen students across the
four cohorts participated in an online, semi-structured interview to obtain more in-depth views on
their learning experience on the course and their reasons for favoring some learning resources over
others. The framework of questions asked in these interviews is provided in Appendix 2.
Data from both these processes fed into the creation of new learning resources for two
distinct interventions during the delivery of the course program. The first intervention involved
supplying the same content in a range of different formats (online lecture, course notes, video,
streamed video, and podcast) for students to trial. The second intervention involved supplying a
wider range of formats for learning materials with each new resource distinct from any other in
terms of content (Table 3). The rationale behind this approach was to seek to separate the students’
preferences for different media formats from the effectiveness of unique learning objects due to
their form and content.

Table 3
Details of the Learning Resources Produced for the Second Intervention of this Study, Providing
Learning Resource Titles, Type, and File Format.
Learning Resource Resource Type Resource Format

The urban forest of downtown Also made available as a


Online lecture
Singapore downloadable video (MP4)
Trees as biotechnology Academic paper PDF file
People love trees Tutor’s Viewpoint PDF file
Landscaping of Birchwood Park, MP4 file
Video taken outdoors
Warrington Available for download
The urban forest of Aarhus, Also made available as a
Online lecture
Denmark downloadable video (MP4)
From the front line (concerning MP3 file
Podcast
recent science about urban forests) Available for download
Also made available as a
The urban forest of Pistoia, Italy Online lecture
downloadable video (MP4)
Hopping on one leg Academic paper PDF File
People loathe trees Tutor’s Viewpoint PDF File
Landscaping of Deepdale Retail MP4 File
Video taken outdoors
Park, Preston Available for download
Links to other Links embedded in the VLE
External Links
websites session page

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 146


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

These two interventions were followed up with a final questionnaire (created within
SurveyHero® and supplied in Appendix 3) sent to all students who had participated in the
intervention (n = 63), which asked them to evaluate their experience of the new online resources
provided via the two interventions. A timeline for the key actions in this study is provided in Figure
2.
Data Collection and Analysis

Figure 2. Timeline illustrating the four key phases of this study: an initial questionnaire,
semi-structured interviews, two interventions, and a final questionnaire.

This self-reporting approach to data capture was considered effective in answering our research
questions. The respondents in this case were all mature learners in professional roles who had a
good rapport with the lead researcher. The discussions during the interviews where consistent with
the responses in the questionnaire, with previous module evaluations and reflections of the course
team. Answers were also consistent with known VLE metrics of students’ use of existing
resources. The research itself was articulated to participants as part of the resource development
for the course. Care was taken to ensure that participants were aware that there was no “correct”
answer and that involvement in the study would not impact on their learning or assessment. The
study gained ethical approval at UCLan.
The responses from the semi-structured interviews and open questions within the
questionnaires were first coded then themed (Burton et al., 2008). The theming of responses was
reiterated three times to achieve conjoining of similar themes and to identify emergent and more
specific themes that were initially placed within broader themes (Tracy, 2013). Relevant insightful
quotes were selected to provide supporting evidence for the thematic analysis (Galletta, 2013). The
closed, Likert scale questions were analyzed utilizing nonparametric, one-way analysis of
variance.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 147


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Results and Discussion


Initial questionnaire
The initial questionnaire received forty responses across the three cohorts of current
students, representing a response rate of 54.8%. Figure 3 provides the outcomes when students
were asked which current online resources they found most efficient and effective for their
learning. A Kruskal-Wallis and post-hoc Mood’s Median test identified that there were significant
differences in the rating of these resources by these respondents (H = 69.21; df = 6; p < 0.001).

Figure 3. Student preferences, expressed via a rating scale from 1 (low) to 5 (high), for the
range of current learning resources offered on the postgraduate course.

Knowledge review quizzes scored the lowest in terms of preferences, so, without
highlighting this to the participants, these quizzes were omitted from the learning resources created
for the two interventions. As we note shortly, students did not mention missing these resources
during the intervention.
An open question about student preferences in terms of existing resources elicited a clear
preference for the online lectures (Table 4), but the creation of course notes was the highest-scoring
preference for new learning materials. This latter preference is probably best explained by this
course being predominantly delivered via online lectures at that time. Therefore, the addition of
course notes had the potential to diversify the learning materials. In addition, some students stated
a preference for more reading materials rather than for further audio/visual resources to be created.
They cited that their high reading speeds made this form of learning resource efficient for their
learning and that they could read away from their computer.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 148


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Table 4
Students' Stated Most-used Current Online Resources (n = 40), Showing Frequency of
Preference, Coded Themes, and Selected Student Comments
Most used
resource Frequency Coded themes Selected student comments
Online lectures 25 Ease of use “Because I am in full time employment and
this is the easiest format to use.”
Key information
“It gives me an introduction into the
Relevant
topic/subject matter and helps explain key
Concise elements.”
Guide to learning “This kind of resource is a more interesting
Enjoyable learning material, which summarize the
content of the lesson.”
Unique
“It cannot be found elsewhere.”
Like attending Uni
“Mimics a real lecture. I learn best from
either physically doing something (difficult
with an online course!) or verbal
communication. Online lectures are as
close as one can get to real life university.”
Academic 11 Research links “It provides a strong basis to the study of
papers the particular topic. Other references and
Key information
searches can then be undertaken.”
Specific
“It can be accessed anytime, anywhere.”
Assignment-
“Very informative.”
related
Accessibility
Up-to-date
Tutor’s 3 Specific “Tailored to the specific module and
viewpoints provides a good overview and insight into
More usable
the key elements that are being explored.”
Assignment-
related
Further 1 Scientific evidence “Good for knowing the latest research.”
Reading
Note. Coded themes are ordered such that the most frequently-occurring themes are at the top of each list.

Semi-structured interviews
The interview process gave rise to much “rich data” and only a small proportion can be
reported here. Three main learning resources were mentioned as the students’ most-favored
resources: the online lectures (n = 13), tutor’s viewpoints (n = 8), and the academic papers (n = 6).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 149


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

The knowledge review quizzes were mentioned as helpful by two interviewees and no other online
resources were mentioned (i.e., external links and further reading).
Key themes relating to online learning resources are provided here, with example quotes
from interviewees.
• A limited use of the online discussion board: “I didn’t tend to contribute, but I read all the
contributions to the discussion board. It was interesting to see what the other students were
thinking. Most of the student contributions were well-written and they had quite interesting
perspectives. I could not often add to what was being said.”
• A preference for online lectures: “I mostly use online lectures, tutor viewpoints and related
research. Most of all I favor online lectures as they guide you through the topic and they are
easier to consume than written text, especially scientific articles.”
• Low usage of knowledge review quizzes: “Quizzes—I use the least. When I am time-poor,
these are not essential.”
• Mixed views on suitable formats for future learning resources: “I already listen to podcasts—
usually when I am doing something mundane like the washing-up—so I can take it in. If you
do sit down to study, though, you want the audio/visual—something to look at as well as to
listen to. Videos could also be good, for clarifying things further.”; Respondent B: “I am not
used to using podcasts—they are not needed for me—I wouldn’t use them. The slides and
online lectures work well for me. I can see video being of some use—for example, to look at
hazardous trees” (Respondent A).
Final questionnaire
The final questionnaire was completed by eighteen students on the course program. Their
responses on preferred learning resources from Intervention One are provided in Table 5. In this
intervention the same content was provided in several different formats.
In the first intervention, the online lectures were most-favored and received no critical
comments. The podcast received the most criticism as a learning resource. Downloadable video
received the most mixed reviews. It required the largest data allowance on the students’ devices,
but some respondents felt that it gave the best viewing performance. Some of these differences can
be traced to the level of English language of the students. Students whose reading and
comprehension ability was higher than their oral comprehension preferred written texts. It was
also clear that in some contexts certain resources could more easily be used, surreptitiously, in
“work time,” and hence were preferred.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 150


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Table 5
Student Views on the Formats of Learning Resources—Intervention One (n = 18)
Positive
Responses Frequency Coded themes Selected student comments
Course 5 Accessible “I used less than five minutes to read the new learning resource
Notes Efficient PDF. While I watch the video, although I change the speed to 2x, I
Supporting still have to use fifteen minutes. Reading is always more efficient
for me.”
“As a foreign student, understanding of English might be difficult
at some point.
The course notes will be a great help to understand both
presentation and video.”
Online 10 Effective “It is the closest imitation mode to being in the class physically.”
Lecture Like attending “I preferred the audio/visual presentation as this is the most like a
Uni lecture and got me in the mindset of studying.”
Note-taking
Stimulating
Tutor emphasis
Audio 2 Accessible “I enjoyed the ability to listen to the podcast whilst doing my day-
Podcast Convenient to-day work.”
Streamed 1 (No specific “I think the audio/visual presentations were of equal merit. They
Video positive were clear and well-structured.”
comments)
Video 7 Accessible “Smooth watch experience.”
Download IT “My first choice would be the MP4, which had the best resolution
compatibility and works on all devices.”
Technically
superior
Negative
Responses Frequency Coded themes Selected student comments
Course 2 Not essential “I avoided downloading the course notes. I prefer using
Notes audio/visual and making my own notes which I find easier to
reference for assignments.”
Online 0 (No comments) No comments received
Lectures
Podcast 6 Lesser resource “No pictures and no words to read.”
No images “Podcast—not funny.”
Not effective “Had no need to listen to them on the move/remotely.”
Streamed 2 Poor screen “Just preferred resources that are better for using.”
Video size
Poorer format
Video 3 Data uploading “Time consuming to download, used much of my internet data
Download Not essential allowance.”
“Didn't use the downloadable videos - didn't see the advantage
when the audio/visual presentation works fine.”

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 151


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Table 6
Student Views on the Learning Resources—Intervention Two (n = 18)
Positive Responses Frequency Coded themes Selected student comments
Academic Papers 1 Assignment-related “Academic paper helps us to have a better direction on what
we should include in our assignment.”
Online lectures 9 Effective “Online lecture with script - easier to follow and get the
Good quality points easier.”
Knowledge “Audio and video more suits my learning style from which I
Like attending Uni can make my own notes for future reference.”
Outdoor Video 4 Engaging “Encourages myself to process what I am seeing and filter it
Real World in my brains, as I understand that ideas are linked and can be
organized and associated.”
Podcast 0 Ineffective “Podcasts require the lecturer to be more descriptive to fill in
the gaps to be effective.”
Streamed Video 0 (No comments) No comments
Tutor’s Viewpoints 5 Accessible “Tutor's viewpoint - in this case, the alternative, thought-
Critique provoking take on the topic was very useful as it helped in
Efficient forming a more objective standpoint.”
Good Content
Video Download 4 Accessible “I prefer the MP4 due to the quality and flexibility of the
Technically better format.”
No Preference 3 Does not matter “Not specific, all formats would be acceptable as long as it is
Liking all formats necessary for the modules.”
Negative Responses Frequency Coded themes Selected student comments
Academic Papers 1 Difficult to use “I use academic papers but I struggle to maintain focus in
trawling through lots of written information.”
External Links 2 Not academic “Time consuming to search for specific information.”
Not specific
Online Lectures 0 (No negative No negative comments
comments)
Outdoor Video 2 Outdoor noises “The outdoor lecture was a bit more difficult to listen to
Quality of content because of the environmental conditions and the content
seemed a bit thin compared to the more prepared lectures in
other formats.”
Podcast 8 Ineffective “This is too casual and cannot get enough of my attention.”
No images “No script and without illustration, I may very likely
Potential to misunderstand the meaning.”
misunderstand
Too long
Streamed Video 1 Lower quality “I did not like the low resolution of some of the embedded
video options.”
Tutor’s Viewpoints 1 Not assignment- “Tutor's viewpoints I have read before have been interesting
related and thought provoking but have not been directly relevant to
the assignment.”
Video Download 0 (No negative No negative comments
comments)
No Preference 2 “I have no need or desire to use the resources whilst on the
between formats move.”

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 152


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Table 6 (above) gives the students’ views on the resources made available to them in the
second intervention.
A very similar pattern of feedback was received for both interventions: that online lectures
were most favored (supporting the findings of MacPherson & Nunes, 2004; Mitra et al., 2010;
Borup et al., 2011; Carmichael et al., 2017; Crook & Schofield, 2017; Scagnoli et al., 2017) and
that the podcasts were the most criticized (Lawlor and Donnelly, 2010). Views were mixed on the
use of outdoor videos to assess urban trees in the second intervention, some students wanted higher
production values in terms of technical content and sound recording. Similar comments were
received about the streamed videos, where lower quality was a key limitation.
No respondents commented on the absence of the knowledge review quizzes that had been
consistently provided in all previous eleven teaching sessions of this module. This strongly
suggests that these formative assessments were not a highly valued learning resource for these
students.

Conclusions
The findings of this study should be understood in the specific context of a specialist online
course on a technical topic at postgraduate level. It is considered likely that the ages, previous
educational backgrounds, and English language ability of the students who participated in this
study will have had a strong bearing on the results reported here.
The initial questionnaire in this study identified three current resources that were
significantly more favored than the others: online lectures, academic papers, and tutor’s
viewpoints. These resources were conceived to be key elements of the teaching provision for this
MSc course program by its tutors—so this instructional design was strongly supported by the
students’ views (n = 40). A continuing preference for these resources was shown after both
interventions (n = 18), identifying that successful course delivery should involve a mix of
audio/visual resources with ample written resources. The responses received provided a clear
answer to our two research questions. These postgraduate students appreciated the mix of learning
resource types made available to them but showed strong preferences toward the use of online
lectures, reading primary literature and having bespoke course-related notes or articles.
In terms of creating an effective online learning community (Figure 1), this study elicited
responses in a number of key areas that may be relevant to other practitioners. At this postgraduate
level, critical thinking is a key expectation of students’ work. Students valued the “Tutor’s
viewpoints” in the course materials because this learning resource always provided a critical stance
on key topics from which students could develop their own views and opinions. Provided in written
form (PDF files), it was the critical content that the students valued; there were no comments on
this being a less valuable learning object because of its medium. A number of responses on a range
of learning resources provided, showed there was a clear focus on relevant content over medium.
Students valued the flexibility of their studies but only a few expressed an interest in the use of the
podcasts. Those that did emphasized that they “freed them from the screen.” The majority of these
students, however, did not like this medium. We would recommend that it should only be provided
as a means of minor differentiation, providing the same content as, say an online lecture or course
notes in a format that suits only a minority.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 153


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Other suggested key attributes of an effective online course, in terms of its learning
materials, are that they should be learner-focused, interactive and collaborative (Figure 1). The
most interactive elements of the current course were considered by the tutor to be the review
quizzes and the discussion board but these were not favored by the students. They often stated a
strong, individualistic focus on gaining new knowledge for themselves rather than on interaction
or collaboration. The quizzes in their current format were clearly ineffective learning resources.
They were both rated low on the initial questionnaire and were not missed by students when they
were omitted from the second intervention. This may be because they were provided as knowledge
reviews on specific reading material. We are intending to trial alternative interactive quizzes with
different foci to see if it was their original specificity that was off-putting to students. Likewise,
adjusting the “rules” of the discussion board may lead to a better level of interaction. For example,
we are considering allowing anonymous postings or pseudonyms for students so they feel less
daunted about using this collaborative tool or by making engagement with the discussion board
compulsory (Malkin et al., 2018).
An online course designer or tutor should act to empower their students by creating
flexible, interactive, attractive, and content-rich learning resources that leads to stronger
engagement by the students on the course (Redmond et al., 2018). This study has reported on one:
an iterative cycle of learning object creation, appraisal, and user feedback. The knowledge
acquired has provided a more nuanced understanding of the ways in which individuals, on this
program, value and utilize the resources made available to them. Our findings have resonated with
tutors delivering other online postgraduate programs to mature, work-based learners. The broad
learning from this work is twofold. Firstly, that care is needed in transferring general research in
online learning to specialist, atypical, groups of learners. Secondly, that an iterative cycle of
reviewing resources brings educational and financial benefits to tutors delivering online courses.
As a result of this work time and effort has been more effectively directed towards the generation
of appropriate and engaging online learning resources.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 154


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

References
Anderson, T. (Ed.) (2008). The theory and practice of online learning (2nd edition). Alberta
University Press.
Augar, N., Raitman, R., Lanham, E., & Zhou, W. (2006). Building virtual learning communities.
In Z. Ma (Ed.), Web-based intelligent e-learning systems:Technologies and applications
(pp. 72–100). Information Science Publishing.
Beaudoin, M. F. (2002). Learning or lurking?: Tracking the “invisible” online student. Internet &
Higher Education, 5(2), 147–155.
Bell, D. (2008). The university in your pocket. In G. Salmon and P. Edirisingha (Eds.),
Podcasting for learning in universities (pp. 178–187). OUP.
Beuschel, W., Gaiser, B., and Draheim, S. (2003). Communication needs of online students. In
A. K. Aggarwal (Ed.), Web-based Education: Learning from experience (pp. 203–222).
Information Science Publishing.
Bonk, C. J. (2001). Online teaching in an online world. CourseShare.
Bonk, C. J., & Zhang, K. (2008). Empowering online learning: 100+ activities for reading,
reflecting, displaying and doing. Jossey-Bass.
Borup, J., Graham, C., & Velasquez, A. (2011). The use of asynchronous video communication
to improve instructor immediacy and social presence in a blended learning environment. In
A. Kitchenham (Ed.), Blended learning across disciplines: Models for implementation (pp.
38–57). IGI Global.
Burton, N., Brundrett, M., & Jones, M. (2008). Doing your education research. Sage
Publications Ltd.
Carmichael, M., Reid, A.-K., & Karpicke, J. D. (2017). Assessing the impact of educational
video on student engagement, critical thinking and learning: The current state of play. Sage
Publishing Ltd.
Cole, M. T., Shelley, D. J., & Swartz, L. B. (2014). Online instruction, e-learning and student
satisfaction: A three-year study. The International Review of Research in Open and
Distributed Learning, 15, 111–131.
Conrad, R., & Donaldson, J. (2004). Engaging the online learner: Activities and resources for
creative instruction. Jossey-Bass.
Crook, C., & Schofield, L. (2017). The video lecture. The Internet and Higher Education 34, 56–
64.
Fisher, M. (2003) Designing courses and teaching on the web: A “how-to” guide to proven,
innovative strategies. Scarecrow Education.
Forsyth, I. (2001). Teaching and learning materials and the internet (3rd edition). Kogan Page
Ltd.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 155


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Galletta, A. (2013). Mastering the semi-structured inteview and beyond: From research design
to analysis and publication. New York University Press.
Garrison, D. R., & Anderson, T. (2011). E-learning in the 21st century: A framework for
research and practice (2nd edition). RoutledgeFalmer.
Gillani, B. B. (2003). Learning theories and the design of e-learning environments. University
Press of America Inc.
Gillett-Swan, J. (2017). The challenges of online learning: Supporting and engaging the isolated
learner. Journal of Learning Design 10(1), 20–30.
Inglis, A., Long, P., & Joosten, V. (2002). Delivering digitally: Managing the transition to the
knowledge media (2nd edition). Kogan Page Ltd.
Kinash, S., Knight, D., & McLean, M. (2015). Does digital scholarship through online lectures
affect student learning? Journal of Educational Technology & Society, 18(2), 129–139.
Lawlor, B., & Donnelly, R. (2010). Using podcasts to support communication skills
development: A case study for content format preferences among postgraduate research
students. Computers & Education, 54(4), 962–971.
Lee, S. J., Srinivasan, S., Trail, T., Lewis, D., & Lopez, S. (2011). Examining the relationship
among student perception of support, course satisfaction and learning outcomes in online
learning. The Internet and Higher Education, 14(3), 158–163.
Lockwood, F., & Gooley, A. (Eds.) (2001). Innovation in open and distance learning: Successful
development of online and web-based learning. Kogan Page Ltd.
MacDonald, J. (2006) Blended learning and online tutoring: A good practice guide. Gower
Publishing Ltd.
Malkin, A., Rehfeldt, R. A., & Shayter, A. M. (2018). An investigation of the efficacy of
asynchronous discussion on students’ performance in an online research method course.
Behaviour Analysis in Practice, 11, 274–278.
Martin, F., & Bolliger, D. U. (2018). Engagement matters: Student perceptions on the
importance of engagement strategies in the online learning environment. Online Learning
22(1), 205–222.
McVay-Lynch, M. (2002). The online educator: A guide to creating the virtual classroom.
RoutledgeFalmer.
Mitra, B., Lewin-Jones J., Barrett, H., & Williamson, S. (2010). The use of video to enable deep
learning. Research in Post-Compulsory Education, 15(4), 405–414.
Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2009). Evaluation of evidence-
based practices in online learning: A meta-analysis and review of online learning studies.
Office of Planning, Evaluation and Policy Development, U.S. Department of Education.
Moore, M. G., & Kearsley, G. (2012). Distance education: A systems view of online learning
(3rd edition). Cengage Learning.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 156


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Muilenburg, L. Y., & Berge, Z. L. (2005). Student barriers to online learning: A factor analytic
study. Distance Education, 26, 29–48.
Palloff, R. M., & Pratt, K. (1999). Building learning communities in cyberspace: Effective
strategies for the online classroom. Jossey-Bass.
Picciano, A. G. (2002). Beyond student perceptions: Issues of interaction, presence and
performance in an online course. Journal of Asynchronous Learning Networks, 6(1), 21–40.
Pomales-Garcia, C., & Liu, Y. (2006). Web-based distance learning technology: The impacts of
web modules’ length and format. American Journal of Distance Education, 20(3), 163–179.
Race, P. (2005). 500 tips for open and online learning (2nd edition). RoutledgeFalmer.
Redmond, P., Heffernan, A., Abawi, L.-A., Brown, A., & Henderson, R. (2018) An online
engagement framework for higher education. Online Learning, 22(1), 183–204.
Richardson, W. (2006). Blogs, wikis, podcasts and other powerful web tools for classrooms.
Sage Publications Ltd.
Salmon, G. (2008). The future of podcasting. In G. Salmon & P. Edirisingha (Eds.), Podcasting
for learning in universities (pp. 178–187). OUP.
Scagnoli, N. I., Choo, J., & Tian, J. (2017). Students’ insights on the use of video lectures in
online classes. British Journal of Educational Technology, 50, 399–414.
doi:10.1111/bjet.12572
Seale, J,, Boyle, T., Ingraham, B., Roberts, G., & McAvinia, C. (2007). Designing digital
resources for learning. In G. Conole & M. Oliver (Eds.), Contemporary perspectives in e-
learning research: Themes, methods and impact on practice (pp. 121–133). Routledge.
Selwyn, N., Gorard, S., & Furlong, J. (2006). Adult learning in the digital age. Routledge.
Sharples, M., Taylor, J., & Vavoula, G. (2006). A theory of learning for the mobile age. In R.
Andrews & C. Haythornthwaite (Eds.), The sage handbook of e-learning research (pp. 221–
247). Sage Publications Ltd.
Smith, R. M. (2008). Conquering the content: A step-by-step guide to online course design.
Jossey-Bass.
Song, L., Singleton, S. S., Hill, J. R., & Koh, M. H. (2004). Improving online learning: Student
perceptions of useful and challenging characteristics. Internet and Higher Education, 7, 59–
70.
Tracy, S. J. (2013). Qualitative research methods. Wiley-Blackwell.
Vonderwell, S. (2003). An examination of asynchronous communication experiences and
perspectives of students in an online course: A case study. The Internet and Higher
Education, 6(1), 77–90.
Weller, M. (2007). The distance from isolation: Why communities are the logical conclusion in
e-learning? Computers & Education, 49, 148–159.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 157


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Appendix A:
Template of the Initial Questionnaire used in this study

Question Question Text Possible responses


Number
01 What year of study are you in, on this 1st year, 2nd year or 3rd
MSc course? year.

02 Rate the extent that you have used the Rating of 1 (low use) to
listed online learning resources 5 (high use) for all
provided by this MSc course. named resources.

03 Rate the existing online learning Rating of 1 (low use) to


resources in terms of their usefulness 5 (high use) for all
to you as a student on the MSc course, named resources.
based on how efficient and effective
your learning is from these resources.

04 Of all existing online resources, which Students could only


do you use the most? select one type of named
resource.

05 Why did you use this particular Open question (textbox).


resource the most?

06 My research work this year will Rating of 1 (low use) to


involve creating new online resources 5 (high use) for all
for learning which you will have named options for new
access to. Some options are resources.
“downloadable”, in that you could
download a file and use it when not
connected to the internet. Other options
are not downloadable because an
internet connection is needed at all
times for these resources to work.
What online resources would you
prefer to see created?

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 158


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Appendix B:
Template for the semi-structured interviews used in this study

Your motivations:
• Reasons for taking the course
• Deep or strategic studier?
Your study behavior:
• Study time spent during the week—and pattern
• Your mix of reading, creating and interacting
• Your use of the discussion board
• The biggest benefits from studying online
• The problems with online study that you would like to highlight
Your favored resources:
• What resources do you personally favor to use?
• Why did you favor these?—what was it about them that made them better to use or learn
from?
• What course resources were more valuable to you for putting together assignment work,
if any?
• What resources do you find you are using the least?—and why is that the case?
• Highlighting essential resources for assignments?
• Rating resources—student ratings? Tutor’s guidance?
Future learning resources:
• Would more accessible resources be more useful to you in your studies?—If so, why?
• Would resources you can keep after you leave the course be more attractive to you? If so,
why?
• From your perspective, what is a good balance between written materials and audio
materials? 50/50?
• In terms of audio materials, do you have a particular preference for audio recordings,
audio/visual lectures or videos? If so, why?
• Is there any benefit in having a mix of audio resources, or is it better to standardise these
to just one or two types, for consistency in delivery?
• Is there any benefit in putting any audio on a more stable platform (e.g., YouTube)?—or
would you find that off-putting?

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 159


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

Appendix C:
Template of the Final Questionnaire used in this study

Question Question Text Possible responses


Number
01 Did you use any new online resources that Yes/No.
were created for MR4001 this year? If yes,
continue to question 3. If no, please just
answer question 2.

02 What factors caused you not to engage with Open question


these new online resources? (textbox).

03 Intervention One: Which formats of this List of formats to tick.


resource did you attempt to use?

04 Intervention One: Please state the format for Open question


this resource that you preferred and why (textbox).
you had a preference for this format.

05 Intervention One: If there were one or more Open question


formats of Alternative Urban Forest Futures (textbox).
that you avoided using, please explain why
you chose not to try to use that format/those
formats.

06 Intervention Two: Which formats from List of formats to tick.


Session 12 did you attempt to use?

07 Intervention Two: Which resources did you List of resources to


find most useful in terms of ideas or tick.
citations for your assignment work for
MR4001?

08 Intervention Two: Which resources did you List of resources to


prefer in terms of their content? tick.

09 Intervention Two: Which resources did you List of resources to


prefer in terms of their format/media? tick.

10 Intervention Two: The audio presentations Yes/No.


did not come with a script in this trial.
Would you have preferred the presentations
to also be supplied with a script?

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 160


Student Preferences for Learning Resources on a Land-Based Postgraduate Online Degree Program

11 Intervention Two: What format of resources Open question


that you tried during this intervention would (textbox).
you want to see used regularly by tutors of
this online MSc course?—and why did you
find them effective for your learning
purposes?

12 Intervention Two: What resources provided Open question


by this intervention did you not find helpful (textbox).
or that you did not use at all?—and why did
you not think them effective for your
learning purposes?

13 Intervention Two: Did you think there was Open question


anything missing from this session, or (textbox).
something that should be added? If so,
please contribute what other resources you
would have liked to be part of this final
session for MR4001.

14 Intervention Two: If you would like to Open question


contribute further thoughts on learning (textbox).
resources that could be effective for
students studying this online MSc course,
please use the comments box provided
below.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 161


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

Factors Influencing Programming Expertise in a


Web-based E-learning Paradigm
Wajid Rafique and Wanchun Dou
Department of Computer Science and Technology, Nanjing University, P.R. China
State Key Laboratory of Novel Software Technology, Nanjing University, P.R. China

Khalid Hussain
School of Business, East China University of Science and Technology, Shanghai, P.R. China
Department of Management Sciences, COMSATS University Islamabad, Sahiwal Campus, Pakistan

Khurshid Ahmed
School of Information Management, Nanjing University, Nanjing, P.R. China
Department of Library and Information Science, The Islamia University of Bahawalpur, Pakistan

Abstract
Modern internet technologies have revolutionized traditional education by providing flexible and
resourceful e-learning opportunities in all fields of life. Programming is an integral part of the
undergraduate curriculum in computer sciences where an adequate level of programming expertise
is expected from the graduates. In this paper, we explore and examine the key factors that
contribute to developing programming skills among undergraduate students in e-learning. We
propose that programming education follows the Technology Acceptance Model (TAM), which
affects the students’ attitude toward learning. We extend the TAM by integrating the factors of
teaching practices, intrinsic factors, perceived usefulness, and efficacy problems with the learning
intentions in our research framework.
This research involves the responses of the 460 final year students studying for a Bachelor of
Computer Science and Software Engineering at an e-learning institution. Structural Equation
Modelling (SEM) and Confirmatory Factor Analysis (CFA) have been employed to evaluate the
relationship between factors of the model. Experimental results demonstrate that teaching
practices, intrinsic factors, and perceived usefulness play a key role in endorsing learning
intentions in the students. Further analysis reveals that learning intentions positively influence the
programming expertise whereas an adverse impact has been observed from the efficacy problems.
The results proclaim that perceived usefulness, teaching practices, and intrinsic factors develop
adequate learning intentions in the students which overcome the efficacy problems and lead to
better programming expertise. This research provides critical implications for policymakers to
effectively implement computer science programs in an e-learning paradigm.

Keywords: e-learning, programming, web, expertise, teaching, programming


education, barriers in programming.

Rafique, W., Dou, W., Hussain, K., & Ahmed, K. (2020). Factors influencing programming
expertise in a web-based e-learning paradigm. Online Learning, 24(1), 158-177.
https://doi.org/10.24059/olj.v24i1.1956

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 158


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

Factors Influencing Programming Expertise in a Web-based E-learning Paradigm


E-learning has been extensively introduced in higher educational institutions due to the
rapid development in information and communication infrastructure (Hung & Chou, 2015). Many
universities have started to offer online degree courses in addition to traditional study programs.
Online services provide immense opportunities for effective implementation of e-learning. They
offer the flexibility for e-learning providers to host their Learning Management Systems (LMS)
online. Moreover, it enables students to access the course material independent of location and
time constraints (Jose & Christopher, 2018). Additionally, web technologies provide the ability to
support big data and multimedia streams that provide independence to the e-learning providers
from bandwidth limitations, computation resources, storage concerns, and many other issues.
Figure 1 explains the architecture of a web-based e-learning education system. It illustrates that
the e-learning stakeholders interact with a web-based e-learning management system equipped
with virtual machines and physical hardware. Students, tutors, and the administrators interact with
the LMS using the interface provided by the service providers. LMS stands at the core of e-learning
as it provides all the educational interactions to the students.
Due to a wide proliferation of web technologies, a huge number of free online courses are
available hosted by YouTube, Coursera, Udemy, Edx, and many others. A wide range of e-learning
courses has been available in every field of life including history, social sciences, natural sciences,
engineering, and medicine. Computer science has been one of the most popular education
disciplines because of an ever-increasing demand for IT professionals. Programming is the
fundamental aspect of computer science programs where most of the universities start computer
science curriculum with the programming courses all over the world (Raigoza, 2017).
An adequate level of programming expertise has been expected from the graduates in
computer sciences. Programming education is demanding because it involves logical reasoning,
mathematical skills, and extensive domain knowledge; moreover, it becomes more challenging in
e-learning (Lam, Chan, Lee, & Yu, 2008). Programming education requires extensive efforts from
students because they must solve complex program logic and develop a procedural algorithm to
develop the code for the underlying problem. Because of these complexities, a higher dropout ratio
has been observed in the computer science degree programs (Sarpong, Arthur, & Amoako, 2013).
Similarly, a huge number of students complete their undergraduate studies in computer science by
getting a sufficient amount of general knowledge in programming but lacking in specific skills to
develop high-quality computer applications (Kelleher & Pausch, 2005).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 159


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

Figure 1. A web-based e-learning system model.

In e-learning, the students become frustrated when they try to execute their code without a
correct understanding of the problem and forcing the entire program to run without modular
development and testing (Raigoza, 2017). Therefore, programming education in e-learning
becomes a critical challenge. Previous studies in this context, mainly focus on the motivational
factors that affect programming education in computer science students (Law, Lee, & Yu, 2010).
Although a wide range of research is available in determining problems faced by e-learning
students in general, there is a lack of literature available on identifying issues in programming
education. Hence, it becomes a noticeable challenge for the practical envisioning of the e-learning
(Liaw, 2008). To address these challenges, we aim at designing this study to identify and analyze
the fundamental factors that involve in the development of programming skills in e-learning. We
perform the analysis from the perspective of both the student and the tutoring environment. A
holistic approach has been used in programming education ecosystem to identify the core factors
that affect programming skills development in e-learning. Moreover, the interdependence of these
factors has been analyzed. This research can be implemented to facilitate policy makers and
administrators to effectively develop, deliver, and manage e-learning in solving the problem of
programming education. The main contributions of this study are as follows:
• We perform an extensive literature study and identify Teaching Practices (TP), Intrinsic
Factors (IF), Perceived Usefulness (PU), Efficacy Problems (EP), and Learning Intentions
(LI) as key factors contributing toward programming expertise development in e-learning.
• A research model has been proposed by extending the Technology Acceptance Model (TAM),
which measures the dependency of these factors on overall programming skills development.
• A set of detailed validation and evaluation experiments have been performed on the survey-
based data to demonstrate the data validation. Moreover, Structural Equation Modelling
(SEM) and Confirmatory Factor Analysis (CFA) have been employed to evaluate the
effectiveness of the proposed research model.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 160


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

Review of Related Literature


Computer science constitutes one of the most important degree programs offered by e-
learning institutions due to an ever-increasing demand for IT professionals. Programming is the
fundamental skill expected from computer science graduates (Lam et al., 2008). It requires
analytical and problem-solving skills from the learner that involve describing processes and
procedures, developing algorithms, and the implementation in the desired programming language
(Law et al., 2010). E-learning students lose interest and face problems during the coding tasks;
therefore, there is a strong need to improve programming education in the e-learning. We perform
an extensive literature survey and find that five factors greatly affect programming expertise in e-
learning, which includes TP, IF, EP, PU, and LI. We devise a research model by extending TAM
and explain how these factors influence the programming expertise in e-learning.
Nganji (2018) reveals that education providers must focus on learners to increase their
participation in the learning process, which can improve their knowledge and skills. Therefore, the
medium of instruction plays an important role in the learning process. The communication strategy
(e.g., synchronous, asynchronous) strongly influences the students’ understanding of the subject.
In synchronous communication, direct interaction among teachers and students provide the basis
for the academic discussions that help the students assimilate the course content. Alternatively,
asynchronous communication is adopted in the e-learning where the discussion forums and emails
are used for the student-teacher interaction. Offir, Lev, and Bezalel (2008) demonstrate that
asynchronous communication yields an adverse impact on the performance of the students. They
propose that asynchronous communication does not produce a student-teacher dialogue that
deprives the students of asking questions. Boelens, De Wever, and Voet (2017) devise a strategy
to arrange face-to-face meetings at the start of the course so that students get an introduction of
their mentors and their classmates. This introduction provides the e-learning students a sense of
community later in the course. The impact of student-teacher interaction on the final year project
has been discussed by Dos Santos and Cechinel (2018); their findings reveal that face-to-face
meetings yield positive results. Hence, the student-teacher interaction is at the core of e-learning
for effective implementation.
Programming is challenging in a way that it requires both a theoretical understanding of
the concepts and hands-on experience in specific programming languages (Lam et al., 2008).
Numerous techniques have been devised to facilitate the complex programming environment
including pair programming, shared code, and tools to facilitate the debugging process. Sarpong
et al. (2013) suggest that extensive lab work under the guidance of a tutor for programming tasks
help students to master programming skills and decrease the retention rates. Celepkolu and Boyer
(2018) discuss the importance of a shared coding system in a hybrid pair programming
environment to overcome the common mistakes performed by the students. Zin, Idris, and
Subramaniam (2006) introduce a virtual pair programming solution where one student performs
the coding and the other proofreads simultaneously. However, this technique consumes a lot of
time; moreover, it requires constant interaction between the students, which sometimes becomes
difficult in e-learning. Lam et al. (2008) propose an automatic debugger to solve the problem of
the mentor providing hints on the mistakes during the coding process. Students submit their code
to a debugger which identifies common errors and offers suggestions for improvement; however,
this tool only works for smaller programming tasks.
One of the primary medium of communication in e-learning is LMS, which facilitates the
student-teacher interaction; moreover, it enables effective follow-up of the course activities.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 161


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

Yunkul and Cankaya (2017) present the importance of Edmodo LMS which provides a secure
environment for student-teacher interaction and feedback. It also incorporates the social media
platform with LMS that creates a social environment among the students. Ateş Çobanoğlu (2018)
explores the preferences of students in an information technology course to learn in a blended
learning environment, which involves both traditional education and e-learning. Their results
proclaim that the use of LMS for blended learning increased student’s performance. Similarly, the
usefulness of the underlying e-learning course for a student is also an important factor in achieving
programming skills. B.-C. Lee, Yoon, and Lee (2009) suggest that the students who perceive that
the e-learning course is beneficial in their future try to perform better during their studies.
Different solutions have been proposed for solving problems faced by the students during
programming. However, there is still a lack of research available on ascertaining factors which
involve programming education. The proposed tools in the research facilitate the specific aspects
of programming education; however, a holistic approach in the programming education ecosystem
is still not available. Therefore, we provide an empirical study and design a research framework
for the identification and analysis of factors that affect programming expertise in e-learning.
E-learning Research Framework
We present a research framework after an extensive literature review and identify TP, IF,
EP, PU, and LI as the key factors involved in programming skills development in e-learning
(Martín-Rodríguez, Fernández-Molina, Montero-Alonso, & González-Gómez, 2015). E-learning
acceptance follows TAM (Venkatesh & Davis, 2000) for effective implementation in different
countries. After a thorough review of the literature, it has been established that TAM offers the
key factors for effective acceptance of e-learning by the students. In the TAM framework, the
flexibility exists for adding more factors (variables) depending on the contextual scenario (Pituch
& Lee, 2006). Similarly, TAM predicts an individual’s attitude toward using ICT technologies and
it owns a widespread background in the field of e-learning (Alharbi & Drew, 2014). The use of
TAM toward the perceptions of the teachers while teaching online has been studied by Wingo,
Ivankova, & Moss, 2017. The EP factor in our research framework moderates the programming
expertise, which is strongly associated with the Perceived Ease of Use (PEU) factor in TAM where
the problems in PEU can negatively affect the intentions to use. The intention factor in TAM
corresponds to the LI in our framework, which demonstrates the motivation to learn programming.
Finally, the usage behavior in the TAM assimilates to the programming expertise in our framework
which corresponds to the effectiveness in adapting to the programming education in e-learning. A
critical review of the research in the field of e-learning demonstrates that TAM has not yet been
studied in the e-learning paradigm to learn computer sciences. Therefore, we utilize PU and IF to
predict the behavioral intention of programming education, moreover, we add TP to develop those
behavioral intentions which are aligned with the TAM framework requirements. Furthermore, we
add computer self-efficacy in performing programming tasks to show the impact on the
programming expertise of students. Due to the above-mentioned reasons, we adopt TAM to
ascertain the impact of students’ intentions toward the continuous use of e-learning for
programming expertise development. Figure 2 illustrates the configuration of all the factors in the
framework. We discuss the development of the research hypothesis below.
3.1 Learning Intentions (LI)
LI can be defined as the extent to which continuous effort has been directed toward
achieving a specific goal (here the goal corresponds to learning how to code effectively).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 162


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

Programming requires constant effort where LI is comprehended by the intuition of students to


learn and practice the programming tasks (Xia & Liitiäinen, 2017). LI depends on the effectiveness
of TP, the PU of the studies, and IF to study in an e-learning course (Pugh, 2019). As discussed in
the literature review, that TP greatly enhances the LI of the students, hence students perform well
in understanding the concepts through TP. These motives serve as an impetus for learning plans
that supports students in knowledge acquisition and academic success (Linnenbrink & Pintrich,
2002). The quality of the education system plays an essential role in determining the success of
the students moreover, the mentoring process and support increase the LI of the students (Dorner
& Kárpáti, 2010). Hence, as this study proposes that LI positively impact the programming
expertise in e-learning, we formulate the hypothesis H3.
H3: LI positively impact the programming expertise of students in an e-learning system.
3.2 Intrinsic Factors (IF)
IF constitute the motivation of a student toward the learning process; moreover, it involves
the individual’s personal rather than environmental setting (Hendijani, Bischak, Arvai, & Dugar,
2016). Students need to be motivated in e-learning because it is easy to lose self-evaluation in the
state of isolation (Galusha, 1998). Khan and Nawaz (2013) argue that when intrinsic motivations
are high, learning outcomes are positive, which demonstrates that IF play a positive role in
developing LI. Most of the IF in higher education involve students’ satisfaction in the current
studies and their interest in the current course (Bouhnik & Marcus, 2006). When students are
satisfied with their studies, they learn effectively, which yields a positive impact on their overall
skills development in e-learning (Eom, Wen, & Ashill, 2006). Considering the above discussion,
we hypothesize that the effect of IF is positively related to the LI. From this discussion, we devise
hypothesis H1 and H6.

Figure 2. E-learning research model indicating all the hypothesis of the research.

• H1: IF positively affect the student’s intentions toward programming education.


• H6: The impact of IF on LI positively transcend toward the programming expertise via an
indirect passage of LI.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 163


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

3.3 Teaching Practices (TP)


TP constitutes the teaching methodologies adopted in the e-learning (Martin, Budhrani,
Kumar, & Ritzhaupt, 2019). In e-learning, TP involves the course content delivery methods
including video lectures, handouts, and LMS discussions for mentoring (Valentine, 2002).
Students perceive that programming is effective and enjoyable in the presence of a mentor
providing hints on their mistakes. However, it is difficult to provide such mentors in e-learning.
Offir et al. (2008) posits that direct interaction between students and the teachers positively affect
the students’ learning process, which helps them to discuss their problems and get immediate
feedback from their mentors. TP greatly affects the programming expertise of students because a
positive correlation exists between the attitude of a learner and the mentoring process (Dahalan,
Hassan, & Atan, 2012). When students keep constant interaction with their tutor, they grasp more
concepts and discuss their issues with their mentor, This results in a positive impact on students’
learning process (Berry, 2019). This study categorizes TP as one of the key factors that influence
programming expertise and measures the effect of TP on students’ LI. Hence, we hypothesize the
following.
• H2: TP in e-learning positively influences students’ LI.
• H7: The impact of TP positively trends toward the programming expertise via an indirect
path of LI.
3.4 Efficacy Problems (EP)
Efficacy is a self-belief to execute a course of action to attain the desired learning outcome in
the e-learning system, factors that negatively impact efficacy have been denoted as EP. It is natural
that the students face problems during the programming tasks hence. Immediate support can help
them get out of the programming complications where they tend to plunge. Most of the times
students lose interest in programming while they practice by themselves and experience failures
in the learning process. Jenkins (2001) reveals that special mentoring arrangements are required
to teach programming in e-learning to enhance efficacy. Allen, Cartwright, and Stoler (2002)
suggest that it is difficult for some beginners to start programming. Automated Integrated
Development Environments (IDEs) can be used to assist them in writing the code. The PASS
program submission system has been developed to facilitate beginners learning programming,
which incorporates an easy-to-use IDE to assist students in programming education (Law et al.,
2010; Yu, Poon, & Choy, 2006). The isolation also contributes to EP; students feel the sense of
isolation due to the non-availability of face-to-face interaction with their peers. This obscures their
learning process. Taking these arguments into consideration, we hypothesize that EP negatively
affects the programming expertise and moderates the influence of LI on student programming
expertise. Hypothesis H4 and H9 have been formulated from this discussion.
• H4: EP adversely impacts programming expertise.
• H9: EP moderate the impact of LI on programming expertise.
3.5 Perceived Usefulness (PU)
PU means the extent to which e-learning students find their course beneficial (B.-C. Lee et al.,
2009). PU has widely been used to predict the adoption of e-learning (Tarhini, Hone, & Liu, 2014;
Y.-H. Lee, Hsiao, & Purnomo, 2014). It is also pertinent to note that the PU has a positive influence
in developing the behavioral intention of the students (Jan & Contreras, 2011). When the students

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 164


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

find that their studies are beneficial to them in the future they put in more effort; hence, their LI
increase (Cheng, 2011). Keeping this in mind, we hypothesize the following.
• H5: PU has a positive impact on developing LI in programming education.
• H8: The impact of PU on LI trends positively toward programming expertise in e-learning.

Methods
Figure 3 describes the workflow of current research, which includes extensive background
study, factors identification, questionnaire development, data collection, validation acceptation,
and finally the results and discussion. The participants in this study include students of the Virtual
University of Pakistan (VU) that employs a web-based LMS to facilitate e-learning. The LMS
enables students to submit assignments, check results, follow the class schedule, participate in the
discussions, and contact the course tutor for the discussions. The video lectures have been recorded
and delivered to the students who attend them according to the schedule provided on LMS. Every
tutor regularly creates a discussion topic for each lecture where students discuss their issues and
problems via text messages. We employ a random sampling procedure to select participants in the
current study. The original sampling frame of this study consisted of 550 students majoring in
different fields of computer science. We used Google forms to conduct the survey, which made it
easy to directly import the data in the analysis tools. We distributed the online survey using emails
and repeatedly sent bimonthly reminders to the participants to complete up the survey. Finally, out
of the total population, we were able to collect responses from 460 respondents (response rate =
83%). We used five-point Likert-scale comprising of strongly agree, agree, neutral, disagree, and
strongly disagree for all items (except item 3, 7, 8, and 10). Alternatively, we measure the items
3, 7, 8, and 10 on a five-point Likert scale of strongly satisfied, satisfied, neutral, somewhat
satisfied, and not satisfied.

Figure 3. The workflow of research.

The data analysis was performed in the two phases. In the first phase, the demographics
and the reliability study of the measurement model were performed. In the second phase, the
hypothesis testing, CFA, and SEM were conducted. The research group of this study consisted of
students who were in their final year of Bachelor of Computer Science (n = 309) and Software
Engineering (n = 151) degree. Students in the final year develop independent projects that need
extensive programming skills. The reason behind selecting these students was that they have
extensive experience of studying in the e-learning system and their programming skills should
have been actively developed to accomplish their final year project. The responses include
students' demographics information, TP, IF, EP, LI, and programming expertise. Table 1 shows
the demographic information of the students involved in the study.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 165


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

Table 1.
Demographic information of the participants.
Variables Value
Age Range (years) 20-43
Gender Male 47.8%
Female 52.2%
Major Computer Science 67.2%
Software Engineering 32.8%

Figure 4. (a) Values of KMO and Cronbach’s Alpha; (b) Factor loading for all the variables where
the dotted line demonstrates the acceptable range in both the figures.

4.1 Model Validation


In this section, we present the experimental evaluation of the proposed research. We
analyze the data with Statistical Package for Social Sciences (SPSS-version 22.0) and Mplus
version 8.1 to measure the Exploratory Factor Analysis (EFA) and reliability coefficients
(Tenenhaus, Vinzi, Chatelin, & Lauro, 2005; Chin, 1998). We initially verify data for the reliability
and validity before testing the proposed hypothesis. All the measurements in this research have
been newly developed. Therefore, the efficiency of these measures has been established by
performing the EFA. The results of EFA are presented in Table 2, which indicates that all the
measurement scales fulfill the requirements of the recommended standard. According to Hair,
Black, BABIN, Anderson, and Tatham (2010), the data must be analyzed for Kaiser-Meyer-Olkin
(KMO) and Bartlett's test of sphericity before proceeding to EFA. The values of KMO should be
higher than 0.7 and Bartlett's trial should be significant to meet the cut-off criteria. Figure 4
elaborates the validation of the dataset; we observe from Figure 4a that the values of Cronbach's
Alpha and KMO are greater than 0.7. Similarly, Figure 4b demonstrates that the values of factor
loading are greater than 0.5 which verifies the suitability of the data for the current research. The
result of Bartlett's test is also substantial, which shows the adequacy of the data for EFA. We
evaluate the results using Principal Component Analysis (PCA), which is a highly reliable
technique to measure EFA. Results in Table 2 demonstrate that the output of all the measurement

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 166


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

scales was a single factor; moreover, the eigenvalue of the first indicator is also higher than 1.0.
Hence, we choose one element for each measurement scale. Moreover, Hair et al., (2010) and
Sekaran and Bougie (2011) state that the factor loadings of individual elements of each
measurement scale should be higher than 0.5 to meet the criteria of convergent validity.
Correspondingly, all the other items successfully loaded on the selected portions.
The factors of PU, LI, and EP are adapted from the study of the Davis and Venkatesh
(1996). The IF conforms to the study of Eom et al., (2006). Similarly, TP was adapted from Hung
and Chou (2015), whereas the factors of PE were extracted from Kelleher & Pausch, 2005. We
used five measurement scales and factor loading of all the items on these scales was higher than
0.5, which shows the adequate convergence of the measurements. The reliability of data was
measured with Cronbach’s Alpha, which is the most common and statistically superior test for
measuring the internal consistency of the data. Finally, Table 2 illustrates that the Cronbach’s
Alpha for all the measurement scales is higher than 0.7 which shows the robust reliability of
measurement scales for further analysis (Hair et al., 2010; Sekaran & Bougie, 2011).

Table 2.
Statistics of EFA on the Dataset
Item Factor KMO Eigen
Variables No. Items Loading Value Value
1 Did you enroll in an e-learning institution because 0.855
of its flexible accessibility?
2 Will you like to take another higher degree course in 0.852
computer sciences at an e-learning institution?
3 How do you rate your satisfaction level in the e- 0.848
Intrinsic Factors 0.887 3.521
learning studies?
4 Did you follow-up the course material along with 0.844
your other schedules regularly?
5 Did you find the computer science course fruitful in 0.796
your career before enrolling?
6 Do you wish to have step by step guidelines for the 0.512
complex coding tasks?
7 How do you rate the content quality of the course 0.686
material?
Teaching 8 How do you rate the lecture delivery of the tutors 0.732
0.759 2.480
Practices during the lectures?
9 Do you like to have face to face conversation with 0.767
the tutor during coding tasks?
10 How effective was the instructor’s response to LMS 0.789
when you interact with them?
Learning 11 Are you intending to join or already working in a 0.710 0.701 2.144
Intentions software development company?
12 Do you try to start coding and fail? 0.675
13 Are you motivated toward learning programming? 0.793
14 Do you explore online tutorials other than course 0.745
material for programming help?

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 167


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

Programming 15 Do you have sound knowledge of basic 0.827 0.793 2.553


Expertise programming skills?
16 Have you developed your final year project by 0.765
yourself?
17 Do you have excellent skills in a reputed 0.847
programming language, e.g., Net, Java, python?
18 Did you complete your programming assignments 0.752
by yourself during your studies?
Efficacy 19 Do you think that if you stuck on a programming 0.851 0.721 2.291
Problems task, you are not able to get out of it, because of
lack of support?
20 Do you think that it is difficult to access right 0.876
programming help on the internet?
21 Do you think integrated software development tools 0.893
(IDE) are complex?
Perceived 22 Do you think that this e-learning course will earn 0.751 0.792 2.312
Usefulness you a good job?
23 Do you think that you will find a good career after 0.862
getting the degree?
24 Do you think that career-oriented learning is the 0.789
need of current rapid development environment?

4.2 Model Fitness


The hypothesized paths have been tested with the SEM technique using Mplus version 8.1.
Before the SEM, we validated the data with confirmatory factor analysis (CFA). The model fitness
indices for both CFA and SEM demonstrate an acceptable fit of the data with the proposed model.
Five measurements of fitness indices were utilized including Comparative Fit Index (CFI), Tucker-
Lewis Index (TLI), Root Mean Square Approximation (RMSEA), Standardized Root Mean Square
Residual (SRMR), and chi-square to degree freedom (χ2 /d.f) as presented in Table 3. Fit indices
suggest that the values of CFI and TLI should be near to 0.95 for a good fit, moreover the values
of RMSEA and SRMR should be less than 0.10 for an adequate fit (Hu & Bentler, 1999).
Additionally, the chi-square to the degree of freedom ratio should be less than 3.0 for an acceptable
fit (Amemiya & Anderson, 1990). Table 3 demonstrates the findings of the CFA and SEM indices
where the values of CFI and TLI are 0.942 and 0.931, respectively. These values are aligned with
the threshold described by Hu and Bentler (1999) for an acceptable fit. Additionally, the values of
SRMR and RMSEA are also following the Hu & Bentler (1999) criteria of the SEM model fitness.
Similarly, the value of chi-square is 1.62 which follows the criteria proposed by (Amemiya &
Anderson, 1990).

Table 3.
Model Fitness Indices used in the Research.
Values
Fitness Indices
CFA SEM
CFI 0.966 0.942
TLI 0.958 0.931
RMSEA 0.045 0.056
SRMR 0.048 0.058
Chi-Square χ2 /d.f 1.40 1.62

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 168


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

To fulfill the requirements of validity and reliability, we carried out CFA of the collected
data before proceeding to structural analysis. The model fit indices as presented in Table 3
indicates that the model fits well with the data. The factor loadings of individual question indicators
of all the variables as shown in Table 4 are higher than 0.5, which satisfies the requirements of
convergent validity (Hair et al., 2010). We deleted two question items, one from teaching practices
construct and one from learning intentions scale because of lower factor loadings. The factor
loading of these two items was lower than 0.5, which was not meeting the cut off criteria; hence,
we carried out further analysis on the remaining items. The Cronbach’s Alpha reliability test
indicates that all the measures used in this study are reliable as these values are higher than 0.70
for all the constructs (Hair et al., 2010).

Table 4.
Confirmatory Factor Analysis
Construct No. of Items Factor Loading (Range) Cronbach’s Alpha
Intrinsic Factors 05 0.740 – 0.813 0.894
Teaching Practices 04 0.559 – 0.743 0.741
Learning Intentions 03 0.504 – 0.649 0.705
Programming Expertise 04 0.648 – 0.801 0.810
Efficacy Problems 03 0.735 – 0.880 0.845
Perceived Usefulness 03 0.613-0.793 0.791

Results
The results of the structural model have been presented in Table 5, which shows the
hypothesized paths and their respective coefficients. H1 explores the relationship between IF and
LI where the items measuring IF include student’s intentions of joining the e-learning program,
student's follow-up of the course, and intrinsic desire to learn the course. The results demonstrate
that H1 has significantly been accepted, which affirms that IF has a positive impact on the LI (β =
0.743, p = 0.000) as it can also be observed in Figure 5 which shows the path performance of all
the hypothesis. The plausible reason behind this is the fact that when the students join the
programming course because of their interest, they follow-up the course regularly and thus are
satisfied with their studies. Furthermore, their learning motivation develops higher. In the same
context, H6 suggested that the impact of IF on LI positively trend toward programming expertise.
This hypothesis was also supported by the results (β = 0.703, p = 0.001), it implies that students
with high intrinsic motivation of joining the computer science degree programs will be able to gain
good programming expertise in the future.
H2 proposes an affirmative impact of TP on LI which has been significantly revealed
during the evaluation (β = 0.564, p = 0.000). The items measuring TP include help in programming
tasks, course content quality, lecture delivery, and effective student-teacher interaction. Similarly,
H7 recommends that the impact of TP on LI positively transcends toward the improved
programming expertise (indirect effect; β = 0.937, p = 0.002). Our results provide the evidence
that TP are one of the most critical factors in determining students' LI, which further contributes
to programming expertise. It implies that effective TP help in developing intentions of learning,
which in turn helps them in effectively grasping the programming knowledge.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 169


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

Figure 5. Coefficients of path of the research framework.

Moreover, H7 suggests that LI significantly affects programming expertise, which has also
been accepted by the results (β = 0.771, p = 0.000). However, EP hinders the students’ ability to
learn programming by undermining their potential of learning. The impact of H4 has also been
successfully established during the evaluation (β = -0.176, p = 0.017). Increase in EP result in
lower programming performance demonstrated by the students. The plausible reason behind this
is the students’ inability to select relevant information from the internet. Due to the heap of
information provided over the internet, deducing timely information is a challenging task. The lack
of real-time feedback is a critical problem in programming. Sometimes students find themselves
plunged into a programming problem where some support can help them out; however, students
lack this support in e-learning. Hence, real-time feedback and support should be provided for
effective learning. Recent developments in interactive programming languages and tools can be
used to assist learners in writing and compiling their code.
Even students who show high LI may face the issues of lack of real-time support, problems
with the complex interface of IDEs, and correct information selection on the internet. In this regard,
these factors were supposed to weaken the relationship between LI and programming expertise.
Although e-learning provides immense opportunities for the students still, these opportunities may
not yield desired results. In this regard, we postulated that students' EP might also moderates the
relationship between LI and programming expertise (H9); however, we could not find the
significant results for this hypothesis (β = -0.599, p = 0.139). The reasons might embed in the fact
that students' LI have a powerful impact on their programming expertise and the impact of EP
became insignificant. Considering the evidence that EP adversely influences students'
programming expertise, most of the students are successfully achieving programming education
and joining the ever-increasing hub of programming experts. The reason behind this is that the
students are motivated enough to learn the programming skills and they overcome EP and continue
learning programming until they achieve a specific level of expertise whether it takes them more
time and energy to assimilate.
H5 implies that PU positively influences LI, which has also been supported by the results
(β = 0.762, p = 0.000). Similarly, H8 suggests that the impact of PU on LI trends positively toward
programming expertise. This hypothesis has also been successfully accepted during the evaluation

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 170


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

(β = 0.821, p = 0.001). This impact corresponds to the fact that students who contemplate that e-
learning is beneficial to them are able to perform better in their academic studies and will be able
to learn more programming skills.
The above empirical evidence led authors to deduce that in e-learning, IF, TP, and PU are
the key factors that serve as an impetus to foster student’s LI, which will further contribute in the
development of their programming expertise. Mentoring and real-time support will help students
to get out of the isolation that students endure during e-learning and will assist them in engaging
in programming tasks. By applying this empirical research, human development organizations,
government, and the education sector can use web-based e-learning to generate an exceptional
pool of talented individuals who can serve effectively to alleviate the poverty and social imbalance.
Moreover, they can fulfill the need for human resources in the software sector.

Table 5.
Standardized Coefficients of Structural Model
Hypothesis Causal Path β SE T-Value Significance
Direct Effects
H1 IF à LI 0.743 0.137 8.353 0.000
H2 TP à LI 0.564 0.141 4.000 0.000
H3 LI à PE 0.771 0.081 9.555 0.000
H4 EP à PE -0.176 0.074 -2.285 0.017
H5 PU à LI 0.762 0.129 1.456 0.000
H9 LI*EP à PE (Interaction term) student -0.599 0.068 -1.487 0.139
EP negatively related to PE and LI
relation with PE weaken the relation
Indirect Effects
H6 IF à LIà PE 0.703 0.171 4.312 0.001
H7 TPà LI à PE 0.937 0.309 3.036 0.002
H8 PUà LI à PE 0.821 0.297 0.292 0.001
LI (R2 = 0.503), PE (R2 = 0.579)

Conclusion
In this study, we identify and evaluate the factors that influence programming expertise in
e-learning. We ascertain teaching practices, intrinsic factors, perceived usefulness, efficacy
problems, and learning intentions are the key factors in developing programming skills. A research
model has been proposed by extending the technology acceptance model, which integrates all the
identified factors. Empirical evidence indicates that effective teaching practices, perceived
usefulness, and correct intrinsic motivations are the bases to instigate the aspiration to learn
programming. Students' efficacy problems undermine their ability to learn; however, they do not
impact their programming skills significantly. At the institutional level, effective learning
management systems should be provided that may encompass the features of face-to-face
communication in e-learning. Moreover, effective student-teacher interaction needs to be
established as students need immediate help during the programming problems. The availability
of quick response can be highly effective as the students sometimes plunge into problems and lose
motivation. Specifically, when students confront complex programming tasks, they need
spontaneous help to grasp their motivation for learning and to complete the programming tasks.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 171


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

Step-by-step programming tutorials with the formal lectures can also help students to grasp a basic
understanding of the programming activity discussed during the lectures.
Furthermore, the mentoring and support staff can help students in completing their complex
programming tasks. It will benefit the student to come out of the problem of isolation and break
the self-centered view of learning which will result in broadening their knowledge horizons.
Moreover, students’ support will be beneficial to overcome the stress of learning everything by
themselves. Therefore, an interactive teaching environment and immediate assistance can help
students in enhancing learning intentions of the students, which will overcome the efficacy
problems and facilitate them in attaining right programming expertise in the web-based e-learning
environment.

Discussion
The experimental analysis presented in the results section demonstrates that TP plays a
pivotal role in instigating learning motivations for programming. These findings have been aligned
with the research of Dos Santos and Cechinel (2018), which supports that effective teaching style
yields better learning outcomes in students. Effective TP involves more interaction between
students and teachers, which leads students to grasp the contents of the lectures adequately. The
interactivity of the e-learning system can help in solving the face-to-face learning issue in the e-
learning paradigm. This outcome is associated with the findings of Pituch and Lee (2006) and
Chen (2011) who reveal that the interactivity of the LMS and immediate response motivates the
learner in e-learning.
We propose that IF positively affects LI where the essential elements in the IF include the
student’s perception and motivation toward learning. The empirical results of our study claim that
these factors have a positive impact on the student’s LI. This finding is in line with the results of
Pugh (2019) who argues that the student’s motivation is the key success factor in higher education.
This impact has also been demonstrated by Venkatesh (2000) who states that IF positively impacts
the learner’s motivation. However, in e-learning, it is difficult to have interactive sessions with the
mentor; thus, LMS should be developed in a way that it can provide a platform for face-to-face
discussions with the tutors along with the text discussions. In this regard, our study proves that IF
also positively affect the LI of students toward programming.
This study demonstrates that PU plays an important role in developing LI that further
contributes to programming expertise. The items in PU correspond to the fact that career-oriented
professionals grasp more knowledge in e-learning. This hypothesis is consistent with Nganji
(2018) and Wingo et al. (2017) who argue that career-oriented learners use strategic methods to
complete their tasks in time and perform well during examination and thus are able to get the
valuable learning outcomes. Similarly, the impact of PU is also aligned with the Park (2009) and
Van Raaij and Schepers (2008) who demonstrate that the success in e-learning is dependent on the
usefulness of the e-learning system.
The result of H4 suggested that EP negatively affect the programming expertise. Here,
efficacy is concerned with the contextual problems faced by the student during programming
including difficulty in using IDEs, getting online help, and lack of support. Although numerous
interactive IDEs have been developed, however, students still face problems while using them.
The research of Altınay (2017) demonstrates the effectiveness of peer learning that can improve

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 172


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

the online learning process. Peer learning uses the experience of other students, their motivation,
and social interaction to help the other students in e-learning. In this regard, the feedback and
support for e-learning students are necessary; Tsai (2013) conducted an empirical analysis of e-
learning students and demonstrate that students who receive immediate feedback on their learning
process perform better than other e-learning students. Tang, Tang, and Chiang (2014) demonstrate
the positive impact of learning from the online resources; moreover, they demonstrate that students
continue visiting a website/blog if they get required help from it. In addition to student-teacher
communication, student-student interaction should also be provided because students can
communicate easily with their peers as compared to their teachers. Providing students with step-
by-step solutions for the programming tasks will increase their interest in programming tasks.
Online advising and mentoring services can also help e-learning students to discuss their problems.
Finally, the students need to be satisfied enough about their study program before joining a course
in the e-learning. Moreover, interactive LMS and responsive teaching facilities should be provided
to the students, which can highly contribute to the programming skills development in e-learning.
Web technologies have been providing immense opportunities for students worldwide to
learn state-of-the-art courses using e-learning. For effective programming education, the
practitioners should provide more support to the students using LMS that may include video
conferencing services for real-time student-teacher interaction.

Acknowledgements
This work was supported in part by the National Key Research and Development Program
of China under Grant 2017YFB1400600; in part by the National Science Foundation of China
under Grant 61672276; and in part by the Collaborative Innovation Center of Novel Software
Technology and Industrialization, Nanjing University.

Author Notes
The corresponding author is Wanchun Dou, the Department of Computer Science and
Technology and the State Key Laboratory of Novel Software Technology, Nanjing University, P.
R. China, email: [email protected]

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 173


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

References

Alharbi, S., & Drew, S. (2014). Using the technology acceptance model in understanding
academics’ behavioural intention to use learning management systems. International
Journal of Advanced Computer Science and Applications, 5(1), 143-155.
Allen, E., Cartwright, R., & Stoler, B. (2002). DrJava: a lightweight pedagogic environment for
Java. In Proceedings of the 33rd SIGCSE Technical Symposium on Computer Science
Education (pp. 137-141). Cincinnati, Ohio: ACM.
Altınay, Z. (2017). Evaluating peer learning and assessment in online collaborative learning
environments. Behaviour & Information Technology, 36(3), 312-320. doi:
10.1080/0144929X.2016.1232752.
Amemiya, Y., & Anderson, T. W. (1990). Asymptotic chi-square tests for a large class of factor
analysis models. The Annals of Statistics, 18(3), 1453-1463.
Ateş Çobanoğlu, A. (2018). Student teachers’ satisfaction for blended learning via Edmodo
learning management system. Behaviour & Information Technology, 37(2), 133-144. doi:
10.1080/0144929X.2017.1417481.
Berry, S. (2019). Teaching to connect: Community-building strategies for the virtual classroom.
Online Learning, 23(1), 164-183. doi: 10.24059/olj.v23i1.1425.
Boelens, R., De Wever, B., & Voet, M. (2017). Four key challenges to the design of blended
learning: A systematic literature review. Educational Research Review, 22, 1-18.
Bouhnik, D., & Marcus, T. (2006). Interaction in distance‐learning courses. Journal of the
Association for Information Science and Technology, 57(3), 299-305.
Celepkolu, M., & Boyer, K. E. (2018). The importance of producing shared code through pair
programming. In Proceedings of the 49th ACM Technical Symposium on Computer
Science Education (pp. 765-770). Baltimore, Maryland: ACM.
Chen, J.-L. (2011). The effects of education compatibility and technological expectancy on e-
learning acceptance. Computers & Education, 57(2), 1501-1511.
Cheng, Y. M. (2011). Antecedents and consequences of e‐learning acceptance. Information
Systems Journal, 21(3), 269-299.
Chin, W. W. (1998). The partial least squares approach to structural equation modeling. Modern
Methods for Business Research, 295(2), 295-336.
Dahalan, N., Hassan, H., & Atan, H. (2012). Student engagement in online learning: Learners
attitude toward e-mentoring. Procedia-Social and Behavioral Sciences, 67, 464-475.
Davis, F. D., & Venkatesh, V. (1996). A critical assessment of potential measurement biases in
the technology acceptance model: three experiments. International Journal of Human-
computer Studies, 45(1), 19-45.
Dorner, H., & Kárpáti, A. (2010). Mentoring for innovation: Key factors affecting participant
satisfaction in the process of collaborative knowledge construction in teacher training.
Online Learning, 14(4), 63-77. doi: 10.24059/olj.v14i4.127.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 174


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

dos Santos, H. L., & Cechinel, C. (2018). The final year project supervision in online distance
learning: assessing students and faculty perceptions about communication tools.
Behaviour & Information Technology, 38, 1-20. doi: 10.1080/0144929X.2018.1514423.
Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students' perceived learning
outcomes and satisfaction in university online education: An empirical investigation.
Decision Sciences Journal of Innovative Education, 4(2), 215-235.
Galusha, J. M. (1998). Barriers to learning in distance education, A publication of University of
Southern Mississippi, USA. ED416377, 1-23.
Hair, J., Black, W., BABIN, B. Y. A., Anderson, R., & Tatham, R. (2010). Multivariate data
analysis: A global perspective. London: Pearson Prentice Hall.
Hendijani, R., Bischak, D. P., Arvai, J., & Dugar, S. (2016). Intrinsic motivation, external
reward, and their effect on overall motivation and performance. Human Performance,
29(4), 251-274. doi: 10.1080/08959285.2016.1157595.
Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis:
Conventional criteria versus new alternatives. Structural Equation Modeling: A
Multidisciplinary Journal, 6(1), 1-55.
Hung, M.-L., & Chou, C. (2015). Students' perceptions of instructors' roles in blended and online
learning environments: A comparative study. Computers & Education, 81, 315-325.
Jan, A. U., & Contreras, V. (2011). Technology acceptance model for the use of information
technology in universities. Computers in Human Behavior, 27(2), 845-851.
Jenkins, T. (2001). The motivation of students of programming. In Proceedings of the 6th
Annual Conference on Innovation and Technology in Computer Science Education (pp.
53-56). Canterbury, United Kingdom, ACM.
Jose, G. S. S., & Christopher, C. S. (2018). Secure cloud data storage approach in e-learning
systems. Cluster Computing, 1-6.
Kelleher, C., & Pausch, R. (2005). Lowering the barriers to programming: A taxonomy of
programming environments and languages for novice programmers. ACM Computing
Surveys (CSUR), 37(2), 83-137.
Khan, A. S., & Nawaz, A. (2013). Role of contextual factors in using e-learning systems for
higher education in developing countries. Journal of Educational Research and Studies,
1(4), 27-34.
Lam, M., Chan, E., Lee, V., & Yu, Y. (2008). Designing an automatic debugging assistant for
improving the learning of computer programming. In Proceedings of the 1st International
Conference on Hybrid Learning and Education (pp. 359-370). Hong Kong, China.
Law, K. M., Lee, V. C., & Yu, Y.-T. (2010). Learning motivation in e-learning facilitated
computer programming courses. Computers & Education, 55(1), 218-228.
Lee, B.-C., Yoon, J.-O., & Lee, I. (2009). Learners’ acceptance of e-learning in South Korea:
Theories and results. Computers & Education, 53(4), 1320-1329.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 175


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

Lee, Y.-H., Hsiao, C., & Purnomo, S. H. (2014). An empirical examination of individual and
system characteristics on enhancing e-learning acceptance. Australasian Journal of
Educational Technology, 30(5), 562-579.
Liaw, S.-S. (2008). Investigating students’ perceived satisfaction, behavioral intention, and
effectiveness of e-learning: A case study of the Blackboard system. Computers &
Education, 51(2), 864-873.
Linnenbrink, E. A., & Pintrich, P. R. (2002). Motivation as an enabler for academic success.
School Psychology Review, 31(3), 313-327.
Martín-Rodríguez, Ó., Fernández-Molina, J. C., Montero-Alonso, M. Á., & González-Gómez, F.
(2015). The main components of satisfaction with e-learning. Technology, Pedagogy and
Education, 24(2), 267-277. doi: 10.1080/1475939X.2014.888370.
Martin, F., Budhrani, K., Kumar, S., & Ritzhaupt, A. (2019). Award-winning faculty online
teaching practices: Roles and competencies. Online Learning, 23(1), 184-205. doi:
10.24059/olj.v23i1.1329.
Nganji, J. T. (2018). Towards learner-constructed e-learning environments for effective personal
learning experiences. Behaviour & Information Technology, 37(7), 647-657. doi:
10.1080/0144929X.2018.1470673.
Offir, B., Lev, Y., & Bezalel, R. (2008). Surface and deep learning processes in distance
education: Synchronous versus asynchronous systems. Computers & Education, 51(3),
1172-1183.
Park, S. Y. (2009). An analysis of the technology acceptance model in understanding university
students' behavioral intention to use e-learning. Educational Technology & Society,
12(3), 150-162.
Pituch, K. A., & Lee, Y.-k. (2006). The influence of system characteristics on e-learning use.
Computers & Education, 47(2), 222-244.
Pugh, C. (2019). Self-determination: Motivational profiles of bachelor’s degree seeking students
at an online, for-profit university. Online Learning, 23(1), 111-131. doi:
10.24059/olj.v23i1.1422.
Raigoza, J. (2017). A study of students' progress through introductory computer science
programming courses. Paper presented at the 2017 IEEE Frontiers in Education
Conference (FIE), (pp. 1-7). Indianpolis, Indiana: IEEE Education Society.
Sarpong, K. A.-M., Arthur, J. K., & Amoako, P. Y. O. (2013). Causes of failure of students in
computer programming courses: The teacher-learner perspective. International Journal
of Computer Applications, 77(12), 27-32.
Sekaran, U., & Bougie, R. (2011). Business research methods: A skill-building approach. New
York: McGraw-Hill.
Tang, J.-t. E., Tang, T.-I., & Chiang, C.-H. (2014). Blog learning: effects of users' usefulness and
efficiency toward continuance intention. Behaviour & Information Technology, 33(1),
36-50. doi: 10.1080/0144929X.2012.687772.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 176


Factors Influencing Programming Expertise in a Web-based E-learning Paradigm

Tarhini, A., Hone, K., & Liu, X. (2014). The effects of individual differences on e-learning
users’ behaviour in developing countries: A structural equation model. Computers in
Human Behavior, 41, 153-163.
Tenenhaus, M., Vinzi, V. E., Chatelin, Y.-M., & Lauro, C. (2005). PLS path modeling.
Computational Statistics & Data Analysis, 48(1), 159-205.
Tsai, C.-W. (2013). An effective online teaching method: The combination of collaborative
learning with initiation and self-regulation learning with feedback. Behaviour &
Information Technology, 32(7), 712-723. doi: 10.1080/0144929X.2012.667441.
Valentine, D. (2002). Distance learning: Promises, problems, and possibilities. Online Journal of
Distance Learning Administration 5(3), 1-11.
Van Raaij, E. M., & Schepers, J. J. (2008). The acceptance and use of a virtual learning
environment in China. Computers & Education, 50(3), 838-852.
Venkatesh, V. (2000). Determinants of perceived ease of use: Integrating control, intrinsic
motivation, and emotion into the technology acceptance model. Information Systems
Research, 11(4), 342-365.
Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance
model: Four longitudinal field studies. Management Science, 46(2), 186-204.
Wingo, N. P., Ivankova, N. V., & Moss, J. A. (2017). Faculty perceptions about teaching online:
Exploring the literature using the technology acceptance model as an organizing
framework. Online Learning, 21(1), 15-35.
Xia, B. S., & Liitiäinen, E. (2017). Student performance in computing education: An empirical
analysis of online learning in programming education environments. European Journal of
Engineering Education, 42(6), 1025-1037. doi: 10.1080/03043797.2016.1250066.
Yu, Y., Poon, C., & Choy, M. (2006). Experiences with PASS: Developing and using a
programming assignment assessment system. Paper presented at the Sixth International
Conference on Quality Software. Beijing: IEEE.
Yunkul, E., & Cankaya, S. (2017). Students' attitudes towards Edmodo, a social learning
network: A scale development study. Turkish Online Journal of Distance Education,
18(2), 16-29.
Zin, A. M., Idris, S., & Subramaniam, N. K. (2006). Improving learning of programming through
e-learning by using asynchronous virtual pair programming. Turkish Online Journal of
Distance Education, 7(3), 162-173.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 177


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

Purposeful Interpersonal Interaction in Online Learning:


What is it and How is it Measured?
Scott Mehall
Bloomsburg University

Abstract
Despite extensive studies surrounding the topic of interaction in online learning, faculty are often
still relegated to an attempt at replicating their face-to-face course interactions in the online
environment. Interpersonal interaction is a necessary yet nebulous concept in online learning. This
paper attempts to build a quality lens to view interpersonal interaction in online learning through,
called purposeful interpersonal interaction (PII) by exploring types of interpersonal interaction
demonstrated in the literature to lead to better student outcomes. PII encompasses three main types
of interaction: purposeful interpersonal instructional interaction, purposeful social interaction, and
supportive interaction. These interaction types have been associated with important student
outcomes like perceived learning, satisfaction, and academic achievement. Robyler and Wiencke’s
(2003) rubric for assessing interactive qualities of distance courses (RAIQDC) includes many of
the concepts identified as important to PII and has been established as a valid and reliable tool for
assessing the amount of quality interpersonal interaction that occurs in an online course.

Keywords: online learning, interaction, instructional design, online pedagogy

Mehall, S. (2020). Purposeful interpersonal interaction in online learning: What is it and how is it
measured? Online Learning, 24(1), 182-204. https://doi.org/10.24059/olj.v24i1.2002

Purposeful Interpersonal Interaction in Online Learning:


What is it and How is it Measured?
Interaction has long been a popular topic of research in online learning. Since the beginning
of cyber education, many have been skeptical of its potential to devolve into an electronic form of
correspondence education, lacking sufficient interaction between faculty and students. Moore’s
(1989) seminal work on interaction in online learning identified how interpersonal interaction can
decrease transactional distance and thus provide a more robust educational experience for the
learner. Moore’s three types of interaction included student-content interaction, student-student
interaction, and student-faculty interaction. Interpersonal interaction includes both student-student
and student-faculty interaction (York & Richardson, 2012) and is generally accepted as a critical
element for all educational settings.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 182


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

The use of social constructivist (Vygotsky, 1997) based online course designs has been
leveraged in order to promote greater interpersonal interaction. Educators often seek to replicate
the dialogue that is easily achievable in their face-to-face courses in the online setting by utilizing
discussion boards and similar technologies. Despite this quest for sufficient interpersonal
interaction, educators still lack consensus on which interpersonal interaction strategies best
promote effective student learning and satisfaction. Often, faculty are pressured to increase the
quality of their online courses but are not aware of strategies to encourage students to interact
(Paquette, 2016). In other cases, faculty have been teaching in the face-to-face environment for
years and are being asked to convert their courses into the online format without pedagogical and
technical support (Lane, 2009).
Additionally, many of the studies on interaction in the online environment do not consider
the qualitative aspects of interaction and instead only measure the number of interactions, which
typically occurs through methods like counting discussion board posts or course updates.
This lack of clarity of what types of interpersonal interaction are most effective warrants
exploration into the types of interpersonal interaction that have been demonstrated to lead to better
student outcomes. A comprehensive review of the pertinent literature related to interpersonal
interaction in online learning as it relates to important student outcomes follows. This review
allows for a qualitative view of interpersonal interaction, called Purposeful Interpersonal
Interaction (PII). Lastly, recommendations for evaluating existing courses for PII using an
established rubric are given.

Review of Related Literature


Interpersonal Interaction is Beneficial
Since interaction in online learning has been extensively studied in the last few decades,
studies demonstrating the positive benefits of interpersonal interaction are plentiful. Interpersonal
interaction in online environments has been associated with increased perceived learning
(Richardson & Swan, 2003; Sher, 2009; Swan, 2002), higher levels of student satisfaction with the
course (Cole, Shelley, & Swartz, 2014; Fedynich, Bradley, & Bradley, 2015; Khalid & Quick,
2016; Richardson & Swan, 2003; Sher, 2009; Swan, 2002), higher levels of faculty satisfaction
with the course (Su et al., 2005), and improved student academic achievement (Long et al., 2011).
Open-ended responses in Sher’s (2009) study determined that students valued
opportunities to interact meaningfully with their faculty and their peers. Berge (1999) elaborates
on the reason behind the benefits of interpersonal interaction: “When students have the opportunity
to interact with one another and their instructors about the content, they have the opportunity to
build within themselves, and to communicate, a shared meaning to ‘make sense’ of what they are
learning” (p. 8). In a study conducted by Northrup, Lee, and Burgess (2002) that investigated the
interactions students perceived to be important in online environments using the online learning
interaction inventory (OLLI), students strongly expressed that prompt feedback from faculty and
their peers was essential. Clearly, learners value interpersonal interaction opportunities and feel
they are important to their successful outcomes in online courses.
Chickering and Gamson’s (1987) widely cited Seven Principles for Good Practice in
Undergraduate Education was designed to improve undergraduate education and endorse concepts
that incorporate the different types of interaction. Four of Chickering and Gamson’s principles

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 183


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

correspond to the critical student-faculty interpersonal interaction types in the online environment:
(a) “Encourages contact between students and faculty,” (b) “Develops reciprocity and cooperation
among students, (c) “Gives prompt feedback,” and (d) “Communicates high expectations” (p. 2).
Lack of Interpersonal Interaction
Not only have studies shown the interpersonal interaction generally leads to better
outcomes, but they have shown that a lack of interpersonal can be detrimental. A three-year study
by Cole, Shelley, and Swartz (2014) that examined graduate and undergraduate student satisfaction
with online instruction at a university discovered lack of interaction with faculty and with
classmates as the main source of student dissatisfaction. This is supported in a study of higher
education students in Kenya conducted by Muuro, Wagacha, Oboko, and Kihoro (2014), who
identified lack of feedback from faculty and lack of feedback from peers as major perceived
challenges by the students. From students’ perspectives, interpersonal interaction can not only lead
to a more satisfying online course, but a lack of appropriate levels of interpersonal interaction has
a negative perceived impact on the learner. Faculty and students alike see value in interpersonal
interaction, yet both are frustrated with the barriers to achieving sufficient levels of this type of
interaction in online environments.
Point of Diminishing Returns
Although interpersonal interaction has generally been demonstrated to lead to better
student outcomes, more interaction may not always be better. Castano-Munoz, Sancho-Vinuesa,
and Duart (2013) found evidence of a point of diminishing returns on academic achievement as a
result of interpersonal interaction that existed in the online environment but did not exist in the
face-to-face environment. This may be due to students becoming overwhelmed with the
interactions, whether written or otherwise, in the online environment. Picciano (2002) mentions
an example where students must monitor comments in an online discussion, and states that the
nature of these comments makes monitoring them more extensive than discussions in face-to-face
settings, which may lead to information overload. Northrup, Lee, and Burgess (2002) support this
idea by stating that there seems to be an ideal range of appropriate interaction with an upper and
lower limit. In Northrup, Lee, and Burgess’ (2002) study, some participants reported being
frustrated with an overwhelming amount of interactive assignments within a weekly module.
Downing, Lam, Kwong, Downing, and Chan (2007) recommend that interaction in online
environments be sustained only as long as there is an educational benefit in doing so. Based on the
results of their study, the group theorized that students may disengage from interaction once they
have the information they need to complete tasks. These studies give some evidence that increasing
interpersonal interaction beyond a saturation point may not only not add any benefit to students
but may actually be detrimental to their educational experience.
What is Purposeful Interaction?
One technique for promoting engaging learning activities is to provide opportunities for
students to interact with one another and with faculty purposefully. Garrison and Cleveland-Innes
(2005) give support that the quality of interaction, not the quantity, is important to fostering deep
learning, stating that high levels of interaction do not necessarily facilitate meaningful learning.
According to Garrison and Cleveland-Innes, “There must be a qualitative dimension characterized
by interaction that takes the form of purposeful and systematic discourse” (p. 135) and “simple
interaction, absent of structure and leadership, is not enough. We need to have a qualitatively richer
view of interaction” (p. 145).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 184


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

There is little research specifically referring to purposeful interaction in online


environments. In one instance, Abrami et al. (2011) mention purposeful interaction: “Guided,
focused, and purposeful interaction goes beyond whether opportunities exist to consider especially
why and how interaction occurs” (p. 88). This statement again speaks to the qualitative component
of interaction over simply measuring the volume of interaction.
Unfortunately, not all instances of interpersonal interaction in any learning environment
directly impact or facilitate intellectual growth. In a face-to-face setting, interactions can be off-
topic, redundant, or even distracting for students. In a similar way, interactions in the online
environment (e.g., an “I agree” response to a discussion post) may not always be purposeful,
valuable, or contributory to student learning. Conversely, not all interactions that do not directly
relate to course content or learning objectives are without purpose and/or student benefit. For
example, a case where students form social bonds with faculty or their fellow students can be a
purposeful interaction. Research has shown that social presence can be an important characteristic
in learning (Gilbert & Moore, 1998; Richardson & Swan, 2003; Tu & McIsaac, 2002; Pacquette,
2016). Abrami et al. (2011) believe the next generation of online education should be designed to
facilitate more purposeful interaction by promoting targeted, intentional, and engaging
interactions. In order for online interaction to fulfill its objectives and advance the learning process,
interaction opportunities should be designed in a way that allow students to interact with content,
faculty, and other students in a manner that is not fake or forced but meaningful and purposeful.
Purposeful Interpersonal Interaction
Purposeful interpersonal interaction (PII) is any high quality, organic, and valid
communication exchange between two or more participants of the learning process that directly
relates to the achievement of established learning outcomes or to the building of social
relationships. As shown in Section 2, a seemingly endless number of studies have attempted to
look at interpersonal interaction from a quantity perspective. Fewer studies have examined the
quality of interpersonal interaction in OL and even fewer studies have examined interaction
through the lens of measuring the amount of quality interpersonal interaction, defined here as PII.
Quality Interaction
An important aspect of PII is quality. Berge (1999) argues that just because interaction
opportunities may increase in quantity, this does not automatically lead to increased quality of
interaction in the course. Clearly, not all interactions in online learning are created equal;
interactions may have differing levels of value to learners. Although interactions in the online
environment can be easily structured by utilizing the robust features of many of today’s widely
used learning management systems (LMS), it is vital that many of these interactions are
purposeful. According to Woo & Reeves (2007), an interaction is viewed as meaningful when it
has a direct influence on intellectual growth for the student.
Social and instructional interactions among students and between student and faculty are
common elements of a face-to-face classroom (Picciano, 2002). According to Picciano (2002),
“The ability to ask a question, to share an opinion with a fellow student, or to disagree with the
point of view in a reading assignment are all fundamental learning activities” (p. 1). In the face-
to-face classroom, many interactions among students and between students and faculty occur
spontaneously and organically (Hirumi, 2002), and the interactions help advance the learning
process. Face-to-face learning provides many opportunities for informal learning where an
interaction is not planned, but class discussions, reflections, debates, or group projects lead to the

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 185


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

stimulation of learning. This process is allowed to happen organically, as faculty member may
notice verbal and nonverbal cues from students and feel the need to elaborate on a topic, for
example (Hirumi, 2002). In the online environment, this informal learning and the ability to adapt
in real-time to fill the gap in understanding may be decreased if students are not given the
opportunity and appropriate tools to interact with their peers and faculty. For that reason, quality
instructional and social interaction opportunities in online environments need to be deliberately
designed into the course (Berge 1999; Bernard et al., 2009; Hirumi, 2002; Northrup, Lee, &
Burgess, 2002).
Robyler and Wiencke (2003) highlight the importance of structuring these opportunities,
stating, “Highly interactive learning environments are rarely serendipitous; activities must be
designed to encourage, support, and even require interaction” (p. 87). The success of online courses
often directly relates to the quantity and quality of these interactions (Picciano, 2002). These types
of interactions in the online environment must occur in a purposeful way if learning is to effectively
occur. According to Martin, Parker, and Deale (2012), “Effectively designed courses should
impact students in such a way that there is an increased and spontaneous use of opportunities for
interaction within the course” (p. 231).
Three Components of PII
PII can be broken into three main categories: instructional interaction, social interaction,
and support interaction, as displayed in Figure 1. The first two types of interaction that make up
PII directly relate to two types of interaction theorized by Gilbert and Moore (1998) to categorize
interaction. The two categories identified are content interaction and social interaction. Gilbert and
Moore (1998) state that many skeptics of online learning are concerned mostly with a lack of
ability to foster two categories of interaction that are routinely found in face-to-face instruction:
social activity and instructional activity. Courses with high levels of quality interaction will have
components of content and social interaction designed in them (Northrup, 2002). When referring
to content interaction in this context, it is not meant to be confused with Moore’s (1989) student-
content interaction, but rather it refers to interpersonal interaction that focuses on the content
(relevant topics) of the course. These two categories seem to mirror two important categories of
interaction that Berge (1999) identifies as task/content interaction and social interaction, and two
categories of interaction Gilbert and Moore (1998) describe as social instructional interactivity and
social interactivity. As a component of PII, the term instructional interaction will be used in place
of content interaction or task interaction to avoid confusion. The third and final category of PII
deals with providing online learners with appropriate support. Therefore, the three types of PII are
instructional interaction (PIII), purposeful social interaction (PSI), and supportive interaction (SI).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 186


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

Figure 1. Three components of purposeful interpersonal interaction in online learning.

Purposeful Interpersonal Instructional Interaction (PIII)


A major part of all educational ventures are interactions directly associated with the
instructional content of the course. Northrup (2002) states that “Content interaction is always
directed at attaining the specific learning outcomes or goal of the instruction” (p. 220). In this
sense, PIII is any interaction between participants in the learning process that directly relate to
completing learning objectives. Although admittedly a very broad category at surface, this
interaction category omits any instances of extraneous (nonpurposeful) interaction. Woo and
Reeves (2008) explain that when students post to a discussion board simply to meet assignment
requirements, it is not likely to lead to meaningful learning. This is an example of extraneous
interaction that would not reflect a purposeful approach, especially in the event that the posting
does not relate in any direct way to course objectives. A student posting an “I agree” or “me too”
type of response in a discussion board would not be considered a PIII. Berge (1999) lists some
examples of interpersonal interaction that faculty might employ:
• disseminating information not readily available from texts or workbooks in
appropriately-sized pieces according to a teacher-determined structure;
• arousing or heightening student interest;
• reviewing previously learned skills and knowledge; and
• giving feedback and corrective guidance. (p. 7–8)
All of the items on Berge’s list are consistent with PIII. These faculty interactions can be utilized
as a strategy to increase instructor presence in online courses. Dennen, Darabi, and Smith (2007)
state:

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 187


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

Perceptions of instructor presence are based on learners’ psychological reactions to an


online instructor’s actions in both public (whole class) and private correspondence. Further,
presence is not only confined to the amount of instructor-learner interaction, but also to the
content of those interactions. (p. 67)
Clearly, the items on Berge’s list would all be interpersonal interaction occurrences that could be
classified as leading to enhanced instructor presence in the online environment.
Timely feedback. The last item on Berge’s list for instructional interactions, giving
feedback and corrective guidance, has also been identified as an essential component of any
learning environment (Berge, 1999; Hirumi, 2005; Lewis & Abdul-Hamid, 2006; Woo & Reeves,
2008). Students perceiving that they have access to faculty and receive timely, valuable feedback
from faculty is essential to their educational experience (Croxton, 2014). According to Kranzow
(2013), “When students receive feedback promptly, they can either have reassurance that they
understand the content sufficiently, or conversely, students can request assistance to guide them in
the right direction” (p. 132). Students are often frustrated when they do not receive timely feedback
(Woo & Reeves, 2008), so it is essential for faculty to “close the loop” on student work in a timely
manner by providing students with a grading rationale, confirmation, and corrective feedback.
Dennen et al. (2007) found that learners find receiving timely feedback is more important than
receiving extensive feedback.
Northrup (2002) also demonstrated that students rate regular feedback from faculty as
important. Although feedback can occur in both nonverbal and verbal ways in the face-to-face
environment, it is arguably even more important in the online environment as it can be imperative
to student satisfaction and performance (Dennen et al., 2007; Northrup, Lee, & Burgess, 2002;
Thurmond, Wambach, Connors, & Frey, 2002; Vrasidas & McIsaac, 1999). Two major types of
feedback, corrective feedback and confirmatory feedback, are differentiated in the literature.
Corrective feedback allows students to make improvements to their work as faculty stress key
areas for improvement and confirmatory feedback allows students to gain approval from faculty
that their work is correct (Hirumi, 2005). Studies have demonstrated that feedback can improve
course satisfaction as well as academic performance in the online environment (Espasa &
Meneses, 2009).
Feedback is also not limited to faculty, as other students can be a source of feedback as
well. As stated previously, lack of feedback from faculty and from peers is a major perceived
challenge for online students (Muuro et al., 2014). Tu and Corry (2003) state, “when students are
allowed and encouraged to obtain support from peers, assignments become social exercises while
maintaining original objectives. This may enhance assignment performance and will permit the
addition of peer evaluation activities” (p. 55).
The timeliness of feedback is a vital characteristic of PII in the online environment. Faculty
must ensure that learners are receiving prompt corrective and confirmatory feedback in order to
allow them to progress through the learning process and achieve key course goals. Without
feedback, students cannot identify their errors or gain understanding of what they are doing well,
and in that regard, feedback is important for students to identify their weaknesses and recognize
their strengths.
Collaborative learning. Today’s modern LMS features enable learners to collaborate in
the online environment in better ways than ever before. Group assignments and projects are
common in many online courses, as online instructors recognize that collaborative learning is

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 188


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

important to cognitive development (Garrison et al., 2000). Graduate students especially can
benefit from collaborative learning through the completion of authentic learning tasks and projects
that will prepare them for similar assignments they will encounter in their professional lives.
In writing about the conceptual approach to collaboration, Krejins, Kirschner, and Jochems
(2003) summarize the set of conditions that enhance collaboration:
• Positive interdependence: team members are linked to each other in such a way that
each team member cannot succeed unless the others succeed and/or that each member’s
work benefits the others (and vice versa).
• Promotive interaction: individuals encourage and help each other’s efforts so as to in
order to reach the group’s goals.
• Individual accountability: all group members are held accountable for doing their share
of the work and for mastery of all of the material to be learned.
• Interpersonal and small-group skills: specific skills are needed when learners are
learning within a group; students who have not been taught how to work effectively
with others cannot be expected to do so must be developed.
• Group processing: the group determines which behaviors should continue or change
for maximizing success based upon reflection of how the group has performed so far.
(p. 339)
Kreijns, Kirschner, and Jochems (2003) state that ensuring these conditions exist for collaborative
learning promotes the positive benefits of this type of learning while also reducing negative aspects
of collaborative learning (e.g., social loafing, free-riders, and the “sucker” effect). In this respect,
creating these conditions in collaborative learning can be viewed as PIII. The key to unlocking
quality collaborative learning that enables students to achieve specific learning objectives in online
environments while interacting as a group is social interaction (Kreijns, Kirschner, & Jochems,
2003); this is the bridge to the next category of purposeful interpersonal interaction.
Purposeful Social Interaction (PSI)
Purposeful social interaction (PSI) is the second main component of PII. According to
Powell and Kaline (2009), “Vygotsky would say that social interaction and culturally organized
activities are necessary in the classroom for proper psychological development” (p. 246). Although
social interaction often may not deal directly with the instructional goals of the course, this sort of
interaction can help shape the learning environment (Gilbert & Moore, 1998). Muilenburg and
Berge (2005) found lack of social interaction as the most significant barrier to online learning
perceived by students. Administrative/faculty issues was the second most reported barrier, which
incorporates student-faculty interaction instances. Tu & McIsaac (2002) found that social presence
positively impacts online interaction and recommend that faculty promote informal relationships
to achieve greater interactivity in their courses. In a study conducted by Jung, Choi, Lim, and Leem
(2002), the group receiving high levels of social interaction had higher levels of learning and
greater participation than groups receiving only academic forms of interaction. Finally, in a study
of 97 students enrolled in online courses, Richardson and Swan (2003) found that students
reporting high levels of social presence also had high levels of perceived learning and satisfaction.
In light of this research, it is recognized that social interactions that are in some ways
separate from the learning outcomes of the course are purposeful as well. Berge (1999) supports

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 189


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

this sentiment by stating, “Much of learning inevitability takes place within a social context, and
the process includes the mutual construction of understanding” (p. 8).
An important consideration of PSI is the concept of social presence. Garrison et al. (2000)
describe social presence as the ability of participants of the online environment to come across to
others as real people and state that its primary importance is to indirectly facilitate the process of
critical thinking and support cognitive presence. Garrison (2009) later updates this definition to
include the ability of participants to “communicate purposefully in a trusting environment, and
develop inter-personal relationships by way of projecting their individual personalities” (p. 352).
Social presence is defined by Tu and McIsaac (2002) as “the degree of feeling, perception, and
reaction of being connected by CMC to another intellectual entity” (p. 140). These definitions
demonstrate that social presence is understood as a perception that directly results from
interpersonal interaction and has influence on the learning process.
Social presence among participants in the learning process is often viewed as a prerequisite
that must be established in order for instructional interaction and purposeful learning to occur
(Garrison et al., 2000; Garrison & Cleveland-Innes, 2005; Tu, 2000; Woods & Baker, 2004). This
precondition allows learners to create relationships and recognize the course as a safe setting where
purposeful interpersonal interaction can occur.
Social presence is not always measured by the amount of social interaction that takes place
in the online environment or improved by additional social interaction. For example, in Tu’s
(2000) study, social presence decreased when a group member participated too much or dominated
the conversation. In a different study, Tu and McIsaac (2002) found that social presence positively
impacts interaction, yet a high amount of participation does not necessarily equal a high level of
social presence.
Northrup (2002) distinguishes social interaction from content (instructional) interaction by
stating, “Social interaction, on the other hand, provides opportunities for peers to connect in non-
task specific conversation” (p. 220). A key difference between instructional and social interaction
is that social interaction is more flexible and mutual than instructional interaction (Gilbert and
Moore, 1998). Gilbert and Moore (1998) confirm that social interaction can improve instructional
interaction: “Social interaction between students and teachers and between students and students
can sometimes have little to do with instructional learning, but can still help to create a positive
(or negative) learning environment…” (p. 30). Social interaction can have real, measurable
impacts on student outcomes in the online environment. Quality and intensity of social interaction
has been associated with increased academic achievement (Kozuh et al., 2015).
Tu and McIsaac (2002) elaborate on how social interaction relates to overall interpersonal
interaction, stating, “By incorporating concepts such as building trust online, providing ‘hand-
holding’ technical support, and promoting informal relationships, instructors can help provide
greater interactivity within the online community of learners” (p. 147). The results of Swan’s
(2003) study of 97 students in online courses demonstrated that students who reported higher levels
of social presence in their online course also reported higher levels of perceived learning and
satisfaction with faculty than students who reported lower levels of social presence.
Social interaction must be designed into the beginning of courses, and when designed
correctly, it can continue on its own without faculty stimulus (Northrup, 2002). Garrison (2009)
states that social presence incrementally develops in the online environment and warns faculty not
to overstress this interaction early in the course. An overabundance of social interaction early in a

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 190


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

course may become a source of frustration for students and some may be unwilling to build deep
social relationships early on. For that reason, it is essential that faculty determine the appropriate
level of social interaction (not too little and not too much) when beginning a course. Downing et
al. (2007) identified a pattern of engagement for discussions in an online course that is
characterized by a socially active phase (where promotion of social interactions by faculty is key
to developing relationships), an instrumental phase (characterized by the assignments in the
course), and then a gradual disengagement from the discussion, which may be similar to the
process of social engagement and then disengagement that occurs in a face-to-face course.
Kreijns, Kirschner, and Jochems (2003) describe two pitfalls many faculty make pertaining
to social interaction. The first is assuming social interaction will occur just because the online
environment provides tools (LMS or external) for it to occur. Kreijns, Kirschner, and Jochems
(2003) give an example: “Just putting a forum in a group and labeling it ‘café’ or ‘lobby’ does not
increase interaction” (p. 347). The second pitfall is restricting social interaction among students to
strictly task contexts without consideration to nontask, socioemotional interactions. Both academic
and personal social interaction appear to be important to learning in the online environment. It is
therefore essential that faculty facilitate social interaction opportunities that allow students to
develop trust, a sense of belonging, and social relationships, especially early in an online course.
Immediacy. Immediacy in the online environment refers to “expressiveness, stimulation,
and the conveying of feelings and emotions through online language” (Tu, 2000, p. 1665). Swan
(2002) reports that one of the ways faculty and students attempt to develop social presence in an
asynchronous online course where face-to-face interaction is limited or nonexistent is by deploying
verbal immediacy behaviors (e.g., paralanguage, self-disclosure, greetings, agreement, etc.)
through text-based communication. Response time and communication style were also found to
be contributors to social presence (Tu, 2000).
Supportive Interaction (SI)
The third and final main component of PII is support, which is an important factor for any
learning environment (Caliskan, 2009). Providing support in a variety of ways to students is
something many faculty take for granted in the online environment because the face-to-face
environment allows them to be far more agile and responsive to student issues. In the online
environment, students are separated by time and distance from the faculty and other learners, so
student issues have the potential to further isolate students and increase the transactional distance
faculty seek to decrease. For this reason, it is essential that faculty provide supportive interactions
to students, as well as find ways to facilitate support from various resources in the event that a
student needs assistance.
Student-interface interaction conditions that instructors cannot expect all learners to have
the ability to interact with content, faculty, and their peers effectively without first ensuring that
they can interact with the LMS, which is an important component of support in the online
environment (Hillman et al., 1994). Providing support for navigating the LMS, either through
tutorials, university resources (e.g., instructional design teams or tutors), or by request is an
essential part of the online teaching experience, as other interactions cannot be successful if the
student cannot effectively navigate the LMS.
Students may also struggle in a variety of other areas. In an online writing class, it may be
appropriate to supply students with supportive assistance for APA or MLA formatting. Various
software tools, external websites, and social networking tools may need to be thoroughly explained

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 191


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

to some learners, while other may embrace them early on. Many times, these student issues differ
drastically by course, so it is essential for faculty to be aware of areas of their courses that warrant
additional supportive interaction in the online environment.
Results from Northrup’s (2002) study reveal that support is an important consideration for
successful outcomes in the online environment. Providing support mechanisms can help obstruct
the possibility of learners becoming frustrated and feeling isolated in an online course. Although
the number of potential student issues are vast, it is most important for faculty to be cognizant that
they will occur and be agile and responsive in providing supportive interaction to those students.
PII Summary
Purposeful interpersonal interaction (PII) is made up of three components: purposeful
interpersonal instructional interaction (PIII), purposeful social interaction (PSI), and supportive
interaction (SI). These interactions together make up the interpersonal interactions found in the
literature that have been identified as important to student outcomes. Many attempts to examine
the quantity of interpersonal interaction in the online environment have been apparent in the
literature. In this light, PII can be summarized as looking at interaction from a quality standpoint.
How Can We Measure PII?
The rubric for assessing interactive qualities of distance courses (RAIQDC) created by
Robyler & Wienke (2003) in Appendix A focuses on the level of interaction perceived by
participants in an online course. This instrument can be used to determine the amount of PII
perceived by students in an online course. The RAIQDC has been demonstrated to be a valid,
reliable instrument to measure interaction in distance courses (Robyler & Wiencke, 2003; 2004).
Robyler and Wiencke (2003) revealed that the rubric had convergent and divergent validity and
had consistency among different raters of the same course, as 95% of the student ratings were
within four points of the total 25 points. The rubric was also reviewed and improved based on
feedback from 42 distance educators to be clearer and more comprehensive (Robyler & Wiencke,
2004).
Robyler and Wiencke’s (2004) study used the rubric alongside course evaluations in four
classes that had no or limited face-to-face components across two universities. The researchers
assessed the reliability and validity of the rubric in three different ways in the study. First, inter-
rater reliability was determined to be good, with Cronbach’s alpha levels of .88, .64, .93, and .95
for the four courses involved in Robyler and Wiencke’s (2004) study. Interestingly, the course
with the lowest Cronbach’s alpha, Course Two at .64, was the course with the greatest F2F
component (80% asynchronous online and 20% F2F). Second, concurrent validity was determined
using Pearson’s correlations between formal course evaluations and scores on the RAIQDC. For
the four courses, the correlations were determined to be .630, .720, .643, and .475. Three of the
four correlations were significant at the .01 level, while Course One was significant at the .05 level
(Robyler & Wiencke, 2004). Third, correlations between specific rubric elements and course
evaluation scores were conducted and revealed that each of the five rubric elements were correlated
with course evaluation sub scores at the .01 significance level. The results of these two studies
(Robyler & Wiencke, 2003; 2004) give evidence that the RAIQDC is a valid and reliable
instrument to assess the interactivity of online courses. The rubric is an acceptable measure for
student samples, as demonstrated by Restauri (2006).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 192


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

The instrument is easy for students to complete. Using a Likert-type scale, respondents
choose one of five possible levels (1–5) for each of five different elements corresponding to the
interaction in their course. Each level of each element has a corresponding label and description,
and the respondents choose the option they perceive as most closely reflecting their course. The
labels are as follows: Low is 1 point, Minimum is 2 points, Moderate is 3 points, Above Average
is 4 points, and High is 5 points. The points for all elements are then totaled and used to categorize
each course into one of three groups. The three groups are as follows: low interactive qualities
group (1–9 points), intermediate interactive qualities group (10–17 points), and high interactive
qualities group (18–25 points). These groups were used as a way to categorize courses in the study.
The five elements that make up the different sections of the RAIQDC are used to assess
various types of quality interaction in the online environment. Each element either directly
incorporates components of PII or facilitates PII to occur. In order to justify the use of this rubric
as a measurement of PII, each element is tied to the components of PII by stating the criteria for
the highest score level in for each element and using concepts from the components of PII to
support its legitimacy and importance to student outcomes in online courses.
Element 1: Social/Rapport-Building Designs for Interaction
High Level description—In addition to providing for exchanges of personal information
among students and encouraging student-student and instructor-student communication and
social interaction, the instructor also interacts with students on a social/personal basis.
This element relates to PSI through its focus on establishing social interaction and building
social presence in an online course, especially early in the course. Social interaction and social
presence have been identified as important precursors for meaningful learning to occur and have
been demonstrated to positively impact perceived learning.
Element 2: Instructional Designs for Interaction
High Level description—In addition to requiring students to communicate with the
instructor, instructional activities require students to develop products by working together
cooperatively (e.g., in pairs or in small groups) and share results and feedback with other groups
in the class.
This element relates directly to PIII, as it requires interpersonal interaction with regard to
instructional activities. In addition, the description refers to collaborative learning, which has been
identified as crucial to cognitive development. The description also references the importance of
peer feedback, which is one of the aspects that is highlighted as a component of PIII. Lack of
feedback from faculty and peers was one of the identified challenges of online students.
Element 3: Interactivity of Technology Resources
High Level description—In addition to technologies used for two-way exchanges of text
information, visual technologies such as two-way video or videoconferencing technologies allow
synchronous voice and visual communications between instructor and students and among
students.
This element is likely the most subtle but it essentially deals with the communication tools
made available to students in an online environment. This is an instance where the rubric is not
assessing direct interpersonal interaction, but rather the facilitation of interpersonal interaction
using LMS tools. Two-way exchanges of information refers to faculty and students being able to

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 193


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

communicate reciprocally either by text or by video (e.g., instant messaging, videoconferencing,


etc.), whereas one-way exchanges of information refers only to instances where information can
be presented by one party but not by the other (i.e., faculty posting a course announcement with
no response area for students). These tools allow faculty to have a greater presence in the course
as well as enable a deeper social presence for all participants. In addition, such tools may allow
faculty to increase the immediacy in their courses. The use of videoconferencing using a
synchronous tool (e.g., Zoom, Adobe Connect, or Skype for Business) can help to humanize online
distance education. In essence, the use of interactive technology resources as communication tools
allow faculty and learners to interact interpersonally in a deeper fashion, which can effectively
decrease the level of transactional distance in the online environment.
Element 4: Evidence of Learner Engagement
High Level description—By end of course, all or nearly all students (90%–100%) are both
replying to and initiating messages, both when required and voluntarily; most messages are
detailed, responsive to topics, and reflect efforts to communicate well.
This element reflects interpersonal interaction as a result of effective course design as well
as social presence. It has been identified that social interaction and the development of social
presence are key to unlocking instructional interaction. Social presence is something that must be
developed early in a course and, when developed appropriately, will continue throughout the
course without faculty influence. This element reflects the literature well as it requires that at least
90% of students are actively engaging in messages (whether through the discussion board or other
communication tools) by the end of the course. In addition, it relates to purposeful interaction
because the messages are required and voluntary (not forced) and must be detailed (i.e., not a
simple “I agree” or “good point” response). The element of learner engagement seeks to measure
how well a course and faculty have established social presence and in turn create an environment
conducive to PII for learners.
Element 5: Evidence of Instructor Engagement
High Level description—Instructor responds to all student queries; responses are always
prompt, i.e., within 24 hours; feedback always offers detailed analysis of student work and
suggestions for improvement, along with additional hints and information to supplement learning.
This item directly relates to two types of PII: support interaction and purposeful
interpersonal instructional interaction. Responding to student issues and concerns is identified as
an important part of the online teaching experience. Whether through issues with navigating the
LMS or different e-learning tools, faculty should provide support to students in a variety of areas
when needed. Timely feedback has been identified as an essential component to successful
learning in the online environment and positively impacts student satisfaction and academic
achievement. The literature demonstrated that students would rather receive prompt feedback than
extensive feedback, and the 24-hour time frame reflects this. Offering detailed analysis of student
work and suggestions for improvement can be both confirmatory and corrective feedback. This
feedback serves to guide learners on a path to achieving the key instructional goals of the course.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 194


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

Summary of RAIQDC as PII


The five elements of the RAIQDC relate directly and indirectly to the different components
of PII. In principle, all five of these elements either directly influence or facilitate PII in online
courses. In that regard, this instrument can be used to identify how much PII has occurred in any
online course from the students’ perspectives. This rubric can be utilized as a tool for instructors
to improve their online course design and instruction by finding an appropriate level of interaction
for their course.

Conclusion
Despite extensive studies surrounding the topic of interaction in online learning, faculty
are often still relegated to an attempt at replicating their face-to-face course interactions in the
online environment. Building a quality lens to view interpersonal interactions in online learning is
possible through purposeful interpersonal interaction (PII). The three interaction types in PII—
purposeful interpersonal instructional interaction, purposeful social interaction, and supportive
interaction—have been associated with important student outcomes like perceived learning,
satisfaction, and academic achievement. Robyler and Wiencke’s (2003) rubric for assessing
interactive qualities of distance courses (RAIQDC) includes many of the concepts identified as
important to PII and has been established as a valid and reliable tool for assessing the amount of
quality interpersonal interaction that occurs in an online course.
Instructors can utilize this rubric to improve their online course design and instruction.
Furthermore, instructors and researchers can utilize other validated research instruments in
conjunction with the RAIQDC to determine the association between level of PII and important
student outcomes like satisfaction, perceived learning, academic achievement, and persistence.
Studies of this type will allow further insight into the point of diminishing returns for interpersonal
interaction in online learning. Future research in this area is warranted to examine the effect of
supplementing PII and decreasing nonpurposeful interactions on important student outcomes.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 195


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

References

Abrami, P. C., Bernard, R. M., Bures, E. M., Borokhovski, E., & Tamim, R. M. (2011).
Interaction in distance education and online learning: Using evidence and theory to
improve practice. Journal of Computing in Higher Education, 23(2–3), 82–103.
doi:10.1007/s125298-011-9043-x
Berge, Z. L. (1999, January–February). Interaction in post-secondary web-based learning.
Educational Technology, 39, 5–11. Retrieved from
https://www.researchgate.net/profile/Zane_Berge/publication/246496634_Interaction_in_
post-secondary_Web-based_learning/links/5614987e08ae983c1b40a111.pdf
Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., &
Bethel, E. C. (2009). A meta-analysis of three types of interaction treatments in distance
education. Review of Educational Research, 79(3), 1243–1289.
doi:10.3102/0034654309333844
Caliskan, H. (2009). Facilitators’ perception of interactions in an online learning program.
Turkish Online Journal of Distance Education, 10(3), 193–203. Retrieved from
http://dergipark.ulakbim.gov.tr/tojde/article/viewFile/5000102609/5000095706
Castano-Munoz, J., Sancho-Vinuesa, T., & Duart, J. M. (2013). Online interaction in higher
education: Is there evidence of diminishing returns? The International Review of
Research in Open and Distance Learning, 14(5), 240–257. Retrieved from
http://files.eric.ed.gov/fulltext/EJ1017547.pdf
Chickering, A. W., & Gamson, Z. F. (1987, March). Seven principles for good practice in
undergraduate education. American Association for Higher Education Bulletin, 3, 1–6.
Retrieved from http://files.eric.ed.gov/fulltext/ED282491.pdf
Cole, M. T., Shelley, D. J., & Swartz, L. B. (2014). Online instruction, e-learning, and student
satisfaction: A three year study. The International Review of Research in Open and
Distance Learning, 15(6), 111–131. Retrieved from
http://www.irrodl.org/index.php/irrodl/article/view/1748/3123
Croxton, R. A. (2014). The role of interactivity in student satisfaction and persistence in online
learning. Journal of Online Learning and Teaching, 10(2), 314–325. Retrieved from
https://pdfs.semanticscholar.org/2a3c/ab58d3d0637d20d907d67fecf3c346851393.pdf
Dennen, V. P., Darabi, A. A., & Smith, L. J. (2007). Instructor–learner interaction in online
courses: The relative perceived importance of particular instructor actions on
performance and satisfaction. Distance Education, 28(1), 65–79.
Downing, K. J., Lam, T., Kwong, T., Downing, W., & Chan, S. (2007). Creating interaction in
online learning: a case study. Research in Learning Technology, 15(3), 201–215.
doi:10.1080/09687760701673592
Espasa, A., & Meneses, J. (2010). Analysing feedback processes in an online teaching and
learning environment: An exploratory study. Higher Education, 59, 277–292.
doi:10.1007/s10734-009-9247-4

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 196


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

Fedynich, L., Bradley, K. S., & Bradley, J. (2015). Graduate students’ perceptions of online
learning. Research in Higher Education Journal, 27, 1–13. Retrieved from
http://files.eric.ed.gov/fulltext/EJ1056187.pdf
Garrison, D. R. (2009). Communities of inquiry in online learning. In Encyclopedia of Distance
Learning, Second Edition, 352–355. IGI Global.
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based
environment: Computer conferencing in higher education. The Internet and Higher
Education, 2(2), 1–34. Retrieved from
http://auspace.athabascau.ca:8080/bitstream/2149/739/1/critical_inquiry_in_a_text.pdf
Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online
learning: Interaction is not enough. The American Journal of Distance Education, 19(3),
133–148. Retrieved from
http://anitacrawley.net/Articles/GarrisonClevelandInnes2005.pdf
Gilbert, L., & Moore, D. R. (1998, May–June). Building interactivity into web courses: Tools for
social and instructional interaction. Educational Technology, 38(3), 29–35.
Hillman, D. C., Willis, D. J., & Gunawardena, C. N. (1994). Learner-interface interaction in
distance education: An extension of contemporary models and strategies for practitioners.
American Journal of Distance Education, 8(2), 30–42.
Hirumi, A. (2002). The design and sequencing of elearning interactions: A grounded approach.
International Journal on E-Learning, 1(1), 19–27. Retrieved from
https://www.researchgate.net/publication/248580777_The_Design_and_Sequencing_of_e
Learning_InteractionsA_Grounded_Approach
Hirumi, A. (2005). In search of quality: An analysis of e-learning guidelines and specifications.
Quarterly Review of Distance Education, 6, 309–329. Retrieved from
https://www.researchgate.net/profile/Atsusi_Hirumi/publication/234590442_In_Search_o
f_Quality_An_Analysis_of_e-
Learning_Guidelines_and_Specifications/links/564095f408aedaa5fa451ce3.pdf
Jung, I., Choi, S., Lim, C., & Leem, J. (2002). Effects of different types of interaction on
learning achievement, satisfaction and participation in web-based instruction. Innovations
in Education and Teaching International, 39(2), 153–162.
doi:10.1080/13558000210121399
Khalid M. N., M., & Quick, D. (2016). Teaching presence influencing online students’ course
satisfaction at institution of higher education. International Education Studies, 9(3), 62–
70. doi:10.5539/ies.v9n3p62
Kozuh, I., Jeremic, Z., Sarjas, A., Bele, J. L., Devedzic, V., & Debevc, M. (2015). Social
presence and interaction in learning environments: The effect on student success.
Educational Technology & Society, 18(1), 223–236. Retrieved from
https://pdfs.semanticscholar.org/b16c/99f068e06f0442ee2f3e0c1d9b43f0f8d520.pdf
Kranzow, J. (2013). Faculty leadership in online education: Structuring courses to impact student
satisfaction and persistence. Journal of Online Teaching and Learning, 9(1), 131–139.
Retrieved from http://jolt.merlot.org/vol9no1/kranzow_0313.htm

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 197


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

Kreijns, K., Kirschner, P. A., & Jochems, W. (2003). Identifying the pitfalls for social interaction
in computer-supported collaborative learning environments: a review of the research.
Computers in Human Behavior, 19(3), 335–353. Retrieved from
http://estudosdirigidos20151.pbworks.com/w/file/fetch/94054940/Identifying%20the%20
pitfalls%20for%20social%20interaction%20in%20computer-
supported%20collaborative%20learning.pdf
Lane, L. M. (2009, October 5). Insidious pedagogy: How course management systems impact
teaching. First Monday, 14, 1–8. Retrieved from
http://journals.uic.edu/ojs/index.php/fm/article/view/2530/2303
Lewis, C. C., & Abdul-Hamid, H. (2006). Implementing effective online teaching practices:
Voices of exemplary faculty. Innovative Higher Education, 31(2), 83–98.
doi:10.1007/s10755-006-9010-z
Long, G. L., Marchetti, C., & Fasse, R. (2011). The importance of interaction for academic
success in online courses with hearing, deaf, and hard of-hearing students. The
International Review of Research in Open and Distance Learning, 12(6), 1–19. Retrieved
from http://www.irrodl.org/index.php/irrodl/article/viewFile/1015/1987
Martin, F., Parker, M. A., & Deale, D. F. (2012). Examining interactivity in synchronous virtual
classrooms. The International Review of Research in Open and Distance Learning, 13(3),
227–261. Retrieved from http://files.eric.ed.gov/fulltext/EJ1001021.pdf
Moore, M. G. (1989). Three types of interaction. American Journal of Distance Education, 3(2),
1–4. Retrieved from http://aris.teluq.uquebec.ca/portals/598/t3_moore1989.pdf
Muilenburg, L. Y., & Berge, Z. L. (2005). Student barriers to online learning: A factor analytic
study. Distance Education, 26(1), 29–48. doi: 0.1080/01587910500081269
Muuro, M. E., Wagacha, W. P., Oboko, R., & Kihoro, J. (2014). Students’ perceived challenges
in an online collaborative learning environment: A case of higher learning institutions in
Nairobi, Kenya. The International Review of Research in Open and Distance Learning,
15(6), 132–161. Retrieved from http://files.eric.ed.gov/fulltext/EJ1048242.pdf
Northrup, P., Lee, R., & Burgess, V. (2002). Learner perceptions of online interaction. In
Proceedings from 2002 World Conference on Educational Multimedia, Hypermedia &
Telecommunications (pp. 1–7). Association for the Advancement of Computing in
Education (AACE). Retrieved from http://files.eric.ed.gov/fulltext/ED477075.pdf
Paquette, P. (2016). Instructing the instructors: Training instructors to use social presence cues in
online courses. The Journal of Educators Online, 13(1), 80–108. Retrieved from
http://files.eric.ed.gov/fulltext/EJ1087698.pdf
Picciano, A. G. (2002). Beyond student perceptions: Issues of interaction, presence, and
performance in an online course. Journal of Asynchronous Learning Networks, 6(1), 21–
40. Retrieved from
http://s3.amazonaws.com/academia.edu.documents/41320876/picciano_2002.pdf?AWSA
ccessKeyId=AKIAJ56TQJRTWSMTNPEA&Expires=1479182003&Signature=FT%2B
ERVBz7FopbEQxgWf%2B4Bgthmk%3D&response-content-
disposition=inline%3B%20filename%3DBeyond_student_perceptions_Issues_of_int.pdf

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 198


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

Powell, K. C., & Kalina, C. J. (2009). Cognitive and social constructivism: Developing tools for
an effective classroom. Education, 130, 241–251.
Restauri, S. L. (2006). Faculty-student interaction components in online education: What are the
effects on student satisfaction and academic outcomes? (Doctoral dissertation, Capella
University). Retrieved from ProQuest Dissertations and Theses (UMI No. 3206695).
Richardson, J. C., & Swan, K. (2003). Examining social presence in online courses in relation to
students’ perceived learning and satisfaction. Journal of Asynchronous Learning
Networks, 7(1), 68–88. Retrieved from
https://www.ideals.illinois.edu/bitstream/handle/2142/18713/RichardsonSwan%20JALN
7(1).pdf?sequence=2
Robyler, M. D., & Wiencke, W. R. (2003). Design and use of a rubric to assess and encourage
interactive qualities in distance courses. The American Journal of Distance Education,
17(2), 77–98. Retrieved from
http://spot.pcc.edu/~rsuarez/rbs/school/EPFA_511/articles/rubric.pdf
Robyler, M. D., & Wiencke, W. R. (2004). Exploring the interaction equation: Validating a
rubric to assess and encourage interaction in distance courses. Journal of Asynchronous
Learning Networks, 8(4), 24–37. Retrieved from
http://www.adesignmedia.com/OnlineResearch/(ourRole)rubrics-
interactionv8n4_roblyer.pdf
Sher, A. (2009). Assessing the relationship of student-instructor and student-student interaction
to student learning and satisfaction in web-based online learning environment. Journal of
Interactive Online Learning, 8, 102–120. Retrieved from
http://s3.amazonaws.com/academia.edu.documents/34432524/8.2.1.pdf?AWSAccessKey
Id=AKIAJ56TQJRTWSMTNPEA&Expires=1479177792&Signature=qho8OETrjwUvUj
OukV4CHsPVmpM%3D&response-content-
disposition=inline%3B%20filename%3DAssessing_the_relationship_of_student-in.pdf
Su, B., Bonk, C. J., Magjuka, R. J., Liu, X., & Lee, S. (2005). The importance of interaction in
web-based education: A program-level case study of online MBA courses. Journal of
Interactive Online Learning, 4(1), 1–19. Retrieved from
http://actxelearning.pbworks.com/f/4.1.1.pdf
Swan, K. (2002). Building learning communities in online courses: The importance of
interaction. Education, Communication, & Information, 2(1), 23–49.
doi:10.1080/143631022000005016
Swan, K. (2003). Learning effectiveness online: What the research tells us. Elements of quality
online education, practice and direction, 4, 13–47. Retrieved from
http://ltc.nutes.ufrj.br/constructore/objetos/learning%2520effectiveness4.pdf
Thurmond, V. A., Wambach, K., Connors, H. R., & Frey, B. B. (2002). Evaluation of student
satisfaction: Determining the impact of a web-based environment by controlling for
student characteristics. The American Journal of Distance Education, 16, 169–189.
Retrieved from
https://www.researchgate.net/profile/Helen_Connors/publication/248940463_Evaluation
_of_Student_Satisfaction_Determining_the_Impact_of_a_Web-

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 199


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

Based_Environment_by_Controlling_for_Student_Characteristics/links/5491b3600cf269
b048616a5c.pdf
Tu, C. (2000). Strategies to increase interaction in online social learning environments. Society
for Information Technology & Teacher Education International Conference: Proceedings
of SITE 2000 (pp. 2–7). Association for the Advancement of Computing in Education
(AACE). Retrieved from http://files.eric.ed.gov/fulltext/ED444550.pdf
Tu, C., & Corry, M. (2003). Building active online interaction via a collaborative learning
community. Computers in the Schools, 20(3), 51–59. doi:10.1300/J025v20n03_07
Tu, C., & McIsaac, M. (2002). The relationship of social presence and interaction in online
classes. The American Journal of Distance Education, 16(3), 131–150. Retrieved from
http://www.mentormob.com/hosted/cards/71178_cfc5725a0c013f51c6279e4e3fdaed03.p
df
Vrasidas, C., & McIsaac, M. S. (1999). Factors influencing interaction in an online course.
American Journal of Distance Education, 13(3), 22–36. Retrieved from
http://vrasidas.com/wp-content/uploads/2007/07/ajde_vrasidas.pdf
Vygotsky, L. (1997). Interaction between learning and development. In M. Gauvin & M. Cole
(Eds.), Readings on the development of children (2nd ed., pp. 34–40). Scientific
American Books. Retrieved from http://blogs.spsk12.net/8576/files/2017/02/Day-4-ZDP-
article-vygotsky.pdf
Woo, Y., & Reeves, T. C. (2007). Meaningful interaction in web-based learning: A social
constructivist interpretation. The Internet and Higher Education, 10(1), 15–25.
doi:10.1016/j.iheduc.2006.10.005
Woods, Jr., R. H., & Baker, J. D. (2004). Interaction and immediacy in online learning.
International Review of Research in Open and Distance Learning, 5(2), 1–13. Retrieved
from http://www.irrodl.org/index.php/irrodl/article/viewArticle/186/268
York, C. S., & Richardson, J. C. (2012). Interpersonal interaction in online learning: Experienced
online instructors’ perceptions of influencing factors. Journal of Asynchronous Learning
Networks, 16(4), 83–98. Retrieved from http://files.eric.ed.gov/fulltext/EJ982684.pdf

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 200


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

Appendix A: Rubric for Assessing Interactive Qualities of Distance Course

(Robyler & Wienke, 2003)

Copyright © 2004, M. D. Roblyer ([email protected]). Used by blanket permission of

the author for nonprofit research and/or education only. For other permission, contact the author.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 201


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 202


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 203


Purposeful Interpersonal Interaction in Online Learning: What is it and How is it Measured?

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 204


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

The Role of an Interactive Visual Learning Tool and its


Personalizability in Online Learning: Flow Experience
Young Ha
California State University, Long Beach

Hyunjoo Im
University of Minnesota, Twin Cities

Abstract
The purpose of this study was to examine the effect of interactive online learning tools on college
student learning using flow as the guiding perspective. Study 1 was conducted to test the effect of
online interactivity manipulated by dynamic visual learning tools on student’s flow experience,
level of telepresence, actual performance on tests, and perceived values of such activities. Study 2
was designed to test the effect of personalizability of difficulty levels in the interactive online
activity on students’ learning experience. The results found that interactive online learning tools
can facilitate student’s active learning process by increasing attention, curiosity, and interest about
the online activity and by reducing awareness of physical surroundings. In addition, the interactive
activity significantly improved students’ test scores. This study also found that personalized
difficulty options available in the interactive online activity significantly increased students’
perceived hedonic value (i.e., enjoyment) of and the level of satisfaction with the activity. The
results emphasize the critical role of interactive visual learning tools in the online activities in
improving students’ flow experience and actual performance. Personalizability of task levels is
also recommended in online learning activities to increase students’ perceived hedonic value and
satisfaction with such online activities.

Keywords: online learning, interactivity, visual learning tools, persoanalizability, flow,


perceived value

Ha, Y. & Im, H. (2020). The role of an interactive visual learning tool and its personalizability in
online learning: Flow experience. Online Learning, 24(1), 205-226.
https://doi.org/10.24059/olj.v24i1.1620

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 205


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning:
Flow Experience
Online-based learning has become an increasingly common mode of learning in higher
education. According to a report by the Babson survey research group (Allen, Seaman, Poulin, &
Straut, 2016), 28.4% of all enrolled students in higher education took at least one distance learning
course in 2014. In the fall of 2016, more than 6.3 million U.S. students took at least one online
class (Friedman, 2018). As of 2017, over 700 learning management system suppliers offer
products to the growing eLearning market (Jasmini, 2017). Despite the increasing popularity of
online learning, online courses in higher education still suffer from high dropout rates (Chen,
2018). Some found the reason to be the lack of interactivity and personalized experience in the
context of online learning (Oria, 2017).
With the technology, the online learning environment provides an exciting opportunity to
enhance learning experience of learners by offering interactive and personalizable content. As
dominant online information is visual (Carroll & Kop, 2016), properly designing visual learning
tools that allow interactive and personalized learning experience can be critical for successful
online learning.
Previous research and literature provide support for the importance of interactivity and
personalizability in online learning effectiveness. As emphasized by online educators (Moreillon,
2015), interactivity is a key feature of online education which helps attract and retain students in
online classes. Interactive online tools provide opportunities for instructors to communicate better
with students and enhance students’ online learning experience. While the online tools are often
adopted to compensate for the loss of face-to-face interaction in a traditional education setting
(Sun & Hsu, 2013), well-designed online tools not only can transfer some face-to-face teaching
techniques but also can increase individual students’ engagement and motivation to learn.
Computer-mediated interactions can elicit students’ curiosity and hedonic motivation when the
learning material is interactive and engaging (Oudeyer, Gottlieb, & Lopes, 2016). Kucuk and
Richardson (2019) found that a well-designed online learning interface made learners cognitively
and emotionally engaged in learning and increased their satisfaction as well.
The theory of flow provides the conceptual framework of why interactive visual learning
tools help students engage and actively participate in the learning process (Csikszentmihaly, 1990).
The flow theory suggests that interactive visual learning tools have a high potential to engage
students in the learning process as students are likely to experience flow and the effect will be
greater when the students’ skill matches the task difficulty (Csikszentmihaly, Abuhamdeh, &
Nakamura, 2005). Interactive online learning activity with personalizable options enable learners
to be more focused and engaged as they can select the learning level that matches their skills
(Pandey, 2017). Ou, Joyner, and Goel (2019) also emphasized the critical role of personalized
online teaching materials in stimulating learners’ interest and engaging them in learning.
While previous research investigated the role of interactivity and flow in learning, a few
gaps in the literature call for further investigation. The scope of the online learning literature is
mostly focused on the role of human-human interaction (e.g., learner-instructor and/or learner-
learner), limiting our understanding of the human-computer interaction (i.e., learner-
content/interface) effects on online learning. Considering the importance of interface in online
learning, Wei, Peng, and Chou (2015) urged need for expanding the scope of interactivity from
human-human interaction to human-computer interaction in an online learning environment. The

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 206


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

current study responds to this call and investigates human-computer interaction effects on online
learning. Also, interactivity and flow effects on learning from previous literature have been
inconsistent, particularly in the context of internet-based learning environment (Meyer & Jones,
2013). Such inconclusive findings may be due to the fact that the majority of studies adopted self-
reported surveys of learning experience (e.g., Chou, Peng, Chang, 2010; Etemad-Sajadi, 2016;
Wei et al., 2015) that are prone to response biases, such as social desirability, memory biases, and
an inability to detect causal relationships. In addition, while personalization is hailed as a critically
important element of online interface, few scholarly journal articles examined the effect of
personalization on students’ online learning. To fill the gap in the area of research, the current
study aims to understand the causal impact of interactivity and personalization of online visual
learning tools on student’s learning through a series of experiments. This study focused on
understanding two key factors linked to online visual learning tools: interactivity and the balance
between skill level and task difficulty (i.e., interactivity with personalizable options). Two
experimental studies were conducted to investigate each element. The purpose of Study 1 was to
examine the effect of online interactivity on student learning process manifested as flow
experience. The focus of Study 2 was to investigate whether students’ learning experience is
enhanced when students could match their skill level with the task difficulty through
personalization options.

Review of Related Literature


Interactivity in Online Learning
Previous research (e.g., Cho & Kim, 2013; Park, 2011; Rodriguez-Ardura et al., 2016; Wei,
Peng, & Chou, 2015) has emphasized interactivity as a critical success factor of online learning
because it enhances students’ learning experience and their performance. The concept of
interactivity used and examined in online learning literature has been varied (Domagk, Schwartz,
& Plass, 2010; Wei et al., 2015). Most common type of interactivity tested in previous online
learning research was the effect of human-human interaction on student learning (e.g., Chen,
Chang, Ouyang, & Zhou, 2018; Luo, Zhang, & Qi, 2017; Kent, Laslo, & Rafaeli, 2016; Yeh, Rega,
Chen, 2019). Human-human interactivity in the e-learning literature reported significant effects of
student-instructor communication (Cheng, 2013; Luo et al., 2017: Paechter, Maier, & Macher,
2010), student-student interaction (Chen et al., 2018; Cheng, 2013; Luo et al., 2017), and peer
evaluation (Yeh et al., 2019) on student online learning experience. Researchers have also
emphasized the critical role of human-computer interaction in the context of online learning
environment (Chou, 2003; Low, Low, & Koo, 2003; Wei et al., 2015). Previous research on
human-computer interactivity were likely to focus on learner-interface interactivity, attempting to
understand effects of using various new learning management systems (e.g., Wei et al., 2015). A
small number of studies investigated learner-content interactivity that addresses the question of
learning-specific online contents such as individualized guides, activities, and instructions.
In this study, learner-content interactivity is of the focal interest and interactivity is defined
as a characteristic of an online system that allows a user to modify elements and contents of the
online environment in real time (Rodriguez-Ardura & Meseguer-Artola, 2016) and provides
immediate responses to the user’s input (Chang & Wang, 2008). Evans and Sabry (2003)
conceptualized a three-way model of human-computer interactivity in computer-mediated learning
environment: computer-initiation, learner-response, and computer-feedback. Accordingly, in the

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 207


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

e-learning environment, students interact with online activities as they respond to the learning
activity (e.g., by clicking and moving images, by answering questions, etc.) and get immediate
feedback from the activity (e.g., correct answers, tips, and guidance provided). Interactive online
learning tools examined in the current study was developed to incorporate this three-way
interactivity.
Telepresence
In the online environment, interactivity is a critical determinant of engagement
(Karageorgakis, 2018) because high interactivity of a system allows the users to be fully present
in the mediated environment. This feeling or perception of being present in a simulated or mediated
environment is called telepresence (Li, Daugherty, & Biocca, 2002). Telepresence is described as
a user’s immersive experience in a mediated environment (Steuer, 1992) and sometimes also noted
as immersion in the literature (e.g., Carrozzino & Bergamasco, 2010).
Previous research in the mediated environment found interactivity of an online system is
an important predictor of telepresence (Esteban-Millat, Martinez-Lopez, Huertas-Garcia,
Meseguer, & Rodriguez-Ardura, 2014; Li et al., 2002; Lim & Ayyagari, 2018; Skadberg &
Kimmel, 2004). For example, in a study of online advertisement, participants felt stronger
telepresence when the online advertisement was interactive (e.g., Fortin & Dholakia, 2005).
Likely, interactive online features such as clickable images with hyperlinks were found to increase
telepresence (Coyle, Mendelson, & Kim, 2008). Therefore, H1 was hypothesized.
H1: Students who used the interactive visual learning tools will report a higher level of
telepresence than those who used the one with noninteractive visual learning tools.
Flow
Flow is a subjective experience of total immersion in the activity (Csikszentmihaly, 1990)
and a momentary feeling of complete engagement (Meyer, Klingenberg, & Wilde, 2016). Flow is
often characterized by simultaneous experience of several dimensions: attention focus (or
concentration), positive emotions such as enjoyment, joy, and pleasure, sense of control, distorted
sense of time, and reduced awareness of physical surroundings and self (e.g., Rossin, Ro, Klein,
& Guo, 2009). Researchers in human-computer interaction emphasized the role of flow as an
important antecedent of learning in an online environment because of the interactive nature of
online operations (Hoffman & Novak, 2009).
Because interactivity increases telepresence, it is likely that high interactivity also increases
the flow experience. Hoffman and Novak (2009), after a review of 12 empirical studies using flow
theory, reported that interactivity has both direct and indirect effects on flow. Researchers reported
empirical evidence of the positive effect of interactivity on flow experience in a web-based training
program (Choi, Kim, & Kim, 2007), e-learning environment (Rodriguez-Ardura & Meseguer-
Artola, 2016), and online university courses (Esteban-Millat et al., 2014; Guo, Xiao, van Toorn,
Lai, & Seo, 2016). In online flow experience research, the majority of researchers understood flow
as a multidimensional construct and measured these multiple constructs to capture flow (Hoffman
& Novak, 2009). Similarly, in this study, the core elements of the experience of flow is
operationalized as (a) control, (b) attention focus, (c) curiosity, and (d) intrinsic interest following
the conceptualization of Huang (2003). Therefore, H2 was formulated.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 208


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

H2: Students who used the interactive visual learning tools will experience a higher level of flow
(control (H2a), attention focus (H2b), curiosity (H2c), and intrinsic interest (H2d)) than those who
used the noninteractive visual learning tools.
Interactivity and Learning
Interactive tools can be effective in facilitating student learning. A range of literature
provides evidence that interactivity increases learning measured as test scores, understanding of
concepts, retention of information (Evans & Gibbons, 2007; Wang, Vaughn, & Liu, 2011), and
perceived knowledge gain (Skadberg & Kimmel, 2004; Sun & Hsu, 2013).
Interactivity of a tool can positively influence learning for several reasons. Some noted that
interactive instructional tools can encourage learners to self-motivate and direct their own learning,
consequently increasing learning by actively constructing knowledge (Evans & Gibson, 2007;
Reiter, Lakoff, Trueger, & Shah, 2013). Others reasoned interactive tools enhance learning
because they allow users to control the learning process by engaging in the learning activity at
their own pace and by skipping, reviewing, and repeating the content as needed (Wang et al.,
2011). Others argued the interactions within the instructional tools help engage learners in the
learning process and prolong their concentration on learning (Esteban-Millat et al., 2014; Kiili,
2005). Therefore, H3 was developed.
H3: Students who used the interactive visual learning tools will perform better on a test than those
who used the one with the noninteractive visual learning tools.
Utilitarian and Hedonic Value
Online information tools can provide utilitarian/instrumental value (e.g., useful
information to enhance performance efficiency) or hedonic/experiential value (e.g., enjoyment)
(van der Heijden, 2004). Researchers emphasized comprehensively understanding both hedonic
and utilitarian values (Babin, Darden, & Griffin, 1994). In the context of learning, utilitarian value
refers to the degree to which a tool provides benefits to achieve learning. Hedonic value, on the
other hand, is the degree to which a tool provides emotional and entertainment benefits. Previous
research emphasized the effect of interactive learning environment on learners’ perceived hedonic
(e.g., pleasure) and utilitarian (e.g., usefulness) value about online learning (Liaw, 2008; Liaw &
Huang, 2013). Wei et al. (2015) also found that students’ perceptions of online learning are highly
related to teacher’s design of interactive learning activities. Cheng (2013a) who longitudinally
examined the effect of interactivity features in the context of e-learning environment found that
online interactive features (e.g., responsiveness, personalization, etc.) positively influenced
learners’ perceived usefulness and enjoyment of e-learning system. Similarly, when the students
use an interactive visual learning tool, they are more likely to find the learning tool to be useful
and fun because the tool not only effectively provides contents (i.e., utilitarian value) but also
inherently possesses the ability to dynamically change in response to the user input. Therefore, H4
was formulated.
H4: Students who used the interactive visual learning tools will perceive a higher level of
utilitarian (H4a) and hedonic value (H4b) than those who used the noninteractive visual learning
tools.
Personalization: Skill-Challenge Level
Csikszentmihaly et al. (2005) identified three important preconditions for flow experience:
clarity of the goal, clear and immediate feedback, and the skill-challenge balance. The interactive

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 209


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

visual learning tools can be designed to satisfy the first two conditions. The students are usually
given a clear objective for learning (i.e., to accomplish the task and learn the materials) and the
interactivity provides immediate and clear feedback on the student’s input. However, the last
condition is dependent on individual student’s ability (e.g., prior knowledge). While learning
activities are designed to offer a reasonable level of challenge for students, the balance can only
be achieved when the students are properly prepared for the given task. According to the model of
flow (Csikszentmihaly & Csikszentmihaly, 1988), when the challenge and skill do not match, the
individuals will feel anxiety (low skill-high challenge), boredom (high skill-low challenge) or
apathy (low skill-low challenge). In a meta-analysis study of antecedents of flow, Fong, Zaleski,
and Leach (2015) found the skill-challenge balance to be a strong antecedent of flow among nine
antecedents investigated. Therefore, in order to make students fully engage in online learning and
experience flow, the skill-challenge balance should be achieved. Guo and colleagues (2016)
empirically showed that the skill-challenge balance positively influenced flow that students
experienced during online learning.
While previous studies relied on self-reported perception of skill/challenge balance by
measuring either perceived skill/challenge level and comparing two scores to determine the
balance (e.g., Fullagar, Knight, & Sovern, 2013) or measuring the perceived balance itself (e.g.,
Engeser & Rheinberg, 2008), the current study attempted to achieve the balance by providing
varying degrees of task challenge options (i.e., personalization option). It is assumed that the
individuals would find the balance between their skill level and the task challenge if they could
choose from easy, medium, and hard difficulty level tasks. When multiple difficulty levels are
offered, individuals can personalize the difficulty level to match their skill level. This way, many
individuals with different levels of skill can find the balance and satisfy the precondition of flow,
and therefore, are likely to experience flow. Hence, H5 was developed.
H5: Students in the personalizable difficulty condition will experience a higher level of flow
(control (H5a), attention focus (H5b), curiosity (H5c), and intrinsic interest (H5d)) than students
in the fixed difficulty condition.
Since personalized online learning activity empowers students to choose their own learning
path that is right for their skill level, it helps students manage what they learn and better perform
in the given task (Pandey, 2017). Skadberg and Kimmel (2004) found that website visitors learn
contents better when the skill and challenge level was balanced. Wang et al. (2011) found that
animated online interactivity that allows students to personalize the input levels to generate a
different visual presentation significantly enhanced students’ understanding of the contents
covered in the online lecture. Personalized virtual learning environment was also found to
significantly improve learners’ performance in final exam (Xu & Wang, 2006). Accordingly, H6
was formulated.
H6: Students in the personalizable difficulty condition will perform better on a test than students
in the fixed difficulty condition.
When the balance is achieved, learner performance and perceived hedonic and utilitarian
value are expected to be also enhanced. Learners are likely to perceive an interactive tool as helpful
in increasing their performance when there is personalization option. Hoffman and Novak (1996),
in their seminal work, theorized the skill-challenge balance leads to positive subjective experience
and exploratory mindset. These intrinsic motivations are directly connected to hedonic values.
Empirical research supported the positive effect of skill-challenge balance on utilitarian and

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 210


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

hedonic values. Cordova and Lepper (1996) in their experimental research found that individually
personalized computer activities significantly enhanced students’ engagement in learning,
perceived competence, and hedonic motivation. In the experimental research, Xu and Wang (2006)
found that personalized online learning materials positively influenced students’ perception of
system usefulness and hedonic motivation. Guo and colleagues (2016) empirically showed that the
skill-challenge balance indirectly influenced perceived utilitarian and hedonic value of an online
course. Thus, H7 was developed.
H7: Students in the personalizable difficulty condition will perceive a higher level of (H7a)
utilitarian (i.e., usefulness) and (H7b) hedonic value (i.e., enjoyment) than students in the fixed
difficulty condition.
Personalizable learning tools are likely to increase learner satisfaction with the learning
activity. When the learner can match the task difficulty with their skill level, they are able to reduce
negative emotions such as anxiety or apathy and are encouraged to engage in learning. Such an
experience is likely to create positive learning experience and increase satisfaction. Online learning
research found that e-learning interface with various presentation types improved learner
satisfaction (Liu, Liao, & Pratt, 2009). Ӧzyurt and Ӧzyurt (2015) content-analyzed 69 articles on
individualized adaptive e-learning published between 2005 and 2014 and concluded that the most
robust findings from the literature was the positive outcome of learner satisfaction, usability, and
preferability. Out of 69, 18 studies investigated and reported significant effect of adaptive e-
learning (i.e., personalized teaching tools based on students’ learning style) on learner satisfaction.
Therefore, it is also anticipated that the skill-challenge balance positively affects user satisfaction
because the flow experience leads to a positive mood and an enhanced feeling of satisfaction
(Hoffman & Novak, 1996). Therefore, H8 was formulated.
H8: Students in the personalizable difficulty condition will show a higher level of satisfaction than
students in the fixed difficulty condition.

Study 1
The purpose of Study 1 was to test effects of interactivity on online learning. Study 1 is
designed to test H1 through H4.
Method
Experimental stimuli development. To test the effect of interactivity on student online
learning experience, a single factor, two-level (Interactivity: Yes/No), between-subjects
experiment was designed. For the manipulation of interactivity, two versions of an online
instructional website on color theory were developed: one with noninteractive visual contents and
the other with an interactive visual learning tool that allows dynamic manipulation of visual
contents. Both websites contained basic explanations for key concepts of color theory: hue, value,
intensity, and color schemes. The noninteractive visual learning tool was one long webpage with
written information about color theory and still images to illustrate the concepts without interactive
features embedded. Thus, students scrolled down the webpage to read and learn the materials. The
interactive visual learning tool was an embedded interactive flash object that presents the same
content. Students could click tabs, buttons, and checkboxes to open or collapse the content and to
interact with the educational contents. As students interact with the learning tool, the flash object

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 211


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

modifies its content in response to the user input. See Figure 1 for sample screen shots of the
interactive visual learning tool used in Study 1.
Instrument development. Eight telepresence items, adapted from Kim and Biocca (1997),
were measured using 5-point Likert scales. To measure students’ flow experience during the
learning activity, four constructs associated with flow were measured using 5-point Likert scales.
Three items were used to measure each of four flow constructs: control, attention focus, curiosity,
and intrinsic interest (Nel, van Niekerk, Berthon, & Davis, 1999). Utilitarian value, operationalized
as perceived usefulness (Davis, Bagozzi, & Warshaw, 1992), was measured using four items (e.g.,
“The online activities like the color theory exercise would improve my learning productivity”).
Hedonic value, operationalized as perceived enjoyment (Childers, Carr, Peck, & Carson, 2001),
was measured using eight items (e.g., studying with the online activities would be fun for its own
sake). Both measures used 5-point Likert scales. The inter-item reliability of measurements was
checked by Cronbach’s alpha and all showed good reliabilities (Cronbach’s α > .70). Multi-item
measurements were averaged to get single scores.
Sample and procedure. Forty-five undergraduate students participated in the experiment
for extra credits. This experiment was done in a lab setting to minimize the effect of other
miscellaneous environmental factors (e.g., technology types, computer specification, internet
speed, time spent, etc.) on dependent measures. In the computer lab, students were asked to learn
the materials by exploring the assigned website for 10 minutes. Students were randomly assigned
to either the interactive (N = 24) or the noninteractive site (N = 21). After the 10 minutes, students
were given a survey questionnaire which included items measuring flow experience and
telepresence while browsing the site, and utilitarian and hedonic values of using the online learning
tool. Students were also asked to provide demographic information (age, ethnicity, year in college)
and previous experience with online learning tools. Upon the completion of the activity, students
completed a short quiz on color theory consisting of six questions. The quiz scores were used to
measure actual student learning after the completion of the online activity.

Results
Description of participants. Participants’ (N = 45) mean age was 20.73, with a range of
18 to 26. Hispanic American was the single largest group accounting for about 35.6% of
participants. Other participants were Caucasian-American (28.9%), African-American (8.9%),
Asian/Asian-American (17.8%), and other (6.7%). Most participants were sophomores (46.7%)
and juniors (37.8%). The number of freshmen (4.4%) and seniors (8.9%) was small. The majority
(80% of participants) claimed that they have previously used online learning tools, such as a study
guide or other online activities in four classes or more.
Hypotheses testing. Multivariate analysis of variance (MANOVA) was used to test the
effects of interactivity on various dependent measures. The results showed a significant
multivariate main effect of interactive online activity on dependent measures (F [8, 36] = 5.426, p
< .0001). Univariate analysis of variance (ANOVA) was also analyzed to test each hypothesis
proposed in Study 1 as follows.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 212


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

Figure 1. Sample screenshot of the interactive visual learning tool used in Study 1.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 213


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

Hypothesis 1. ANOVA found a significant main effect of interactivity on telepresence (F


[1, 43] = 15.729, p < .0001). Students in the interactive condition showed significantly higher mean
scores for telepresence (M = 2.99, SD = .983) than those in the noninteractive condition (M = 2.02,
SD = .570). The result indicates that interactive visual imageries used in the online activity
contributed to telepresence. Thus, H1 was supported.
Hypothesis 2. ANOVA revealed a significant main effect of interactivity on attention focus
(F [1, 43] = 10.608, p < .001), curiosity (F [1, 43] = 14.053, p < .001), and intrinsic interest (F [1,
43] = 26.969, p < .0001), supporting H2b, H2c, and H2d. Students in the interactive condition
showed significantly higher mean scores than those in the noninteractive condition for attention
focus (interactive: M = 3.34, SD = .726, noninteractive: M = 2.60, SD = .807), curiosity
(interactive: M = 3.88, SD = .679, noninteractive: M = 3.13, SD = .654), and intrinsic interest
(interactive: M = 4.03, SD = .629, noninteractive: M = 3.05, SD = .635). However, control did not
show a significant difference between groups, rejecting H2a. Although the difference was not
statistically significant, mean scores showed the direction consistent with our prediction
(interactive: M = 3.68, SD = .641 vs. noninteractive: M = 3.41, SD = .893).
Hypothesis 3. A significant main effect of interactivity on student’s actual performance in
the test was also found (F [1, 43] = 35.110, p < .0001). An inspection of the cell means revealed
that students in the interactive condition performed significantly better in the quiz (interactive: M
= 5.33, SD = 1.049 vs. noninteractive: M = 3.38, SD = 1.161). Thus, H3 was supported. The results
indicate that the interactive visual learning tool used for the online activity could enhance student
learning.
Hypothesis 4. ANOVA revealed a significant main effect of interactivity on utilitarian (F
[1, 43] = 18.161, p < .0001) and hedonic value (F [1, 43] = 7.334, p < .01). An inspection of the
cell means revealed that students in the interactive condition perceived the online activity more
useful and enjoyable (utilitarian: M = 4.50, SD = .659, hedonic: M = 4.01, SD = .601) than those
in the noninteractive condition (utilitarian: M = 3.68, SD = .628, hedonic: M = 3.51, SD = .619).
Thus, H4a and H4b were supported.

Study 2
The purpose of Study 2 was to test effects of skill and challenge balance on flow and
learning. Study 2 is designed to test H5 through H8.
Method
Experimental stimuli development. To examine the effects of skill-challenge balance on
flow experience, a single factor, 2-level (personalizable difficulty vs. fixed difficulty) between-
subjects design was used. It is assumed that student skill level and task difficulty would be more
likely to match when the students have an option to personalize the level of task difficulty.
Therefore, two interactive visual learning tools, one with three personalizable difficulty levels and
the other with a fixed difficulty level, were developed.
The learning tools had multiple tabs for providing concepts and theory explanation and for
application activities. Both tools contained the identical theory information tab that provided
written information on the basic color theory with proper visual examples and interactive features
to assist understanding of the basic concepts. Both learning tools presented an interactive activity
tab that was designed to help students understand various color relationships using Munsell color

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 214


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

chart. Students were able to drag and drop color chips in the correct order on a two-dimensional
chart with the x-axis representing intensity or chroma and the y-axis representing value of a hue.
The activity could be repeated for four different hues and students could select one hue at a time.
Two learning tools differed in the availability of difficulty selection options. For the tool
with the personalizable option, students were able to choose a difficulty level out of three options
(i.e., easy, medium, and hard) using a drop-down menu. The easy, medium, and hard levels (see
Figure 2 for three difficulty levels) presented 8–15, 18–28, and 61–93 color swatches to be
placed in the chart, respectively. The exact number of color swatches varied based on value and
intensity of the selected hue. The activity with the fixed difficulty option presented the medium
difficulty level only with 18–28 color swatches (see Figure 2 for the medium difficulty option).
Sample and procedure. One hundred and forty undergraduate students from four sections
of the same course taught in two large universities participated in the experiment for extra credits.
In a computer lab, students were randomly assigned to one of the experimental conditions
(personalizable [N = 72] vs. fixed difficulty [N = 68]) and were asked to learn the materials and
explore the online activity for 20 minutes. Students were directed to view the basic information
tab first to learn about the color theory, and then to complete the interactive online activity through
which students created a value/intensity color chart. Upon the completion of the activity, students
were asked to complete a survey questionnaire online, which included items measuring four flow
constructs, utilitarian and hedonic values of using such activities, and satisfaction. Students (N =
50) from one university also completed a short test to measure the effect of skill-challenge balance
on learning. Test scores were used to examine students’ actual performance after the activity.
Students in the other university could not complete the test due to the limitation of course schedule.
Demographic information (age, ethnicity, year in college) and four questions regarding previous
experience with the online learning tools were also collected.

Figure 2a. Three difficulty levels manipulated in Study 2: Easy

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 215


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

Figure 2b. Three difficulty levels manipulated in Study 2: Medium (top), Hard (bottom)

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 216


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

Instrument development. The same items used in Study 1 were used to measure flow
experience and utilitarian/hedonic values. Three overall satisfaction items (e.g., Do you like the
online activities like XXX?) were added to Study 2 and measured using 5-point Likert-type scale
(Not at all to Very much). The inter-item reliability of measurements was checked by Cronbach’s
alpha and all showed good reliabilities (Cronbach’s α >.702). A multi-item measurement was
averaged to get a single score to test hypotheses.

Results
Description of participants. The mean age of participants (N = 140) was 21.5, with a
range of 18 to 39. Caucasian American was the single largest group accounting for about 68.6%
of participants followed by Asian American (12.9%), Hispanic American (10.7%), African
American (2.1%), and other (5.7%). Most participants were seniors (66.4%), with about equal
numbers of juniors (15%) and sophomores (14.3%). Freshmen (4.3%) were small. The majority
(over 70%) of participants had often used the online learning tools to obtain course information
(e.g., lecture note, grades) and to use for the group discussion.
Hypotheses testing. Multivariate analysis of variance (MANOVA) was used to test the
effects of personalizable difficulty level option on various dependent measures. The results showed
a significant multivariate main effect of personalizable difficulty-level option on dependent
measures (F [7, 125] = 2.292, p < .05).
Hypothesis 5. Univariate analysis of variance (ANOVA) revealed a significant main effect
of personalizable difficulty level option in the interactive activity on curiosity (F [1, 131] = 4.823,
p < .05) and intrinsic interest (F [1, 131] = 10.09, p < .005), supporting H5c and H5d. Students in
the condition with three personalizable difficulty-level option showed significantly higher mean
scores for curiosity (personalizable: M = 4.02, SD = .644 vs. fixed: M = 3.74, SD = .809) and
intrinsic interest (personalizable: M = 4.13, SD = .624 vs. fixed: M = 3.75, SD = .766) than those
in the fixed condition. However, control and attention focus did not show a significant difference
between groups, rejecting H5a and H5b.
Hypothesis 6. The result of ANOVA revealed no significant difference between two groups
in terms of actual test scores (F [1, 48] = 2.97, p = .09), rejecting H6. This result suggests that the
availability of personalizable difficulty-level option in the interactive online activity did not
influence students’ actual performance on the test.
Hypothesis 7. ANOVA results showed a significant main effect of personalizable difficulty
level option in the online activity on hedonic value (F [1, 131] = 6.048, p < .05) but not significant
on utilitarian value (F [1, 131] = 3.272, p = .073). Cell means also revealed that students perceived
the interactive online activity with the personalizable difficulty level option more enjoyable (M =
4.05, SD = .502) than the activity with the fixed option (M = 3.80, SD = .677). Thus, only H7b was
supported.
Hypothesis 8. A significant main effect of personalizable difficulty level option on
student’s satisfaction toward the interactive online learning activity was also found (F [1, 131] =
4.839, p < .05). This indicates that students found the interactive online activity with personalizable
difficulty level option more favorable than the one with the fixed option (personalizable: M = 4.42,
SD = .574 vs. fixed: M = 4.15, SD = .819). Therefore, H8 was supported.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 217


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

Table 1
Mean (M) and Standard Deviation (SD) Scores for Different Conditions in Study 1 and Study 2
Study 1 Study 2
No Fixed Three
Interactivity Interactivity Difficulty Level Difficulty Levels
Dependent Variables M SD M SD M SD M SD
Control 3.41 .893 3.68 .641 3.90 .682 3.89 .644
Attention Focus 2.60 .807 3.34 .726 3.33 .878 3.47 .707
Curiosity 3.13 .654 3.88 .679 3.74 .809 4.02 .644
Intrinsic Interest 3.05 .635 4.03 .629 3.75 .766 4.13 .624
Intrinsic Motivation 3.51 .619 4.01 .601 3.80 .677 4.05 .502
Extrinsic Motivation 3.68 .628 4.50 .659 4.21 .804 4.45 .734
Note. All items were measured using 5-point Likert scale.

Discussion and Conclusion


Contribution of the Study
The current study empirically investigated effects of the interactive visual learning tools
on student learning experience and performance through two experiments. The findings of the
study contribute to the literature of human-computer interaction in the context of online learning.
Based on the theory of flow, the current study highlighted the potential of the interactive visual
learning tools for teaching visual contents online. The study is also a response to a call for empirical
testing of human-computer interaction effect on student online learning (Wei et al., 2015).
Additionally, this study makes contribution to the literature of e-learning by testing two important
characteristics of online learning tools, interactivity, and personalizable options for skill-challenge
balance, that positively lead to flow experience, learner perception, and performance. It is
noteworthy that this study provides evidence for causal effects of interactivity and personalization
through controlled experiments while many studies inferred the effects through correlations
between learner’s self-reported perception and academic performances (e.g., Chou et al., 2010;
Etemad-Sajadi, 2016; Wei et al., 2015).
Effect of Interactivity on Telepresence, Flow, and Learning
The purpose of Study 1 was to investigate the effect of the interactive visual learning tools
on students’ learning process. Similar to the prior research (Lim & Ayyagari, 2018) suggesting
interactivity as a major antecedent of telepresence in the context of e-commerce setting, this study
demonstrates the significant effect of interactivity on telepresence in the online learning
environment. This result demonstrates the critical role of dynamic, real-time interactivity in
improving students’ learning by reducing awareness of physical surroundings.
Consistent with previous research (Esteban-Millat et al., 2014; Rodriguez-Ardura &
Meseguer-Artola, 2016), the result of this study also supports that the interactive visual learning
tools augmented students’ flow experience in the context of an e-learning environment. The result
reveals that students who used the interactive visual learning tools experienced a higher level of
flow (attention focus, curiosity, intrinsic interest) than those who used the noninteractive one. This

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 218


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

implies that interactive visual learning tools can facilitate student’s active learning process by
increasing attention, curiosity, and interest about the online activity. Therefore, to enrich student
learning experience in the online learning environment, it is of greatest importance to incorporate
interactivity by means of dynamic visualization into online instructional materials. This can be
more beneficial when students learn abstract concepts, particularly in the context of online learning
environment where students easily lose their attention and interest on lecture materials.
This study also confirms that the interactive visual learning tools contribute to learning,
which was evidenced by higher test scores for the interactive group than for the noninteractive
group. Previously published studies have reported mixed findings related to effects of interactive
learning tools on performance. Some found supporting evidence for positive effects of interactive
learning tools (e.g., Sharp & Hamil, 2018) while others failed to confirm the effect (Wei et al.,
2015). According to Wang et al. (2011), this inconsistency might be because levels or types of
learning examined in previous studies were not consistent. Wang et al. (2011) found that animated
interactivity is more effective for the intermediate level of learning (i.e., understanding concepts)
than for the lowest (i.e., remembering) or highest level of learning (i.e., high level applying). Since
the current study employed the interactive activity to help students understand the concept of color
theory, the learning activities students were engaged in can be in the intermediate level of learning.
Thus, our result corroborates Wang et al.’s (2011) findings. When developing an interactive online
learning activity, online content developers or instructors are necessary to consider levels or types
of learning students should achieve.
Effect of Personalized Interactivity on Flow and Satisfaction
Study 2 tested how the interactive online activity with personalization (i.e., three difficulty
level options to achieve skill-challenge balance) influences students’ learning experience. As
expected, students experienced significantly higher level of curiosity and interest about the online
activity when three difficulty-level options were available than when one fixed option was
available. Once the balance between students’ skill level and task challenge in the online activity
is achieved, students tend to experience higher level of flow elicited by higher curiosity and interest
about the online activity. This result is consistent with the previous research (Guo et al., 2016) that
found the significant impact of perceived balance between challenge and skill level on flow
experience in online learning.
However, inconsistent with the hypotheses, the influence of the skill-challenge balance on
level of attention focus was not significant. The effect could have been minimal because both
conditions presented very interactive tools with dynamic visualizations. When compared with the
noninteractive group in Study 1, both personalizable and fixed difficulty groups in Study 2
experienced fairly high level of attention focus (see Table 1 for mean scores). It is possible that
the availability of online interactivity has a stronger effect on level of attention focus than the
availability of personalizable difficulty level options. In addition, Engeser and Rheinberg (2008)
found that perceived importance of the task moderates effects of the skill-challenge balance on
flow experience. Therefore, it is possible that when the students feel the task is important, the
effects of the balance may be attenuated because their goal to achieve the end outcome
predominantly determines their level of flow.
Although no significant difference was found in terms of test scores between two groups,
students’ overall satisfaction with the online activity was significantly higher for the group with
personalizable difficulty-level option. This finding is in line with results of a previous study that

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 219


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

reported perceived balance of challenge and skill only affects satisfaction but not perceived
learning of subject matter or actual performance (Rossin et al., 2009). Rossin et al. (2009) argued
that this might be because of an intrinsic reward associated with tasks performed. As demonstrated
earlier in the current study, personalized difficulty options induced higher curiosity and interest
for students and influenced satisfaction. The results imply that the online task performed serves as
its own intrinsic reward (i.e., satisfaction) at the moment of first use and therefore no need for an
extrinsic reward (e.g., test score improvement) to continue adopting the task. Wei et al. (2015)
claimed that once the task is adopted and used frequently, performance score will be improved as
well.
Control in Online Learning
For both Study 1 and 2, effects of interactivity and skill-challenge balance on control were
not supported. Although mean scores showed the direction consistent with our prediction, control
was not statistically different between two experimental conditions in Study 1. Similarly, the mean
scores of control for the personalizable difficulty group and the fixed difficulty group were
statistically same. Control is a feeling that one is in control of their own action and the interactions
at the moment (Koufaris, 2002) and is an important element of flow experience. However, it could
be that the students in all conditions felt equally in control of their actions because the context of
the experiment was online learning and they were left to explore the learning tools on their own.
Regardless of their experimental conditions, whether they were using the interactive tool or not,
or working on the activity with the personalizable difficulty levels or not, the students were given
the time, space, and the computer to play with the learning tool. Therefore, in the context of online
learning, control may be not as important as some other dimensions of flow. Consistent with this
logic, Fong et al. (2015), after analyzing 46 studies specifically investigating the relationships
between skill-challenge balance and flow, concluded that the skill-challenge balance effects on
flow is weakest in work or education contexts (vs. leisure or personal contexts). Fong et al. (2015)
also noted the skill-challenge balance seemed to be more important for older populations (i.e., aged
30 and over). This implies that personalization effects on feeling of control may be stronger for
older people. Because our study sample is a younger group of students in their early 20s, the effects
could have been attenuated.
Hedonic and Utilitarian Values
Consistent with previous research (Cheng, 2013a), the result of Study 1 supported that
students exposed to the interactive visual learning tools perceived the online activity more useful
(utilitarian value) and enjoyable (hedonic value) than those exposed to noninteractive visual
learning tools. The result confirms the critical role of human-computer interactivity in enhancing
students’ hedonic and utilitarian motivation to use online learning tools. It is important for online
instructional designers to utilize interactive online contents that are more useful and enjoyable,
which will cultivate learners’ involvement in learning.
As demonstrated in Study 2, students perceived higher level of hedonic value about the
online activity when they were able to balance the task-challenge level with their own skill level.
Students tend to enjoy the online learning activity more when they have personalizable options to
choose the challenge level than when they have no option. This result supports Cordova and Lepper
(1996) who found the significant impact of personalization and choice on students’ perceived
hedonic value (i.e., enjoyment) in the process of learning. Both Studies 1 and 2 demonstrate that
interactivity and personalizability play important roles in motivating students hedonically.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 220


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

Although no statistically significant difference was found in terms of perceived utilitarian value,
as revealed in the cell mean comparisons (see Table 1), students were likely to perceive the online
activity with personalizable options more useful than the one with the fixed option. More
importantly, both conditions showed high usefulness mean scores, indicating that students tend to
perceive the interactive online activity, whether it was personalizable or not, highly useful and
valuable for their learning productivity. Similarly, Wang et al. (2011) found that three levels of
animated interactivity (i.e., low to high interactivity) did not change students’ perceived usefulness
of the activity used. More importantly, students in all three interactivity treatment groups in Studies
1 and 2 showed higher perception scores than the control group (i.e., no interactivity group in
Study 1). Therefore, it is possible that the availability of dynamic visual interactivity contributes
more to students’ perceived utilitarian value than that of personalization options (or higher level
of interactivity). Results from two experiments suggest that online interactivity is a major
determinant of both hedonic and utilitarian values and achieving a skill-challenge balance with
personalizable options is also considered important for perceived hedonic value. This implies that
as long as dynamic interactivity exists in the online learning context, students would perceive such
activities as useful and enjoyable for e-learning process. However, for engagement and intrinsic
motivation, hedonic value can be particularly important. Therefore, online course designers are
advised to offer appropriate task challenge options based on learners’ inherent skill level to
enhance their interest in online learning process.
Limitations and Future Studies
Although this study contributes to the understanding of students’ learning process
established by flow experience in the context of online learning environment by employing real
online activities in two experimental studies and by measuring actual test scores upon the
completion of each activity, there are a few limitations to be addressed. To minimize effects of
other confounding factors (e.g., internet access/speed, computer specification and types, etc.) on
dependent measures, both studies were done in the lab setting with limited time given to students.
Therefore, interpretation and generalizability of the findings should be done with caution. It is
possible that results of this study would be slightly different from the current study when various
personal and situational factors (e.g., computer or mobile devices used, internet speed, time spent
on activity, other environmental factors, etc.) are introduced. Thus, replications of the current study
in various settings such as an online experiment in the future are necessary to understand combined
effects of various factors. Also, the current study used two versions of a single content learning
object to test the interactivity and personalization effects. Conducting studies with similar online
learning materials will be meaningful to test the robustness of the effects across multiple
interactive learning tools. In future studies, it is also important to examine how personal
differences in learning styles affect students’ performance and responses to this type of interactive
online learning activity, particularly with customizable options, because not everyone learns in the
same way.

Author Note
Correspondence related to this paper should be addressed to Young Ha, California State
University, Long Beach, [email protected]

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 221


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

References

Allen, I. E., Seaman, J., Poulin, R., & Straut, T. T. (2016, February). Online report card:
Tracking online education in the United States. Online Learning Survey.
https://onlinelearningsurvey.com/reports/onlinereportcard.pdf
Babin, B. J., Darden, W. R., & Griffin, M. (1994). Work and/or fun: Measuring hedonic and
utilitarian shopping value. Journal of Consumer Research, 20(4), 644–656.
Carroll, F., & Kop, R. (2016). Colouring the gaps in learning design: Aesthetics and the visual in
learning. International Journal of Distance Education Technologies, 14(1), 92–103.
Carrozzino, M., & Bergamasco, M. (2010). Beyond virtual museum: Experiencing immersive
virtual reality in real museums. Journal of Cultural Heritage, 11(4), 452–458.
Chang, H. H., & Wang, I. C. (2008). An investigation of user communication behavior in
computer mediated environments. Computers in Human Behavior, 24(5), 2336–2356.
Chen, R. (2018, March 11). How to cut high dropout rates of online courses. eLearning Industry.
https://elearningindustry.com/dropout-rates-of-online-courses-cut-high
Chen, B., Chang, Y. H., Ouyang, F., & Zhou, W. (2018). Fostering student engagement in online
discussion through social learning analytics. The Internet and Higher Education, 37, 21–
30.
Cheng, Y. (2013a). Roles of interactivity and usage experience in e-learning acceptance: a
longitudinal study. International Journal of Web Information Systems, 10(1), 2–23.
Cheng, Y. (2013b). Exploring the roles of interaction and flow in explaining nurses' e-learning
acceptance. Nurse Education Today, 33(1), 73–80.
Childers, T. L., Carr, C. L., Peck, J., & Carson, S. (2001). Hedonic and utilitarian motivations for
online retail shopping behavior. Journal of Retailing, 77(4), 511–535.
Cho, M. H., & Kim, B. J. (2013). Students' self-regulation for interaction with others in online
learning environments. The Internet and Higher Education, 17, 69–75.
Choi, D. H., Kim, J., & Kim, S. H. (2007). ERP training with a web-based electronic learning
system: The flow theory perspective. International Journal of Human-Computer Studies,
65, 223–243.
Chou, C. (2003). Interactivity and interactive functions in web-based learning systems: A
technical framework for designers. British Journal of Educational Technology, 34(3),
265–279.
Chou, C., Peng, H., & Chang, C. Y. (2010). The technical framework of interactive functions for
course-management systems: Students’ perceptions, uses, and evaluations. Computers &
Education, 55(3), 1004–1017.
Cordova, D. I., & Lepper, M. R. (1996). Intrinsic motivation and the process of learning:
Beneficial effects of contextualization, personalization, and choice. Journal of
Educational Psychology, 88(4), 715–730.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 222


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

Coyle, J., Mendelson, A., & Kim, H. (2008). The effects of interactive images and goal-seeking
behavior on telepresence and site ease of use. Journal of Website Promotion, 3(1/2), 39–
61.
Csikszentmihalyi, M. (1990). Flow: The psychology of optimal experience. Harper & Row.
Csikszentmihalyi, M., Abuhamdeh, S., & Nakamura, J. (2005). Flow. In A. J. Elliot & C. S.
Dweck (Eds.), Handbook of competence and motivation (pp. 598–608). Guilford.
Csikszentmihalyi, M., & Csikszentmihalyi, I. (1988). Optimal experience: Psychological studies
of flow in consciousness. Cambridge University Press.
Davis, F. D., Bagozzi, R. P., and Warshaw, P. R. (1992). Extrinsic and intrinsic motivation to use
computers in the workplace. Journal of Applied Social Psychology, 22(14), 1111-1132.
Domagk, S., Schwartz, R. N., & Plass, J. L. (2010). Interactivity in multimedia learning: An
integrated model. Computers in Human Behavior, 26(5), 1024–1033.
Jasmini, V. (2017, August 13). Online learning statistics and trends. eLearning Industry.
https://elearningindustry.com/online-learning-statistics-and-trends
Engeser, S., & Rheinberg, F. (2008). Flow, performance and moderators of challenge-skill
balance. Motivation and Emotion, 32(3), 158–172.
Esteban-Millat, I., Martinez-Lopez, F. J., Huertas-Garcia, R., Meseguer-Artola, A., &
Rodriguez-Ardura, I. (2014). Modeling students’ flow experiences in an online learning
environment. Computers & Education, 71, 111–123.
Etemad-Sajadi, R. (2016). The impact of online real-time interactivity on patronage intention:
The use of avatars. Computers in Human Behavior, 61, 227–232.
Evans, C., & Gibbons, N. J. (2007). The interactivity effect in multimedia learning. Computers &
Education, 49(4), 1147–1160.
Evans, C., & Sabry, K. (2003). Evaluation of interactivity of web-based learning systems:
Principles and process. Innovations in Education and Teaching International, 40(1), 89–
99.
Fong, C. J., Zaleski, D. J., & Leach, J. K. (2015). The challenge-skill balance and antecedents of
flow: A meta-analytic investigation. The Journal of Positive Psychology, 10(5), 425–446.
Fortin, D., & Dholakia, R. (2005). Interactivity and vividness effects on social presence and
involvement with a web-based advertisement. Journal of Business Research, 58(3), 387–
396.
Friedman, J. (2018, January 11). Study: More students are enrolling in online courses. U.S.
News and World Report. https://www.usnews.com/higher-education/online-
education/articles/2018-01-11/study-more-students-are-enrolling-in-online-courses
Fullagar, C. J., Knight, P. A., & Sovern, H. S. (2013). Challenge/skill balance, flow, and
performance anxiety. Applied Psychology, 62(2), 236–259.
Guo, Z., Xiao, L., van Toorn, C., Lai, Y., & Seo, C. (2016). Promoting online learners’
continuance intention: An integrated flow framework. Information & Management, 53,
279–295.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 223


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

Hoffman, D. L. & Novak, T. P. (1996). Marketing in hypermedia computer-mediated


environments: Conceptual foundations. Journal of Marketing, 60(July), 50–68.
Hoffman, D. L., & Novak, T. P. (2009). Flow online: Lessons learned and future prospects.
Journal of Interactive Marketing, 23, 23–34.
Huang, M-H. (2003). Designing website attributes to induce experiential encounters. Computers
in Human Behavior, 19, 425–442.
Karageorgakis, T. (2018, April 7). The importance of interactivity in eLearning programs.
eLearning Industry. https://elearningindustry.com/interactivity-in-elearning-programs-
importance
Kent, C., Laslo, E., & Rafaeli, S. (2016). Interactivity in online discussions and learning
outcomes. Computers & Education, 97, 116–128.
Kim, T., & Biocca, F. (1997). Telepresence via television: Two dimensions of telepresence may
have different connections to memory and persuasion. Journal of Computer-Mediated
Communication, 3(2). https://doi.org/10.1111/j.1083-6101.1997.tb00073.x
Koufaris, M. (2002). Applying the technology acceptance model and flow theory to online
consumer behavior. Information Systems Research, 3(2), 205–223.
Kucuk, S., & Richardson, J. C. (2019). A structural equation model of predictors of online
learners’ engagement and satisfaction. Online Learning, 23(2), 196–216.
Li, H., Daugherty, T., & Biocca, F. (2002). Impact of 3-D advertising on product knowledge,
brand attitude, and purchase intention: The mediating role of presence. Journal of
Advertising, 31(3), 43–58.
Liaw, S. S. (2008). Investigating students’ perceived satisfaction, behavioral intention, and
effectiveness of e-learning: A case study of the Blackboard system. Computers &
Education, 51(2), 864–873.
Liaw, S. S., & Huang, H. M. (2013). Perceived satisfaction, perceived usefulness and interactive
learning environment as predictors to self-regulation in e-learning environments.
Computers & Education, 60(1), 14–24.
Lim, J., & Ayyagari, R. (2018). Investigating the determinants of telepresence in the e-commerce
setting. Computers in Human Behavior, 85, 360–371.
Liu, S. H., Liao, H. L., & Pratt, J. A. (2009). Impact of media richness and flow on e-learning
technology acceptance. Computers & Education, 52(3), 599–607.
Low, A. L., Low, K. L. T., & Koo, V. C. (2003). Multimedia learning systems: A future
interactive educational tool. The Internet and Higher Education, 6(1), 25–40.
Luo, N., Zhang, M., & Qi, D. (2017). Effects of different interactions on students’ sense of
community in e-learning environment. Computers & Education, 115, 153–160.
Meyer, A., Klingenberg, K., & Wilde, M. (2016). The benefits of mouse keeping—An empirical
study on students’ flow and intrinsic motivation in biology lessons. Research in Science
Education, 46, 79–90.
Meyer, K. A., & Jones, S. J. (2013). Do students experience flow conditions online? Journal of
Asynchronous Learning Networks, 17(3), 1–12.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 224


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

Moreillon, J. (2015). Increasing interactivity in the online learning environment: Using digital
tools to support students in socially constructed meaning-making. TechTrends, 59(3), 41–
47.
Nel, D., van Niekerk, R., Berthon, J., & Davis, T. (1999). Going with the flow: Web sites and
customer involvement. Internet Research, 9(2), 109–116.
Oria, V. (2017, June 7). Lowering online student dropout rates. Inside Higher Ed.
https://www.insidehighered.com/digital-learning/views/2017/06/07/tools-lower-student-
dropout-rates
Ou, C., Joyner, D. A., & Goel, A. K. (2019). Designing and developing video lessons for online
learning: A seven-principle model. Online Learning, 23(2), 82–104.
Oudeyer, P-Y., Gottlieb, J., & Lopes, M. (2016). Intrinsic motivation, curiosity, and learning:
Theory and applications in educational technologies. Progress in Brain Research, 229,
257–284.
Ӧzyurt, Ӧ., & Ӧzyurt, H. (2015). Learning style based individualized adaptive e-learning
environments: Content analysis of the articles published from 2005 to 2014. Computers
in Human Behavior, 52, 349–358.
Paechter, M., Maier B., & Macher, D. (2010). Students’ expectations of, and experiences in e-
learning: Their relation to learning achievements and course satisfaction. Computers &
Education, 54(1), 222–229.
Pandey, A. (2017, May 30). How personalized e-learning engages learners-featuring a case
study. eLearning Industry. https://elearningindustry.com/personalized-elearning-engages-
learners-featuring-case-study
Park, J. (2011). Design education online: Learning delivery and evaluation. The International
Journal of Art & Design Education, 30(2), 176–187.
Rodriguez-Ardura, I., & Meseguer-Artola, A. (2016). E-learning continuance: The impact of
interactivity and the mediating role of imagery, presence, and flow. Information &
Management, 53, 504–516.
Rossin, D., Ro, Y. K., Klein, B. D., & Guo, Y. M. (2009). The effects of flow on learning
outcomes in an online information management course. Journal of Information Systems
Education, 20(1), 87– 98.
Skadberg, Y. X., & Kimmel, J. R. (2004). Visitors’ flow experience while browsing a web site:
Its measurement, contributing factors, and consequences. Computers in Human Behavior,
20, 403–422.
Sharp, L. A., & Hamil, M. (2018). Impact of a web-based adaptive supplemental digital resource
on student mathematics performance. Online Learning, 22(1), 81–92.
Sheiter, K., Schubert, C., Schuler, A., Schmidt, H., Zimmermann, G., Wassermann, B., Krebs,
M. C., & Eder, T. (2019). Adaptive multimedia: Using gaze-contingent instructional
guidance to provide personalized processing support. Computers & Education, 139, 31–
47.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 225


The Role of an Interactive Visual Learning Tool and its Personalizability in Online Learning: Flow Experience

Steuer, J. (1992). Defining virtual reality: Dimensions determining telepresence. Journal of


Communication, 42(4), 73–93.
Sun, J-N., & Hsu, Y-C. (2013). Effect of interactivity on learner perceptions in web-based
instruction. Computers in Human Behavior, 29(1), 171–184.
Van der Heijden, H. (2004). User acceptance of hedonic information systems. MIS Quarterly,
28, 695–704.
Wang, P-Y., Vaughn, B. K., & Liu, M. (2011). The impact of animation interactivity on novices’
learning of introductory statistics. Computers & Education, 56, 300–311.
Wei, H. C., Peng, H., & Chou, C. (2015). Can more interactivity improve learning achievement
in an online course? Effects of college students' perception and actual use of a course-
management system on their learning achievement. Computers & Education, 83, 10–21
Xu, D., & Wang, H. (2006). Intelligent agent supported personalization for virtual learning
environments. Decision Support Systems, 42(2), 825–843.
Yeh, Y. C., Rega, E. M., & Chen, S. Y. (2019). Enhancing creativity through aesthetics-
integrated computer-based training: The effectiveness of a FACE approach and
exploration of moderators. Computers & Education, 139, 48–64.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 226


Using Structured Pair Activities in a Distributed Online Breakout Room

Using Structured Pair Activities in a


Distributed Online Breakout Room
Jeffrey Saltz and Robert Heckman
Syracuse University

Abstract
With the increasing availability of synchronous video-based breakout rooms within online courses,
a growing need exists to understand how to best leverage this technology for enhanced online
education. To help address this challenge, this paper reports on a case study that explored student
activity within online video-based breakout rooms via a Structured Paired Activity (SPA)
methodology. SPA, which is adapted from the concept of Paired Programming, defines a general
way to structure roles and activities for the participants within the breakout room. Initial qualitative
results suggest that the use of SPA in online breakout rooms increases student engagement and
process effectiveness. These results are potentially applicable to a broad range of web-based
synchronous online courses.

Keywords: online education, synchronous distance learning, breakout rooms

Saltz, J., & Heckman, R. (2020). Using structured pair activities in a distributed online breakout
room. Online Learning, 24(1), 227-244. https://doi.org/10.24059/olj.v24i1.1632

Using Structured Pair Activities in a Distributed Online Breakout Room


With the continued growth of online education (Allen & Seaman, 2013), and the increasing
ability for instructors to use video conferencing tools to share computer screens and documents, a
growing need exists to understand how to best leverage these technologies in order to enhance
online education. One method of learning often available within this type of web-based learning
environment is a breakout room, a form of peer collaborative learning where students
synchronously work together in small groups. A breakout session is an active learning technique
designed to engage a small group in solution of a problem outside of the larger class meeting
(Lougheed et al., 2012). Breakout sessions have been a staple of face-to-face class sessions, and
more recently have been employed in both asynchronous and synchronous online courses (e.g.,
Chandler, 2016; Martin and Parker, 2014).
Collaborative learning benefits when using breakout sessions have been demonstrated in
many studies. These benefits include deeper learning, better grades, longer retention of
information, greater communication and teamwork skills, and a better understanding of the
professional environment in which students will work (Oakley et al., 2004). But Oakley and her
colleagues caution that these benefits are not automatic. Kuhn (2015) warns that “cognitive
collaboration with peers does not always yield identifiable benefits, and whether it does or not
appears to depend on who is learning what and under what conditions” (p. 46.) Others have

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 227


Using Structured Pair Activities in a Distributed Online Breakout Room

observed that students often have difficulties coordinating their interactions and achieving the
benefits of peer collaborative learning when left to their own devices (Hesse, Garsoffky, & Hron,
1997; cited by Weinberger, 2011). Hence, to achieve the benefits of peer collaborative learning,
instructors must create an effective classroom structure for teamwork. This challenge, of how to
design synchronous video-based breakout room student interaction, is especially acute since, in
this type of breakout room environment, the instructor may not able to actively monitor all the
breakout rooms at the same time. In the face-to-face classroom, an instructor can more easily
observe, at least at a high level, all the team interactions at once.
To address the challenge of how to effectively use such rooms, this paper explores one
approach to structuring the activities in online breakout rooms. Specifically, it reports on a case
study observing two semesters of an introductory data science course that used a structured
methodology within its virtual breakout rooms. This approach, described as a Structured Paired
Activity (SPA) methodology, is loosely based on Pair Programming (PP), in which two
programmers work together at one keyboard. SPA can be considered a form of a collaborative
script designed to provide learners with a specific socio-cognitive structure that maps their roles
and interactions (Weinberger, 2011), and thereby overcomes some of the difficulties observed in
unscripted peer collaboration.
This case study was done within a data science course. Data science integrates concepts
across a range of fields, including computer science, information systems, software engineering,
and statistics. It combines basic computer coding with iterative problem-based discussions to
understand the goals of the effort, the knowledge needed to reach the goals, and the best approach
to solving the problem at hand. For these reasons, a data science course is an appropriate domain
to evaluate the use of this more structured process. While this case study was done within a data
science course, an additional goal of this research was to understand the potential applicability of
SPA in breakout rooms to other domains.
The case study compared team behavior when using SPA to behavior when students were
left to their own devices on how to work in virtual breakout rooms. It also explored how graduate
data science students perceived the utility of using SPA in breakout rooms. Specifically, this
research focused on the following questions:
RQ1: How does student team behavior change when using SPA, compared with unscripted
collaboration?
RQ2: Do students perceive SPA as being a useful structuring mechanism within virtual
breakout rooms?
This paper begins by describing previous research related to breakout rooms. Then a
description of SPA as well as an explanation of the methodology used in the case study is
discussed. This is followed by a review of the findings from the case study. Finally, a concluding
discussion includes possible next steps and limitations.

Review of Related Literature


This section first reviews the general topic of distributed synchronous group learning,
which has been in existence for almost twenty years. Next, research with respect to the use of
video-based breakout rooms is discussed. This is followed by a review of pair programming and
more importantly, distributed pair programming.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 228


Using Structured Pair Activities in a Distributed Online Breakout Room

Emerging and Scripted Role Assignments in Distributed Synchronous Collaborative


Learning
There has been significant research on the use of distributed synchronous group learning,
much of it occurring when basic synchronous computer mediated communication technology was
first realized, approximately fifteen years ago. While there were many technical challenges, such
as network bandwidth limitations, these research efforts typically focused on how an instructor
should interact with a class during a synchronous online session. Of course, much of that computer
mediated communication was hindered by the lack of audio and video capabilities (Wang, 2004).
Despite these technology challenges, there was still a research focus on distance-based group
collaboration. Generally, case studies (e.g., Chen, Ko, Kinshuk & Lin, 2005), found that online
synchronous live instruction could be valuable to students. However, when examining
synchronous collaboration in a chat environment, Pfister & Mühlpfordt (2002) noted that “lack of
coordination and coherence among contributions is a typical problem” and found that establishing
scripts within the chat environment helped provide some structure and improved student learning.
As the technology improved, it was noted that students, while skilled at watching videos,
still lacked the knowledge of how to collaborate in a formal synchronous learning environment
(Cole, 2009). Perhaps even more important, Warden, Stanworth, Ren & Warden (2013) culminated
nine years of research evaluating synchronous learning environments and found that issues were
typically not due to technology, but rather, from human behavior, and observed that “while
students are familiar with virtual worlds and video meetings, they are inexperienced as virtual
learners.” Since scripts were shown to improve synchronous collaboration in the chat environment,
scripts may also be a useful approach to help students overcome their inexperience as online
collaborative learners using other synchronous technologies. Weinberger (2011) suggests that
scripts can help learners engage in activities that are related to knowledge construction, reduce
process losses in complex collaborative learning arrangements by taking over coordination tasks
not inherently related to learning, and can make learners aware of the different responsibilities
within the group and thereby facilitate beneficial motivational states and self-regulation.
One scripting approach that has shown promise is the use of scripted role assignments.
Within this context, roles are defined as stated functions and/or responsibilities that guide students’
behavior and group interaction, and scripted role assignments specify and externalize the roles
expected from learners during collaboration (Strijbos and Weinberger, 2010). In a study of
undergraduate students in an asynchronous environment, Olesova et al. (2016) found that scripted
roles were an effective strategy to improve both learning processes and outcomes. They randomly
assigned students in online discussions into one of three roles (starter, skeptic, wrapper) or no role
at all. Role assignments were rotated. They found that students demonstrated a higher level of
cognitive presence when assigned a role than they did with no role assignment. In an earlier study
of an asynchronous learning environment, Aviv et al. (2003) found that knowledge construction
and critical thinking reached their highest level when the learning network was more highly
structured. Other researchers (e.g., Schellens et al., 2005; DeWever et al., 2010) have found that
different roles have different impacts on knowledge construction, with the summarizer role in
online discussions having the most positive effect. Research on scripted role assignment has also
suggested the importance of rotating assigned roles (O’Donnell and Dansereau, 1992.)
There has been much discussion concerning the strengths and weaknesses of using scripts
to structure collaborative interaction in the computer supported collaborative learning (CSCL)
community. Kollar et al. (2006) and Weinberger (2011) have pointed out that the preexisting,

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 229


Using Structured Pair Activities in a Distributed Online Breakout Room

internal collaboration scripts may be in conflict with whatever scripted role assignments an
instructor might design. These internal, or emerging, scripts are evident and may be observed when
a collaborative activity is unscripted and allowed to proceed as the participants desire. Weinberger
(2011) cites three potential risks inherent in overscripting collaborative interactions. First, overly
constraining scripts can dampen student motivation (Rummel, Spada, & Hauser, 2009). Second,
externally provided scripts may also interfere with existing, well-functioning internal collaboration
scripts (Kollar, Fischer, & Slotta, 2007). Finally, externally provided scripts, may by their very
nature, interfere with self-regulated, playful, and exploratory thinking (Dillenbourg, 2002.) Thus,
it is important to observe and understand both scripted and emergent role assignments when
exploring role effects in distributed synchronous online breakout rooms.
Use of Breakout Rooms in Online Learning
Breakout rooms are increasingly used within online learning environments. For example,
Martin and Parker (2014) found that 25% of the surveyed online educators used breakout rooms.
In general, the use of breakout rooms encourages “learner-learner interaction,” which as noted by
Moore (1993), is a valuable resource for learning. Chandler (2016) found that breakout rooms are
useful for facilitating collaborative learning and interaction. Chandler noted that breakout rooms
provide distance-learning students with the opportunity for peer-to-peer contact, which can be
invaluable in building relationships and confidence. Some have argued that the effectiveness of
active learning techniques such as breakout rooms lies as much in the enhancement of engagement
as in the ability to generate in-depth exploration of the topic (Redish, Saul, & Steinberg, 1997).
Some educators have even begun to research the incorporation of “escape-room” narrative and
gamification to provide experiential structure to the use of breakout rooms. They claim two
benefits of adopting the escape room strategy: a clear problem-based structure for students,
combined with a higher level of engagement.
However, there has been minimal research exploring the pedagogical aspects of breakout
groups in face-to-face or online classrooms. Lougheed et al. (2012) reported that research about
the use of breakout groups in postsecondary education is sparse. They also reported that most of
the published literature pertaining to the use of breakout groups describes the feedback generated
during the breakout sessions rather than specific pedagogical elements of the breakout groups
themselves. This dearth of published information highlights the need for research related to factors
that affect their use in this context.
There has also been little research into online interaction during synchronous breakout
sessions (Brown, Schroeder, & Eaton, 2016). Two papers briefly discuss online breakout rooms,
but without any explicit focus on the viability of breakout rooms or the process to be used in the
breakout room. In one paper, Martin and Parker (2014) noted that using breakout rooms could
enhance interaction and build a sense of community. However, there was no examination on use
or the effectiveness of breakout rooms. Ellingson and Notbohm (2012) also discussed the use of
breakout rooms, but focused on the technical details, such as how to setup a breakout room. They
described breakout rooms as an “appealing feature,” but did not discuss any guidelines on how to
use the breakout rooms, nor did they report on any observations of use of breakout rooms.
There are also some indications that breakout rooms do not always magically create
engagement and higher levels of learning. Blackstone and Oldmixon (2016) found that students in
a breakout from a lecture class were not more satisfied and did not succeed at higher levels
compared to their peers in a lecture-only class. Lougheed et al (2012) found that higher-GPA

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 230


Using Structured Pair Activities in a Distributed Online Breakout Room

students had a significantly less favorable response to the use of breakout groups than did their
lower-GPA colleagues. In these studies, researchers speculated that possible reasons for these
findings were that students may not have been clear about what they were supposed to gain from
the breakout group sessions, or that some (e.g., high-GPA students) may have found that the
structure of the sessions did not meet their needs.
These concerns are consistent with Kuhn’s (2015) critique, who argues that results of
collaborative learning are often precarious, and it therefore should not be considered a “silver
bullet.” Kuhn argues that without careful design attention to the nature of the task or problem, and
specification of the learning goals expected, the outcome of any collaborative learning intervention
is likely to be unpredictable. Thus, what little research exists on the subject of synchronous
distributed breakout rooms suggests that much more attention needs to be paid to the pedagogical
structures and scripts used to prepare students to use them.
Distributed Pair Programming
Pair Programming (PP) is an agile software development technique that is part of Extreme
Programming (XP). When using PP, two developers work together, side-by-side, at one keyboard.
One person, “the driver,” types at the keyboard. The other person, “the observer,” reviews each
line as it is typed, checking for errors and thinking about the overall design (McDowell et al.,
2002). Distributed Pair Programming (DPP) is pair programming with the two programmers
working at a distance via online tools (Hanks, 2005). Pair programming is thought to provide
several benefits, including fewer errors in the code, enhanced ability to share best practices, faster
team learning, and social support that improves morale.
Research in DPP within an educational context has typically reported on the use of DPP
when the students have been able to build a relationship within a face-to-face context. Early
research with respect to DPP, such as Stotts et al. (2003), used students within a face-to-face class
to compare the results of DPP and PP. Even though the technology used was not as advanced as
what is possible today, in those early experiments, DPP was shown to have a positive impact on
outcomes, similar to PP. In more recent research, Tsompanoudi et al. (2016) implemented a system
that supports the application of DPP within an interactive development environment (IDE), and
found that the use of collaboration scripts, defined to implement DPP, yields improved results,
such as improved student learning. Like many of the earlier studies, their experiment was for a
face-to-face class that used DPP, not for a distributed team using DPP. In fact, in a review of DPP
research, Estácio (2015) notes that while there have been 34 articles discussing DPP, these papers
have primarily covered tools to support DPP, or reported on experiments where a face-to-face class
uses DPP (e.g., Stotts, 2003; Tsompanoudi et al., 2016), and that “few studies explore DPP as a
pedagogical tool and how DPP could be integrated with the trend of online courses.”
Overall researchers have not often explored DPP when the students were not taking a
colocated, face-to-face class. This distinction is important, since colocation enables students to
establish a connection in a face-to-face context and then use online tools to do DPP. This gap in
the research has also been noted by Edwards et al. (2010), who called for more comprehensive and
intensive investigation into the power of pair programming when used within purely online
courses.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 231


Using Structured Pair Activities in a Distributed Online Breakout Room

Motivation for the Study


Gaps in the literature reviewed above provide three dimensions of motivation for this study.
First, scripted collaboration role assignments have primarily been studied in asynchronous
environments. This prior research suggests that there are potential benefits and risks that should
be also be explored in synchronous environments, especially in the comparison of scripted versus
naturally emerging role behavior. Thus, this study observes role behavior in both emergent and
scripted situations. Second, while technology advances have made the use of distributed online
breakout rooms more common, research on the pedagogical structures supporting their use has
been sparse. Finally, while the use of the Distributed Pair Programming concept provides a
potential model for breakout room role assignment, there has been little research exploring its use
in purely online courses. Thus, this study explores the impact of using a Structured Pair Activity
(SPA) methodology for scripted role assignment on students’ collaborative behavior in distributed
online breakout rooms.

Methods
Pair programming concepts were used to develop the SPA scripts, which structured student
collaboration in breakout rooms during an online data science course.
The impact of using SPA within breakout rooms was explored via a case study. Merriam
(1988) indicated that a case study should have a bounded system that can be identified as the focus
of the investigation. This study examines the process of using SPA within synchronous online
breakout room sessions, where students have access to video conferencing, chat, and the sharing
of files.
Case Study Context and Setting
SPA was evaluated within two one-semester sections of an online graduate-level
introduction to data science course. In addition to the class’s asynchronous activities, the course
also met in a synchronous online session weekly at a specific day and time. For part of each
synchronous session, students worked in two-person teams using breakout rooms. Over the two
semesters, 26 graduate information system students participated in the study. Students were
randomly assigned into teams of two people for work in the breakout rooms. Twelve students (six
teams) were in the first semester’s class and 14 students (seven teams) were in the second
semester’s class. The same breakout teams were used across the entire semester. The students had
a wide variety of educational and career backgrounds. Twenty-five percent of the participants were
female. The students were geographically distributed across multiple time zones, with students
participating from North America, Europe, and the Middle East. Eighty percent of the students had
a STEM-focused undergraduate degree. Finally, 92% of the students had full-time jobs. The
instructor, a coauthor of this research, was the same for each of the two semesters and had
previously taught the data science course many times.
Each week, over an eight-week period, there was a different breakout-room assignment.
Five of the assignments were programming assignments. In them, students were required to use
the R programming language, a popular data science tool that is used in both industry and
academia. For these assignments, the student teams were expected to do R programming, using
typical data science techniques such as machine learning algorithms and geographic information
analysis. For two of the assignments, the work focused on a more qualitative task that required

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 232


Using Structured Pair Activities in a Distributed Online Breakout Room

students to document the result of the team’s discussion. The final assignment was the creation of
a client presentation visualizing the results of the data analyses and documenting the outcomes that
would likely be actionable by their client. The sequence of these breakout assignments is shown
in Table 2. The synchronous sessions were 90 minutes long and typically the students were in
breakout rooms for 40 to 50 minutes. The technology used for the synchronous sessions was
similar to that described by Martin & Parker (2014) and included video conferencing, chat, screen-
sharing, and the sharing of documents. Each virtual breakout room was equipped with similar
tools.
Over the two-semester period, 104 breakout sessions were monitored. In each semester,
the first four breakout sessions used a baseline condition where the instructor provided the
assignment to be done in the breakout room but provided little guidance with respect to how the
students should collaborate. These sessions provided an opportunity to observe naturally-
occurring, or emergent, role behavior. Three of these four breakout sessions focused on coding
tasks. For the following four breakout sessions, two of which were primarily coding tasks, the
students used SPA. Thus, these sessions provided an opportunity to observe the impact of scripted
role assignments.
Structured Pair Activity
For the first four sessions, the process used within the breakout room was left to the
students. There was no specific process defined for them to use. Based on industry best practices
(McKinnie, 2018), as well as the lack of identified research addressing how to use breakout rooms,
this baseline condition, with unscripted role assignments, appears to be a common practice for
many instructors that use breakout rooms. SPA was then used to provide a structure of scripted
role assignments for the following four weeks.
Before the first use of SPA, the SPA process was explained to students via discussions and
a documented presentation. Specifically, SPA role assignments were described to the students in
terms of the following key concepts:
• Within each breakout room, there was one driver (the person that had control of the shared
screen and was typing within a shared document). The second student was the active
observer that, via the shared screen, saw what was being written by the driver (R
programming code or other documents such as a PowerPoint presentation). These roles
were explained to the students.
• Drivers were instructed with the following scripted role assignment:
When you're the driver:
Agree with your partner on one tiny goal at a time, something you can complete
within a few minutes.
State the problem in words.
Talk to your partner!
Ensure that you both know what you are working on right now.
Complete the current tiny task (e.g., coding goal, presentation text, etc.) as quickly
as you can.
Ignore larger issues (but note them out loud).
Trust the observer to be your safety net.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 233


Using Structured Pair Activities in a Distributed Online Breakout Room

• Observers were instructed with the following scripted role assignment:


When you're the observer:
Read what the driver is writing as he or she writes it; evaluate it for accuracy.
Your job is to review and think how it fits into the larger picture.
Pay total attention, aiming to let nothing get by you.
Think about possible issues and ways to simplify.
Bring up issues directly related to the tiny task
Wait until the current tiny goal is done to bring up larger issues and ideas for design
improvement.
Don't dictate—driver should be actively thinking about how to achieve the current
tiny task, not just typing.
Exploit the fact that you don't need to focus on the details.
• All students were encouraged to be actively engaged with each other, to share their
thoughts and ideas, and to ask questions.
• Students were instructed to frequently rotate roles between driver and observer, with a goal
of rotating every fifteen minutes.
Evaluating the Impact of SPA
To evaluate the impact of SPA, the research adapted Hackman’s team effectiveness model
(1987). This model, shown in Figure 1, states that to evaluate the effectiveness of a team process,
one should observe task process and output, the team’s continued desire to work together and the
satisfaction of individual team members.

Task Process and Output

Satisfaction of individual team


Process Effectiveness
members

Willingness of team to work


together on future tasks

Figure 1. Evaluating the effectiveness of a process (adapted from Hackman, 1987).

To evaluate the model shown in Figure 1, multiple data sources were used, which is
consistent with Eisenhardt (1989). First, since an instructor was able to easily move between the
breakout rooms unobtrusively, systematic instructor observations provided insight into how the
teams were working together, the group dynamics within each team, and the ways the scripted role
assignments affected team behavior (answering RQ1). Students were informed that the instructor
would periodically observe their interactions. During each breakout session, the instructor
systematically moved through each of the different breakout rooms, observed the student teams in
each breakout room, and documented those observations. Each room was observed for 3–5 minutes
at a time and each room was visited 2–3 times per class session.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 234


Using Structured Pair Activities in a Distributed Online Breakout Room

The observations of student behavior patterns of were focused through a set of specific
questions: who is leading the conversation? Are students equally participating in the dialog? Does
the team appear to be productive and effectively working toward completion of the task? Are there
indications of expert-novice conditions (or experience gaps)? In sessions where SPA was used, the
instructor added an additional question: are students rotating roles? These systematic observations
provided a qualitative view of task process and output for each team.
Student satisfaction with SPA (RQ2) was explored through a three-item student
satisfaction scale, which had a reliability, or internal consistency, of 0.94 (based on Cronbach’s
alpha). The scale consisted of the following three items:
I want to use SPA for future small group assignments.
SPA was useful for our work.
I am satisfied when using SPA.
The survey also included an open-ended question:
What were the strengths and weaknesses of the SPA breakout room process?
The survey was given to students at the end of each semester as part of a voluntary course
evaluation process. Response rate was 69%. Consistent with IRB review guidelines, student survey
participation was voluntary and students were informed that survey results could be used, in an
anonymous fashion, as part of an ongoing pedagogical research project.
Finally, an indicator of students’ willingness to work together on future tasks (RQ2) was
obtained when students had the opportunity to reform teams for a subsequent project. Table 1 maps
the data sources to the key measures defined in our model to evaluate the effectiveness of SPA.

Table 1
Measuring Team Effectiveness

Key Measures How Measured

Task process and output Instructor Observations

Satisfaction of individual team members Student Survey

Willingness of team to work together on future tasks Selection of (new) project team members

Results
Task Output and Team Process
An assessment of the effects of SPA on team process was made based on a systematic,
week-by-week observation of the student teams, as described above. The weekly observations
before the introduction of SPA are summarized in Table 2. Initial analysis determined that there
were similar results in both semesters in the sense that there were codeveloper teams and one-
person-dominant teams at the outset, and the one-person dominant teams modified their behavior

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 235


Using Structured Pair Activities in a Distributed Online Breakout Room

similarly after the introduction of SPA. As a result, results from the two semesters have been
combined into a single presentation.

Table 2
Weekly Observations of Team Process Prior to the Introduction of SPA
Week Assignment Observations
1 Discussion of a real- • Pairs were mostly polite with each other.
world situation and • All teams began discussions of “How should we proceed?”
how it could use data • 7 teams seemed to have a more talkative person, who appeared
science to lead or dominate the discussion.
2 R Coding • The teams exhibited three distinct patterns of role behavior:
• 7 Teams: One-Person-Dominant. One person, seemingly the
most experienced, was the dominant person and did all the
coding (this week was a coding assignment). In most of these
teams, the other person was quiet and relatively uninvolved.
• 4 Teams: Codevelopers. Both team-members contributed
equally, cutting and pasting code to each other via the chat
function. Neither dominated the interaction. They appeared to
have clear emergent role expectations, or internal collaboration
scripts, that were compatible.
• 2 Teams: Looking-For-Guidance. These two teams were
continuously asking the instructor what to do next. Neither
person was dominant, but they did not appear to have a
functional emerging collaboration script.
3 R Coding • The two teams that had previously asked for help migrated to the
One-Person Dominant strategy. This left:
• 9 One-Person-Dominant teams: In most of these teams, the
non-dominant person continued to be relatively uninvolved,
trying to understand what the more experienced partner was
doing.
• 4 Codeveloper Teams: These teams continued to work
effectively. Their internal collaboration scripts were active and
functional.
4 R Coding • The same pattern continued as in week 3. Roles had become
normalized into the two basic emergent role scripts: One-Person-
Dominant (9) and Codeveloper (4).
• The less-experienced person in One-Person-Dominant teams
remained relatively uninvolved, and there was a growing gap in
their level of knowledge, since the “doers” were learning more
while doing. Thus, this emergent script was not producing the
desired learning outcomes for these individuals.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 236


Using Structured Pair Activities in a Distributed Online Breakout Room

Observations on the Unscripted Sessions.


Across both semesters, in the first week, it was observed that the student’s use of the
breakout room was often a bit awkward. For example, students did not know each other well and
did not want to “step on the other person’s toes.” Since this was the first week of the course, this
could be explained due to the fact that the students did not know each other well, and hence, had
to develop a social connection, especially since they were only connected via computer mediated
communication. Unfortunately, during the following three unscripted sessions, only four teams
were perceived to work effectively (the co-developer teams). The dynamics between the students
during these unscripted weeks appears to have been driven by a number of factors, such as how
outgoing the people were and how much knowledge each person had with respect to the
assignment. Hence, often times, the more outgoing and/or knowledgeable person dominated the
two-person discussion.
The weekly observations after the introduction of SPA are summarized in Table 3. From
these weekly observations, four key themes emerged that suggests that task process improved,
which are discussed in the rest of this section.

Table 3
Weekly Observations of Team Process After to the Introduction of SPA
Week Assignment Observations
5 Discussion of a real- • There was some initial confusion on the roles and how to “rotate”
world situation and who was “driving”.
how it could use data • The instructor clarified questions and encouraged teams to swap
science who was driving and who was observing.
• Switching roles was technologically challenging due to the
limitations of the platform.
6 R Coding • Most teams started to get the hang of SPA. They figured out
workarounds to more easily switch who was driving (e.g., using
Google Drive or emailing files).
• Teams started to become more productive and got into a rhythm of
doing work.
• In seven of the nine original One-Person-Dominant teams (often
due to an experience imbalance,) the less experienced person
clearly was more engaged and doing more. The amount of
discussion was greater this week compared to last week.
• Two of the original One-Person-Dominant teams were still unable
to swap driver/observer roles, and in these teams, the observer
remained fairly uninvolved. One of these teams made no effort to
switch roles.
• Some observers expanded their role to do outside research (e.g.
they looked for solutions to problems in websites like Google or
Stack Overflow). The two original Codeveloper teams in the first
semester were the leaders in this role expansion. They modified
their previous co-equal collaboration scripts to include observer
research while the driver was doing the writing/coding (this
became an “active researcher/observer” role).

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 237


Using Structured Pair Activities in a Distributed Online Breakout Room

Week Assignment Observations


7 R Coding • At this point, there were only two One-Person-Dominant teams
remaining. Eleven were classified as Codeveloper teams.
• The active researcher / observer role spread further in week 7, with
more observers becoming proactive in searching for answers using
external resources.
• As teams become more comfortable, the teams seemed to be better
at decomposing work into smaller tasks (short bursts of work).
This was perhaps due to the need to switch roles and their
improving experience in being able to switch roles.
• Students still did not switch roles as frequently as the SPA
instructions called for (every 15 minutes). Actual switching time
was approximately 25 minutes.
• One team was still unable to switch roles.
8 Creation of a • Similar to week 7, teams were fairly predictable in how they were
presentation with interacting.
visualization of • Student-to-student engagement and dialogue continued to increase.
findings • By the end of this week, only one team was still struggling to swap
driver/observer. The observer on that team remained uninvolved.

Observations on the Sessions with SPA Scripted Role Assignments.


Improved team coordination and focus when using SPA. The dynamics within the
breakout rooms changed when students were introduced to SPA. The nondominant person in the
One-Person-Dominant teams started to be more productive. For example, it was noted that these
teams, when using SPA, “would quickly determine who was the driver, and what was their short-
term goal.” SPA seemed to provide two key advantages. First, it provided a framework where
being a leader was divided between two roles: doing the writing (the driver) and doing the
brainstorming (the observer). This was helpful for the originally One-Person-Dominant teams (the
definition and switching of the roles helped to balance the dominance). Thus, SPA provided a
framework for the observer to be more active and for both students to have well-defined roles. In
general, it was observed that there was more two-way dialog (due to the active nature of the
observer) under SPA as compared to the baseline condition.
Expanded observer responsibilities. During the first semester, the instructor observed
that in both of the initial codeveloper teams, the person in the SPA observer role often started to
work on tasks beyond what was suggested for the observer. Specifically, the students who were
observers would sometimes start to actively look for solutions (via websites such as Stack
Overflow or a specific data science website). These students then shared their insight with their
driver so that the driver could leverage that insight. The questions addressed by the observer when
doing the searching ranged from specific coding details (such as the parameters of a specific R
function) to much more conceptual open-ended questions (such as how one might handle missing
data). Since it was believed that this type of active research improved the team effectiveness, the
description of the observer was expanded to include this type of activity in the second semester.
This addition did not change the basic pattern of unscripted versus scripted role behavior in the
second semester. In the first semester, there were two original codeveloper teams and four original
one-person-dominant teams by week three. In the second semester there were two original

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 238


Using Structured Pair Activities in a Distributed Online Breakout Room

codeveloper teams and five original one-person-dominant teams by week three. In both semesters,
all but one of the One-Person-Dominant teams evolved to become Codeveloper teams.
Role-switching difficulty. Even though the frequency of role rotation increased under
SPA, the teams did not rotate between driver and observer at the frequency suggested. Specifically,
the SPA instructions suggested that students rotate every fifteen minutes. However, most teams
rotated at a rate of approximately once every twenty-five minutes (i.e., one rotation within the
breakout session). This decrease in role-switching was at least partly due to the technology being
used, in that switching roles was not seamless. For example, files needed to be explicitly “uploaded
and then downloaded” from one student to the other student.
Increased student engagement. An unexpected observation was that, later in the course,
there was a perceived increase in student engagement (i.e., questions to the instructor, dialog
between students) compared to the first half of the course and to other course sections that were
offered in previous semesters (course sections that used unscripted breakout rooms but did not use
SPA). This might have been due to the observed bonding that occurred within the SPA-breakout
sessions, where the social sharing of information was much greater than what occurred when using
a more traditional breakout room process. In other words, using a more well-defined breakout
room process might have improved team bonding due to the structure of alternating who was is
“in charge” (i.e., the person typing at the keyboard).
Team Member Satisfaction
To explore student satisfaction, the three-item student satisfaction scale, described above
was used. The voluntary survey was administered at the end of each term and the response rate
was 69%. The average student response for this scale was 4.4, suggesting that students were
relatively satisfied with SPA.
The open-ended qualitative feedback, within the same survey, was analyzed to more deeply
explore the drivers of student satisfaction when using SPA. Three key themes emerged that seemed
to drive their satisfaction. These themes are described below:
Improved learning. Students thought that their learning improved when using SPA. This
improved learning was driven by better insight shared between the partners. For example, one
student stated “I got to learn more by working with my partner in this way.”
Improved coordination & collaboration. Since a key goal of SPA is to improve
coordination between the two students, it was not surprising that several students noted that they
thought that SPA improved coordination, which often led to a feeling of improved collaboration.
For example, “it helped me coordinate with my partner” and “it allowed us to collaborate much
easier” were statements that exemplified how the students perceived their improved collaboration
when using SPA. However, one student did note a disadvantage to using SPA, in that “some people
are hard to keep on track, or are very rigid in needing control.” Note that this last feeling could
have been instilled during the first four sessions. In any event, this personality trait might suggest
that additional initial discussion with respect to working in a team is required prior to the use of
breakout rooms.
Improved productivity. Students also focused on their perceived improved productivity.
For example, one student noted that “we were most productive during class time when we used
SPA versus on our own when we did not.” This productivity was also aided by the fact that students
thought it was easy to work with their partner, perhaps due to the structured dialog with using SPA.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 239


Using Structured Pair Activities in a Distributed Online Breakout Room

For example, one student simply noted that SPA “Made it easy to work with someone else remote”
and another stated “It was an easy way to work with my partner.”
Willingness to Work Together on Future Projects
In terms of the students’ willingness to work together on future projects, after the four SPA
breakout sessions, the students had to form a project team to work on an end-of-the semester
project. Students were given the opportunity to stay in their current “breakout team” or select
different team members (with or without the help of the instructor). Ninety-two percent of the
students wanted to continue working with their breakout team member, and the others did not
strongly object to staying with their current breakout team. While this could have been driven by
students being comfortable with the status quo and not wanting to risk working with a “bad”
partner, it nevertheless does show that the students were at least not frustrated with the current
partner. Hence, there was a clear favorable response with respect to the students’ desire to continue
to work together on future projects.

Discussion
This paper defined a process, Structured Paired Activity (SPA), for use within breakout
rooms of an online course. A case study was performed to explore the effects of using SPA within
a breakout room. Systematic observations suggested that SPA was a useful way to provide
structure within breakout rooms and positively modified student behavior (thus addressing the first
research question). In addition, students also thought that SPA was a useful way to provide
structure (addressing the second research question). Furthermore, based on the fact that (1) task
output was thought to improve, (2) team members were very satisfied while using SPA, and (3)
the students wanted to continue working with their teammate, our model of process effectiveness
suggests that SPA was an effective intervention. One additional finding was that the use of
breakout rooms seems to have enabled learning via a social and constructive process. This
connectedness was evident via increased student-to-student interaction during class as well as
increased student-to-student communication outside of class.
Prior to the introduction of SPA, there was clearly a mismatch between the internal
collaboration scripts possessed by nine of the thirteen teams and the requirements of the virtual
breakout rooms. Since there was no instruction on how to use the breakout rooms, it is not
surprising that these teams experienced a momentary lack of support (underscripting) as described
by Dillenbourg (2002). Because of the scarcity of research on breakout rooms (virtual or face-to-
face) described by Lougheed et al. (2012), it is difficult to know how often students are provided
with little or no structure to guide collaboration in real breakout-room environments. But personal
experience suggests that such underscripting may not be uncommon. The introduction of SPA
alleviated this underscripting in all but two of the teams. The results support the idea that at least
some of the problems previously observed in breakout rooms (e.g., lack of success and satisfaction
(Blackstone and Oldmixon, 2016;) dissatisfaction in higher-GPA students (Lougheed et al., 2012))
can be attributed to lack of structure and underscripting.
It is also interesting to consider the four codeveloper teams that began the course with
seemingly effective internal collaboration scripts. Some research has suggested that external
scripts may interfere with previously effective internal collaboration scripts (Weinberger, 2011;
Kollar, Fischer, & Slotta, 2007). In this case, however, these student teams not only adopted the

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 240


Using Structured Pair Activities in a Distributed Online Breakout Room

SPA script, but also positively modified it by expanding and enriching the SPA observer role. The
concerns expressed by Dillenbourg (2002), that externally provided scripts, may by their very
nature interfere with self-regulated, playful, and exploratory thinking, were not evident in this
exploration of SPA. Thus, the SPA script seems robust and flexible enough to avoid both
overscripting and underscripting.
Ideally, the provision of an external collaboration script is intended to achieve several
different outcomes. First, the goal is to regulate learning activities and provide complementary
process knowledge that leads to more effective team performance (Weinberger, 2011). SPA
appears to have achieved this goal. Collaboration scripts also hope to increase both individual and
shared domain knowledge. While this appears to be the case in this study, the qualitative design
provides no direct evidence. Future studies of SPA should develop explicit measures of individual
and team learning to study the learning effects of SPA. Designs such as those used by Kuhn (2015)
would be beneficial. This is especially germane since this study revealed expert-novice experience
gaps in a number of the teams. Such gaps may be common in many types of courses. Finally,
instructor-provided collaboration scripts are intended to help students learn how to collaborate
more effectively in the future; that is, the ultimate goal is that students will gradually transition
from external to internal collaboration scripts. This study indicated that students found SPA to be
useful. Future research should investigate how much of the SPA collaboration script is
internalized.
While there were over one hundred breakout sessions observed, there were only two classes
in this case study, and each had a low number of students in the course. Hence, one limitation is
the small sample size, in terms of the number of students and number of courses in the study.
Another limitation of this study’s design was, as mentioned above, the lack of a direct measure of
learning. In addition, SPA was compared to a straightforward no-script alternative. While this
alternative may reflect reality in a number of classrooms, a possible next step could be to explore
the value of using the SPA process versus breakout rooms with different structuring
methodologies.
While data science was an interesting class to evaluate SPA (since the assignments ranged
from open-ended discussions to more structured programming tasks), it would be interesting if
other types of courses evaluated SPA. For example, more discussion-focused courses could be
explored to better understand the strengths and weaknesses of SPA in other contexts. Kuhn’s
(2015) probing discussion of the types of skills best suited to collaborative learning (e.g.,
argumentation skills, inquiry skills) can be useful in guiding this future expanded research. In our
study, inquiry skills were clearly needed, as students labored to develop new ways of approaching
problems in data analytics. Research in other domains would help us to understand if some tasks
are better suited for using this methodology (or, in general, if some tasks are better suited for
breakout rooms).
In summary, this case study suggests that when students use the scripted SPA role
assignments in a video-enabled web-based breakout room, student process, productivity,
motivation and connectedness to other students improve. While additional research on how to best
structure student interaction in breakout rooms is required, this research indicates that the practice
of just sending students into a breakout room without much structure is not ideal.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 241


Using Structured Pair Activities in a Distributed Online Breakout Room

References
Allen, I. E., & Seaman, J. (2013). Changing course: Ten years of tracking online education in
the United States. Sloan Consortium.
Blackstone, B., & Oldmixon, E. (2016). Assessing the effect of breakout sessions on student
success and satisfaction. PS: Political Science and Politics, 49(1), 117–121.
Brown, B., Schroeder, M., & Eaton, S. (2016). Designing synchronous online interactions and
discussions. IDEAS 2016: Designing for Innovation Selected Proceedings.
http://dx.doi.org/10.11575/PRISM/5325
Chandler, K. (2016). Using breakout rooms in synchronous online tutorials. Journal of
Perspectives in Applied Academic Practice, 4(3), 16–23.
Chen, N. S., Ko, H. C., Kinshuk, & Lin, T. (2005). A model for synchronous learning using the
Internet. Innovations in Education and Teaching International, 42(2), 181–194.
Cole, M. (2009). Using Wiki technology to support student engagement: Lessons from the
trenches. Computers & Education, 52(1), 141–146.
De Wever, B., Van Keer, H., Schellens, T., & Valcke, M. (2010). Roles as a structuring tool in
online discussion groups: The differential impact of different roles on social knowledge
construction. Computers in Human Behavior, 26(4), 516–523.
Dillenbourg, P. (2002). Over-scripting CSCL: The risks of blending collaborative learning with
instructional design. In P. A. Kirschner (Ed.), Three worlds of CSCL: Can we support
CSCL? (pp. 61–91). Open Universiteit Nederland.
Edwards, R. L., Stewart, J. K., & Ferati, M. (2010). Assessing the effectiveness of distributed
pair programming for an online informatics curriculum. ACM Inroads, 1(1), 48–54.
Ellingson, D. A., & Notbohm, M. (2012). Synchronous distance education: Using web
conferencing in an MBA accounting course. American Journal of Business Education,
5(5), 555–562.
Eisenhardt, K. (1989). Building theories from case study research, Academy of Management
Review, 14(4), 532–550.
Estácio, D., José, B., & Prikladnicki, R. (2015). Distributed pair programming: A systematic
literature review. Information and Software Technology, 63, 1–10.
Hackman, J. (1987). The design of work teams. In J. Lorcsh (Ed.), Handbook of Organizational
Behavior (pp. 315–342). Prentice Hall.
Hanks, B. (2005). Student performance in CS1 with distributed pair programming. ACM SIGCSE
Bulletin, 37(3), 316–320.
Hardin, J., Hoerl, R., Horton, N. J., & Nolan, D. (2014). Data science in the statistics curricula:
Preparing students to “Think with Data.” Retrieved from https://arxiv.org/abs/1410.3127
Hesse, F. W., Garsoffky, B., & Hron, A. (1997). Interface-design für computerunterstütztes
kooperatives Lernen [Interface design for computer supported cooperative learning]. In

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 242


Using Structured Pair Activities in a Distributed Online Breakout Room

L. J. Issing & P. Klimsa (Eds.), Information und Lernen mit Multimedia [Information and
learning with multimedia] (pp. 253–67). Beltz.
Kollar, I., Fischer, F., & Slotta, J. D. (2007). Internal and external scripts in computer-supported
collaborative inquiry learning. Learning and Instruction, 17(6), 708–721.
Kollar, I., Fischer, F., & Hesse, F. W. (2006). Collaboration scripts: A conceptual analysis.
Educational Psychology Review, 18, 159–185.
Kuhn, D. (2015). Learning together and alone. Educational Researcher, 44(1), 46–53.
Lougheed, J., Kirkland, J., & Newton, G. (2012). Using breakout groups as an active learning
technique in a large undergraduate nutrition classroom at the University of Guelph. The
Canadian Journal for the Scholarship of Teaching and Learning, 3(2), 1–15.
Martin, F., & Parker, M. A. (2014). Use of synchronous virtual classrooms: Why, who, and how?
Journal of Online Learning and Teaching, 10(2), 192–210.
McKinnie, R. (2018). Best practices for delivering virtual classroom training. Adobe. Retrieved
December 20, 2018, https://www.elearningguild.com/showfile.cfm?id=3159
Merriam, S. (1998). Qualitative research and case study applications in education. Jossey-Bass.
McDowell, C., Bullock, H., Fernald, J., & Werner, L. (2002). A study of pair-programming in an
introductory programming course. In Proceedings of the 33rd ACM SGICSE (pp. 38–42).
ACM.
Mellody, M. (2014). Training students to extract value from big data. Summary of a Workshop,
The National Academies Press. http://www.nap.edu/openbook.php?record_id=18981
Moore, M. J. (1993). Three types of interaction. In K. Harry, M. John, & D. Keegan (Eds.),
Distance education theory (pp. 19–24). Routledge.
Nicholson, S. (2016). The state of escape: Escape room design and facilities. Paper presented at
Meaningful Play 2016, Lansing, Michigan. Retrieved from
http://scottnicholson.com/pubs/stateofescape.pdf
Oakley, B., Felder, Felder, R., Brent, R., Elhajj, I. (2004). Turning student groups into effective
teams. Journal of Student-Centered Learning, 2(1), 9–34.
O'Donnell, A. M., & Dansereau, D. F. (1992). Scripted cooperation in student dyads: A method
for analyzing and enhancing academic learning and performance. In R. Hertz-Lazarowitz
& N. Miller (Eds.), Interaction in cooperative groups: The theoretical anatomy of group
learning (pp. 120–144). Cambridge University Press.
Olesova, L., Slavin, M., & Lim, J. (2016). Exploring the effects of scripted roles on cognitive
presence in asynchronous online discussions. Online Learning, 20(4), 34–53.
O’Neil, M. (2014). As data proliferate, so do data-related graduate programs, The Chronicle of
Higher Education. Retrieved from https://www.chronicle.com/article/As-Data-
Proliferate-So-Do/144363
Pfister, H. R., & Mühlpfordt, M. (2002). Supporting discourse in a synchronous learning
environment: The learning protocol approach. In Proceedings of the Conference on

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 243


Using Structured Pair Activities in a Distributed Online Breakout Room

Computer Support for Collaborative Learning: Foundations for a CSCL Community,


581–582. Erlbaum.
Redish, E., Saul, J., & Steinberg, R. (1997). On the effectiveness of active-engagement
microcomputer-based laboratories. American Journal of Physics, 65(1), 45.
Rummel, N., Spada, H., & Hauser, S. (2009). Learning to collaborate while being scripted or by
observing a model. International Journal of Computer-Supported Collaborative
Learning, 4, 69–92.
Saltz, J., & Heckman, R. (2016). Big data science education: A case study of a project-focused
introductory course. Themes in Science and Technology Education, 8(2), 85–94.
Schellens, T., Van Keer, H., & Valcke, M. (2005). The impact of role assignment on knowledge
construction in asynchronous discussion groups a multilevel analysis. Small
GroupResearch, 36(6), 704–745.
Strijbos, J., & Weinberger, A. (2010). Emerging and scripted roles in computer-supported
collaborative learning. Computers in Human Behavior, 26, 491–494.
Stotts, D., Williams, L., Nagappan, N., Baheti, P., Jen, D., & Jackson, A. (2003). Virtual
teaming: Experiments and experiences with distributed pair programming. In F. Maurer
& D. Wells (Eds.), Conference on Extreme Programming and Agile Methods, Lecture
Notes in Computer Science, vol. 2753 (pp. 129–141). Springer.
Tsompanoudi, D., Satratzemi, M., & Xinogalos, S. (2016). Evaluating the effects of scripted
distributed pair programming on student performance and participation. IEEE
Transactions on Education, 59(1), 24–31.
Warden, C. A., Stanworth, J. O., Ren, J. B., & Warden, A. R. (2013). Synchronous learning best
practices: An action research study. Computers & Education, 63, 197–207.
Wang, Y. (2004). Distance language learning: Interactivity and fourth-generation Internet-based
videoconferencing. CALICO Journal, 373–395.
Weinberger, A. (2011). Principles of transactive computer-supported collaboration scripts.
Nordic Journal of Digital Literacy, 6(3), 189–202.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 244


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

The Validity and Instructional Value of a Rubric for


Evaluating Online Course Quality: An Empirical Study
Ji Eun Lee and Mimi Recker
Utah State University

Min Yuan
University of Utah

Abstract
This study investigates the validity and instructional value of a rubric developed to evaluate the
quality of online courses offered at a midsized public university. This rubric is adapted from an
online course quality rubric widely used in higher education, the Quality Matters rubric. We first
examine the reliability and preliminary construct validity of the rubric using quality ratings for 202
online courses and eliminate 12 problematic items. We then examine the instructional value of the
rubric by investigating causal relationships between: (a) course quality scores, (b) online
interactions between students, instructors, and content, and (c) student course performance (course
passing rates). A path analysis model, using data from 121 online courses enrolling 5,240 students,
show that only rubric items related to learner engagement and interaction have a significant and
positive effect on online interactions, while only student-content interaction significantly and
positively influence course passing rates.

Keywords: online course quality, rubric, online interactions, rubric reliability, rubric
validity, quality matters rubric

Lee, J.E., Recker, M., & Yuan, M. (2020). The validity and instructional value of a rubric for
evaluating online course quality: And empirical study. Online Learning, 24(1), 245-263.
https://doi.org/10.24059/olj.v24i1.1949

The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality:
An Empirical Study
The number of college students taking online courses has increased dramatically over the
past decade, with almost 31% of U.S. undergraduate students (about 5.2 million) having taken at
least one course online as of the 2016 fall semester (McFarland et al., 2018). With this rapid growth
in the number of online courses, evaluating their quality has taken on a new urgency. While many
approaches have been developed to evaluate online course quality for example, surveys, checklists,
observations, peer reviews, and expert reviews—one common way is through quality rubrics
(Custard & Sumner, 2005; Jaggars & Xu, 2016; Roblyer & Wiencke, 2003; Yuan & Recker, 2019).
With a quality rubric, a course can be rated along several constituent quality dimensions—for
example, the Quality Matters (QM) rubric (Quality Matters, 2018) consists of eight dimensions,

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 245


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

such as learning objectives, instructional materials, learner support, accessibility, and usability, etc.
Each of these dimensions may, in turn, be composed of one or more specific quality indicators
(Custard & Sumner, 2005). In addition, for each indicator, rubrics often use rating scales and may
be accompanied by a scoring guide.
While quality rubrics are commonly used in many higher education institutions, few rubrics
have been empirically tested in terms of their reliability or validity (Yuan & Recker, 2015).
Moreover, an often-ignored aspect of course quality is its influence on online interactions and
student outcomes; in other words, the instructional value of the rubric. A key assumption is that a
well-designed course following a proven instructional design theory will enhance student learning
and engagement and thereby lead to improved outcomes (Reigeluth, 1999). Thus, a course that
scores high on quality should result in better student outcomes than one receiving a low score.
However, this relationship has seldom been examined in the literature (Jaggars & Xu, 2016).
The purpose of this article is twofold. The first is to test the validity of a rubric developed
to evaluate the quality of online courses offered at a midsized public university. This rubric, called
the AS rubric, was adapted from the QM rubric. The QM rubric is one of the most widely used
rubrics in higher education and its design is informed by online learning research (Quality Matters,
2018). In particular, using the course quality scores from 202 online courses, we examined the
preliminary construct validity of the AS rubric.
The second purpose is to examine the implicit logic linking online course quality to online
interactions and student course performance. We investigated the causal relationships between
course quality scores, online interactions between students, instructors, and content, and student
performance as measured by their course passing rates. We characterized student and instructor
online interactions in a subset of these online courses (the number of courses = 121; the number
of students = 5,240) using the clickstream data automatically captured by the learning management
system (LMS) for these courses. Finally, we examined the extent that the course quality measures,
mediated by student and instructor interactions, influenced passing rates. The specific research
questions guiding this research are:
1. To what extent is the AS online course quality rubric valid in measuring quality along
a number of course quality dimensions? Which specific indicators are reliable (internal
consistency reliability of the rubric) and valid (construct validity of the rubric)?
2. How do the course quality measures, when mediated by student and instructor online
interactions, influence course passing rates?
Figure 1 articulates the logic underpinning this study: an online course that rates highly on
quality along several key dimensions will positively influence the online interactions of its students
and instructors and how they interact with content, which will ultimately lead to improved course
performance. Figure 1 also illustrates how these three constructs are operationalized in our study.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 246


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

Figure 1. The study’s logic linking instructional design to student course performance with
measures for each component.

Review of Literature
In this section, we review the literature related to these three constructs shown in Figure 1.
We first review the growing literature surrounding the use of course quality rubrics in higher
education. We also specifically review the few studies that examine the relationship between
online course quality scores and student learning outcomes. Finally, we describe a framework for
characterizing and classifying interactions in online courses.
Course Quality Rubrics
We conducted a search of course quality rubrics in ERIC and Google Scholar with the
following keywords: online course, quality, rubric, and evaluation. We also found rubrics from
reviewing references of existing rubrics and getting recommendations from colleagues. These
strategies yielded 31 rubrics. Ten course quality rubrics were ultimately selected based on the
following criteria: they (a) were used for evaluating the quality of online courses; (b) consisted of
more than two dimensions, with accompanying definitions of the dimensions; and (c) were used
in higher education settings. Building on the approach used in a prior review of the quality rubric
literature (Yuan & Recker, 2015), we examined online course quality rubrics used by higher
education institutions in terms of three aspects: (a) development process, (b) quality dimensions,
and (c) and results of reliability and validity testing.
First, in terms of the development process, most of the rubrics were adapted from other
existing rubrics, rather than based on online learning theories or models (see Table 1). Regarding
revisions to the rubrics, eight rubrics noted that they went through several rounds of revisions.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 247


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

Table 1
Development Process, Reliability, and Validity of the Ten Rubrics Reviewed
No Rubric Development process Reliability & Validity
(publicly reported)
1 Checklist for • Developed based on Southern Regional • Not reported
Evaluating Online Education Board’s standards for quality
Courses (Southern online courses
Regional Education
Board, 2006)
2 Quality Standards • Developed based on the principles of • Not reported
Inventory (Egerton & active learning and effective teaching
Posey, 2007)
3 Online Course Design • Developed based on QM • Not reported
Rubric (New Mexico • Noted that “the rubrics are updated
State University, 2011) regularly.”
4 Online Course Best • Informed by a few existing rubrics (e.g., • Reported that “a pilot test of
Practices Checklist Blackboard, QM) the checklist was conducted”
(Palomar College, • Revised several times but specific results were not
2012) reported.
5 Quality Learning and • Informed by existing rubrics and models • Not reported
Teaching Instrument (e.g., QM, Community of Inquiry model)
(California State • Revised several times
University 2015)
6 Online Educational • First version developed in 2014 by the OEI • Not reported
Initiative Course Development work group
Design Rubric • Revised based on feedback from
(California Community instructors and reviewers
College, 2016)
7 Exemplary Course • First developed in 2000 • Not reported
Program Rubric • Reviewed and updated annually by
(Blackboard Inc., Blackboard experts
2017)
8 Rubric for Evaluating • Developed based on a few existing rubrics • Not reported
Online Courses (e.g., Blackboard)
(University of North • Revised several times
Dakota, 2017)
9 Quality Online Course • Informed by existing rubrics • Not reported
Initiative Rubric • Brainstormed dimensions first and then
(Illinois Center chucked into categories
College, 2017) • Revised several times
10 Quality Matters (QM): • Informed by a few research articles, and • Improvement process
Course Design Rubric revised based on users’ inputs reported (Shattuck et al.,
Standards (2018) • Revised for a few versions 2014)
• Measured “rater agreement.”

Second, with regard to quality dimensions, although each rubric used slightly different
terms, our review found five common dimensions for measuring online course quality across the
rubrics. These were: (a) course design and introduction, (b) learning objectives and assessment, (c)
interaction and collaboration, (d) learning resources and support, and (e) course technology and
accessibility. However, the rubrics also showed differences in their evaluation focus. For instance,
Rubric #10 (Quality Matters, 2018) consisted of 42 weighted items with almost 30% of the weight

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 248


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

addressing “learning objectives and assessment” and only 11% of the weight focused on
“interaction and collaboration.” In contrast, Rubric #6 (California Community College, 2016)
emphasized “course technology and accessibility” with 48% of the total items related to these
issues.
Finally, rubrics require sufficient levels of reliability and validity (Roblyer & Wiencke,
2003). Despite the importance of establishing reliability and validity of rubrics, none of the
reviewed rubrics publicly reported the results of reliability or construct validity tests. Only two
rubrics (Rubric #4 and #10 in Table 1) noted that they underwent empirical testing, such as a
measurement of rater agreement, but details were not reported. This lack of reliability or validity
testing calls into question the rubrics’ overall suitability for rigorously evaluating online course
quality (Yuan & Recker, 2015).
To summarize, the ten rubrics reviewed in this study show similarities in the dimensions
addressed and the rating scales used, but they differed in their focus for evaluation. These
differences seem reasonable, as all higher education institutions have different needs, interests,
and criteria for evaluating online courses (Britto, Ford, & Wise, 2013). However, from a research
perspective, key questions remain: which dimensions are more important in evaluating the quality
of an online course? Which dimensions better predict student performance?
Course Quality and Student Learning Outcomes
Our literature review suggests that rubrics for measuring course quality have been validated
mostly in terms of the opinions and perceptions of faculty and students, rather than in terms of
construct validity or relationships to learning outcomes (Hixon, Barczyk, Ralston-Berg, &
Buckenmeyer, 2016). Empirical studies (Jaggars & Xu, 2016; Lee, 2014; Liu et al., 2010; Sun et
al., 2008; Swan et al., 2012) have found that a course with high quality scores measured by rubrics
resulted in higher student learning outcomes in terms of course performance or satisfaction than
one receiving low quality scores. However, studies also showed that not all scores on dimensions
of the rubrics significantly predicted learning outcomes (Jaggars & Xu, 2016; Lee, 2014; Sun et
al., 2008). For instance, Jaggars & Xu (2016) explored the relationship between rubric scores from
23 online courses and student final grades at two community colleges in the U.S. Results revealed
that among the four rubric dimensions, only the “interpersonal interaction” dimension had a
statistically significant and positive impact on student final grades. Thus, while well-organized
courses or well-described learning objectives might be desirable, these quality aspects may not
lead to better learning outcomes per se.
Characterizing Interactions in Online Learning
Interactions among learners, instructors, and content are integral components of online
education (Bernard et al., 2009). A widely used framework for examining interactions in online
education is Moore’s (1989) interaction framework. This framework classifies interactions into
three types: Student-Instructor, Student-Student, and Student-Content.
Later, Anderson and Garrison (1998) expanded Moore’s framework by differentiating
between Student-Content and Instructor-Content interaction. These four types of interactions are
defined by Anderson (2008) as Student-Instructor (SI), Student-Student (SS), Student-Content
(SC), and Instructor-Content (IC). SI interaction refers to communication between learners and
experts, which includes instructor feedback, support, and encouragement to learners. SS
interaction is defined as communication between one learner and other learners, including

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 249


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

collaborative or cooperative settings. SC interaction includes student activities such as reading


course materials, watching lecture videos, and completing assignments. IC interaction refers to
instructors creating, monitoring, or modifying content or learning activities.
Many empirical studies have examined how the strength of interactions is associated with
student learning outcomes, such as their performance or satisfaction (Borokhovski et al., 2012;
Choi, Lee, Hong, Lee, Recker, & Walker, 2016; Hoey, 2017; Ke, 2013; Kuo et al., 2013; Murray
et al., 2012; Sher, 2009). However, the effects of each interaction type on learning outcomes have
not been found to be equal. Our review found that studies yielded different results depending on
the outcome variable studied.
First, studies that used measures of student course performance as dependent variables
indicated that the effects of SC or SS interaction were larger than the effect of SI interaction on
student performance. For instance, Bernard et al. (2009) reviewed 74 empirical studies to examine
the effects of three types of interaction (SS, SI, SC) strength on student performance. The results
of a meta-analysis revealed that the effects of SS and SC interactions were significantly larger than
the effect of SI interaction on performance. Similarly, in other studies, SS or SC interactions (Ke,
2013), SS interaction (Borokhovski et al., 2012; Choi et al., 2016), or SC interaction (Murray et
al., 2012) had significant and positive influences on student performance.
Second, studies that used student affective outcomes as dependent variables tended to show
somewhat different results. For instance, in the meta-analysis by Bernard et al. (2009), the effect
of SS interaction was significantly larger than the effects of SC or SI interactions on student
attitudes. However, a study by Kuo et al. (2013) produced opposite results, finding that SC and SI
interactions were significant predictors of student satisfaction, while SS interaction was not. To
summarize, our review found that the effects of each interaction type differed depending on the
dependent variable used in the study and the characteristics of interactions analyzed.

Methods
Course Quality Rubric
This study used course quality rating scores collected through a rubric used at a midsized
public university in the U.S. The rubric was developed collaboratively by instructional designers
at an Academic Support (AS) unit in order to support instructional designers in better designing
online courses as well as ensuring online course quality at this university. The AS rubric was
adapted from the well-established and reliable QM rubric and consists of nine dimensions (course
organization, course introduction and syllabus, learning objectives, assessments and activities,
resources and materials, interaction and learner engagement, accessibility, course technology, and
learner support) and 51 items to measure online course quality.
However, we identified several problems with these predefined dimensions. First, the
number of items measuring each quality dimension, which influences the coefficients of internal
consistency and reliability (Drost, 2011), varied widely across the dimensions (from 3 to 12 items).
Second, some items did not adequately reflect their dimension, which raises content validity issue.
For instance, one item in the “course instruction and syllabus” dimension, “provides clear
expectations for student response, engagement, and participation,” also aligned to the “interaction
and learner engagement” dimension. For these reasons, we decided to ignore the predefined

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 250


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

dimensions and generate new ones using the results of an exploratory factor analysis, described
below.
Research Context and Participants
To measure the preliminary construct validity of the AS rubric (RQ1), we used course
quality scores collected from the ratings of 202 online courses offered at this university from 2012
to 2016. Among the 2,797 courses offered during this period, the instructional designers randomly
selected 202 courses and evaluated their course quality using the AS rubric. The courses included
both undergraduate (173 courses, 85.6% of the sample) and graduate level courses (29 courses,
14.4% of the sample) from various academic disciplines. Each course was rated by one
instructional designer in the AS unit at the beginning of the semester. The items were rated on a
two-point scale (Yes = 1, No = 0). Note that no responses were coded as null.
To measure the level of online interactions in each course (RQ2), we categorized instructor
and student clickstream data automatically collected by the university’s LMS into the four types
of interactions as defined by the framework described above (see Table 2). Of the original sample
of 202 courses, 81 lacked LMS interaction data or student final grades and were excluded from
further analysis. The remaining 121 courses enrolled a total of 5,240 students. All measures were
converted to Z-scores before computing the average level of interaction. We also measured student
course performance in terms of passing rates. This was computed by dividing the number of
students who successfully passed the courses (receiving grades of A, B, C, or D) by the number of
students enrolled in each course. Among these students, 169 students (3%) received a grade of W
(Withdrawal), indicating that the students dropped the course after the first three weeks of the
semester.
Table 2
Summary of LMS Variables Used to Measure the Four Types of Interaction
Online LMS Variables Measures
Interactions
Instructor- ic_atta # of attachments posted by an instructor
Content ic_disc # of discussion topics posted by an
(IC) instructor !"#$$# %!"&'() % !"+',' %!"-.'/ %!"#(('
ic_wiki # of wiki topics posted by an instructor 0
ic_quiz # of quizzes posted by an instructor
ic_assi # of assignments posted by an instructor
Student- sc_atta Avg. # of attachments viewed by a student
Content sc_disc Avg. # of discussions viewed by a student
(SC) sc_wiki Avg. # of wiki topics viewed by a student
123443 + 126!7" + 128!9! + 12:;!< + 12377!
sc_quiz Avg. ratio of quizzes completed by a
5
student
sc_assi Avg. ratio of assignments completed by a
student
Student- ss_disc Avg. # of discussion messages (initial
Student messages and replies) posted by a student - ss_disc
(SS)
Student- si_disc # of discussion messages (initial messages
Instructor and replies) posted by an instructor - si_disc
(SI)
Note. The course is the unit of analysis. All interaction measures were converted to Z-scores.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 251


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

Data Analysis
Before examining the validity of the rubric (RQ1), the internal consistency reliability of
the AS rubric was measured using Kuder-Richardson formula-20 (KR-20) with two-point
measurement data. Specifically, we used a stepwise procedure to find unreliable items and to
maximize scale reliability (Raubenheimer, 2004). In the stepwise procedure, the least reliable item
is removed, as indicated by the expected increase in KR-20 coefficient for the subscale. Then, the
next least reliable item is removed, and the analysis is repeated until the removal of items does not
lead to an increase in reliability.
To examine the preliminary construct validity of the rubric, we conducted an exploratory
factor analysis (EFA) as we had little theoretical or empirical basis for the rubric’s design. Since
our data are dichotomous, we computed tetrachoric correlation coefficients and then conducted
an EFA using these coefficients. For the extraction factor rotation methods, we chose unweighted
least-squares (ULS) extraction with Promax rotation, the recommended method for the analysis of
tetrachoric correlation coefficients (Han et al., 2001).
For RQ2, we conducted a path analysis to investigate the relationships between online
course quality scores, online interactions, and passing rates. The path model tested three
hypotheses: (a) the online course quality scores influence all variables (the four types of
interactions) and passing rates; (b) the four types of interactions influence passing rates, and; (c)
the online interactions mediate the influence of online course quality scores on passing rates. R
Studio with the psych and lavaan packages was used for all analyses.

Results
Research Question 1: Reliability and the Preliminary Construct Validity of the AS Rubric
The first research question examined the reliability and the validity of the AS quality rubric
using its quality dimensions and items. To answer this question, we conducted an internal
consistency reliability analysis and an EFA. The initial KR-20 coefficient for 51 items was .82.
Next, the stepwise procedure was performed to maximize reliability. As a result, eight items were
eliminated (16% of the total) (see Table 3), and the KR-20 coefficient for 43 items increased to .87.
As summarized in Table 3, four of the eliminated items (item #39, #40, #41, #42) were related to
the “accessibility” dimension. The other four eliminated items (item #28, #30, #31, #47) related to
course technology issues

Table 3
The Items Eliminated from the Reliability Test and the EFA
Item no. Descriptions
Items item40 Scanned PDF documents are made screen readable with OCR technology.
removed item41 Images used for learning have a visual description.
from the item39 Audio is captioned or transcribed.
reliability item47 Course provides sufficient instructions for students on use of tools and media.
test item31 No unreasonable software requirements.
item42 Images have an alt tag.
item30 Resources & materials can be accessed with multiple operating systems.
item28 Resources & materials are easily accessed and used.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 252


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

Items item11 Provides clear expectations for instructor response and engagement.
removed item08 Evaluation methods and assessment activities are clearly outlined.
from the item29 Purpose of each element is explained
EFA item32 Learner engagement and interaction activities promote achievement of
learning objectives.

Next, we conducted an EFA using the remaining 43 items to examine the preliminary
construct validity of the rubric. The results of Bartlett’s test of sphericity (χ2[903] = 16200.13, p
< .05) and the Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy (KMO = .70) indicated
that our data were suitable for performing a factor analysis (Yong & Pearce, 2013). Forty-three
items were analyzed using an ULS extraction method with Promax rotation. For the convergent
validity, we used cut-off loadings of 0.4. Next, to determine the number of factors to retain for
rotation, we checked eigenvalues (Kaiser’s rule) and performed a parallel analysis. The results
indicated that the nine-factor solution had the cleanest structure (i.e., fewest cross-loadings and no
factors with fewer than three items).
Table 4 shows the results of factor loadings for the 43 items. The nine-factor solution
explained 73% of the total variance. Among the 43 items, another four items were eliminated
because one cross-loaded onto two factors, and the other three did not have primary factor loadings
of .4 or above. These four items tended to have imprecise descriptions or criteria to evaluate course
quality, perhaps making use by raters difficult (see Table 3).

Table 4
Results of Factor Loadings for AS Rubric Items (43 items)
Items 1 2 3 4 5 6 7 8 9
Eigenvalues 5.99 4.08 3.70 4.03 3.09 3.01 2.51 2.59 2.44
% of variance 0.14 0.09 0.09 0.09 0.07 0.07 0.06 0.06 0.06
Cumulative % 0.14 0.23 0.32 0.41 0.49 0.56 0.61 0.67 0.73
item19 0.59 -0.13 -0.25 -0.20 0.26 -0.03 0.09 0.12 0.37
item22 0.42 0.13 0.21 0.08 0.22 0.16 -0.28 -0.04 0.29
item23 0.87 -0.15 0.15 -0.12 0.01 0.20 0.07 0.12 -0.43
item24 0.44 -0.09 -0.02 0.19 0.06 0.34 -0.17 0.24 -0.18
item25 0.58 0.32 -0.09 -0.23 0.15 0.00 0.29 -0.05 0.14
item26 0.68 0.18 -0.12 0.28 -0.17 -0.02 0.12 -0.13 0.16
item27 0.83 -0.01 0.11 0.16 -0.25 0.09 0.01 -0.01 0.03
item43 0.64 -0.15 0.09 0.20 0.20 0.06 -0.05 0.33 -0.10
item44 0.82 0.31 -0.14 0.07 -0.03 -0.01 -0.03 -0.30 -0.03
item48 0.51 0.13 0.09 -0.09 -0.12 -0.16 0.06 0.28 -0.01
item01 0.02 0.89 -0.22 -0.04 -0.01 -0.01 -0.06 -0.01 0.11
item02 0.08 0.71 0.06 -0.08 -0.13 0.16 -0.14 0.11 0.04
item03 0.17 0.84 0.14 0.10 -0.21 0.16 -0.22 -0.07 -0.09
item04 0.12 0.62 0.18 0.20 0.10 -0.01 0.03 -0.06 -0.02
item49 0.12 -0.03 0.78 -0.13 0.00 0.10 0.10 0.02 0.00
item50 -0.03 0.06 1.02 0.04 0.00 0.04 -0.05 -0.13 -0.07
item51 -0.08 0.06 0.98 0.06 0.02 0.01 -0.05 -0.12 0.00
item13 0.18 0.10 0.03 0.76 0.09 -0.06 0.38 -0.04 -0.13
item14 -0.06 0.30 0.20 0.55 0.22 0.07 -0.05 0.12 -0.08
item34 0.03 -0.05 0.16 0.73 0.21 -0.13 0.09 -0.37 0.18
item35 0.06 -0.15 -0.18 0.74 -0.22 0.35 0.24 -0.12 0.30
item36 0.39 0.03 -0.18 0.54 0.07 0.14 0.01 -0.06 0.03
item37 -0.01 0.14 -0.26 0.47 0.08 0.03 -0.11 0.27 0.07

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 253


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

Items 1 2 3 4 5 6 7 8 9
item16 -0.15 0.00 0.00 0.24 0.98 0.10 0.04 -0.11 -0.30
item17 -0.08 -0.10 -0.03 0.11 0.81 -0.19 -0.01 -0.05 0.13
item18 0.16 -0.17 0.03 -0.03 0.78 -0.06 0.06 0.13 0.11
item05 -0.05 0.26 0.12 0.03 -0.20 0.59 0.12 0.00 -0.07
item06 0.17 0.10 -0.02 0.02 0.08 0.76 0.07 0.08 -0.13
item12 0.12 -0.44 0.33 -0.07 -0.15 0.46 0.39 0.03 0.21
item33 0.26 -0.03 0.19 0.27 -0.02 0.44 -0.09 0.09 0.21
item07 -0.35 0.25 0.08 -0.04 0.26 0.11 0.41 0.28 0.19
item09 0.17 0.06 0.15 -0.11 0.10 0.34 0.49 -0.09 0.09
item10 0.09 -0.34 -0.08 0.40 -0.04 0.03 0.99 -0.01 0.08
item38 -0.24 0.02 0.02 0.05 0.00 0.33 0.00 0.42 0.39
item45 0.08 0.17 -0.28 -0.25 -0.11 0.22 0.08 0.96 0.11
item46 0.40 -0.19 0.17 -0.03 0.10 -0.19 -0.08 0.66 0.10
item15 -0.23 0.32 0.14 0.00 0.23 0.37 -0.12 -0.14 0.54
item20 -0.02 0.13 0.32 0.14 -0.28 -0.37 -0.09 0.18 0.44
item21 0.08 -0.10 -0.13 0.16 -0.06 -0.16 0.28 0.17 0.67
item11* -0.24 0.29 0.01 0.46 -0.08 0.21 0.43 0.13 -0.09
item08** 0.08 0.11 0.34 0.00 0.13 -0.49 0.36 0.07 0.09
item29** 0.39 0.35 -0.13 0.02 0.10 -0.03 0.06 0.19 -0.01
item32** 0.24 0.23 -0.24 0.37 -0.01 0.06 -0.07 0.17 0.24
Note. Factor loadings < .4 are suppressed. * Item cross-loaded onto multiple factors. ** Items without primary factor
loadings of .4 or above.

Finally, Table 5 summarizes the nine factors, their labels, and their 39 items based on the
EFA. Factor 1 accounted for the highest amount of the total variance (14%) among the nine factors.
Ten items displayed meaningful loadings (greater than .40) for this factor and all the items related
to student activities or course content. This factor was labeled “Learning Activities & Materials.”

Table 5
Summary of New Factors and Their Items Based on the EFA
EFA
constructs Items from AS rubric
and labels
Factor 1 item19 Assessments and activities are consistent with the course objectives and resources.
(Learning item22 Activities provide students with opportunities to receive feedback early and frequently,
Activities & specifically in preparation for high stakes assessments.
Materials) item23 Course includes assessments and activities that are problem-centered or application-oriented in
nature.
item24 Students are encouraged to integrate new concepts into regular practice and understanding
through demonstration, reflection, creation, or similar activities.
item25 Resources & materials support learning objectives.
item26 Resources & materials are sufficient for students to learn the subject.
item27 Resources, materials, and instructor interactions activate students’ prior learning and
experiences while introducing new concepts.
item43 Tools and media support the learning objectives.
item44 Tools and media are appropriately chosen and appropriately varied to enhance student
interactivity with course content.
item48 Course provides additional tutorials/resources as needed to accomplish objectives.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 254


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

EFA
constructs Items from AS rubric
and labels
Factor 2 item01 Upon first entering the course, students can easily find the course syllabus and introductory
(Course materials.
Introduction & item02 The progression of course content and activities is easy to find, clearly outlined, and
Design) appropriately segmented into units or modules.
item03 Course appears visually clean, consistent, and appealing on the home page and throughout.
item04 A course introduction orients student to the course environment and suggests the relevance of
course materials and activities to students and/or program goals.
Factor 3 item49 Course provides technical support services link/description.
(Learner item50 Course provides academic support services link/description.
Support) item51 Course provides student support link/description.
Factor 4 item13 Provides clear expectations for student response, engagement, and participation.
(Learner item14 Provides clear expectations for student etiquette in participation.
Engagement & item34 A means for making course announcements is clearly available and used regularly to
Interaction)
encourage student completion and participation and to connect course content with current
events and research.
item35 Course design fosters interaction with other students.
item36 Course design fosters interaction with content.
item37 Appropriate synchronous or asynchronous means are provided for students to ask questions
and receive answers from the instructor and/or students.
Factor 5 item16 Objectives are clearly stated.
(Learning item17 Objectives are measurable.
Objectives) item18 Objectives are consistent with the course material/assessments/assignments.
Factor 6 item05 Course has an instructor introduction.
(Course item06 Students have an opportunity to introduce themselves.
Facilitation) item12* Course fees, if any, are explained.
item33 Course design fosters interaction with instructors.
Factor 7 item07 The course grading policy is clearly stated.
(Course item09 Course technology requirements are addressed up front, if applicable.
Information) item10 Textbook information and other materials requirements are provided.
Factor 8 item38 Course has a statement directing students with ADA-documented disability to the DRC for
(Course reasonable accommodations as needed.
Technology) item45 Tools and media are as easy to use as is reasonably possible.
item46 Tools and media are sufficiently compatible with web and other applicable standards.
Factor 9 item15 Syllabus addresses course-appropriate policies, including academic honesty, harassment,
(Course withdrawal and I-grades, and the student grievance process.
Management) item20 Appropriate pacing mechanisms (due dates, reminders, follow-ups) are used to ensure timely
student completion and regular engagement.
item21 Specific descriptive criteria are provided for the evaluation of student’s work and
participation, ideally in the form of a rubric.
Note. * Item does not fit well in category

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 255


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

Factors 2, 3, and 4 each explained 9% of the total variance. The four items loading onto
Factor 2 related to aesthetic dimensions of the course or its introductory materials. This factor was
labeled “Course Introduction & Design.” The three items loading onto Factor 3 dealt with whether
academic or technical support links/descriptions are provided in the courses (labeled “Learner
Support”). Six items displayed significant loadings for Factor 4 related to interaction, student
participation, and engagement in courses (labeled “Learner Engagement & Interaction”).
Factors 5 and 6 each explained 7% of the variance. Factor 5 consisted of three items and
was labeled “Learning Objectives.” Four items displayed meaningful loadings for Factor 6. Three
items (item5, item6, item33) dealt with facilitating the courses (labeled “Course Facilitation”).
However, one item (item12: “Course fees, if any, are explained”) did not seem to measure the
same construct as other items, which implies that revisions to the rubric are needed.
Factors 7, 8, and 9 each explained 6% of the total variance. The three items loaded onto
Factor 7 dealt with course policy or requirements (labeled “Course Information”). Factor 8
consisted of three items related to course technology issues (labeled “Course Technology”). The
three items showing meaningful loadings for Factor 9 dealt with course management issues such
as syllabus, pacing mechanism, and evaluation of student work (labeled “Course Management”).
Research Question 2: Instructional Value of the Rubric
The second research question investigated how course quality measures, when mediated
by student and instructor online interactions, influenced course passing rates. We used a path
analysis to model the influence of course quality scores on the four types of online interactions
and passing rates. Table 6 summarizes the descriptive statistics for course quality rubric scores,
online interactions, and passing rates. For course quality scores, we computed average rubric
scores for the nine factors identified by the EFA.

Table 6
Descriptive Statistics of all Variables Included in the Path Model (N = 121 courses, 5,240
students)
Variables M SD Min. Max.
Course Factor 1: Learning Activities & Materials 0.92 0.16 0.10 1.00
quality Factor 2: Course Introduction & Design 0.87 0.26 0.00 1.00
scores Factor 3: Learner Support 0.87 0.30 0.00 1.00
(rubric Factor 4: Learner Engagement & Interaction 0.76 0.30 0.00 1.00
scores) Factor 5: Learning Objectives 0.88 0.27 0.00 1.00
Factor 6: Course Facilitation 0.83 0.25 0.00 1.00
Factor 7: Course Information 0.95 0.17 0.00 1.00
Factor 8: Course Technology 0.93 0.17 0.00 1.00
Factor 9: Course Management 0.80 0.26 0.00 1.00
Online Instructor-Content interaction* 0.00 0.67 -0.95 3.37
interactions Student-Content interaction* 0.00 0.58 -1.22 2.44
(recorded Student-Student interaction* 0.00 0.99 -0.63 4.97
by LMS) Student-Instructor interaction* 0.00 0.99 -0.73 4.86
Course passing rate (ratio) 0.90 0.12 0.45 1.00
Note. The course quality scores are binary. *All interaction measures were converted to Z-scores.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 256


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

First, we performed a path analysis using the initial model, with the direct effect of the
course quality scores on course passing rates represented as path c, the direct effect of online
interactions on course passing rates represented as path b, and the indirect effect of course quality
scores on course passing rates represented as path a (see Figure 2). The model was statistically
significant (χ2[6] = 89.34; p < .05), but it did not have a satisfactory model fit (Comparative Fit
Index [CFI] = .37, recommended to be greater than .90) and included nonsignificant paths.

Figure 2. Path diagram for the initial model of the relationships among the course quality scores, online
interactions, and course passing rates. (Note: Path a is from each of the nine factors to the interaction
variables.)

We therefore dropped the nonsignificant paths and reconducted the path analysis, which
showed good model fit (χ2[6] = 14.26; p < .05, CFI = .91, RMSEA = .11). Figure 3 shows the
results with the standardized regression coefficients. In the revised model, all path coefficients
were significant at the .05 level except for one path (Course Facilitation - Passing rate, β = .155, p
> .05).

Figure 3. Path diagram for the final model.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 257


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

Regarding the causal relationships between online course quality scores and online
interactions, “learner engagement & interaction” scores had significant influences on Student-
Content (β = .286, p < .05), Student-Student (β = .333, p < .05), and Student-Instructor interactions
(β = .365, p < .05). Finally, Student-Content interaction had a significant direct effect on passing
rate (β = .358, p < .05). The R-squared value indicates that approximately 16.3% of the variance
in passing rate is explained by this model.

Discussion
This study examined the preliminary construct validity and instructional value of an online
course quality rubric, the AS rubric. Instructional value was investigated in terms of the
relationships between course quality, as measured by the AS rubric scores, online interactions
between students, instructors, and content as automatically captured by the Canvas LMS, and
student course passing rates.
For RQ1, the internal consistency reliability test for the AS quality rubric revealed eight
unreliable items. Four were related to course accessibility, while the other four were related to
course technology or course materials and resources. In addition, we found that some of the
removed items did not use precise terms or clear guidelines in terms of evaluating course quality.
For instance, the item “no unreasonable software requirements” did not define “unreasonable.”
Similarly, in the case of the item “course provides sufficient instructions for students on use of
tools and media,” the criteria for “sufficient” can be subjectively interpreted. Internal consistency
reliability can be improved by using precise terms, clear guidelines, and making instructions as
explicit as possible (Cohen et al., 2007). The EFA revealed four additional problematic items that
either loaded on multiple factors or did not significantly load on any factor. The EFA identified
nine factors, explaining 73% of the total variance. Among these nine factors, “learning activities
& materials” explained the highest amount of total variance in course quality.
For RQ2, we modeled the causal relationships between the online course quality scores,
the four types of online interactions captured by the LMS, and passing rates using a path analysis.
First, results show that only rubric scores related to the “learner engagement and interaction”
construct had a positive and significant effect on online interactions. The quality scores of “learner
engagement and interaction” had the largest effect on SI interaction, followed by SS and SC
interactions. Thus, online courses that are designed to encourage student participation and
interaction with other students appear to not only have a higher level of SS interaction but also a
higher level of SC and SI interactions. The quality measures for the other dimensions did not have
a significant impact on any of the types of online interactions. While these dimensions address
course features that are certainly desirable aspects to include in course design, they may not
contribute to enhanced online interactions per se.
Second, in terms of the associations between the four types of interactions and passing
rates, only SC interaction had a significant and positive effect on passing rates. This aligns with
previous findings that SC interaction positively influenced performance (Bernard et al., 2009; Ke,
2013; Murray et al., 2012). We also note that SS interaction did not have a significant effect on
passing rates. One reason for this result might be contextual differences as this study included
courses from various academic disciplines. Indeed, one study (Ke, 2013) found that there were
significant differences between disciplines in terms of the amount and type of online interactions.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 258


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

Lastly, in terms of the relationship between the course quality scores and passing rates, the
scores for one construct, “course facilitation,” had positive and significant influences on passing
rates in the initial model, but not in the final model. However, scores on the “learner engagement
and interaction” construct had a positive and significant effect on SC interaction, which, in turn,
significantly and positively influenced passing rates. Thus, the results imply that course design
elements related to “learner engagement and interaction” are an important aspect of course quality,
indirectly contributing to course performance. Another study (Jaggars & Xu, 2016) reported a
similar result in that the “interpersonal interaction” dimension of a quality rubric had a significant
and positive impact on student final grades, while other dimensions of the rubric did not. In
addition, while the final path model explained only 16.3% of the variability in passing rates, it is
important to note that many other factors, in particular, student-related factors (e.g., academic
background, relevant experiences), also influence successful course completion (Lee & Choi,
2011).
Limitations and Future Research
Several limitations to this research are important to note. In terms of the AS rubric,
although the quality of over 200 online courses was measured, all came from a single university
with its own institutional culture. Also, the rubric was only applied by one rater thus making it
impossible to determine another important form of reliability, inter-rater reliability. Finally, the
rubric used a binary score while a Likert scale may have increased the usability of the rubric (Yuan
& Recker, 2015). In addition, our data were also drawn from various academic disciplines. As
previously mentioned, one study (Ke, 2013) found significant disciplinary differences in online
interaction patterns. Therefore, future research should consider the quality of online interactions
using a disciplinary lens. Future work should also consider how results from this study inform
rubric design to improve validity and instructional value. Finally, future work should examine the
influence of course design and interaction variables on other important kinds of student learning
outcomes (e.g., satisfaction, perseverance).

Conclusions
While the AS rubric was based on the widely used and reliable QM rubric, almost one-
fourth of the rubric items were identified as problematic. This concerning result has implications
for other quality rubrics used in higher education institutions because: (a) most of the rubrics
reviewed in the literature were adapted from existing rubrics, rather than based on empirical testing
or online learning models and (b) none of the rubrics reported results from reliability or validity
tests. In particular, a lack of construct validity may result in misinterpretations of a construct, as
well as raise doubts about the suitability and credibility of the measurement tool (Cohen et al.,
2007; Yuan & Recker, 2015). Thus, more empirical studies are needed to establish the reliability
and validity of existing course quality rubrics.
From a practical perspective, this study has several implications. During the course design
stage, instructors and course designers could consider adding different strategies to promote
students’ engagement and interactions, for example by using games and simulations, providing
hands-on activities, and building an online course community using social networks. During the
course review process, course designers could consider providing rubric definitions and guidelines,
especially for items that are more subjective. They could also consider revising items related to
course accessibility and technology use to make them easier to apply.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 259


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

At the university level, although different higher education institutions might have different
needs and criteria for evaluating online courses, a quality rubric plays an important role in
identifying and addressing elements deemed important to instructional design (e.g., accessibility,
course objectives). It is important to consider to what extent these elements serve to influence (or
not) subsequent online interactions and learning outcomes. Many factors, stakeholders, and
decisions influence the design of online courses and these results are revealing in terms of
identifying those that seem to have a greater impact on students and providing guides for
instructors and instructional designers on their course design process.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 260


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

References
Anderson, T. (2008). The theory and practice of online learning. Athabasca University Press.
Anderson, T. D., & Garrison, D. R. (1998). Learning in a networked world: New roles and
responsibilities. In C. C. Gibson (Ed.), Distance learners in higher education (pp. 97–112).
Atwood Publishing.
Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., &
Bethel, E. C. (2009). A meta-analysis of three types of interaction treatments in distance
education. Review of Educational Research, 79(3), 1243–1289.
Blackboard Inc. (2017). Blackboard exemplary course program rubric. Blackboard. Retrieved
from https://www.blackboard.com/resources/are-your-courses-exemplary
Borokhovski, E., Tamim, R., Bernard, R. M., Abrami, P. C., & Sokolovskaya, A. (2012). Are
contextual and designed student-student interaction treatments equally effective in distance
education? Distance Education, 33(3), 311–329.
Britto, M., Ford, C., & Wise, J. M. (2013). Three institutions, three approaches, one goal:
Addressing quality assurance in online learning. Online Learning Journal, 17(4).
California Community College. (2016). Course design rubric for the online education initiative.
Retrieved from http://ccconlineed.org/wp-content/uploads/2015/11/OEI_Rubric_Edited-
ACC.pdf
California State University (2015). Quality learning and teaching. Retrieved from
https://www.csun.edu/it/qlt
Choi, H., Lee, J. E., Hong, W. J., Lee, K., Recker, M., & Walker, A. (2016). Exploring learning
management system interaction data: Combining data-driven and theory-driven
approaches. In T. Barnes, M. Chi & M. Feng (Eds.), Proceedings of the 9th International
Conference on Educational Data Mining (pp. 324–329). ACM.
Cohen, L., Manion, L., & Morrison, K. (2007). Research methods in education (6th ed.).
Routledge.
Custard, M., & Sumner, T. (2005). Using machine learning to support quality judgments. D-Lib
Magazine, 11(10). Retrieved from http://www.dlib.org/dlib/october05/custard/10custard.html
Drost, E. A. (2011). Validity and reliability in social science research. Education Research and
Perspectives, 38(1), 105.
Egerton, E. O., & Posey, L. (2007). Quality standards inventory. Retrieved from
http://learn.gwnursing.org/Education/QSI/QSI.html
Han, L., Neilands, T. B., & Dolcini, M. M. (2001). Factor analysis of categorical data in SAS.
Retrieved from https://www.lexjansen.com/wuss/2001/WUSS01044.pdf
Hixon, E., Barczyk, C., Ralston-Berg, P., & Buckenmeyer, J. (2016). The impact of previous
online course experience students' perceptions of quality. Online Learning, 20(1), 25–40.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 261


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

Hoey, R. (2017). Examining the characteristics and content of instructor discussion interaction
upon student outcomes in an online course. Online Learning, 21(4), 268–281.
Illinois Center College. (2017). Quality online course initiative. Retrieved from
https://icc.edu/faculty-staff/files/ICC-QOCI-Version-4.4-published-05.04.17-2.pdf
Jaggars, S. S., & Xu, D. (2016). How do online course design features influence student
performance? Computers & Education, 95, 270–284.
Ke, F. (2013). Online interaction arrangements on quality of online interactions performed by
diverse learners across disciplines. Internet and Higher Education, 16(1), 14–22.
Kuo, Y., Walker, A. E., Belland, B. R., & Schroder, K. E. E. (2013). A predictive study of
student satisfaction in online education programs in online courses. International Review of
Research in Open and Distance Learning, 14(1), 1–39.
Lee, J. (2014). An exploratory study of effective online learning: Assessing satisfaction levels of
graduate students of mathematics education associated with human and design factors of an
online course. International Review of Research in Open and Distance Learning, 15(1),
111–132.
Lee, Y., & Choi, J. (2011). A review of online course dropout research: Implications for practice
and future research. Educational Technology Research and Development, 59(5), 593–618.
Liu, I. F., Chen, M. C., Sun, Y. S., Wible, D., & Kuo, C. H. (2010). Extending the TAM model
to explore the factors that affect intention to use an online learning community. Computers
& Education, 54(2), 600–610.
McFarland, J., Hussar, B., Wang, X., Zhang, J., Wang, K., Rathbun, A., Barmer, A., Forrest
Cataldi, E., and Bullock Mann, F. (2018). The Condition of education 2018 (NCES 2018-
144). U.S. Department of Education. National Center for Education Statistics.

Moore, M. G. (1989). Three types of interaction. American Journal of Distance Education, 3, 1–


7.
Murray, M., Pérez, J., Geist, D., & Hedrick, A. (2012). Student interaction with online course
content: Build it and they might come. Journal of Information Technology Education:
Research, 11, 125–140.
New Mexico State University (2011). Online course design rubric. Retrieved from
http://hpdr.education.nmsu.edu/files/2013/07/on-line-course-design-rubric-.pdf
Palomar College (2012). Online course best practices checklist. Retrieved from
http://www2.palomar.edu/poet/BestPracticesChecklistSP12.pdf
Quality Matters (2018). Quality matters higher education rubric. Retrieved from
https://www.qualitymatters.org/sites/default/files/PDFs/StandardsfromtheQMHigherEducatio
nRubric.pdf
Raubenheimer, J. E. (2004). An item selection procedure to maximize scale reliability and
validity. SA Journal of Industrial Psychology, 30(4), 59–64.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 262


The Validity and Instructional Value of a Rubric for Evaluating Online Course Quality: An Empirical Study

Reigeluth, C. (1999). Instructional design theories and models: A new paradigm of instructional
theory. Lawrence Erlbaum Associates.
Roblyer, M., & Wiencke, W. (2003). Design and use of a rubric to assess and encourage
interactive qualities in distance courses. American Journal of Distance Education, 17(2),
77–98.
Sher, A. (2009). Assessing the relationship of student-instructor and student-student interaction
to student learning and satisfaction in web-based online learning environment. Journal of
Interactive Online Learning, 8(2), 102–120.
Southern Regional Education Board. (2006). Checklist for evaluating online courses. Retrieved
from https://www.sreb.org/publication/checklist-evaluating-online-courses
Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e-
Learning? An empirical investigation of the critical factors influencing learner satisfaction.
Computers & Education, 50(4), 1183–1202.
Swan, K., Matthews, D., Bogle, L., Boles, E., & Day, S. (2012). Linking online course design
and implementation to learning outcomes: A design experiment. The Internet and Higher
Education, 15, 81–88.
University of North Dakota. (2016). Rubric for evaluating online courses. Retrieved from
http://www1.und.edu/academics/center-for-instructional-and-learning-
technologies/_files/docs/bestpracticesrubric.pdf
Yong, A. G., & Pearce, S. (2013). A beginner’s guide to factor analysis focusing on exploratory
factor analysis. Tutorials in Quantitative Methods for Psychology, 9(2), 79–94.
Yuan, M., & Recker, M. (2015). Not all rubrics are equal: A review of rubrics for evaluating the
quality of open educational resources. International Review of Research in Open and
Distributed Learning, 16(5), 16–38.
Yuan, M., & Recker, M. (2019). Does audience matter? Comparing teachers’ and non-teachers’
application and perception of quality rubrics for evaluating open educational
resources. Educational Technology Research and Development, 67(1), 39–61.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 263


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

A Dramaturgical Examination of Online University


Student Practices in a Second Year Psychology Class
Dawn Marie Gilmore
RMIT Online

Abstract
This study employs dramaturgical analysis, the study of social interaction in terms of theatrical
performance, in examining online student interactions. Region-specific activity—front stage (the
course LMS) versus backstage (Facebook)—was examined to determine where students spend
their time doing class-related tasks. The context for this case study is a second-year online
psychology class at an Australian university. Data were collected concerning students’ course-
related activities in the two venues. Over a 12-week semester, 126 students were observed in the
LMS. Twenty-one students completed fortnightly questionnaires about where they spent their time
and with whom. At the end of the semester, 14 students participated in online interviews. Findings
suggest that the audience in each setting, as well as the timing of communication and duration
within each setting, appear to have contributed to shaping students’ learning experiences.
Awareness of these contributing factors may aid online teachers in understanding students’
learning preferences, and the roles of social networking tools in supporting learning collaborations.

Keywords: online learning, online teaching, student experience, Facebook, Goffman

Gilmore, D.M. (2020). A dramaturgical examination of online university student practice in a


second year psychology class. Online Learning, 24(1), 264-281.
https://doi.org/10.24059/olj.v24i1.1988

A Dramaturgical Examination of Online University Student Practices


in a Second Year Psychology Class
Dramaturgical sociology views human interactions determined by time, setting, and
audience (Goffman, 1959). Goffman’s approach advocates that one must not analyze the cause of
human interactions, but instead examine contexts in which those interactions occur. Goffman
(1959) uses theatrical metaphors, specifically the stage: front stage (where the actors perform for
an audience) and backstage (where the actors prepare for the performance). This study considers
the learning management system (LMS) as the front stage and Facebook as the backstage and
examines how and why online students use backstage online settings, such as Facebook, instead
of front stage settings, such as the LMS, to support their university learning. A second-year online
psychology class was selected as the case due to its large class size and the fact that students were
familiar with online learning and social media. Online observations, questionnaires, and interviews
were employed to understand students’ front stage and backstage learning experiences.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 264


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

Facebook and social learning in the university context


Facebook is a popular social networking tool among university students. Junco (2014), for
example, found that university students spend over an hour a day using Facebook for university
purposes. The three main ways to express presence on Facebook are through individual profiles,
pages, and groups. A Facebook profile is a personal account where an individual can connect with
friends, see other friends’ posts, share their own thoughts, and share photos or links to internet
sites. Facebook pages, on the other hand, are for official individuals (like Taylor Swift) or
businesses to share stories and connect with people. In the university context, a university might
have an official page and students would “like” the page. Then updates from the page would appear
on individual users’ Facebook feeds. Universities have successfully used official Facebook pages
to integrate new students into academia before course registration (Lin, Hou, Wang, & Chang,
2013). There are also Facebook groups, which are settings for a small group or community to
converse and share information. In one study, students reported being members of five or six
university-related Facebook groups. These included groups for primary school alumni, political
affiliations, hobbies, sharing opinions on current topics, having academic conversations, and
sharing learning materials (Bosch, 2009). Groups can be publicly available, for anyone to join, or
privately available where those who join must be approved by an administrator.
The affordances to communicate synchronously through Facebook messaging,
asynchronously through wall posts, as well as commenting on and sharing information, are features
that make Facebook ideal for social learning. While the blurring of lines between social networking
and university learning has been criticized by some students (Donlan, 2014) and teachers (Prescott,
Wilson, & Becket, 2013), others believe the inherently social nature of such sites supports learning.
Indeed, studies surveying students found that they feel Facebook has the potential to promote
collaborative and cooperative learning (Arouri, 2015; Bicen & Cavus, 2011; Roblyer, McDaniel,
Webb, Herman, & Witty, 2010). Social learning theories see learning as generated through the
observation of others and through direct experiences with others – two modes that Facebook tools
afford. In their theory of situated learning, for example, Lave & Wenger (1991) refer to this
phenomenon as legitimate peripheral participation (LPP). LPP is the process of observing others
before direct social interaction with others. By observing others, students can learn about behaviors
and their consequences as well as reap the benefits of any information shared while observing
others’ social interactions. Once a learner moves beyond LPP they can chose to take a more visible
role in the group, however this is not a requirement of continual learning.
Lave & Wenger (1991) suggest that situated learning occurs when a group is made up of
novices and experts, or newcomers and old-timers. The mixed abilities create opportunities for
more experienced members to share their knowledge. An expert or old-timer can also be referred
to as a more knowledgeable other (Vygotsky, 1978; Wenger, 1998). When students are surrounded
by peers of various knowledgeability, they are afforded opportunities to go beyond the content that
was scaffolded for them in the design of the curriculum. This is advantageous for students whose
needs may not be being otherwise met. As one study suggested, learning backstage on Facebook
was the result of a student’s inability to find information and not understand content, assessments,
or course administration (Cuesta, Eklund, Rydin, & Witt, 2016). This suggests that information
seeking performed by a novice and information sharing performed by an expert or more
knowledgeable other occurred. In several studies students reported that Facebook posts that asked
questions that grew into discussions were beneficial to their learning (DiVall & Kirwin, 2012),
particularly when the responses came from a ‘more knowledgeable other,’ such as the teacher

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 265


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

(Rambe, 2012) —again illustrating how cohorts of mixed abilities can support learning. In addition,
one study found that Facebook posts about not understanding were balanced with responses of
understanding by a 20:18 ratio (English & Duncan-Howell, 2008). This suggests that more
knowledgeable others were present and willing to share their experiences and knowledge.
While expert presence may be viewed positively, it can also be disruptive. Rambe (2012)
found that students abstained from answering classmates’ Facebook posts related to content and
waited for a teacher to respond. In this instance, students viewed the content-related posts as the
teacher’s domain (Rambe, 2012). Similarly, in Facebook groups with both postgraduates and
undergraduates present, the postgraduates posted the most and the undergraduates posted the least
(Ru-Chu, 2013). When experts, such as a teacher or older student, are present on the Facebook
page, the students might defer to the expert and self-identify areas where they should not answer,
even if they can. There was, however, one exception to this. (Bowman & Akcaoglu, 2014) found
that “super-users” responded to classmates regardless of whether they knew the answer. With the
exception of “super-users,” students may be aware that more knowledgeable others are present
and defer to them.
Students can also use situated learning to learn how to be a university student while
learning course content. Learning to be a student involves knowledge-seeking or knowledge-
sharing as regards course management, academic codes, and course requirements, particularly
those related to assessments (Bowman & Akcaoglu, 2014; Cuesta et al., 2016). Learning content,
on the other hand, involves seeking or sharing an understanding of content specific to a particular
class. Learning-content posts, for example, can include links from class materials to current events
(Bosch, 2009; Staines & Lauchs, 2013), political thought (Hyde-Clarke, 2013), as well as work
experiences (English & Duncan-Howell, 2008). Overall Facebook posts about learning-to-be-a-
student consistently outnumbered Facebook posts about learning the knowledge of a content area
and, even when students’ used Facebook independent from their class, they continued to seek and
share information related more to assessments than the content knowledge (Selwyn, 2009).
Nonetheless, as these studies demonstrate, few studies explore why students migrate towards
online social spaces beyond the course. The studies only viewed students in one context, the
Facebook context. Similarly, in studies of education it is common for researchers to only explore
the formal education setting (Livingstone & Sefton-Green, 2016). The intent of this paper,
however, is to explore student interactions in both settings. To achieve this, I employ Goffman’s
(1959) region-based behavior as the theoretical lens. The next section describes this approach and
how it was applied across the two contexts.
Theoretical Approach: Goffman’s (1959) region behavior
In dramaturgical sociology, region behavior occurs in any place defined by cultural
perception. Borrowed from theater, Goffman (1959) metaphorically employs two regions of social
behavior, the front stage and the backstage, as a means of analyzing social behaviors. In the front
stage, an actor is putting on a performance and is conscious of being observed by others. In the
backstage, an actor is afforded privacy from those in the front stage. The backstage is a place for
preparation for front stage performance and a place to seek reprieve. In Goffman’s 1959 study of
the Shetland Hotel, he identified the dining room and parlor as the front stage. This was the space
where guests and hotel staff interacted with each other. In this space, both employees and guests
behaved according to British middle-class norms. But in the backstage, the kitchen, the employees
behaved according to Shetland Islander norms. This meant that acceptable food, attire, and
behavior in the backstage was different to that of the front stage. For example, it was acceptable

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 266


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

to wear a hat, hang socks over the stove to dry, spit in a cup, and keep moldy soup in the backstage.
However, in the front stage, staff maintained a polished appearance and the presence of mold was
unacceptable.
Goffman's (1959) overall observation was that an employee’s front stage (in the restaurant)
and backstage (in the kitchen) was parts of the whole individual separated by a kitchen door.
Technology, the door, played an important role in situating behavior within the spaces of the hotel.
Behaviors the hotel managers did not want the hotel customers seeing remained hidden behind the
door in the backstage. One of the main parameters of Goffman’s body of work is co-presence of
participants. In recent times, however, technology has come to simulate a co-presence between
people. In online studies, however, not studies of university students, Goffman’s region behaviors
have been applied to produce a fuller account of how internet users engage across the backstage
and front stage spaces (Bullingham & Vasconcelos, 2013; Hogan, 2010; Pearson, 2009; Ross,
2007; Trammell & Keshelashvili, 2005). Bullingham & Vasconcelos (2013) argue that blogs and
avatars are online environments, which could be the front stage to an offline backstage.
If the LMS is identified as the front stage, then all other environments that a student uses
to prepare for their performance there combine to form the students’ backstage learning
environment. A front stage is typically marked by the decorum of those present, not the space. In
the Shetland Hotel example, the front stage was marked by middle-class norms and the backstage
Shetland Islander norms. However, the backstage kitchen was not totally hidden from the front
stage dining area. The door, which separated the stages, could be propped open at times by
waitstaff who were carrying heavy trays. This permitted customers the opportunity to glimpse into
the kitchen. It did not suddenly turn the kitchen into a momentary front stage. By comparison,
Ross (2007) studied London cabbies-in-training who used public online message boards as a
backstage to their front stage in-person cabbie training. The backstage was an online community
for learners, created by learners, with an occasional outsider passing through. The online backstage
afforded cabbies a space to feel connected by using informal language, share resources that made
learning possible, as well as anonymity that made critiquing actors from the front stage (examiners,
customers, colleagues) possible.
In the context of the current study, for students taking formal online courses the LMS, the
frontstage, is considered the central locus of learning. It provides space and tools where students
and teachers can store and access learning materials, to communicate on discussion boards, and to
submit assignments. As in the case of the Shetland Hotel where the door mediated the roles actors
played between the dining room and the hotel kitchen, the LMS mediates the role of students and
their interactions. Questions and statements posted to a front stage discussion board can be viewed
by everyone in the course. If this public action induces feelings of stage fright, this may discourage
further posting. That does not mean the question ceased to exist or went unasked. It could indeed
get asked in a backstage venue. Facebook is often used a backstage where university students can
interact out of view from teachers and staff, and essentially learn how to be university students
(Selwyn, 2009).
For the purpose of this study, the front stage is defined as the space where an online student
gives a performance—the LMS. Actions in the LMS front stage space can be “seen” by the
university, whether through the online discussion board or through student activity logs. The
backstage is the space where an online student prepares for a performance. This study examines
what students do beyond the LMS and how social media spaces preferred by students afford social
learning and enrich the student experience.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 267


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

Methods
This research employs a constructivist paradigm in that it examines participants’ lived
experiences (Waller, Farquharson, & Dempsey, 2016). Researchers applying this paradigm accept
that reality is socially constructed from the participants’ point of view. Meaning is not taken for
granted and interpretations of actions are based on how those we study define the situation (Denzin,
1989). While the findings are thus limited to the cohort studied, findings and their interpretations
can nonetheless inform theory, research, and practice (Stake, 1995).
Data were collected through observations, fortnightly questionnaires, and interviews. In
the first week of the course, the teacher announced that I would be observing for research purposes
and encouraged students to participate in the research. Over a 12-week semester I observed 126
students in the front stage LMS. Of the 126 students, 21 students opted to complete fortnightly
questionnaires that prompted the students to report where they spent their time completing class-
related-tasks in the backstage, and with whom. At the end of the semester, 14 students chose to
participate in an online interview. All data collection procedures were conducted in accordance
with the university human ethics guidelines of the university. In the three sections that follow, I
include a brief description of each approach.
Observations
Being enrolled in the LMS as an observer enabled me to take in the scene of the research
setting – specifically the participants’ front stage. I knew what students were being asked to do
and when, including reading the weekly learning materials, activities, and assessments. I observed
students’ responses to the weekly activities and conversations that occurred in the discussion
boards. If the teacher sent a group email, I also received the email. My observations of the front
stage contributed to my understanding of the data generated from the backstage in the Facebook
context.
Questionnaires
Fortnightly questionnaires were used to collect data about students’ content-related tasks
and study habits over the twelve-week semester. Each fortnight students were asked to recall where
they went to seek and share information related to the course, who they interacted with, and for
how long they did each of these actions in the front stage and backstage.
Interviews
Interviews gave participants the opportunity to give voice to their front stage and backstage
data. Interviews were transcribed, uploaded to NVivo and coded by applying Braun & Clarke's
(2006) guidelines for thematic analysis. To assure trustworthiness I participated in member checks
and triangulation between the three data types (see Stake, 1995). The students who participated in
the questionnaire and interview were a mix of part-time and full-time enrolments and ranged in
age from 21 to 73 years old. They were also from a variety of locations around Australia, including
major cities like Melbourne and remote areas like far north Queensland.
Setting: A second year university psychology class
This research was conducted in an online second year psychology class at an Australian
university, which offers both face-to-face and online degrees. This course was part of a fully online
bachelor’s degree in psychology. The online students are awarded the same qualifications as the
on-campus students. The class was delivered using the Blackboard Learning Management System

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 268


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

(Blackboard), which afforded students and teachers two main functions: access to course content
and communication. The course content function allowed students to access learning materials
such as articles, assignments, and videos. These learning materials were organized into twelve
weekly learning modules. Each week covered one theoretical approach to counselling, which
included a video of a patient receiving counselling and a discussion board activity. The
communication function allowed for both asynchronous and synchronous communication between
teachers and students. The class also used a live conferencing tool called Collaborate to host one-
hour weekly tutorials.
Participation in discussion board forums was not graded; however, the syllabus stated that
students were expected to contribute to the discussion board forums on a regular basis. Three total
contact hours were prescribed for the course, which included two hours per week completing the
learning materials, and one hour per week participating in a synchronous Collaborate tutorial (or
watching the recording of those who participated). One unit coordinator and four tutors taught the
class. The teaching team was responsible for monitoring the discussion board forums, marking
students’ assessments, and running the weekly Collaborate sessions.

Results
Consistent patterns in the students’ participation emerged in the front stage observation
data. In order to illustrate this, I characterized the students into four front stage typologies:
performers, extras, cameos, and stagehands. The typologies not only describe the participation
patterns, but also extend Goffman’s (1959) theater terminology (front stage, backstage, actors,
props, setting). Table 1 describes the performance patterns observed in the front stage.

Table 1
Description of Front Stage Roles
Front stage role Description of the front stage performance patterns
Performer Posted weekly, or more, to front stage discussion boards
Extras Occasionally posted to the front stage discussion board,
participation was consistent at the start and tapered off
Cameos Made brief appearances in the front stage discussion board. This
was typically to introduce themselves or ask one question about
one assessment
Stagehands Never posted to the front stage discussion board

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 269


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

Table 2
The Breakdown of Students in the Psychology Class and Total Participants in the Fortnightly
Questionnaires and Interviews
Total students Total participants
Level of Total participants
enrolled in the in fortnightly
participation in interviews
class questionnaires
Stagehand 44 7 3
Cameo 45 6 4
Extra 23 4 3
Performer 13 4 4
Total students 126 21 14
Source: Front stage observation data

Table 2 shows the breakdown of performers, extras, cameos, and stagehands in the
psychology class. A variety of participation levels were present in the study. In addition,
participants who completed questionnaires and interviews were well represented across the
participation levels. Out of the 126 students, a total of 44 were stagehands and therefore never
posted to the discussion board, and only 13 were performers. Most students rarely, if ever, posted
to the discussion board in the front stage.
Interestingly, Table 2 illustrates how those students who had the highest representation in
the study, the stagehands, had the lowest representation in the front stage discussion board. This
participation pattern could be used to support the suggestion that a student’s front stage data, such
as posting to the discussion board or hours spent logged into the front stage, may not be an indicator
of engagement in the online class. This was further supported by Table 3, which compares the
average hours students spent in the front stage compared to the time students reported using to
complete class-related tasks in the backstage.

Table 3
Average Hours that 21 Participants Performed Class-related Tasks over 12 Weeks
Level of
Average hours online Average hours Total Average
participation
in the front stage backstage online online hours
(the cast)
Stagehands 10 27 74
Cameos 9 17 63
Extras 28 59 142
Performers 75 34 145
Total 122 137 424
Source: LMS data and questionnaire data

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 270


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

Backstage online Facebook groups and friends


As shown in Table 3, stagehands, cameos, and extras spent almost twice the amount of
time in the entire online backstage (not just Facebook) when compared with the front stage.
Although the times reported in Table 3 are not a measure of learning or engagement with the class,
it does help to identify contexts where students might prefer to learn or engage within an online
class. Responses to fortnightly questionnaires indicate that students were engaged in backstage
online spaces such as the university library, Google Scholar, and YouTube; however, interviews
reveal that the most popular space that stagehands, cameos, and extras used for learning in the
online backstage was Facebook. The Facebook groups and the purpose students described are
summarized in Table 4 below.

Table 4
Facebook Groups and Descriptions
Facebook Groups Purpose
Social Science Majors (Closed) A student group for all majors in the
social science faculty at this university
only. For learning content and learning
to be a student.
Psychology Majors Only (Closed) A student group for psychology majors
that enrolled at this university in the
same year. For learning content and
learning to be a student.
Individual study groups related to A small student group organized to
specific class (Closed) study together for a specific class or
complete tasks together
Social Facebook groups unrelated to A public support group for any tertiary
the university (Public) student at any institution for example:
UNI Coffee Shop. For learning to be a
student.
Content Facebook groups or groups A public Facebook group for people
unrelated to the university (Public) interested in learning about content of
their choice for example: The Glasser
Institute. For learning content.
Facebook friends from this university Some students made one-to-one
friendships and shared study and social
or personal information like family
photos. For learning to be a student,
learning content, and social.
Source: Interview data

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 271


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

In dramaturgical sociology, elements of human interaction depend on audience, time, and setting
(Goffman, 1959). These factors also shape social learning experiences in online courses.
Explicated illustrations of each follow.
Audience size and attributes in the front stage and backstage
Most students reported being members of both the Social Science Majors and Psychology
Majors groups. The Social Science Majors group was the largest of the groups, with over 600
members, and the Psychology Majors group had over 130 members. The discussion board in the
LMS also had 126 members, but only 13 of those students were performers. Table 5 lists the
characteristics from the interview data that students used to describe each audience. The most
notable difference between the front stage and backstage audiences was the presence of teachers
in the front stage and the presence of peers with a variety of experience levels in the online
backstage.

Table 5
Students’ Descriptions of the Front Stage and Backstage Audiences
Front stage discussion board audience Backstage online Facebook audience
• Teachers who only log in at • Peers from the same class (near
certain times of day peers)
• Teachers who give harsh feedback • Peers who have completed this class
or request students to relocate (experts)
discussion board posts
• Peers who are now friends (two
• Teachers/university staff who years’ time)
have vetted learning materials
• Peers who have vetted resources for
• Teachers/peers who may not learning
respond or respond too late to
• Peers from various class but on the
questions or completion of tasks
same academic calendar
• Peers and teachers who write
• Peers who have around the clock
using formal language and big
access and easily accessible
words
notifications about posts
• Peers in this class only (12 weeks’
time)
• Peers who make off-topic posts
making the discussion board
unwieldy
Source: Interview data

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 272


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

Discussion
There is a marked difference between the ways online students present themselves front
stage in the LMS and backstage in Facebook. Through the lens of Lave and Wenger’s (1998) social
learning theory, which suggests that learning occurs across space and time in multiple contexts,
and Goffman’s (1956) approach of region behavior, we see that students with low front stage
participation were active and engaged in the backstage. Discussion of these practices is organized
into four sections: (a) time and social learning experiences; (b) students’ perceptions of tutors; (c)
speed; and (d) a sense of belonging.
Time and social learning experiences
For some students, time may be an important factor that supports or impedes learning.
Overall, students reported spending more time in the online backstage than they spent in the front
stage. The exception to this finding was the Performer cohort, who perpetually logged into the
front stage for fear of missing out on information that could be important. This group also acted
as first-responders, being the first to respond to questions or tasks, as well as the first to offer
encouragement to classmates. In studies of Facebook usage, Bowman (2014) referred to these
students as super-users because they respond to students even when they don’t know the answer.
This was the also the case in the front stage discussion board. Super-users responded to almost
every post by a student or teacher. Ingrid, for example, was a super-user who posted fifteen times
more than the average student, which made her responsible for 15% of the 1,430 posts in the front
stage:
This [front stage] is my friendly place, where I feel part of something, not all alone
at my desk, looking out at the horrible gray walls of the house next door. I think I
have gained as much from various discussion boards as from all my other reading…
Ingrid uses “friendly” to describe her feelings of connectedness and sense of belonging in her
studies. Ingrid did not have a Facebook account because she felt that the discussion board was
enough to support her learning experience in the class. Like Ingrid, most of the performers were
not on Facebook because they felt that their sense of belonging was fulfilled by their active
presence in the front stage.
While students like Ingrid may be inherently social, other students may need more time to
develop social ties that support their learning experience. In which case, time may be an important
aspect between the front stage and backstage that impacts a social learning experience. The timing
of communication, such as whether communication occurs synchronously or asynchronously,
affords distinct behaviors (Hogan 2010). And in this case, the length of time in a space may have
even impacted whether communication occurred at all. The online class ran for twelve weeks but
some students were in the university Facebook groups for as long as two years or more. The
ephemeral nature of the online class may afford students the time needed to negotiate their role in
each space. In the front stage, a stagehand remained constant for the twelve weeks, whereas a
cameo or extra tapered off around week 3. Table 6 shows how participation in the front stage
significantly dropped off between weeks one and four. This decrease in participation suggests that
learning may have shifted to a backstage.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 273


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

Table 6
Participation Rate by Week
Weeks in the front
1 2 3 4 5 6 7 8 9 10 11 12
stage
Total students
posting in the front 37 23 23 14 12 11 9 6 6 6 4 1
stage per week
Source: Front stage observation data

In the large backstage Facebook groups most of the students described changing their
participation trajectories in Facebook from stagehand to performer. Kara recalled being invited to
the Social Science Facebook group during her first semester via a front stage discussion board
post. She joined the Facebook group and, at first, only observed. As an observer in the Facebook
group, Kara became acquainted with members from the Social Science Majors group because their
names frequently appeared in her everyday Facebook feed. After six months of observing she
began posting to the community because she felt more connected there than she did in the
discussion board:
I didn't really interact much at first. It is probably more after 6 months as the same
names keep cropping up. We post a bit of everything [related to psychology] and
sometimes just letting off steam over marks.
The process that Kara describes is a typical first step in social learning. Lave & Wenger
(1991) suggests that all learning begins with legitimate peripheral participation before learners feel
confident enough to participate as a newcomer, near peer, or expert. Despite being at the university
and in the Facebook group for the same number of years, Kara was a stagehand in the front stage
but described herself as an extra, if not a performer, in the online backstage. This, though, took her
six months to achieve. This suggests that the length of a course may not be sufficient for some
learners to establish a sense of trust, belonging, and the ability to negotiate their roles and
interactions with others—all features which Wenger (1998) argues are conditions for social
learning. This was the case for some extras, Briana and Julia, who describe how over time they
made Facebook friends from their online university class who helped to support their social
learning in the backstage:
Fortunately I have established online relationships with people throughout this
degree and they aren’t necessarily in my current unit but may have completed and
are often happy to discuss things via Facebook through inbox and also through
Facebook on the main group for [Social Science Majors and Psychology Majors
Only]. (Briana, Extra)
I became friends with two ladies [from a previous class]. One’s in Townsville and
Cairns. We’d brainstorm forever…We message through Facebook. Actually one
day I talked [to the one friend] for three and a half hours. So that works better than
the discussion board in my personal situation. (Julia, Extra)

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 274


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

As a result of participating in Facebook groups, most of the students reported making


Facebook friends who supported their learning experience. Bosch (2009) found that it was
common for university students to be members of multiple university-related Facebook groups.
One stagehand was simultaneously a member of the Social Science Majors group, the Psychology
Majors Only group, a Facebook group for every class she enrolled in (past and present), and a
Facebook group unrelated to the university about positive mental health counselling (the topic of
the class). In addition to these groups, she has Facebook friends whom she met over the last three
years of her university study. Most of the students interviewed were in more than one Facebook
group, which may have increased their chances for finding relationships where they felt a sense of
belonging, trust, and negotiation. Facebook provided multiple contexts for students to apply their
learning, related to the psychology class, with others over time.
Students’ perceptions of tutors in the front stage
Audience and time clearly shape social interactions. In the case of course tutors, some
students described feedback as too slow, too harsh and sometimes disruptive to conversations that
could have contributed to a student’s understanding (see Table 5.). Stagehands, and to some extent
extras and cameos, self-segregated themselves confirming Goffman’s observation that front stage
control is often one measure for audience segregation. Through segregating oneself from the front
stage, actors can escape or buffer themselves from those aspects of a setting they find unpleasant
(Goffman, 1959). This is a useful way to explain students’ absence in the front stage. It might also
explain sudden decreases in front stage participation. Kathy, an extra, was the leader for a small
Facebook group of students who were unhappy about the class. Kathy described tutor feedback on
assessments as “harsh.” She also reported tutors asking her to move discussion board posts from
one discussion board forum to another. This happened to various students in the class on six
occasions. In each instance the conversation that had been interrupted ended and the student did
not post to the discussion board again. Kathy describes this disruption:
Kathy (Extra): I went on the discussion board and asked about ethics. About an
experience I wanted to know about a psychiatrist…and one of the tutors was
awesome about it and was telling me the procedure, but then another tutor said,
“Ah, this shouldn’t be on this discussion board, it should be just on the other
discussion board. Did you want to move this conversation there?”
Interviewer: Was that the end of the conversation?
Kathy (Extra): Yep, I was talking to the other tutor, and she was like, talking about
the ethics of it, and it was fine, but then the other tutor just like totally cut us
off…I…I [also] put it on there [the discussion board] is there anyone in the Gold
Coast who wants to study and meet up and…and then um, the tutor was just like
“Oh, can you put this on the other…another discussion board” or something… I
was like “Oh, okay”. I just…I didn’t post it to the other one. I just thought…well,
I gave up.
Interviewer: Did any of your classmates respond to you about meeting up?
Kathy (Extra): No.
Kathy eventually stopped posting in the front stage altogether. By week 4, her discussion
board participation ceased but her backstage Facebook participation increased. Goffman (1959)
suggests that actors who go backstage are afforded opportunities to derogate the audience and that

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 275


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

the discussion in the backstage can often turn towards problems of staging. Interestingly, students
explained how these backstage conversations eventually evolved into learning opportunities. In
Kathy’s case, she found a group of students from the Psychology Majors Only Facebook group
that were upset about technology problems and class assessment feedback. Initially, these students
bonded over their negative experiences in the front stage. Subsequently they created an individual
Facebook group for the psychology class. In this group, they worked through weekly activities
together and studied for the final exam together. Kathy preferred this space instead of the
discussion board:
Um, just I liked talking on Facebook more than the discussion board. I was able to
learn from my classmates in that way. Yes, um, we completed the tasks [from the
front stage] they [my classmates] also sent videos out on Facebook, like examples,
like YouTube videos of different counselling methods.
Kathy’s experience studying in a Facebook group was not uncommon. A total of four small
separate study groups (containing 2–5 students) were reported during interviews and two more
Facebook study groups were referenced by students in discussion board posts. While perceptions
of their tutor may not be the only reason for their segregation into the backstage it does highlight
the impact of teaching presence in online courses. As studies of Facebook show, students can seek
and find information from those who make them feel more comfortable.
Students’ perceptions of speed
In addition to self-segregating for the purposes of having small study groups, students also
preferred Facebook based on their perception of the speed of responses to posts. Overall, speed of
responses was reported as faster in the backstage Facebook groups. During interviews students
echoed repeatedly that information was faster in the backstage online.
My first step was [the] Discussion Board and I had to wait because responses are
slow. Facebook was the second step. [But I preferred] the Facebook group because
the responses were quicker and also more personal. (Briana, Extra)
There are several reasons why the backstage audience was perceived to be faster than the front
stage audience. In the front stage, conversations may have been “slow” for reasons found by
Rambe (2012): if students view certain posts as a teacher’s domain this could slow down the
responses in the front stage. However, if teachers are not present, like in the backstage Facebook
groups of this study, then the behavior in the setting changes or in this instance the communication
was “quicker”:
…there are quite a few really good YouTube channels that have ex professors and
teachers and they are really good they explain things without treating the audience
like a brainless dolt. Usually videos from Facebook were always good because
another classmate already used it……that is one of the great things about the
Facebook groups the sharing of links to extra material that sometimes help
understand a class or concept [from that week]… (Kara, Stagehand)
Resources in the backstage, such as videos, were vetted by more experienced students who had
already completed the class and understood what it was like to be a student in that class. This
supported students taking control of their learning experience in terms of time and access.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 276


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

Students’ perceptions of belonging


Belonging emerged as a theme absent from prior research on Facebook in the university
context. In the data, students reported and described a sense of belonging in the backstage that was
not present in the front stage. Students in the backstage developed a “less formal” and more
“comfortable” setting:
Facebook was good actually, because you could post bits and pieces and whatever.
It felt less formal than the discussion board. Even though we were probably talking
about the same thing, but to me personally, it felt less structured. Less academic, is
probably the word I am looking for. (Julia, Extra)
I am not comfortable posting on DB. I think there is the fear of making an idiot of
myself but that is only part of it. I do feel disconnected there, I have posted things
and waited days for a response and sometimes no response. (Kara, Stagehand)
Students benefited from having a space for observing and a separate space for sharing. Multiple
contexts help to facilitate social learning through both observation and direct experiences (Lave &
Wenger, 1991; Wenger, 1998).
No backstage online Facebook presence
Three students reported not having a backstage online presence on Facebook. The reasons
cited were not having a Facebook account, not knowing about the university or class Facebook
pages, or not considering themselves social people. One cameo who did not consider herself a
social person described herself: “I am not really a joiner though, like in general, I don’t join
groups.” This student preferred to discuss her learning experience with her face-to-face co-workers
and clients. Meanwhile, another cameo was a shift-worker. She could only study during the hours
when her classmates were most likely sleeping, therefore both her front stage presence and
backstage presence were minimal. These caveats remind us that not all students are social learners
or live in circumstances that afford online social learning experiences. Similarly, it is unreasonable
to expect that all students want an online Facebook presence to support their learning experience.
As illustrated in the example above, some learners support their social learning experience through
offline relationships and this could be another backstage worth exploring (see for example
Gilmore, 2014; Gilmore, 2017; Livingstone & Sefton-Green, 2016).

Conclusion
By using the theatre metaphor, I was able to capture the specific ways university students’
online learning practices differ across settings, time, and audience. This analytical approach
revealed useful insights in explaining why students are absent from a class’s discussion board and
what they alternatively do to learn course content. Absence from the front stage may not be an
absence from learning; rather, the act of being absent affords actors the control to escape, or buffer
oneself, from deterministic demands (Goffman, 1959). Some students avoided the front stage
discussion board because the audience was too slow, too harsh, and too formal. The backstage
online audience solved these problems of the front stage, which made this a more attractive
location. Students may need spaces where control and content are student-driven. The challenge
for teachers and universities is to develop curriculum with the backstage in mind.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 277


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

This study employed a dramaturgical approach to examine how online students perform
class-related tasks in spaces other than those designed and monitored by the university. Such an
approach allows for careful investigation and analysis of how setting, time, and audience impact
online students’ learning experience. While not every student used Facebook for university
purposes, a closer examination of the backstage online in this psychology class reveals how
Facebook facilitates some students’ social learning experiences, a finding that can apply to various
forms of social media and collaborative technologies outside of an LMS.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 278


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

References
Arouri, Y. (2015). How Jordanian university students perceive the opportunities and challenges
of using Facebook as a supplementary learning resource? International Journal of
Emerging Technologies in Learning (iJET), 10(1), 46.
http://doi.org/10.3991/ijet.v10i1.4265
Bicen, H., & Cavus, N. (2011). Social network sites usage habits of undergraduate students: Case
study of Facebook. Procedia—Social and Behavioral Sciences, 28, 943–947.
http://doi.org/10.1016/j.sbspro.2011.11.174
Bosch, T. E. (2009). Using online social networking for teaching and learning: Facebook use at
the University of Cape Town. Communication, 35(2), 185–200.
http://doi.org/10.1080/02500160903250648
Bowman, N. D., & Akcaoglu, M. (2014). “I see smart people!”: Using Facebook to supplement
cognitive and affective learning in the university mass lecture. The Internet and Higher
Education, 23, 1–8. http://doi.org/10.1016/j.iheduc.2014.05.003
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in
Psychology, 3(2), 77–101.
Bullingham, L., & Vasconcelos, A. C. (2013). “The presentation of self in the online world”:
Goffman and the study of online identities. Journal of Information Science, 39(1), 101–
112. http://doi.org/10.1177/0165551512470051
Catalano, A. (2015). The effect of a situated learning environment in a distance education
information literacy course. The Journal of Academic Librarianship, 41(5), 653–659.
http://doi.org/10.1016/j.acalib.2015.06.008
Chaiklin, S., & Lave, J. (Eds.). (1993). Understanding practice: perspectives on activity and
context. Cambridge University Press.
Collins, A., Brown, J. S., & Newman, S. E. (1989). Cognitive apprenticeship: Teaching the crafts
of reading, writing, and mathematics. In L. B. Resnick (Ed.), Knowing, learning, and
instruction: Essays in honor of Robert Glaser (pp. 453–494). Lawrence Erlbaum
Associates.
Cuesta, M., Eklund, M., Rydin, I., & Witt, A.-K. (2016). Using Facebook as a co-learning
community in higher education. Learning, Media and Technology, 41(1), 55–72.
http://doi.org/10.1080/17439884.2015.1064952
Denzin, N. K. (1989). The research act: A theoretical introduction to sociological methods (3rd
ed). Prentice Hall.
DiVall, M. V., & Kirwin, J. L. (2012). Using Facebook to facilitate course-related discussion
between students and faculty members. American Journal of Pharmaceutical Education,
76(2). doi: 10.5688/ajpe76232
Donlan, L. (2014). Exploring the views of students on the use of Facebook in university teaching
and learning. Journal of Further and Higher Education, 38(4), 572–588.
http://doi.org/10.1080/0309877X.2012.726973

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 279


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

English, R. M., & Duncan-Howell, J. A. (2008). Facebook goes to college: Using social
networking tools to support students undertaking teaching practicum. Journal of Online
Learning and Teaching, 4(4), 596–601.
Gilmore, D. (2014). Goffman's front stage and backstage behaviors in online education. Journal
of Learning Analytics, 1(3), 187-190.
Gilmore, D. (2017). Where and with whom do students learn in an online university subject? A
multiple case study analysis. Swinburne University.
Goffman, E. (1959). Presentation of self in everyday life. Double Day Anchor Books.
Hogan, B. (2010). The presentation of self in the age of social media: Distinguishing
performances and exhibitions online. Bulletin of Science, Technology & Society, 30(6),
377–386. http://doi.org/10.1177/0270467610385893
Hyde-Clarke, N. (2013). Facebook and public debate: An informal learning tool for the youth.
Journal of African Media Studies, 5(2), 131–148.
Junco, R. (2014). iSpy: Seeing what students really do online. Learning, Media and Technology,
39(1), 75–89. http://doi.org/10.1080/17439884.2013.771782
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation.
Cambridge University Press.
Livingstone, S. M., & Sefton-Green, J. (2016). The class: Living and learning in the digital age.
New York University Press.
Lin, P.-C., Hou, H.-T., Wang, S.-M., & Chang, K.-E. (2013). Analyzing knowledge dimensions
and cognitive process of a project-based online discussion instructional activity using
Facebook in an adult and continuing education course. Computers & Education, 60(1),
110–121. http://doi.org/10.1016/j.compedu.2012.07.017
Livingstone, S. M., & Sefton-Green, J. (2016). The class: Living and learning in the digital age.
New York University Press.
Markham, A. N., & Baym, N. K. (Eds.). (2009). Internet inquiry: Conversations about method.
Sage Publications.
Masood, K., Ahmed, B., Choi, J., & Gutierrez-Osuna, R. (2012). Consistency and validity of
self-reporting scores in stress measurement surveys. In Engineering in Medicine and
Biology Society (EMBC), 2012 Annual International Conference of the IEEE (pp. 4895–
4898). IEEE. Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=6347091
Meskill, M, Sadykova G. (2017). The presentation of self in everyday ether: A corpus analysis of
student self-tellings in online graduate courses. Online Learning, 11(3), 123-138.
http://dx.doi.org/10.24059/olj.v11i3.1723
Pearson, E. (2009). All the World Wide Web’s a stage: The performance of identity in online
social networks. First Monday, 14(3). http://doi.org/10.5210/fm.v14i3.2162
Phillips, S. (2007). A brief history of Facebook. The Guardian. Retrieved from
https://www.theguardian.com/technology/2007/jul/25/media.newmedia
Prescott, J., Wilson, S., & Becket, G. (2013). Facebook use in the learning environment: do
students want this? Learning, Media and Technology, 38(3), 345–350.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 280


A Dramaturgical Examination of Online University Student Practices in a Second Year Psychology Class

Rambe, P. (2012). Critical discourse analysis of collaborative engagement in Facebook postings.


Australasian Journal of Educational Technology, 28(2), 295–314.
Roblyer, M. D., McDaniel, M., Webb, M., Herman, J., & Witty, J. V. (2010). Findings on
Facebook in higher education: A comparison of college faculty and student uses and
perceptions of social networking sites. The Internet and Higher Education, 13(3), 134–
140. http://doi.org/10.1016/j.iheduc.2010.03.002
Rogoff, B., & Lave, J. (Eds.). (1984). Everyday cognition: Its development in social context.
Harvard University Press.
Ross, D. A. R. (2007). Backstage with the knowledge boys and girls: Goffman and distributed
agency in an organic online community. Organization Studies, 28(3), 307–325.
http://doi.org/10.1177/0170840607076000
Ru-Chu, S. (2013). Effect of using Facebook to assist English for Business Communication
course instruction. TOJET: The Turkish Online Journal of Educational Technology,
12(1), 52–59.
Selwyn, N. (2009). Faceworking: Exploring students’ education-related use of Facebook.
Learning, Media and Technology, 34(2), 157–174.
Staines, Z., & Lauchs, M. (2013). Students’ engagement with Facebook in a university
undergraduate policing unit. Australasian Journal of Educational Technology, 29(6),
792–805.
Stake, R. E. (1995). The art of case study research. Sage Publications.
Trammell, K. D., & Keshelashvili, A. (2005). Examining the new influencers: A self-
presentation study of A-list blogs. Journalism & Mass Communication Quarterly, 82(4),
968–982.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes.
Harvard University Press.
Wenger, E. (1998). Communities of practice: Learning, meaning, and identity. Cambridge
University Press.
Stake, R. E. (1995). The art of case study research. Sage Publications.
Waller, V., Farquharson, K., & Dempsey, D. (2016). Qualitative social research: Contemporary
methods for the digital age. Sage Publications.

Online Learning Journal – Volume 24 Issue 1 – March 2020 5 281

You might also like