Online Learning in Higher Education: Exploring Advantages and Disadvantages For Engagement

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

J Comput High Educ (2018) 30:452–465

https://doi.org/10.1007/s12528-018-9179-z

Online learning in higher education: exploring


advantages and disadvantages for engagement

Amber D. Dumford1 • Angie L. Miller2

Published online: 3 April 2018


 Springer Science+Business Media, LLC, part of Springer Nature 2018

Abstract As the popularity of online education continues to rise, many colleges and
universities are interested in how to best deliver course content for online learners.
This study explores the ways in which taking courses through an online medium
impacts student engagement, utilizing data from the National Survey of Student
Engagement. Data was analyzed using a series of ordinary least squares regression
models, also controlling for relevant student and institutional characteristics. The
results indicated numerous significant relationships between taking online courses
and student engagement for both first-year students and seniors. Those students
taking greater numbers of online courses were more likely to engage in quantitative
reasoning. However, they were less likely to engage in collaborative learning,
student-faculty interactions, and discussions with diverse others, compared to their
more traditional classroom counterparts. The students with greater numbers of
online courses also reported less exposure to effective teaching practices and lower
quality of interactions. The relationship between these engagement indicators and
the percentage of classes taken online suggests that an online environment might
benefit certain types of engagement, but may also be somewhat of a deterrent to
others. Institutions should consider these findings when designing online course
content, and encourage faculty to contemplate ways of encouraging student
engagement across a variety of delivery types.

Keywords Online education  Higher education  Student engagement  Assessment

& Angie L. Miller


[email protected]
Amber D. Dumford
[email protected]
1
University of South Florida, 4202 E Fowler Ave, EDU 105, Tampa, FL 33620, USA
2
Indiana University Bloomington, 1900 E. 10th St, Suite 419, Bloomington, IN 47406, USA

123
Online learning in higher education: exploring advantages… 453

Introduction

A rapidly increasing number of colleges and universities are looking for ways to
deliver course content online. Online technology (email, learning management
systems, discussion boards, video conferences, social media, etc.) can offer efficient
and convenient ways to achieve learning goals for online education students (Chen
et al. 2010; Junco et al. 2010, 2013; Parsad and Lewis 2008). As technology swiftly
develops and more students pursue the online learning route for a variety of reasons,
it is important to further develop assessment and evaluation techniques for the
‘‘virtual university’’ (Stallings 2002). However, assessment of online learning
programs should also take into account some of the unique aspects of this type of
learning environment, as ‘‘using established techniques for student success in
traditional classrooms do not always work in distance courses’’ (Serwatka 2002,
p. 48).

Online education issues and trends

It is crucial to explore the current situations and issues with higher education online
learning to provide a better context for ways in which the student experience might
be improved. Online learning has several characteristics that can have an impact on
faculty implementation and course progress. Restauri et al. (2001) cite the
importance of considering the logistical component of online learning, suggesting
that improperly functioning technology can hinder learning and engagement if
students and instructors must devote time and resources to simple content access.
Failing technological aspects of online courses can be especially frustrating for
students and have a negative impact on their overall perception of the course
(Pollack and Wilson 2002), so user-friendly design and adequate technological
support must be considered differently within online education. Furthermore, Shuey
(2002) indicates that it can be difficult for instructors to adapt certain activities (such
as performance assessments, continuous assessment, and proctored tests) to the
online format without losing content knowledge or interaction between classmates
and/or instructors. Wijekumar et al. (2006) suggest that the feedback loop between
teacher and student that is taken for granted in a face-to-face setting must be adapted
as well, as online students may feel more isolation from their professors if
traditional assessments like multiple-choice quizzes and exams are used too heavily.
In addition to issues of cheating, overreliance on the summative feedback from
graded quizzes and exams might limit the formative feedback given to students
during the learning process, which can also be problematic.
The rapid rate of changes in technology often exceeds the rate of scientific
research on such time-sensitive topics. The explosion of social media within the past
decade has had an increasing impact on higher education, and more recent research
indicates trends for the importance of incorporating social media into the classroom
(Evans 2014; Junco et al. 2010, 2013; Tess 2013). Furthermore, there is rising
evidence for the importance of adapting online education to mobile devices, citing
that younger and full-time employed students are more likely to use mobile versions

123
454 A. D. Dumford, A. L. Miller

of learning management systems (Han and Shin 2016). Most studies find positive
effects from use of mobile learning for online courses (as discussed in Wu et al.
2012), although this could also be due to bias against the publication of studies with
non-significant findings (Whitley 2002).
Another consideration in the evaluation of online learning programs is that online
education students often have different background characteristics in terms of
gender, age, academic discipline, and prior education, which contributes not only to
their preference for an online course format but also to their success in any
academic setting (Richardson et al. 1999). Investigations of differences between
online and face-to-face course formats also need to explore whether differences in
outcomes should be attributed to the online medium itself or to differences in
student-level characteristics (Wojciechowski and Palmer 2005). Some research
suggests that certain types of students, including younger, male, and Black students,
may be at a disadvantage in their ability to adapt to online courses (Xu and Smith
Jaggars 2013). Furthermore, students may need additional motivation, organization,
and self-discipline to be successful in their online learning endeavors (Jacob and
Radhai 2016). Online education has the potential to reach a wider audience, in a
sense leveling the playing field for students usually at a disadvantage in access to
education; however, the unique needs and situations of these students can greatly
impact their educational experiences and institutions should take care not to
exacerbate existing gaps.
Yet despite the potential issues with implementation and learning outcomes,
online education continues to expand. Close to 70% of higher education institutions
in the United States say that online education is crucial to their long-term strategies
(Allen and Seaman 2013), and as of 2013 there were over 5.5 million students
enrolled in at least one online education course at degree-granting postsecondary
institutions (U.S. Department of Education 2016), pursuing degrees, micro-
credentials, professional development, or personal growth. In addition to courses
taught entirely online, blended learning (i.e., instruction that combines face-to-face
with online elements) has become increasingly popular not only at the course level
(Drysdale et al. 2013) but also at the student level, as many students are taking a mix
of online only and face-to-face courses as part of their college experience (Allen and
Seaman 2013). Nearly a decade ago, research from Kim and Bonk (2006) predicted
some of these trends in the rise of learning management systems and blended
learning, but further noted the importance of planning and moderating skills for
faculty teaching online courses and that faculty generally want training and support
from their institutions to improve their online teaching abilities.

Student engagement

Since student engagement, defined as student involvement in educationally


purposeful activities (Kuh 2001), has been shown in many studies to be the
strongest predictor of students’ learning and personal development (see seminal
higher education works from Astin 1993; Pace 1980; Pascarella and Terenzini
2005), understanding how the online environment affects engagement should
inform the implementation of online programs. There are many elements that

123
Online learning in higher education: exploring advantages… 455

contribute to student engagement, both inside and outside of the classroom (Kuh
2001). Academic behaviors, including the use of strategies for self-regulated
learning, quantitative reasoning, activities that promote higher-order thinking, and
reflective and integrative learning can increase content knowledge as well as overall
cognitive processing and are all linked to various aspects of achievement and
success (Ormrod 2011; Pascarella and Terenzini 2005). Interactions with peers
through collaborative learning and discussions with diverse others are also
important elements of student engagement (Cabrera et al. 2002), along with
student-faculty interactions and effective teaching practices utilized by faculty (Kuh
and Hu 2001a). Additionally, there are environmental features that contribute to
student engagement, such as the quality of interactions with students, faculty, and
other types of staff, as well as an overall perception of a supportive environment
(Baird 2005).
While numerous researchers have explored the impact of the widespread
adaptation of digital technologies on students’ educational attainment and learning
outcomes (Braten and Streomso 2006; Kuh and Hu 2001b; Robinson and Hullinger
2008; Zhou and Zhang 2008), relatively little is known about how these alternate
learning experiences and practices impact overall student engagement, as a majority
of the original research on student engagement was centered on students in
traditional face-to-face settings. Newer research that does explore the effects of
technology-mediated learning on student engagement and outcomes generally
focuses solely on findings at the individual course level (Drysdale et al. 2013;
Henrie et al. 2015), not the overall student experience, so it can be difficult to
generalize these findings.
A few studies have suggested that technology can have positive effects on student
engagement (Chen et al. 2010; Henrie et al. 2015; Hu and Kuh 2001; Junco et al.
2013; Nelson Laird et al. 2005; Robinson and Hullinger 2008) and time spent in co-
curricular activities (Junco 2012). For example, one study showed that using
asynchronous technology tools promoted reflection, which leads to higher-order
thinking (Robinson and Hullinger 2008). Others found that online courses increase
the need for students to work collaboratively (Thurmond and Wambach 2004), and
that more interactivity and engagement with online discussion boards are related to
higher course performance (Kent et al. 2016).
Given these previous findings, the current study investigates the engagement of
students who access their learning content through an online medium at varying
levels, exploring patterns of engagement for online learners and those in face-to-
face settings through the percentage of classes in which a student is enrolled online.
While research has also found differences in the educational effectiveness of online
courses by discipline (Dominguez and Ridley 2001), little research has compared
the differences in overall engagement levels for online and traditional classroom
environments. Therefore, this research will address gaps in the literature through the
examination of students in online and face-to-face settings across several different
aspects of student engagement. Through its design, it will also address the fact that
many higher education students are taking a mixture of online and traditional
classroom courses (Moore and Kearsley 2011), and explore how the degree of
online course exposure contributes to student engagement, while controlling for

123
456 A. D. Dumford, A. L. Miller

those elements previously suggested to differ as well—student characteristics,


institutional characteristics, and discipline (Wojciechowski and Palmer 2005; Xu
and Smith Jaggars 2013).

Methodology

Participants

The National Survey of Student Engagement (NSSE) annually collects information


from first-year and senior (usually in their fourth or fifth year, nearing graduation in
terms of credits) students about the nature and quality of the programs and activities
in which they are engaged while at their higher education institutions. Updated in
2013, much of the terminology on the survey was adapted or developed with the
goal of being inclusive of both online and traditional learning environments. For
instance, the words ‘‘in the classroom’’ were changed to ‘‘in your courses’’ to avoid
the reference to a physical space. This study uses 2015 NSSE data from 300,543
first-year students and seniors attending 541 U.S. institutions. None of these
institutions were considered primarily online institutions (those offering the vast
majority of courses online). The participating institutions varied across geographic
region, Carnegie classification, and enrollment size. First-year students made up
42.6% of the respondents, while the remaining 57.4% were seniors. The sample was
35.0% males and 65.0% females, with 88.5% reporting full-time enrollment status.
In addition, the sample included 17,080 students (7.2%) taking all of their courses
completely online and 180,525 students (76.1%) taking none of their courses online.
The average institutional response rate was 29% (SD = .116).

Measures

Students can be classified into online, traditional, or mixed format students using
two NSSE items. The first item asks students to report the number of courses they
are taking in the current term, and then a follow-up item asks how many of those
courses are entirely online. Using these two items, a ‘‘percentage of courses online’’
variable was created by dividing the number of online courses by the number of
total courses. For instance, if a respondent reported taking 4 courses, 3 of which
were entirely online, he/she would be 75% online. This continuous variable was the
independent variable of interest in the study.
The dependent variables were ten scales, which NSSE terms ‘‘Engagement
Indicators,’’ that were used to measure the engagement levels of students. These
scales included higher-order learning (4 items; Cronbach’s a = .85 first-year, .86
senior), reflective and integrative learning (7 items; Cronbach’s a = .87 first-year,
.88 senior), quantitative reasoning (3 items; Cronbach’s a = .85 first-year, .87
senior), learning strategies (3 items; Cronbach’s a = .77 first-year, .78 senior),
collaborative learning (4 items; Cronbach’s a = .81 first-year, .80 senior),
discussions with diverse others (4 items; Cronbach’s a = .89 first-year, .90 senior),
student-faculty interactions (4 items; Cronbach’s a = .83 first-year, .85 senior),

123
Online learning in higher education: exploring advantages… 457

effective teaching practices (4 items; Cronbach’s a = .85 first-year, .87 senior),


quality of interactions (5 items; Cronbach’s a = .84 first-year, .81 senior), and
supportive environment (8 items; Cronbach’s a = .89 first-year, .89 senior). These
scales show acceptable levels of internal consistency (McMillan and Schumacher
2001) and previous research suggests sufficient evidence for construct validity with
exploratory and confirmatory factor analyses (Miller et al. 2016). Each scale was
scored on a 60-point scale by converting the response sets to 60-point intervals and
then averaging the rescaled items. Consequently, a score of zero would mean a
student responded at the bottom of the response set for every item in the scale, while
a score of 60 would mean that a student responded at the top of the response set for
every item in the scale. Thus, higher scores on the scales mean higher levels of that
particular aspect of engagement. (For a more detailed discussion of the scale scoring
and construction process and the individual items in each scale, please refer to the
NSSE website at http://nsse.indiana.edu/html/engagement_indicators.cfm#a1.) The
survey instrument also collected demographic information from respondents,
including gender, transfer status, enrollment status, parents’ education, age, major,
race/ethnicity, and grades. The survey instrument data is then combined with
institution-provided data, such as student scores for SAT/ACT, institution control,
and size. This demographic and institutional information served as control variables
for all of the models.

Analyses

Ordinary Least Squares (OLS) regression analyses were chosen due to the ordinal
nature of the dependent variables and the appropriateness of this method for testing
theory with real-world data collected outside of manipulated laboratory settings
(Field 2009; Tabachnick and Fidell 2001). A series of 10 step-wise OLS regression
analyses (one per each engagement indicator) were conducted for both first-year and
senior students. In each of the analyses, the percentage of online courses was
entered as the last step predictor variable by itself. Selected student and institutional
characteristics were entered as step one of the model, as previous research
(Pascarella and Terenzini 2005) suggests that there are differences in student
engagement and educational experiences for students based on these characteristics.
The student-level characteristics included were gender, transfer status, enrollment
status, first generation status, age, SAT/ACT, major, race/ethnicity, and grades.
Control (private/public) and size were included as the institutional-level character-
istics. All categorical independent variables were dummy coded prior to entry in the
model. The 10 engagement indicators were the outcome variables in each of the
models. The Variance Inflation Factor (VIF) values for each predictor variable in
these regression models were all well below 5 (ranging from 1.0 to 2.5), suggesting
that multicollinearity was not an issue in the models (Field 2009). Finally, since
multiple comparisons were being made, the Bonferroni correction was used and
only those predictors that were significant at the p \ .005 level were considered.

123
458 A. D. Dumford, A. L. Miller

Results

The results of the regression models including all first-year students indicate that the
proportion of online classes being taken has a statistically significant, negative
effect on 3 of the 10 engagement indicators. Specifically, first-year students who
take more classes online report lower levels of collaborative learning in their
courses, fewer diverse discussions with others, and lower quality of interactions. On
the other hand, the proportion of online courses taken by a first-year student had a
positive effect on the amount of time spent engaging in quantitative reasoning
activities. More online courses were related to more engagement. This suggests that
while there appear to be some disadvantages to online learning in terms of
engagement for first-year students, there are benefits as well.
Some of these findings were also observed for seniors, although the patterns were
not exactly the same. For the models with senior students, the percentage of online
courses was a statistically significant negative predictor for 5 of the 10 engagement
indicators. Students’ ratings of effective teaching practices, student-faculty inter-
action, discussions with diverse others, quality of interactions, and collaborative
learning were lower when they were taking more online courses. The strongest
relationship found was between the percentage of online courses taken and
collaborative learning for seniors, meaning that the higher the percentage of online
courses a student is taking, the lower the amount of collaborative learning in which
he/she is engaged. This was followed by the relationship between student-faculty
interaction and percentage of courses online for seniors, again with traditional face-
to-face students having the advantage in this type of engagement.
Individual statistically significant beta weights for all models are reported in
Table 1. In addition, the variance explained for each model is listed in Table 1. The
predictor variables accounted for 1.7–10.0% of the total variance on the engagement
indicators (first-year M = .0315, SD = .0102; senior M = .0498, SD = .0287).
While that may not seem like a very large proportion, the relatively low percentage
of explained variance is most likely due to the fact that the models are very
simple—focusing on just the percentage of online courses taken and controlling for
student and institutional characteristics.

Discussion

The significant relationships for engagement and amount of online course exposure
reveal that the online environment might encourage certain types of engagement,
such as more use of quantitative reasoning activities. In contrast, traditional face-to-
face environments seem more likely to promote collaborative learning, student-
faculty interaction, effective teaching practices, quality of interactions, and
discussions with diverse others. The findings from this study focusing on
engagement in the online environment build on the past literature in this area and
show potential need for even more exploration.

123
Online learning in higher education: exploring advantages… 459

Table 1 Selected results from the OLS regression models: explained variance (R2) and the effectsa of the
percentage of online courses on the ten engagement indicators
Engagement indicator First-year R2 First-year b Senior R2 Senior b

Higher-order learning .021*** NS .027*** NS


Reflective and integrative learning .034*** NS .070*** NS
Learning strategies .036*** NS .044*** NS
Quantitative reasoning .050*** .013*** .100*** NS
Collaborative learning .037*** - .025*** .080*** - .087***
Discussions with diverse others .017*** - .011** .020*** - .013**
Student-faculty interaction .036*** NS .075*** - .048***
Effective teaching practices .025*** NS .028*** - .022***
Quality of interactions .038*** - .019*** .027*** - .012**
Supportive environment .021*** NS .027*** NS

**p \ .005 (Bonferroni cutoff); ***p \ .001


a
Controlling for student characteristics: gender, transfer status, enrollment status, first generation status,
age, SAT/ACT, major (arts and humanities as reference group), race/ethnicity (White as reference group),
grades (mostly A’s as reference group) and institutional characteristics: control (private/public), size

First-year students taking more classes online also reported higher levels of
quantitative reasoning use. This could be related to the nature of core courses most
likely to be adapted to the online environment. Certain disciplines that are higher in
use of quantitative reasoning skills, such as business and nursing, are most
commonly offered in the online format (Friedman 2014). Many course management
systems, such as MyMathLab, are geared specifically at STEM fields with online
video functions, programming to accommodate special characters and formulas, and
interactive guided problem-solving. These systems are also designed to help make
the online format simpler for faculty with automatic grading functions and pre-
recorded videos. Many of the quantitative courses that adapted this way for an
online format are entry level courses, which could be the reason that this same
finding was not seen in senior students.
However, those students with greater exposure to online formats showed less
engagement in collaborative learning activities among both first-years and seniors.
It may be that the potential isolation that comes with online, self-directed learning
might contribute to fewer opportunities for collaborative learning. This result is
somewhat in opposition to other research that has been done on effective uses of
student collaboration in online environments (Thurmond and Wambach 2004). Yet
whether a student is learning online or in the traditional classroom setting, the group
skills gained while working collaboratively are crucial. It is very important to create
the sense of a community of learners in an online course, since technology lacks a
human component and can lead to feelings of isolation (Cohen 2003). Assigning
group projects or requiring classmate interaction via chats and discussion boards
may be a useful approach to integrating collaborative learning activities into web-
based courses. Furthermore, since this study is taking the less traditional route of

123
460 A. D. Dumford, A. L. Miller

exploring the effects of the percentage of online courses rather than comparing
online-only students to face-to-face students, those taking a combination of courses
online and traditionally might be less familiar with navigating the online classroom
environment. It may also be the case that students who dislike group interactions
self-select into online courses when they are available, as a preference for more
individual work.
Online learning environments also seemed to be less conducive to student-faculty
interaction for seniors. Although some instructors might feel that online environ-
ments necessitate greater, rather than less, interaction with students because
research suggests that online courses require more time from faculty (Allen and
Seaman 2013; Tomei 2006), it may be that the interaction is more superficial in this
type of context. It takes longer to type multiple email responses to student questions,
which may come at any time of the day, than it does to make a single in-class
announcement. Answering trivial questions can take instructor time away from
sharing course content and developing course activities; thus, interaction may suffer
from a student perspective. Interestingly, there may be ways to combat some of
these reserved and detached online interactions with faculty and replace them with
more amiable means of communication. More informal assessments of student
learning can take place during chat-room discussions, examination of problem-
solving logs, and discussion board content analysis to enhance formative feedback
for students while providing a sense of enhanced faculty interaction (Wijekumar
et al. 2006). Pukkaew (2013) found that while online students did not care for the
communication platforms in the institution-provided course system, they had greater
course success when using social media (Facebook) for chat and messaging with
instructors and tutors instead. The formality of institution-provided email and
learning management systems may be hindering student perceptions of faculty
interaction. In general, interactions for students and faculty are less common for
first-year students; perhaps the reduced interaction for all first-year students masks
the online effect. The quality of interactions with other institutional representatives
follows this pattern as well, but for both first-year and senior students.
Similar to the findings for student-faculty interaction and quality of interactions,
senior students with more online exposure seemed to rate their faculty members’
effective teaching practices lower. Perhaps faculty members that are teaching
courses online need to spend more time contemplating the logistical components of
incorporating technology and adapting their courses for the online environment
(Restauri et al. 2001). This result may also be explained through faculty time
requirements, as previously noted that online courses require more time from
instructors (Allen and Seaman 2013; Tomei 2006). The extra time commitment
might mean that faculty do not have time to concentrate on improving their teaching
or trying different techniques, and instead simply do what they can to get through
the course. Furthermore, online instructors are more likely to be in the adjunct
category, with other full-time (or multiple part-time) jobs outside of their teaching
commitments. Adjunct status can impact many aspects of students’ experiences
(Umbach 2007), and in addition to time constraints, adjunct faculty may also be
constrained by pre-established syllabi and course assignments, from which they
must not stray. This may be negatively impacting engagement in online courses,

123
Online learning in higher education: exploring advantages… 461

suggesting that the importance of effective teaching practices extends beyond just
learning and development to engagement as well (Kuh and Hu 2001a).
Finally, the results of this study suggest that both first-year and senior students
who take more online courses have fewer opportunities for discussions with diverse
others. While this result on first consideration may seem to be counterintuitive
because the online environment allows students who are very different from the
traditional college student to take courses (often being older, more likely to work
full-time, and from a lower socioeconomic background), the anonymity of the
online environment may actually be the cause of this finding. Students could be
reporting few interactions with diverse others because they just do not know that
they are happening. Being online takes away some of the knowledge of who is
different from oneself in the course, in terms of background. The problem with this
is that students have to know about the diversity in order to benefit from a diverse
interaction. Faculty can circumvent this issue by having students in their courses
introduce themselves and disclose more about their personal background (while still
maintaining an open and non-judgmental atmosphere), or possibly introduce video
aspects into the course.

Limitations

While this study has several strengths, some limitations should also be considered.
The first limitation is that in terms of data collection, this data set was not designed
as a project to look strictly at online education. NSSE is a national project that was
designed to be a good measure of student engagement (Kuh 2001). Although the
researchers involved in updating NSSE for 2013 took the increasing presence of
online education into consideration when revising item language, some of the
questions still might be more easily interpreted from a traditional classroom
perspective. Another limitation is that since participation in NSSE is voluntary for
institutions, they are not selected randomly or to create a representative sample of
institutions. Although this is a concern, when compared to national data, the
institutions in NSSE 2015 do mirror that of the national picture (National Survey of
Student Engagement 2015). Furthermore, given the research design, this study was
unable to test for causal relationships between online learning and engagement. The
results can only confirm whether or not they are associated, which are important to
consider so as not to overstate the findings or draw erroneous conclusions.
Additionally, there were relatively weak standardized regression coefficients and
low explained variance, suggesting other factors not included in the model are
having an influence on student engagement. Finally, although this research has the
advantages of large sample size and ease of online data collection, it does rely on
self-reported measures, which may not always be objective. However, most studies
looking at self-reports of students in higher education suggest that self-reports and
actual abilities are positively related (Anaya 1999; Hayek et al. 2002; Pike 1995)
and social desirability bias does not play a major role in student responses for
surveys of basic cognitive and academic behaviors (Miller 2012).

123
462 A. D. Dumford, A. L. Miller

Conclusion

With the proliferation of online learning in higher education, there is an increased


need to understand the engagement and gains of students who only have an
opportunity for an online atmosphere. While there may be some benefits of online
learning in the realm of engagement, it seems that there are also some sacrifices
online learners make when it comes to an engaging educational experience. These
findings open the door to more inquiry. Further research might look at particular
online tools and techniques, both general and discipline-specific, which lead to these
different types of engagement and learning in order to improve education for online
learners. Future studies might integrate concepts such as motivation (Pintrich 2004)
and achievement goal orientation (Murayama and Elliot 2009), also known to play a
role in student engagement but not specifically measured with NSSE, and apply
previous findings to the setting of online learning. This study used a multi-
institution sample, but it may be useful to conduct some in-depth explorations of a
few schools that have made progressive advances in online education, profiling their
processes and outcomes to develop an applied model for practice.
As the technology used in online education continues to evolve rapidly, research
must persistently address the impact of online learning in higher education. It might
be useful to replicate the current study with a slight reframing of the research
question, comparing subgroups of students who take 100% of their courses online to
those who are take all of their courses in traditional face-to-face settings and
exploring whether the predictive power increases. More research is also needed on
whether there are disciplinary differences between academic majors and the use of
online curriculum, and if these patterns are similar to those for face-to-face learning
settings. If a primary goal of online learning is to reach a wider range of students
and provide educational opportunities for those who might not otherwise have such
access, then it is important to ensure that online education students are partaking in
equally engaging educational experiences that contribute to their learning and
success.

Compliance with ethical standards

Conflict of interest The authors declare that they have no conflict of interest.

References
Allen, E., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United
States. Babson Park, MA: Babson Survey Research Group.
Anaya, G. (1999). College impact on student learning: Comparing the use of self-reported gains,
standardized test scores, and college grades. Research in Higher Education, 40, 499–526.
Astin, A. W. (1993). What matters in college? Four critical years revisited. San Francisco, CA: Jossey-
Bass.
Baird, L. (2005). College environments and climates: Assessments and their theoretical assumptions.
Higher Education: Handbook of Theory and Research, 10, 507–537.
Braten, I., & Streomso, H. I. (2006). Epistemological beliefs, interest, and gender as predictors of
Internet-based learning activities. Computers in Human Behavior, 22(6), 1027–1042.

123
Online learning in higher education: exploring advantages… 463

Cabrera, A. F., Crissman, J. L., Bernal, E. M., Nora, A., Terenzini, P. T., & Pascarella, E. T. (2002).
Collaborative learning: Its impact on college students’ development and diversity. Journal of
College Student Development, 43(1), 20–34.
Chen, P. D., Lambert, A. D., & Guidry, K. R. (2010). Engaging online learners: The impact of web-based
learning technology on student engagement. Computers & Education, 54, 1222–1232.
Cohen, V. L. (2003). Distance learning instruction: A new model of assessment. Journal of Computing in
Higher Education, 14(2), 98–120.
Dominguez, P. S., & Ridley, D. R. (2001). Assessing distance education courses and discipline
differences in effectiveness. Journal of Instructional Psychology, 28(1), 15–19.
Drysdale, J. S., Graham, C. R., Spring, K. J., & Halverson, L. R. (2013). An analysis of research trends in
dissertations and theses studying blended learning. Internet and Higher Education, 17, 90–100.
https://doi.org/10.1016/j.iheduc.2012.11.003.
Evans, C. (2014). Twitter for teaching: Can social media be used to enhance the process of learning?
British Journal of Educational Technology, 45(5), 902–915.
Field, A. (2009). Discovering statistics using SPSS (3rd ed.). London: Sage Publications.
Friedman, J. (2014). Online education by discipline: A graduate student’s guide. Retrieved from http://
www.usnews.com/education/online-education/articles/2014/09/17/online-education-by-discipline-a-
graduate-students-guide.
Han, I., & Shin, W. S. (2016). The use of a mobile learning management system and academic
achievement of online students. Computers & Education, 102, 79–89. https://doi.org/10.1016/j.
compedu.2016.07.003.
Hayek, J. C., Carini, R. M., O’Day, P. T., & Kuh, G. D. (2002). Triumph or tragedy: Comparing student
engagement levels of members of Greek-letter organizations and other students. Journal of College
Student Development, 43(5), 643–663.
Henrie, C. R., Halverson, L. R., & Graham, C. R. (2015). Measuring student engagement in technology-
mediated learning: A review. Computers & Education, 90, 36–53. https://doi.org/10.1016/j.
compedu.2015.09.005.
Hu, S., & Kuh, G. D. (2001). Computing experience and good practices in undergraduate education: Does
the degree of campus ‘‘wiredness’’ matter? Education Policy Analysis Archives, 9(49). http://epaa.
asu.edu/epaa/v9n49.html.
Jacob, S., & Radhai, S. (2016). Trends in ICT e-learning: Challenges and expectations. International
Journal of Innovative Research & Development, 5(2), 196–201.
Junco, R. (2012). The relationship between frequency of Facebook use, participation in Facebook
activities, and student engagement. Computers & Education, 58(1), 162–171.
Junco, R., Elavsky, C. M., & Heiberger, G. (2013). Putting Twitter to the test: Assessing outcomes for
student collaboration, engagement, and success. British Journal of Educational Technology, 44(2),
273–287.
Junco, R., Heiberger, G., & Loken, E. (2010). The effect of Twitter on college student engagement and
grades. Journal of Computer Assisted learning. https://doi.org/10.1111/j.1365-2729.2010.00387.x.
Kent, C., Laslo, E., & Rafaeli, S. (2016). Interactivity in online discussions and learning outcomes.
Computers & Education, 97, 116–128. https://doi.org/10.1016/j.compedu.2016.03.002.
Kim, K. J., & Bonk, C. J. (2006). The future of online teaching and learning in higher education: The
survey says…. Educause Quarterly, 4, 22–30.
Kuh, G. D. (2001). The National Survey of Student Engagement: Conceptual framework and overview of
psychometric properties. Bloomington, IN: Indiana University, Center for Postsecondary Research.
Kuh, G. D., & Hu, S. (2001a). The effects of student-faculty interaction in the 1990s. Review of Higher
Education, 24(3), 309–332.
Kuh, G. D., & Hu, S. (2001b). The relationships between computer and information technology use,
student learning, and other college experiences. Journal of College Student Development, 42,
217–232.
McMillan, J. H., & Schumacher, S. (2001). Research in education: A conceptual introduction. New York:
Longman.
Miller, A. L. (2012). Investigating social desirability bias in student self-report surveys. Educational
Research Quarterly, 36(1), 30–47.
Miller, A. L., Sarraf, S. A., Dumford, A. D., & Rocconi, L. M. (2016). Construct validity of NSSE
engagement indicators (NSSE psychometric portfolio report). Bloomington, IN: Center for
Postsecondary Research, Indiana University, School of Education. http://nsse.indiana.edu/pdf/
psychometric_portfolio/Validity_ConstructValidity_FactorAnalysis_2013.pdf.

123
464 A. D. Dumford, A. L. Miller

Moore, M. G., & Kearsley, G. (2011). Distance education: A systems view of online learning. Belmont,
CA: Wadsworth.
Murayama, K., & Elliot, A. J. (2009). The joint influence of personal achievement goals and classroom
goal structures on achievement-relevant outcomes. Journal of Educational Psychology, 101(2),
432–447. https://doi.org/10.1037/a0014221.
National Survey of Student Engagement. (2015). NSSE 2015 overview. Bloomington, IN: Indiana
University, Center for Postsecondary Research.
Nelson Laird, T. F., Shoup, R., & Kuh, G. D. (2005). Measuring deep approaches to learning using the
National Survey of Student Engagement. Paper presented at the annual meeting of the Association
for Institutional Research, Chicago, IL. http://nsse.iub.edu/pdf/conference_presentations/2006/
AIR2006DeepLearningFINAL.pdf.
Ormrod, J. E. (2011). Human learning (6th ed.). Upper Saddle River, NJ: Pearson.
Pace, C. R. (1980). Measuring the quality of student effort. Current issues in Higher Education, 2, 10–16.
Parsad, B., & Lewis, L. (2008). Distance education at degree-granting Postsecondary Institutions:
2006–2007 (NCES 2009–044). National Center for Education Statistics, Institute of Education
Sciences. Washington, DC: US Department of Education. http://nces.ed.gov/pubs2009/2009044.pdf.
Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students: A third decade of research
(Vol. 2). San Francisco, CA: Jossey-Bass.
Pike, G. R. (1995). The relationship between self-reports of college experiences and achievement test
scores. Research in Higher Education, 36(1), 1–22.
Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated learning in
college students. Educational Psychology Review, 16(4), 385–407. https://doi.org/10.1007/s10648-
004-0006-x.
Pollack, P. H., & Wilson, B. M. (2002). Evaluating the impact of internet teaching: Preliminary evidence
from American national government classes. PS. Political Science and Politics, 35(3), 561–566.
Pukkaew, C. (2013). Assessment of the effectiveness of internet-based distance learning through the
VClass e-Education platform. International Review of Research in Open and Distance Learning,
14(4), 255–276.
Restauri, S. L., King, F. L., & Nelson, J. G. (2001). Assessment of students’ ratings for two methodologies
of teaching via distance learning: An evaluative approach based on accreditation. ERIC document
460-148, reports-research (143).
Richardson, J. T. E., Morgan, A., & Woodley, A. (1999). Approaches to studying distance education.
Higher Education, 37, 23–55.
Robinson, C. C., & Hullinger, H. (2008). New benchmarks in higher education: Student engagement in
online learning. Journal of Education for Business, 84(2), 101–108.
Serwatka, J. A. (2002). Improving student performance in distance learning courses. Technological
Horizons in Education THE Journal, 29(9), 48–51.
Shuey, S. (2002). Assessing online learning in higher education. Journal of Instruction Delivery Systems,
16, 13–18.
Stallings, D. (2002). Measuring success in the virtual university. The Journal of Academic Librarianship,
28, 47–53.
Tabachnick, B. G., & Fidell, L. S. (2001). Using multivariate statistics (4th ed.). Needham Heights, MA:
Allyn & Bacon.
Tess, P. A. (2013). The role of social media in higher education classes (real and virtual)—A literature
review. Computers in Human Behavior, 29(3), A60–A68.
Thurmond, V., & Wambach, K. (2004). Understanding interactions in distance education: A review of the
literature. International Journal of Instructional Technology & Distance Learning, 1, 9–33. http://
www.itdl.org/journal/Jan_04/article02.htm.
Tomei, L. A. (2006). The impact of online teaching on faculty load: Computing the ideal class size for
online courses. Journal of Technology and Teacher Education, 14, 531–541.
Umbach, P. D. (2007). How effective are they? Exploring the impact of contingent faculty on
undergraduate education. Review of Higher Education, 30(2), 91–123. https://doi.org/10.1353/rhe.
2006.0080.
U.S. Department of Education, National Center for Education Statistics. (2016). Digest of education
statistics, 2014 (NCES 2016-006), Table 311.15. Retrieved from https://nces.ed.gov/fastfacts/
display.asp?id=80.
Whitley, B. E. (2002). Principles of research in behavioral science (2nd ed.). New York, NY: Routlegde.

123
Online learning in higher education: exploring advantages… 465

Wijekumar, K., Ferguson, L., & Wagoner, D. (2006). Problems with assessment validity and reliability in
wed-based distance learning environments and solutions. Journal of Educational Multimedia and
Hypermedia, 15(2), 199–215.
Wojciechowski, A., & Palmer, L. B. (2005). Individual student characteristics: Can any be predictors of
success in online classes? Online Journal of Distance Learning Administration, 8(2), 13.
Wu, W., Wu, Y. J., Chen, C., Kao, H., & Lin, C. (2012). Review of trends from mobile learning studies:
A meta-analysis. Computers & Education, 59, 817–827. https://doi.org/10.1016/j.compedu.2012.03.
016.
Xu, D., & Smith Jaggars, S. (2013). Adaptability to online learning: Differences across types of students
and academic subject areas (CCRC Working Paper). New York, NY: Teachers College, Columbia
University. Retrieved from http://ccrc.tc.columbia.edu/publications/adaptability-to-online-learning.
html.
Zhou, L., & Zhang, D. (2008). Web 2.0 impact on student learning process. In K. McFerrin et al. (Eds.),
Proceedings of society for information technology and teacher education international conference
(pp. 2880–2882). Chesapeake, VA: AACE.

Amber D. Dumford is an Associate Professor at the University of South Florida. She teaches in the
department of College Student Affairs and coordinates the graduate program. Her research interests
include gender issues in higher education, arts education, engineering education, creativity, and
quantitative reasoning.

Angie L. Miller is an Associate Research Scientist in the Center for Postsecondary Research at Indiana
University. She does research and data analysis for the National Survey of Student Engagement (NSSE)
and the Strategic National Arts Alumni Project (SNAAP). Her research interests include student
engagement, creativity assessment, utilization of creativity in educational settings, and factors influencing
gifted student achievement.

123

You might also like