Stem Bevo 2019
Stem Bevo 2019
Stem Bevo 2019
VALIDATION OF A SURVEY
INSTRUMENT (AKA) TOWARDS ISSN 1648-3898 /Print/
Bevo Wahono
Chun-Yen Chang
63
Journal of Baltic Science Education, Vol. 18, No. 1, 2019
DEVELOPMENT AND VALIDATION OF A SURVEY INSTRUMENT (AKA) TOWARDS ATTITUDE,
KNOWLEDGE AND APPLICATION OF STEM
ISSN 1648–3898 /Print/
(P. 63-76) ISSN 2538–7138 /Online/
nations. However, the development and deployment of STEM Education in every continent and country are
different. Some countries are already implementing it at an advanced level, while some countries are still in the
introduction and trial stages (National Audit Office, 2010). Therefore, an instrument to evaluate the attitude,
knowledge and even application (AKA) of STEM Education is necessary.
The current developed AKA instrument encompasses three domains at once that include STEM attitude,
knowledge and application by the teacher. Based on review literature, some previous studies have also devel-
oped related instruments on STEM. Lam, Doverspike, Zhao, Zhe, and Menzemer (2008) developed an instrument
to know the STEM career interest in middle school students. Meng, Idris, Eu, and Daud (2013) also developed a
survey instrument to elicit perceptions of secondary school students on the STEM-related subjects in the school
assessment practices. Ibrahim, Aulls, and Shore (2017) developed an instrument to survey the STEM for bachelor
students’ achievement values of inquiry engagement. Nevertheless, those kinds of STEM survey instrument have
a function for eliciting information from students.
Other studies developed instruments that focused on the teachers. El-Deghaidy and Mansour (2015) have
developed an instrument focused on science teachers’ perceptions of STEM Education. Another research, Ven-
nix, Brok, and Takonis (2016) have developed a survey questionnaire to assess the perceptions of STEM-based
outreach learning activities both for the students and teachers in secondary education. However, this instrument
has not been comprehensive enough for general use. The instrument has a specialty such as only to elicit data
regarding teacher perception of STEM-based outreach learning activities. That instrument would be a problem
to obtain the teachers’ perception of STEM in general, without preceded by any STEM activities. For instance,
an instrument used to get teachers’ attitude, knowledge, and application simultaneously regarding STEM
education as well as the problems and challenges faced by teachers on the particular area, country, or global.
Moreover, there is limited research that develops survey instruments focusing specifically on science teachers.
One solution is to re-evaluate, modify, combine, and reconstruct the instruments (Summers & Abd-El-Khalick,
2018) into a single instrument by convergence with an existing STEM education framework: a framework for
K12 science education (NRC, 2012). However, these approaches cannot overcome the ingrained problems of
existing instruments. The AKA instrument simply answers the needs of science teachers, as we have known the
vital role of a science teacher (Kola, 2013; Ruiz et al., 2014) in quantitatively measuring the levels of attitude,
knowledge, and application of STEM.
Attitude, knowledge, and application (AKA) are the three main words that often appear in the assessment,
especially in education. The definition also has a broad meaning. The first aspect introduced here is the attitude.
Maio and Haddock (2014) argued that the term attitude can be defined in ample ways. Pryor, Pryor, and Kang
(2016) defined attitude as negative, positive, or neutral feeling regarding some behavior or object. Then, in term
of direction and strength, the attitude can vary such as a spectrum from extremely unfavorable to extremely
favorable. Ajzen (2005) and Maio and Haddock (2014) explained that an overall assessment of an object in several
conditions such as unpleasant or pleasant, and bad or good is a definition of the term attitude. Therefore, in this
research “attitude” indicates whether the science teacher agrees or disagrees with the application of STEM, at
the time of the students’ classroom learning, and the students’ sense of curiosity towards STEM, as well as what
is the teacher thinking and feeling about STEM. Al-Salami, Makela, and de-Miranda (2017) proved that primary
teachers need to develop both skills and attitudes toward interdisciplinary teaching. To determine secondary
teachers’ attitudes regarding teaching of integrated STEM-related subject, Thibaout, Knipprath, Dehaene, and
Depaepe (2018) have developed a valid questionnaire in alignment with a theoretical framework encouraged
by Van-Aalderen, Walma and Asma (2012). The finding of the validation research provided any evidence for a
framework composing of three conditions, namely perceived control, cognition, and emotion. In the field of
STEM education, the study on science teachers related to the attitudes is relatively limited.
The next aspect introduced is knowledge. The definition of knowledge also has a broad meaning. Not
Surprising if many discussions of teacher knowledge, just how difficult it is to probe this aspect of teaching
practice by a quantitative survey. Thomson (1998) defined the term of knowledge as many specific meanings.
Firstly, as familiarity or awareness obtained by an experience of a thing, person, or a fact, or as a person’s range
of any information. Secondly, as a practical or theoretical understanding of a language, subject, etc. or as an
64
https://doi.org/10.33225/jbse/19.18.63
Journal of Baltic Science Education, Vol. 18, No. 1, 2019
ISSN 1648–3898 /Print/ DEVELOPMENT AND VALIDATION OF A SURVEY INSTRUMENT (AKA) TOWARDS ATTITUDE,
KNOWLEDGE AND APPLICATION OF STEM
ISSN 2538–7138 /Online/ (P. 63-76)
amount of what was known. Thirdly, a right justified belief; a specific understanding as opposed to a perception
or opinion. Furthermore, Shulman (1986) divided teacher knowledge into three forms: propositional knowl-
edge, case knowledge, and strategic knowledge. In the process of acquiring such knowledge, Biggam (2001)
argued that there are many ways to gain knowledge. It could be obtained through some experiences. Moreover,
knowledge could also be obtained from rational thought. In addition, knowledge can also be more specialized
or even expandable. Nowadays, information and communication technology greatly influences the way to com-
municate and work of a knowledge (Binckly et al., 2012). Thus, in this research, the researchers restricted the
term teachers’ STEM knowledge to all information held by a science teacher about STEM education regarding
the extent of the term STEM. The knowledge weather STEM education is one kind of teaching method or not,
as well as their knowledge of the way to apply STEM in the classroom, including the interconnectedness of one
discipline with another. Koehler and Mishra (2009) said that the teacher needs to be knowledgeable in each
discipline and understand how they interconnect with each other.
STEM teachers have content knowledge that includes the scientific method, evidence-based reasoning,
principles of engineering design and constraints, and mathematical theories and constructs, and technology
applications that support their content knowledge. However, pedagogy is also one important thing. Pedagogy is
the knowledge of how students learn classroom management skills, lesson planning and assessment (Koehler &
Mishra, 2009). The challenge is to have sufficient STEM content knowledge, and effective pedagogical knowledge
to make the learning effective, challenging and engaging. Several previous studies attempted to measure the
knowledge, such as Lam, Doverspike, Zhao, Zhe, and Menzemer (2008), have tried to extract insight information
about STEM, especially among high- school students. They elicit information by surveys through workshops on
the knowledge and beliefs of students and parents on STEM Education. Gosselin and Macklem-Hurst (2002) also
surveyed to access the level of knowledge of students in high school. Nevertheless, information about STEM
teacher knowledge is still very limited.
Furthermore, the term of application, practice, and implementation are words that have the same relative
meaning. Those show the meaning of realization or performance of some activities. Meanwhile, the term of prac-
tice is a real implementation or the usedness of a belief, method, or idea, as opposed to related-theories toward
the term. Then, the implementation is the process of putting a decision or plan into effect or execution. In this
current research, the researchers would like to use the term application rather than two other words. The term
application is more appropriately applied to describe the teacher’s STEM performance in the classroom. Many
researchers have addressed the application of STEM in the classroom, but few still discuss it from the teacher’s
point of view, especially on a quantitative application. Han, Yalvac, Capraro, and Capraro (2015), have studied
using a qualitative case study approach regarding teachers’ implementation and understanding of STEM-related
activities. They stated that in applying STEM in the classroom the teacher should pay attention to the academic
level of the students, and the teacher should prepare as best as possible and try hard. English (2016), Herschbach
(2011), as well as Kelley and Knowles (2016) asserted that whilst the quantity of STEM education over countries
is significantly increasing, however, still not much was known about methods or approaches for the application
of STEM education instruction. Overall means that, to understand the current situation in the field, especially in
STEM education, this instrument is strongly necessary.
Research Focus
Any limitation of the existing STEM survey instruments on some previous researches (El-Deghaidy & Mansour,
2015; Meng, Idris, Eu, & Daud, 2013; Vennix, Brok, & Takonis, 2016), regarding scope and purpose as well as the need
of an appropriate tool to access the latest progress on STEM education were the concern to be addressed in this
current research. Then, this research focused on the design and development of a valid instrument that is used to
determine the general quantitative degree of attitude, knowledge, and application of STEM by science teachers.
This instrument serves as an essential tool and reference for evaluating and understanding the general descrip-
tion of STEM progress. The AKA instrument enables users to obtain data and information on the development of
STEM, and the problems and challenges faced by science teachers worldwide. The worthiness of the instrument
is determined by: (1) how does the development process of AKA instrument, and (2) how does the validity of the
AKA instrument.
65
https://doi.org/10.33225/jbse/19.18.63
Journal of Baltic Science Education, Vol. 18, No. 1, 2019
DEVELOPMENT AND VALIDATION OF A SURVEY INSTRUMENT (AKA) TOWARDS ATTITUDE,
KNOWLEDGE AND APPLICATION OF STEM
ISSN 1648–3898 /Print/
(P. 63-76) ISSN 2538–7138 /Online/
Methodology of Research
General Background
This research attempted to develop an instrument used to scrutinize the quantitative degree of STEM attitude,
knowledge, and application among science teachers. This research is categorized as Research and Development
(R & D). There were four phases to get the final instrument namely planning, construction, quantitative evalua-
tion, and validation. The ultimate instrument was qualified as a survey instrument used to determine the general
quantitative degree of attitude, knowledge, and application on STEM simultaneously by science teachers on the
secondary school. The quality of the instrument was determined based on reliability and validity. The quantita-
tive way by Statistical Package for the Social Sciences (SPSS) was done in term accessing the reliability, while the
validity was determined by both quantitative (SPSS) and qualitative ways (analysis of experts’ opinions). Data were
collected from Indonesian science teachers of secondary schools in different provinces from February to April 2018.
Sample
Samples taken for this test pilot were 137 participants who are secondary school science teachers in Indonesia.
Table 1 here, points out a demographic data of respondents. This demographic data comprised gender, education
background, teaching experience, and specialization of teaching. The respondents were secondary school science
teachers from eight provinces in Indonesia for validity and reliability testing. In terms of ethical concerns, the re-
searcher ensured the confidentiality of the participated teachers’ identities in the current research. The purpose
of the research was explained as well in detail, and all the respondents were assured of the confidentiality of their
responses. Furthermore, there is no coercion in the retrieval of data and could be interpreted as all participants
sincerely filling out and involved in this research. The subjects of the teachers were Chemistry, Physics, Biology,
and Integrated Science subject both on junior and senior high school.
Procedure
There were four phases used in the design and development of this instrument. These phases were bypassed
in order to obtain accurate instruments in measuring target variables (Creswell, 2005). Those four phases are plan-
ning, construction, quantitative evaluation, and validation. The four stages of the research are shown in Figure 1.
66
https://doi.org/10.33225/jbse/19.18.63
Chemistry 16 11.65
Procedure
Journal of Baltic Science Education, Vol. 18, No. 1, 2019
There were four phasesDEVELOPMENT
ISSN 1648–3898 /Print/
used in theAND
design and development
VALIDATION of this instrument.
OF A SURVEY INSTRUMENT TheseATTITUDE,
(AKA) TOWARDS phases
were bypassed in order
ISSN 2538–7138 /Online/ to obtain accurate instruments in measuring target
KNOWLEDGE ANDvariables (Creswell,
APPLICATION OF STEM
(P. 63-76)
2005). Those four phases are planning, construction, quantitative evaluation, and validation. The
four stages of the research are shown in Figure 1.
Construction Validation
Figure 1. Research Phases of developing AKA instrument (adapted from Creswell, 2005).
Figure 1. Research Phases of developing AKA instrument (adapted from Creswell, 2005).
The first phase was planning. At this stage, determining the purpose of making the instrument as well as the
The first phase was planning. At this stage, determining the purpose of making the instrument
target group to be tested, were the main focus. As the researchers discussed before, the aim was to provide a basic
as well as the target group to be tested, were the main focus. As the researchers discussed before, the
tool and reference for evaluating and understanding the general description of STEM progress. The second stage
aimwaswas to providestage.
the construction a basic tool
On this step,and reference determined
the researchers for evaluating
the shapeand understanding
of the the identified
survey instrument, general
description of STEM progress. The second stage was the construction stage. On this
the use of a 5-point Likert-type scale, divided the STEM domain into four sections and created items about each step, the
researchers determined the shape of the survey instrument, identified the use of a 5-point Likert-type
domain (item pool) and other additional issues to strengthen the survey results. The third stage was to determine
scale,
anddivided
conduct athe STEM
pilot domain
study to into four sections
obtain information about theand created of
deficiencies items about each
the instrument domain
made based(item pool)
on feedback
andfrom
other
the additional
respondent. Theissues
last to strengthen
stage the The
was validation. survey results.
validation wasThe third using
conducted stageface
wasandtocontent
determine and
validation
by experts as well as by exploratory factor analysis using SPSS. Actually, the quantitative phase and validation phase
were performed simultaneously, and the result of this step would go back to the construction phase. The cycles
did not finish until the researchers got the final instrument (hidden way). In the result of the research below, the
researchers in detail provided what already had done in each phase, especially for the second to the fourth phase.
Data Analysis
In terms of to know the internal consistency in each domain created in the pilot study, the researchers used the
reliability of the instrument. The internal consistency of the instrument was known from the value of the Cronbach’s
Alphas. Three main domains and five subdomains were accessed using this test. The domains were STEM attitude,
STEM knowledge, and STEM application. The subdomains were STEM Science-Technology, Science-Engineering,
Science-Mathematics, Science-Technology-Engineering, Science-Technology-Math, Science-Engineering-Math, and
Science-Technology-Engineering-Math. Then, the next step was validation. In the validation phase, there were two
types of validation done. The first validation of the instrument was assessed by three experts who are as content
and face instrument validators. Those experts have experience in the development of an instrument and also know
about STEM Education. Two of them are professors, and another one is a doctor in the science education area.
They were using a 5-point Likert-type scale with a range of one to five to get their level of agreement. From the
validation the results obtained the degree of approval from the validators as well as some suggestions and input.
Feedback from validation was used to improve the research instrument for the better. The second validation was
to perform exploratory factor analysis using the SPSS for windows version 22 that was by analyzing the loading
factor of each domain item instrument.
67
https://doi.org/10.33225/jbse/19.18.63
Journal of Baltic Science Education, Vol. 18, No. 1, 2019
DEVELOPMENT AND VALIDATION OF A SURVEY INSTRUMENT (AKA) TOWARDS ATTITUDE,
KNOWLEDGE AND APPLICATION OF STEM
ISSN 1648–3898 /Print/
(P. 63-76) ISSN 2538–7138 /Online/
Results of Research
The results of this development, the researchers titled “Survey of Science Teachers’ Attitude, Knowledge, and
Application (AKA) of STEM.” The development of this instrument began with STEM domain grouping into three
main groups, namely STEM knowledge (SK), STEM Application (SAp), and STEM Attitude (SAt). STEM knowledge
was a domain that reveals the element of knowledge of respondents to STEM education. The second domain was
the STEM Application. In this domain, these questions lead to information about the extent to which STEM ap-
plications were in the classroom by teachers of science teachers. STEM application was derived into several other
small parts of STEM Science-Technology (SAp-ST), Science-Engineering (SAp-SE), Science-Mathematics (SAp-SM),
Science-Technology-Engineering (SAp-STE), Science-Technology-Math (SAp-STM), Science-Engineering-Math (SAp-
SEM), and Science-Technology-Engineering-Math (SAp-STEM). The last domain was a STEM Attitude (SAt). The SAt
explores the information about attitude or respondents’ view towards STEM education.
The STEM knowledge domain consisted of four question items that were representative of a single construct.
In the beginning, we developed six questions, however, finally after reliability and validity test, they became four.
The STEM application domain consisted of 23 question items consisting of six constructs. The 23 items were also
the result after the reliability and validity test. While the last domain, STEM attitude consisted of three question
items that were part of a construct. The final question items and each construct are shown in an appendix.
Reliability of Instrument
The analysis of instrument reliability level was done after obtaining data of the respondents’ test results us-
ing Cronbach’s Alpha method. Reliability test is an index showing the extent to which measurement tools can be
trusted or relied upon. Below (Table 2), shows a resume of the reliability test results of the instrument.
Table 2 above shows that the value of internal consistency alpha of all STEM domains was greater than .6. The
highest internal value of the highest three domains was in the domain SK (.908) and the lowest value is in the SAp
domain (.865). As for fractions of STEM application, the highest value was at SAp-STEM (.865) and the lowest in
SAp-SEM domain (.684). These values mean that the scale reliability of Cronbach’s alpha was similar to with values
greater than .6 considered acceptable. Therefore, the AKA instrument was to be acceptable of internal consistency
among domains.
68
https://doi.org/10.33225/jbse/19.18.63
Journal of Baltic Science Education, Vol. 18, No. 1, 2019
ISSN 1648–3898 /Print/ DEVELOPMENT AND VALIDATION OF A SURVEY INSTRUMENT (AKA) TOWARDS ATTITUDE,
KNOWLEDGE AND APPLICATION OF STEM
ISSN 2538–7138 /Online/ (P. 63-76)
Validity of Instrument
Development of the survey instrument was then validated to provide confidence, whether the instrument
was feasible to use to obtain the right data or not. This research used two types of validation. The first, content
and face validity was done using a 5-point Likert-type scale by three experts. This validity was conducted to see
how the instrument looks like or appearance. Table 3 shows the results of the content and face validity, which
concludes how much expert-approved or agreed on the presence and content of the instrument. Moreover, this
validity was done to determine the readability, accuracy, and suitability of the instrument content. As a part of
this validity, the experts gave some suggestions and comments as well. Second, the validation was verified by
doing an exploratory factor analysis (EFA). There were some reasons why the researchers were doing EFA. Firstly,
there was a set of underlying variables called factors that for aggregation of observed variables could indicate
the interrelationships among those variables. Secondly, the authors want to know whether these items hang or
swing together to create a construct. Finally, the authors want to know whether the questions in the survey had
similar patterns of responses.
Validation Score
No Criteria
Expert 1 Expert 2 Expert 3 Mean±SD
The result was that the average expert agreement rate is 83.33%. Based on BSNP (2016), the agreement scores
from the experts is 83.33% in the good level. No criteria items scaled below three, which means that all experts
agree on the form and content of the instrument. However, experts still provided some notes and comments to
refine the instrument. The experts’ notes and comments are summarized in table 4 below.
Identities/ Demog- Expert 1 Please insert the aim of research in the questionnaire Write down the aim of the research on the introduc-
raphy tion of this questionnaire
Expert 2 It is better not use “name”, but initials only Change the item name to an initial name
STEM Knowledge Expert 1 I often see my friends doing STEM class, this item should Removed
be removed because it will make the respondent confused
Expert 3 I know everything about STEM I know the term of STEM
69
https://doi.org/10.33225/jbse/19.18.63
Journal of Baltic Science Education, Vol. 18, No. 1, 2019
DEVELOPMENT AND VALIDATION OF A SURVEY INSTRUMENT (AKA) TOWARDS ATTITUDE,
KNOWLEDGE AND APPLICATION OF STEM
ISSN 1648–3898 /Print/
(P. 63-76) ISSN 2538–7138 /Online/
STEM Application Expert 1 Please give some examples of term the design something I often ask students to design something related to
on the SAp-SE item the topic of science (ex: design replica of DNA, atom,
etc.)
Expert 2 The meaning of technology not only as a tool, but also as a My students are actively involved using simple
system of thinking in producing something technology or a particular procedure to produce
something in learning
The number of items for SAp-STEM is too much unbal- Reduce the number of items from five to four items
anced with other domains
Give an example for this item, I often invite students to use I often invite students to use all possible technologies
all possible technologies to collect data on learning in the to collect data on learning in the science classroom
science classroom (ex: using a thermometer and use mathematical com-
putation to make a decision)
STEM Attitude Expert 3 The two items are too few, add at least one more Added one item “I am very interested to know more
how to properly integrate the mathematical, tech-
nological and engineering approaches in teaching
science in the classroom”
Complementary Expert 2 The sentence structure of the item “Based on your knowl- Fixed the sentence structure and added word current
edge, provide some of the possible difficulties... is not ap- and ability. Based on your current knowledge and
propriate for the respondent, please add the word “current abilities, provide some of the possible difficulties
and ability”
Sentences or words in italic words are the result of a change
Those notes from the experts became valuable advice for researchers. Well known that what some people
thought is good, may be different from what others think. Therefore, here, the role of experts was needed. However,
some notes are not included in the table because researchers had a difference in understanding, but the numbers
are very limited. For example, one of the comments is not to use the word “always” in the survey sentences. After
carefully considering and relying on the original goal, the researchers ignored the note. Thus, after revising and
accommodating the comments and suggestions from these experts, the progress of the instruments improved.
In addition to the face and content validation by experts, the researchers also performed a construct valida-
tion using exploratory factor analysis method. In terms of the construct validity, explanatory factor analysis was
performed by using the IBM SPSS 22.0 for windows program. Based on Hair, Black, Babin, Anderson, and Tatham
(2006), only any factors with an eigenvalue higher than one were included as representative. Then, Keiser-Meyer-
Olkin (KMO) of sampling adequacy test .822 indicating that the variables are highly factorable (Table 4). The result
of Bartlett’s Test of Sphericity is significant (p< .05). The finding indicates that variables were correlated. The Bartlett
test was statistically significant, then based on Pallant (2005), the value of KMO found is higher than the recom-
mended value of .60, and that means the researchers continued the factor analysis.
Analysis result of a construct validation using exploratory factor analysis method shows three factors as shown
in table 5 below. From the analysis, results obtained information that all items have a factor loading score higher
than .5. However, one of the items (SAP_STEM4) gets a not-so-good value of .65, but the value is still eligible to
include the item into the analysis.
70
https://doi.org/10.33225/jbse/19.18.63
Journal of Baltic Science Education, Vol. 18, No. 1, 2019
ISSN 1648–3898 /Print/ DEVELOPMENT AND VALIDATION OF A SURVEY INSTRUMENT (AKA) TOWARDS ATTITUDE,
KNOWLEDGE AND APPLICATION OF STEM
ISSN 2538–7138 /Online/ (P. 63-76)
Table 5. Cronbach’s α and factor loading for the main domain of STEM.
The largest percentage of variance is in factor one, which is 45.54% of the total variance explained about
77.74%. Finally, using principal component analysis with Varimax Rotation Method items SK1, SK2, SK3, and SK4
were shown to belong to Factor 1 as the values are larger than .3. Item SAP_STEM1, SAP_STEM 2, SAP_STEM3
and SAP_STEM4 belong to Factor 2. Then, items SAt1, SAt2 and SAt3 belong to Factor 3. Factor 1 refers to STEM
knowledge, factor 2 refers to STEM application and factor 3 refers to STEM Attitude.
Final Instrument
The final instrument was the result after several revisions based on expert input and statistical analysis from
the pilot study. The instrument consisted of three parts: the introduction, the core questionnaire items, and ad-
ditional or complementary questionnaire items. The introductory section contained any matters relating to the
purpose of making the instrument, how to use the instrument, how to provide and calculate survey scores and
instructions for how to obtain data easily. For instance, how do you score the survey? Each item response is scored with
a value of one assigned to strongly disagree, all the way to five for strongly agree. For each construct, the participant’s
responses are averaged. For example, the four questions under SK (STEM Knowledge) are averaged to produce one SK
(STEM Knowledge) Score.
The core questionnaire item consisted of items to access respondents’ demographic data totaling eight ques-
tions and the items to elicit teachers’ AKA regarding STEM education. The example item to access respondents’
demographic data was about “the range of teachers’ teaching experience (< 10 Years or > 10 Years)”. Then, the items
to explore STEM knowledge, attitude, and applications totaling 30 questions. The example item to get information
about teachers’ STEM attitude was “I strongly agree to implement the mathematical, technological and engineering
approaches in teaching science in the classroom.” Next, an example item to elicit information about teachers’ STEM
knowledge was “I know the term of STEM.” Finally, the sample item to get information regarding STEM application was
“I usually teach science content using any kinds of technologies, engineering and mathematical context simultaneously.”
While the last part consisted of questions to access the application model of STEM and two open questions
about opinions and obstacles or challenges of the implementation of STEM. For instance, “based on your current
knowledge and abilities, please provide some of the possible difficulties that will be faced when applying the integration
of mathematical, engineering, and technology approaches in science classroom learning!” The full form of core items
regarding STEM knowledge, attitude and application by science teachers of the final instrument is shown in the
appendix of this paper.
71
https://doi.org/10.33225/jbse/19.18.63
Journal of Baltic Science Education, Vol. 18, No. 1, 2019
DEVELOPMENT AND VALIDATION OF A SURVEY INSTRUMENT (AKA) TOWARDS ATTITUDE,
KNOWLEDGE AND APPLICATION OF STEM
ISSN 1648–3898 /Print/
(P. 63-76) ISSN 2538–7138 /Online/
Discussion
Exploratory factor analysis indicated that a three-factor was formed namely STEM knowledge, STEM application,
and STEM attitude, which was calculated from 77.74% of the total variance. The analyses of the result showed that
a sum of items loaded into factors that corresponded to the theoretical structure. Cronbach’s alpha value of each
domain or construct was all over .60. Important to note that Cronbach’s alpha was evaluated similarly to scale reli-
ability with values between .7 and .9 considered good. Then, the Cronbach’s alpha values higher than .6, however,
considered acceptable. Dillon and Goldstein (1984) and Joreskog (1971) asserted that the scale reliability also referred
to as construct of reliability, was measured based on results of the exploratory factor analysis.
In addition, the distinguishment between this development process and previous similar process of developing
an instrument is the content and face validation of the experts to elicit their approval or agreement. Dorrusen, Lenz,
and Blavoukos (2005) said that researchers are necessary to assess the reliability and validity of the information pro-
vided by some interviewees in surveying research and expert interviews. Furthermore, they claim that inter-expert
agreement is fundamental in the validation process of a new instrument. The result of the validity of the face and
content performed by the expert showed the average approval value of 85%. Based on the analysis results of the
factors and the expert agreement indicated that the developed instrument was reliable and valid, it is feasible to be
used to collect data. However, any researches or replicated researches are still necessary to make this instrument more
meaningful. Lin and Tsai (2017) suggested that replicated studies with national random samples may be meaningful
in consolidating the findings of a study.
In terms of the item’s context in this AKA instrument, the authors referred to several definitions that have been
raised by previous research studies. An item related to STEM knowledge, for example, was “I know the term of STEM.”
Based on Thomson (1998), this item belonged to the category of knowledge that originates from a person’s awareness
or familiarity with something or a condition. Surely, in this case, was the awareness of the STEM education knowledge.
Next one, the example item to get information about teachers’ STEM attitude was “I strongly agree to implement the
mathematical, technological and engineering approaches in teaching science in the classroom.”The item could be classified
into a statement that was predicted to explore information regarding person’s attitude due to it was related to feelings
of like or dislike, agree or disagree about a condition (Pryor, Pryor, & Kang, 2016), in this case, of course, again is related
to the STEM education. Finally, an example item on the STEM application domain, e.g., “I usually teach science content
using any kinds of technologies, engineering and mathematical context simultaneously.” Authors argue that the sentence
could extract information about teacher’s STEM performance or activity in the classroom. This argument is under the
definition from Thomson (1998) which stated that the application is an action of implementation of something on a
surface. Thus, all items contained in AKA’s instrument were relevant, valid, and feasible concerning its construct and
definition to be used in assessing some important aspect on science teacher regarding STEM education in the field.
Teachers are a vital component in an educational process. Ibrahim and Aulls (2017) found that teachers’ roles
in class included the teachers’ roles as motivators and facilitators as well as teachers’ roles as mentors and models.
Furthermore, the part of the teacher in STEM teaching and learning approach is to aid the students to develop a
conception or abstractions and to decontextualize concepts for implementation in a variety of authentic contexts
on different real-world problems (Moore et al. (2014). Thus, knowledge of STEM is absolutely required by science
teachers. Quantitative knowledge can aid teachers towards better understanding and attitude. A good attitude led
an impact on STEM application in everyday teaching and learning process. Various studies have proven that STEM
applications by teachers are affected by several attributes such as the level of knowledge and attitudes of teachers
themselves (Han, Yalvac, Capraro, & Capraro, 2015; Thibaut, Knipprath, Dehaene, & Depaepe, 2018). On the other hand,
STEM is becoming a trend lately because of its reliability and potential in improving the quality of learning. However,
not all the teachers understand and accept STEM. Many teachers were indicated to have misunderstandings and
misconceptions toward STEM education. For instance, teachers show weak control of students and sometimes just sit
down and see what students are doing without any intervening (Han, Yalvac, Capraro, & Capraro, 2015). Inevitably, an
investigation or survey of the STEM attitude, knowledge, and application of the teacher becomes very important in
providing information on current conditions and for better planning in the future. WHO (2008) states that the survey
would be a tool to gather information about specific information or a specific small topic. Thus, they also suggested
that the results of investigation through a survey could be a baseline to represent data collected, at a point in time
before any intervention is carried out.
Then, some contributions of this current research to the literature. Firstly, this instrument is specially used to
assess a general quantitative degree regarding attitude, knowledge, and application of STEM education simultane-
72
https://doi.org/10.33225/jbse/19.18.63
Journal of Baltic Science Education, Vol. 18, No. 1, 2019
ISSN 1648–3898 /Print/ DEVELOPMENT AND VALIDATION OF A SURVEY INSTRUMENT (AKA) TOWARDS ATTITUDE,
KNOWLEDGE AND APPLICATION OF STEM
ISSN 2538–7138 /Online/ (P. 63-76)
ously by science teachers. Secondly, this research used factor analysis and content and face validity from experts to
determine AKA’s instrument validity which is the step rarely funded in some similar instrument developments. Then,
the final results of the AKA instrument gave us a quantitative way to measure the progress of STEM Education.
Conclusions
The results showed that the designed and developed AKA instrument was valid and could be used to collect
data. The AKA instrument allows for collecting quantitative data on a large-scale; and by using an amalgamated instru-
ment, the field now has, a baseline to start from, or refer to, for any interventions of STEM education. The development
of AKA instrument enables users worldwide to obtain information about the progress of STEM education and the
problems as well as any challenges which would face the science teachers in the field. Furthermore, this instrument
was being a prototype for the emergence of other similar instruments that may differ in some respects depending
on the purpose of the development and where the instrument is used. Some of the things mentioned above reflect
the limitations of this instrument. For instance, the purpose of this instrument was to know the general quantitative
level of STEM, so it is unsuitable for data collecting from a STEM workshop’ pre-test and post-test or other similar
activities. Another limitation is that this instrument has only been tested in one country namely Indonesia. Then,
in the next future research, larger samples from different area and cultures should be explored. It will take several
repetitions, especially in different target countries to obtain more accurate results. However, a recommendation for
further research is to test the psychometric properties of the instrument on a variety of cultural contexts and demo-
graphic data of the targeted respondents. Nonetheless, researchers believed that the AKA instrument could be used
for researchers that are interested in, for instance, to explore some factors which affect science teachers to integrate
STEM in their classroom on any areas or countries.
Acknowledgements
The authors would like to express the gratefulness to Terrence from the Science Education Center, NTNU, who
have helped in the English editing process. This study was supported in part by the Ministry of Science and Technology
(MOST), Taiwan, Republic of China., by the number of grant 106-2511-S-003-050-MY3, A Joint Adventure in Science
Education Research & Practice (STEM 2TV) and the National Taiwan Normal University Subsidy for Talent Promotion
Program. We also would like to say thank you, for having received funding from the Ph.D. Degree Training of the 4 in
1 project of University of Jember, the Ministry of Research Technology and Higher Education Indonesia, and Islamic
Development Bank (IsDB).
References
73
https://doi.org/10.33225/jbse/19.18.63
Journal of Baltic Science Education, Vol. 18, No. 1, 2019
DEVELOPMENT AND VALIDATION OF A SURVEY INSTRUMENT (AKA) TOWARDS ATTITUDE,
KNOWLEDGE AND APPLICATION OF STEM
ISSN 1648–3898 /Print/
(P. 63-76) ISSN 2538–7138 /Online/
El-Deghaidy, H., & Mansour, N. (2015). Science teachers’ perceptions of STEM education: Possibilities and challenges. International
Journal of Learning and Teaching, 1 (1), 51-54.
English, L. D. (2016). STEM education K-12: Perspectives on integration. International Journal of STEM Education, 3 (3), 2-8.
Gosselin, D. C., & Macklem-Hurst, J. L. (2002). Pre-/post knowledge assessment of an earth science course for elementary/middle
school education majors. Journal of Geoscience Education, 50 (3), 169-175.
Guskey, T. R. (2003). Analyzing lists of the characteristics of effective professional development to promote visionary leadership.
NASSP Bulletin, 87 (637), 4-20.
Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2006). Multivariate data analysis (6th Ed.). Upper Saddle River, NJ:
Pearson Prentice Hall.
Han, S., Yalvac, B., Capraro, M.M., & Capraro, R.M. (2015). In-service teachers’ implementation and understanding of STEM project-
based learning. Eurasia Journal of Mathematics, Science & Technology Education, 11 (1), 63-76.
Herschbach, D.R. (2011). The STEM initiative: Constraints and challenges. Journal of STEM. Teacher Education, 48 (1), 96–112.
Honey, M., Pearson, G., & Schweingruber, H. (2014). STEM integration in K-12 education: Status, prospects, and an agenda for research.
Washington, DC: The National Academies Press.
Ibrahim, A., Aulls, M.W., & Shore, B.M. (2017). Teachers’ roles, students’ personalities, inquiry learning outcomes, and practices of
science and engineering: The development and validation of the McGill attainment value for inquiry engagement survey in
STEM disciplines. International Journal of Science and Mathematics Education, 15 (3), 1195–1215.
Joreskog, K.G. (1971). Statistical analysis of sets of congeneric tests. Psychometrika, 36, 109-133.
Kelley, TR., & Knowles, JG. (2016). A conceptual framework for integrated STEM education. International Journal of STEM Education,
3 (11), 2–11.
Kola, A. J. (2013). Importance of science education to national development and problems militating against its development.
American Journal of Educational Research, 1 (7), 225-229.
Lam, P., Doverspike, D., Zhao, J., Zhe, J., & Menzemer, C. (2008). An evaluation of STEM program for middle school students on learn-
ing disability related IEPs. Journal of STEM Education, 9 (1&2), 21-29.
Lin, T. J., & Tsai, C. C. (2017). Developing instruments concerning scientific epistemic beliefs and goal orientations in learning science:
A validation study. International Journal of Science Education, 39 (17), 2382–2401.
Loehlin, J. (1998). Latent variable models: An introduction to factor, path, and structural analysis. Lawrence Erlbaum: Associates
Publishers.
Maio, G., & Haddock, G. (2014). The psychology of attitudes and attitude change. London, England: Sage Publications.
Meng, C. C., Idris, N., Eu, L. K., & Daud, M. F. (2013). Secondary school assessment practices in science, technology, engineering and
mathematics (STEM) related subjects. Journal of Mathematics Education, 6 (2), 58-69.
Moore, T. J., Stohlmann, M. S., Wang., H. H., Tank, K. M., Glancy, A., & Roehrig, G. H. (2014). Implementation and integration of en-
gineering in K-12 STEM education. In J. Strobel, S. Purzer, M. Cardella (Eds.), Engineering in precollege settings: Research into
practice, (pp. 35–59). West Lafayette: Purdue University Press.
National Audit Office. 2010. Educating the next generation of scientists. London: The Stationery Office.
National Research Council (NRC). 2012. A Framework for K12 science education: Practices, cross, cutting concept, and core ideas. Wash-
ington: National Academies Press.
Pallant, J. (2005). SPSS survival manual. (2nd Ed.). New York: Open University Press.
Pryor, B. W., Pryor, C. R., & Kang, R. (2016). Teachers’ thoughts on integrating STEM into social studies instruction: Beliefs, attitudes,
and behavioral decisions. The Journal of Social Studies Research, 40, 123–136.
Roelens, K., Verstraelen, H., Egmond, K.V., & Temmerman, M. (2006). A knowledge, attitudes, and practice survey among obstetrician-
gynaecologists on intimate partner violence in Flanders, Belgium. BMC Public Health, 6 (238), 1-10.
Ruiz, M. A. O., Osuna, L. V., Salas, B. V. A., Wienner, M. S., Garcia, J. S., Cordova, E. C., Nedev, R., & Ibarra, R. (2014). The importance of
teaching science and technology in early education levels in an emerging economy. Bulletin of Science, Technology & Society,
34 (3), 87-93.
Shulman, L.S. (1986). Those who understand: Knowledge growth in teaching. Educational Research, 15 (2), 4-14.
Siew, N. M., Amir, N., & Chong, C. L. (2015). The perceptions of pre-service and in-service teachers regarding a project-based STEM
approach to teaching science. SpringerPlus, 4 (8), 1-20.
Summers, R., & Abd-El-Khalick, F. (2018). Development and validation of an instrument to assess student attitudes toward science
across grades 5 through 10. Journal of Research in Science Teaching, 55 (2), 172–205.
Thibaut, L., Knipprath, H., Dehaene, W., & Depaepe, F. (2018). How school context and personal factors relate to teachers’ attitudes
toward teaching integrated STEM. International Journal of Technology and Design Education, 28 (3), 631-651.
Thomson, D. (1998). The concise Oxford dictionary: 9th edn. England: Oxford University Press.
Van Aalderen-Smeets, S. I., Walma van der Molen, J. H., & Asma, L. J. (2012). Primary teachers’ attitudes toward science: A new theo-
retical framework. Science Education, 96 (1), 158-182.
Vennix, J., Brok, P. den., & Taconis, R. (2017). Perceptions of STEM-based outreach learning activities in secondary education. Learn-
ing Environment Research, 20, 21–46.
Wahono, B., Rosalina, A. M., Utomo, A. P., Narulita, E. (2018). Developing STEM based student’s book for grade XII Biotechnology
topics. Journal of Education and Learning, 12 (3), 450-456.
Watermayer, R., & Montgomery, C. (2018). Public dialogue with science and development for teachers of STEM: Linking public
dialogue with pedagogic praxis. Journal of Education for Teaching, 44 (1), 90-106.
WHO. (2008). A guide to developing knowledge, attitude and practice surveys. Switzerland: World Health Organization.
74
https://doi.org/10.33225/jbse/19.18.63
Journal of Baltic Science Education, Vol. 18, No. 1, 2019
ISSN 1648–3898 /Print/ DEVELOPMENT AND VALIDATION OF A SURVEY INSTRUMENT (AKA) TOWARDS ATTITUDE,
KNOWLEDGE AND APPLICATION OF STEM
ISSN 2538–7138 /Online/ (P. 63-76)
Appendix
Scales
Items Neither
Strongly Strongly
Disagree Agree or Agree
disagree Agree
Disagree
75
https://doi.org/10.33225/jbse/19.18.63
Journal of Baltic Science Education, Vol. 18, No. 1, 2019
DEVELOPMENT AND VALIDATION OF A SURVEY INSTRUMENT (AKA) TOWARDS ATTITUDE,
KNOWLEDGE AND APPLICATION OF STEM
ISSN 1648–3898 /Print/
(P. 63-76) ISSN 2538–7138 /Online/
Bevo Wahono MA, PhD Student, Graduate Institute of Science Education, National
Taiwan Normal University, Taiwan & Lecturer at Faculty of Teacher
Training and Education, University of Jember, Indonesia.
E-mail: [email protected]
Chun-Yen Chang PhD, Chair Professor, Director of Science Education Center, National
Taiwan Normal University, Taiwan.
Email: [email protected]
Website: http://changcy.com
76
https://doi.org/10.33225/jbse/19.18.63