Week 1-Midterm

Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

UNIVERSITY OF CAGAYAN VALLEY

(Formerly Cagayan Colleges Tuguegarao)


Tuguegarao City, Cagayan, Philippines
SCHOOL OF LIBERAL ARTS AND TEACHER EDUCATION
Development of Classroom Assessment Tools

Introduction

One of the most important aspects of being a teacher is administering tests to students
in order to assess their learning ability and how well they comprehend the material at hand.
Good assessments inform teachers if students have grasped the material and offer insight into
their capacity to think at a higher level. Assessment should be ongoing rather than something
slapped on at the conclusion of a unit of a study. Assessments should be frequent enough to
inform teachers and students about their progress, and they should be consistent in terms of
subject application and marking. Teachers assess students on a daily basis through casual
observations, data collection, and a more formal classroom test. In this lesson, students will
select and construct appropriate test items and tasks for classroom assessments that they will
use in their future teaching careers.

Learning Objectives

At the end of the lesson, students should be able to:


1. Define and give examples of each of the different types of test;
2. Discuss guidelines in constructing different test formats.
3. Construct different type of tests in accordance with the guidelines in test construction.

Learning Content

TYPES OF TEST ITEM FORMATS

Test items can be written in various formats, including multiple choice, matching, true/false,
short answer, and essay. These formats vary in their strengths and weaknesses, and no one
format is ideal in all circumstances.

The first three formats are known as selection type (selected-response), because the student
sees the possible answers and has to choose (or select) the correct one.
1. Multiple Choice
2. Binary-Choice
3. Matching Type.
The last two formats are known as supply test (constructed-response), because the student has
to come up with the answers on his own.
1. Short Answer.
2. Essay-restricted
3. Essay-extended

SELECTION TYPE (SELECTION-RESPONSE)

In selection type test items several possible answers/alternatives for each question are already
provided, and students are simply required to choose the correct or best response among them
rather than recalling and giving facts or information from their own memory. As a result, these
test items are easier to answer than supply test items. The following types of selection type test
items are most common:

1. Multiple Choice
The multiple choice (MC) formats is the most commonly used format in formal testing.
There are two parts to a multiple choice test item: the stem, which comprises the question,
statement, or problem, plus a list of alternative responses make up a multiple choice
test item. Distractors refer to incorrect responses. Multiple choice is popular for several
reasons:
a. No subjective evaluation is required in scoring (the answer is either right or
wrong, best or not best, not half-right or partly wrong).
b. It lends itself to detailed analysis of responses, in which even incorrect answers
can provide information on the student's skills.
c. It lends itself well to computer scoring.

Figure 1. Sample multiple choice test items.


(Taken from: http://avstop.com/ac/instructors_handbook/images/fig_b_2.jpg)
There is also one significant drawback to multiple choice. As a selected-response format,
it is unable to test writing skills, including organization of thought and originality. These
skills are generally beyond the scope of a standardized achievement test.

In addition to the general characteristics of a good test item noted in Introduction to


Test Items, there are some specific guidelines to follow when writing or evaluating MC
items. Some relate to the stem, some to the options.

Guidelines in constructing Multiple Choice Item

1. The stem should clearly state the problem. A good stem is often clear enough
that a competent student can answer the item correctly without seeing any of
the options.
2. The stem should contain as much as the item as possible, but no more. There is
no point in redundantly repeating something in each option that can be stated in
the stem. On the other hand, the stem should not wordy nor contain irrelevant
information, known as window dressing. One exception would be a problem
presented that requires the student to determine which facts presented are
necessary to solve the problem.
3. The stem should, in most cases, be worded positively and in the active voice.
When negatives do need to be used, they must be accentuated in boldface or
ALLCAPS.
4. Use "story problems" – literally or figuratively – to present scenarios that require
comprehension and analysis, not merely recall of the concept.
5. Always keep in mind that the primary goal in writing the response options in MC
is to make it difficult for an uninformed person who is skilled at testing to figure
out the correct answer. Knowledge of the construct being evaluated ideally plays
the only factor in correctly answering an MC or any other item format.
6. Three or four options are best. It is difficult to write more than two or three
plausible distractors. The various authors of the Handbook of Test
Development range from mild to strongly-worded support of only three options.
7. All options should be parallel in structure and similar in length. The item is more
readable, and there will be no obvious clues as to which options may be correct
or are obviously incorrect.
8. Options must be grammatically consistent with the stem in order to prevent
elimination of distractors.
9. All options must be plausible. If someone skilled, or at least comfortable, in a
testing environment, were to take a test on a subject of which he knew nothing,
he should not be able to dismiss options that seem to be implausible.
10. Distractors should reflect typical student errors, which makes them more
plausible and more valuable in analyzing student performance.
11. The option, "All of the above", is confusing and should generally be avoided. The
option, "None of the above", should only be used when there is one absolutely
correct answer, as in spelling or math.
12. Options should avoid clang associations, in which the correct answer contains a
word or phrase from the stem that the distractors lack.
13. Options should be placed in a logical order, such as numerical, alphabetical, or
response length. On the other hand, placement of the correct response should
be random. Any discernable pattern of correct answers can invalidate a test.
14. Options should not overlap each other; one option should not be a partial version
of another.

2. Binary-choice

True or false, correct or wrong, good or better, and so on are examples of binary-choice
exams. By pure guesswork, a student who knows nothing about the exam's topic has a
50 percent chance of answering the correct answer. Although there are methods for
correcting for guessing, it is preferable if the teacher guarantees that a true-false item
can distinguish between those who know and those who are simply guessing. By forcing
students to explain their answers and disregarding a right response if the explanation is
wrong, a modified true-false exam can mitigate the effects of guessing.

The following are some general guidelines for creating true-false items.

1. Do not give a hint (inadvertently) in the body of the question.

Example:
The Philippines gained its independence in 1898 and therefore celebrated its
centennial year in 2000.
Obviously, the answer is FALSE because 100 years from 1898 is not 2000 but
1998.

2. Avoid using the words “always”, “never” “often” and other adverbs
that tend to be either always true or always false.
Example:
Christmas always falls on a Sunday because it is a Sabbath day.
Statements that use the word “always” are almost always false. A test-wise
student can easily guess his way through a test like these and get high scores
even if he does not know anything about the test.

3. Avoid long sentences as these tend to be “true”. Keep sentences short.

Example 1:
Tests need to be valid, reliable and useful, although, it would require a great
amount of time and effort to ensure that tests possess these test characteristics.
Notice that the statement is true. However, we are also not sure which part of
the sentence is deemed true by the student. It is just fortunate that in this case,
all parts of the sentence are true and hence, the entire sentence is true. The
following example illustrates what can go wrong in long sentences:

Example 2:
Tests need to be valid, reliable and useful since it takes very little amount of
time, money and effort to construct tests with these characteristics.
The first part of the sentence is true but the second part is debatable and may,
in fact, be false. Thus, a “true” response is correct and also, a “false” response is
correct.

4. Avoid trick statements with some minor misleading word or spelling anomaly,
misplaced phrases, etc. A wise student who does not know the subject matter
may detect this strategy and thus get the answer correctly.

Example:
True or False. The Principle of our school is Mr. Michael Z. Alonzo.

The Principal’s name may actually be correct but since the word is misspelled
and the entire sentence takes a different meaning, the answer would be false!
This is an example of a tricky but utterly useless item.

5. Avoid quoting verbatim from reference materials or textbooks. This practice


sends the wrong signal to the students that it is necessary to memorize the
textbook word for word and thus, acquisition of higher level thinking skills is not
given due importance.

6. Avoid specific determiners or give-away qualifiers. Students quickly learn that


strongly worded statements are more likely to be false than true, for example,
statements with “never” “no” “all” or “always.” Moderately worded statements
are more likely to be true than false. Statements with “many” “often”
“sometimes” “generally” ‘frequently” or “some” should be avoided.

7. With true or false questions, avoid a grossly disproportionate number of either


true or false statements or even patterns in the occurrence of true and false
statements.

3. Matching Type

A matching test item is made up of two lists, each of which might contain a
variety of words, terminology, pictures, phrases, or sentences. The student must match
options from one list with similar options from a second list. A matching exercise is, in
actuality, a set of linked multiple-choice items. More samples of a learner's knowledge
may generally be tested with matching rather than multiple choice items in a given
length of time. The matching item is very useful for assessing a learner's ability to detect
patterns and create connections between terms, components, words, phrases, clauses,
or symbols in one column and similar things in another. Matching decreases the
likelihood of predicting accurate replies, especially when alternatives can be utilized
several times. It is also possible to make better use of testing time.
They are effective when you need to measure the learner’s ability to identify the
relationship or association between similar items. They work best when the course
content has many parallel concepts.
 Terms and Definitions
 Objects or Pictures and Labels
 Symbols and Proper Names
 Causes and Effects
 Scenarios and Responses
 Principles and Scenarios to which they apply

The items in the first column are called premises and the answers in the second column
are the responses.
 Premises column – the column for which a match is right
 Response column – the column from which to choose the match
By convention, the items in Column A are numbered and the items in Column B are
labelled with capital letters.

Figure 2. Sample matching type test items.


Source:https://theelearningcoach.com/elearning_design/writing-matching-test-items/

The following are some general guidelines for creating matching type items.

1. Two-part directions. Your clear directions at the start of each question need
two parts: 1) how to make the match and 2) the basis for matching the response
with the premise. You can also include whether items can be re-used, but often
pre-built templates don’t allow for this.

Example for exercise above: Drag each career name in Column B to the best
definition in Column A. No items may be used more than once.

2. Parallel content. Within one matching test item, use a common approach, such
as all terms and definitions or all principles and the scenarios to which they
apply.

3. Plausible answers. All responses in Column B should be plausible answers to


the premises in Column A. Otherwise, the test loses some of
its reliability because some answers will be “give-aways.”

4. Clueless. Ensure your premises don’t include hints through grammar (like
implying the answer must be plural) or hints from word choice (like using the
term itself in a definition).
5. Unequal responses. In an ideal world, you should present more responses
than premises, so the remaining responses don’t work as hints to the correct
answer. This is not often possible when using a template.

6. Limited premises. Due to the capacity limitations of working memory, avoid a


long list of premises in the first column. A number that I’ve come across is to
keep the list down to six items. Even less might be better, depending on the
characteristics of your audience.

7. One correct answer. Every premise should have only one correct response.
Obvious, but triple-check to make sure each response can only work for one
premise.

SUPPLY TEST (CONSTRUCTED-RESPONSE)

The word "supply item" comes from the fact that students are asked to give the solution
to a question rather than choose or choose one. There are occasions when assessments include
supply item questions with only one right answer. The student must provide a response in the
form of a word, phrase, or paragraph in order to pass the supply type test. The learner must
organize their knowledge in the supply type item. It necessitates the capacity to articulate
oneself and is thus useful in assessing a learner's overall comprehension of a topic.

They are typical broken into four categories of short answer, completion, restricted
response essay, and extended response essay.

1. Short-answer. Open-ended questions that require pupils to develop a response are


known as short-answer questions. Short responses are created as straight answers to
queries, rather than providing words to complete sentences. This style is often used in
exams to test a topic's fundamental knowledge and comprehensions (low cognitive
levels - declarative and procedural knowledge) before more in-depth assessment
questions are addressed. There is no standard format for short answer questions.
Complete the phrase, give the missing word, provide short descriptive or qualitative
replies, illustrations with explanations, and other questions may demand responses.
The response is generally brief, ranging from a single word to a few lines. Students
frequently respond in bullet style. In this style, item statements are usually in the
interrogative form (see table 1).

Table 1: Sample Short-answer test


Stimulus Response
An interrogative statement (direct question) (short phrases or statement)

quadrilateral
1. What is four-sided polygon called?
2. What is the circumference of a circle with a 51.81 feet
diameter of 16.5 feet?
3. What is the area of a circle with a radius of 5 78.5
feet?
Square and rhombus
4. What 2 shapes have all 4 equal sides?
polygon
5. What is a closed shaped with straight lines?

Guidelines in constructing a good short answer question:


Writing short-answer items similarly follow guidelines in writing completion items.

1) State the item so that only one answer is correct


2) State the item so that the required answer is brief. Requiring a long response
would not be necessary and it can limit the number of items students can answer
within the allotted period of time.
3) Do not use questions verbatim from textbooks and other instructional materials.
This will give undue disadvantage to students not familiar with the materials
since it can become a memory test instead of completion.
4) Designate units required for the answer. This frequently occurs when the
constructed response requires a definite unit to be considered correct. Without
designating the unit, a response may be rendered wrong because of a differing
mindset.
Example:
Poor: How much does the food caterer charge? This item could be answered in
different ways like cost per head, per dish, per plate, or as a full package
Improved: How much does the food caterer charge per head?
5) State the item succinctly with words students understand. This is true for all
types of tests. The validity of a classroom-based test is at risk when students
cannot answer correctly, not because they do not know, but could be due to the
messy wording of the question.
Example:
Poor: As viewed by creatures from the earth, when does the blood moon appear
in the evening?
Improved: When does a blood moon appear?
6) Design short answer items which are the appropriate assessment of the learning
objective
7) Make sure the content of the short answer question measures knowledge
appropriate to the desired learning goal
8) Express the questions with clear wordings and language which are appropriate to
the student population
9) Ensure that the item clearly specifies how the question should be answered (e.g.
Student should answer it briefly and concisely using a single word or short
phrase? Is the question given a specific number of blanks for students to
answer?)
10) Write the instructions clearly so as to specify the desired knowledge and
specificity of response
11) Set the questions explicitly and precisely.
12) Direct questions are better than those which require completing the sentences.
13) For numerical answers, let the students know if they will receive marks for
showing partial work (process-based) or only the results (product based), also
indicated the importance of the units.
14) Let the students know what your marking style is like, is bullet point format
acceptable, or does it have to be an essay format?
15) Prepare a structured marking sheet; allocate marks or part-marks for an
acceptable answer(s).
16) Be prepared to accept other equally acceptable answers, some of which you may
not have predicted.

2. Fill in the Blank Test.This is a type of exam in which pupils are given a blank sentence
and must fill it in with the most acceptable response available. It primarily needs
Bloom's Taxonomy's remembering level, and if effectively structured, a question can test
higher-order thinking. This exam has a stem and a blank space for pupils to fill in the
right answer (see table 1). It is recommended that the blank in this style be no more
than two lines long (2).

Table 2: Sample Fill in the Blank Test.


Stimulus Response
An interrogative statement (direct question) (short phrases or statement)

quadrilateral
1. A four-sided polygon called a______.
2. The circumference of a circle with a 51.81 feet
diameter of 16.5 feet is ______.

3. The area of a circle with a radius of 5 feet is 78.5


_______.

4. The 2 shapes have all 4 equal sides are Square and rhombus
_____.

5. It is a closed shaped with straight lines polygon


____.

3. Completion Type. The test items that are provided in the form of incomplete
statements are known as completion type test items. In completion type of test, learners
are expected to enter a word, or a sequence of words, to complete a phrase, or a series
of sentences,. Completion items can be used to assess a variety of abilities and
information, although they are most commonly used to measure fundamental
knowledge.

Guidelines in the construction of Completion Items

1. When completing a statement, there should only be one right response. When
there is only one predicted response, this helps with scoring efficiency because a
key to rectification may be readily produced in advance. To avoid more than one
right response, the incomplete statement must be properly worded. If you're
testing for verbal inventiveness, it's okay to give a variety of appropriate replies.
This should, however, be stated clearly in the exam instructions. For instance,
the more synonyms students can give to the word costly like expensive,
exorbitant, pricey, the more points they can earn (see table 2). Objective scoring
will likely have to be modified also. Another example, in table 1 item a, the way
it is worded may be open to more than one acceptable answer such as square,
rectangle, or parallelogram. To eliminate other terms it can be worded this way,
"A quadrilateral with four equal sides is called _____."

2. The blank should be at the end of the incomplete statement or towards the end.
This will give the reader a suitable and adequate background before they answer
the blank, preventing them from being puzzled.
3. Avoid giving out unintentional hints about the correct answer. When a student
properly answers an item without truly knowing what the correct response is, the
validity of his or her score is endangered. His or her score might indicate a
different type of skill than what was meant to be tested. This occurs when a
student who doesn't know the answer uses unintentional grammatical hints, such
as the existence of the indefinite articles "a or an" before the blank to indicate a
solution that begins with a vowel.

ESSAY

This is a free response test question. Unlike the completion and short-answer items
which are highly structured to elicit only one short correct answer, essay items are less
structured to allow the students to organize freely their response using their own writing style
to answer the question. It allows measuring students' abilities to organize, integrate, and
synthesize his knowledge, to use the information to solve problems, and to be original or
innovate in his approaches to problem situations. This can be a composition test or definition
illustration test. This format, therefore, is appropriate for testing deep understanding and
reasoning. Some of the thinking processes involved in answering essay questions are
comparison, induction, deduction, abstracting, analyzing perspective, decision-making, problem-
solving, constructing support, and experimental inquiry (Marzano, et.al., 1993). They actually
involve higher-order thinking skills.

Types of Essay

1) Restricted-Response Essay. The content and response of a restricted response


question are generally limited. The breadth of the issue to be discussed usually limits
the substance; constraints on the type of answer are frequently included in the
question. Another technique for limiting replies in essay examinations is to ask questions
about specific issues.
Examples:
o Write the life sketch of Jose Rizal in 100 words?
o State any five definitions of education?

2) Extended-Response Essay. Extended response has no restriction is placed in


students as to the point he will discuss and the type of organization he will use.
Teachers in such a way so as to give students the maximum possible freedom to
determine the nature and scope of the question and in a way he would give a response,
of course, being related topic and in a stipulated time frame to these types of
questions.

Examples:
o Global warming is the next key to disaster. Explain.
o Do children need to go to school? Support your answer.

Guidelines in constructing an essay test as suggested by Miller, Linn & Gronlund


(2009).

1. Restrict the use of essay questions to those learning outcomes that cannot be measured
satisfactorily by objective items. Objective items cannot measure such important skills as
the ability to organize, integrate, and synthesize ideas showing one's creativity in writing
style. The use of essay format encourages and challenges students to indulge in high-
order thinking skills instead of simply tote memorization of facts and of remembering
inconsequential details.
2. Construct questions that will call forth the skills specified in the learning standards. A
review of learning standards in school curricula will show that they range from
knowledge to deep understanding. The performance standards require the learners to
demonstrate the application of principles, analysis of experimental findings, evaluation
of results, and creation of new knowledge, and these are explicitly stated in terms of the
expected outcomes at every grade level. The essay questions to be constructed then
should make the students model how they are to perform the thinking processes.
3. Phrase the question so that the student's task is clearly defined. Restricted-response
type of essay questions especially states the specific task to be done in writing. As much
as possible, the students should interpret the question in the same way according to
what the teacher expects through the specifications in the question.
4. Indicate an approximate time limit for each question. This should be especially
considered when the test is a combination of objective and non-objective format like the
inclusion of essay questions. Knowing how much time is allotted to each one will make
the students budget their time so they do not spend their time on the first question and
consequently missing out on the others.
5. Avoid the use of optional questions. Some teachers have the practice of allowing the
students to select one or two essay questions from a set of five questions. Some
disadvantages of this practice may include: not being able to use the same basis for
reporting test results, or students being able to prepare through memorization for those
they will likely choose.
6. Plan what mental process are to be tested before writing the test (student's analytical
skills? knowledge? or his ability to synthesize?)
7. Use essay questions to test the students' ability to organize information
8. Use keywords to phrase your essay questions (example: compare, explain, predict...)
9. Focus your essay question on only one issue at a time
10. Inform the test taker that questions will be graded on the strength of their evidence,
presentation, and organization of thoughts on an issue and not on the basis of the
position taken on an issue.

WRITING GOOD MULTIPLE CHOICE TEST QUESTIONS

Multiple choice test questions, also known as items, can be an effective and efficient way to
assess learning outcomes. Multiple choice test items have several potential advantages:

Versatiliy: Multiple choice test items can be written to assess various levels
of learning outcomes, from basic recall to application, analysis, and
evaluation. Because students are choosing from a set of potential answers,
however, there are obvious limits on what can be tested with multiple choice
items. For example, they are not an effective way to test students’ ability to
organize thoughts or articulate explanations or creative ideas.

Reliability: Reliability is defined as the degree to which a test consistently


measures a learning outcome. Multiple choice test items are less susceptible to
guessing than true/false questions, making them a more reliable means of
assessment. The reliability is enhanced when the number of MC items focused
on a single learning objective is increased. In addition, the objective scoring
associated with multiple choice test items frees them from problems with
scorer inconsistency that can plague scoring of essay questions.

Validity: Validity is the degree to which a test measures the learning outcomes
it purports to measure. Because students can typically answer a multiple
choice item much more quickly than an essay question, tests based on multiple choice items can
typically focus on a relatively broad representation of course material, thus increasing the
validity of the assessment.
The key to taking advantage of these strengths, however, is construction of good multiple choice
items.
A multiple choice item consists of a problem, known as the stem, and a list of suggested
solutions, known as alternatives. The alternatives consist of one correct or best alternative, which
is the answer, and incorrect or inferior alternatives, known as distractors.

CONSTRUCTING AN EFFECTIVE STEM

1. The stem should be meaningful by itself and should present a definite problem. A stem
that presents a definite problem allows a focus on the learning outcome. A stem that does
not present a clear problem, however, may test students’ ability to draw inferences from
vague descriptions rather serving as a more direct test of students’ achievement of the
learning outcome.
2. The stem should be negatively stated only when significant learning outcomes require
it. Students often have difficulty understanding items with negative phrasing (Rodriguez
1997). If a significant learning outcome requires negative phrasing, such as identification
of dangerous laboratory or clinical practices, the negative element should be emphasized
with italics or capitalization.

Use negatively stated stems sparingly and when using negatives such as NOT, underline
or bold the print.
1. Which of the following is not a prime number?
A. 11
B. 21
C. 31
D. 41

BETTER STEM
2. Which of the following is NOT a prime number?
A. 11
B. 21
C. 31
D. 41
3. The stem should be a question or a partial sentence. A question stem is preferable
because it allows the student to focus on answering the question rather than holding the
partial sentence in working memory and sequentially completing it with each alternative
(Statman 1988). The cognitive load is increased when the stem is constructed with an
initial or interior blank, so this construction should be avoided.

4. Eliminate excessive wording and irrelevant information. Reduce wordiness


Sheldon developed a highly controversial theory of personality based on
body type and temperament of the individual. Which of the following is a
criticism of Sheldon's work?
A. He was influenced too much by the Freudian psychoanalysis.
B. His rating of physique and temperament were not independent.
C. He failed to use empirical approach.
D. His research sample was improperly selected
BETTER
Which of the following is a criticism of Sheldon's work?
A. He was influenced too much by the Freudian psychoanalysis.
B. His rating of physique and temperament were not independent.
C. He failed to use empirical approach.
D. His research sample was improperly selected

5. Avoid repetitive words.

BETTER
The Philippines is called “Gateway to the East” because it is ___.
A. an archipelago
B. located in Southeast Asia
C. in a strategic location
D. rich in resources

The Philippines is called “Gateway to the East” because


A. It is an archipelago
B. It is located in Southeast Asia
C. It has strategic location
D. It is rich in resources
6. When using incomplete statements avoid beginning with the blank space.
Better
The least severe form of behavior disorder is
____________________.
A. Neurasthenia
B. Neurosis
C. Panic
D. Psychosis

________________ is the least form of behavior disorder.


A. Neurasthenia
B. Neurosis
C. Panic
D. Psychosis

7. Spaces put between items.


11. Eyewitnesses accounts are example of _________ sources.
A. Court
B. Primary
C. Secondary
D. All of the above

CONSTRUCTING EFFECTIVE ALTERNATIVES

8. All alternatives should be plausible. The function of the incorrect alternatives is to


serve as distractors, which should be selected by students who did not achieve the
learning outcome but ignored by students who did achieve the learning outcome.
Alternatives that are implausible don’t serve as functional distractors and thus should not
be used. Common student errors provide the best source of distractors.

9. Alternatives should be stated clearly and concisely. Items that are excessively wordy
assess students’ reading ability rather than their attainment of the learning objective
10. Alternatives should be mutually exclusive. Alternatives with overlapping content may
be considered “trick” items by test-takers, excessive use of which can erode trust and
respect for the testing process.

11. Alternatives should be homogenous in content. Alternatives that are heterogeneous in


content can provide clues to student about the correct answer.
12. Is the grammar in each option consistent with the stem?

These four picture show


A. Heat and light
B. Heating and sound
C. Light and sound
D. Sounds and colourful

These four picture show


A. Heat and light
B. Heat and sound
C. Light and sound
D. Sounds and color

13. When possible, present alternatives in some logical order.


The number of photoreceptors in the retina of each human is about
A. 115 million
B. 5 million
C. 65 million
D. 35 billion

Better
The number of photo receptors in the retina of each human is about
A. 5 million
B. 35 million
C. 65 million
D. 115 million

14. Numbers in order.

Two dice are tossed. How many possible outcomes are there?
A. 12
B. 24
C. 36
D. 42

15. Alphabetical order.

Who is the Father of Modern Chemistry?


A. Andre Ampere
B. Robert Brown
C. Albert Einstein
D. Antoine Lavoisier

16. The alternatives should be presented in a logical order (e.g., alphabetical or


numerical) to avoid a bias toward certain positions.

Dates in chronological order.


In what year did the Japanese bomb the American base at Pearl Harbour?
A. 1939
B. 1940
C. 1941
D. 1942
17. Lines from a passage should be arranged in order they appear in the passage • if nothing
else, others in length (pyramiding) --not vital, last choice.
Which observation is NOT in quantitative form?
A. Ten seeds germinated on the second day.
B. The book weights fifteen gram
C. Bubbles were formed on the surface
D. Ten grams of salt was left.
Which observation is NOT in quantitative form?
A. Ten grams of salt was left.
B. The book weights fifteen gram
C. Bubbles were formed on the surface
D. Ten seeds germinated on the second day.
18. Alternatives should be free from clues about which response is correct. Sophisticated
test-takers are alert to inadvertent clues to the correct answer, such differences in
grammar, length, formatting, and language choice in the alternatives. It’s therefore
important that alternatives.

19. The alternatives “all of the above” and “none of the above” should not be
used. When “all of the above” is used as an answer, test-takers who can identify more
than one alternative as correct can select the correct answer even if unsure about other
alternative(s). When “none of the above” is used as an alternative, test-takers who can
eliminate a single option can thereby eliminate a second option. In either case, students
can use partial knowledge to arrive at a correct answer.

20. Avoid complex multiple choice items, in which some or all of the alternatives consist of
different combinations of options. As with “all of the above” answers, a sophisticated
test-taker can use partial knowledge to achieve a correct answer.

BLOOMS’S TAXONOMY

Bloom's taxonomy is a set of three hierarchical models used to classify educational


learning objectives into levels of complexity and specificity. The three lists cover
the learning objectives in cognitive, affective and sensory domains.

The goal of an educator's using Bloom's taxonomy is to encourage higher-order thought in their
students by building up from lower-level cognitive skills. Behavioral and cognitive learning
outcomes are given to highlight how Bloom's taxonomy can be incorporated into larger-scale
educational goals or guidelines.
Examples of Student Activities and Verbs for Bloom’s Cognitive Levels
REFERENCES:

o De Guzman, Estefania S., and Adamos, Joel L., (2015). ASSESSMENT IN LEANING 1.
ADRIANA Publishing Company, Incorporated. Cubao, Quezon City, Manila, Philippines.
o Macarandang, Mercedes A. and Vega, Violeta A., (2003). ASSESSMENT OF LEARNING 1,
Book Atbp. Publishing Corp.
o Navarro, Rosita L., Santos, Rosita G., and Corpuz, Brenda B., (2019). ASSESSMENT IN
LEARNING 1 Fourth Edition. LORIMAR Publishing Inc. Quezon City, Manila.
o Oredina, Nora A., (2019). AUTHENTIC ASSESSMENT METHODOLOGIES IN
MATHEMATICS, Nieve Publishing CO. LTD.

You might also like