Suskie Handouts
Suskie Handouts
Suskie Handouts
Critical thinking is a widely used term whose meaning lacks clear consensus. Critical thinking skills
can include any or all of the following thinking skills.
Application is the ability to use knowledge and understanding in a new context. It includes the
abilities to understand cause-and-effect relationships, understand the meaning of various logical propositions,
criticize literary works, and apply scientific and economic principles to everyday life, provided that these
relationships, propositions, works and principles are new to the student. Many word problems in mathematics
require skill in application.
Examples:
Locate online resources on a particular topic or issue.
Apply scientific and economic principles to everyday life.
Analysis is the ability to break a complex concept apart to see the relationships of its components.
Students who can analyze can identify the elements, relationships, and underlying principles of a complex
process. Analysis is not merely understanding the components of a process or concept explained in class; that
would be simple understanding. Students who can analyze can understand the structure of concepts they
haven’t seen before. They can think holistically, make a case, discover the underlying principles of a
relationship, and understand organizational structure. They can integrate their learning, relating what they’ve
just learned to what they already know.
Examples:
Explain chemical reactions not explicitly introduced in prior study.
Explain the impact of the Korean War on U.S.-Far East relations today.
Analyze errors.
Analyze perspectives and values.
Explain why a research paper is structured the way it is.
Synthesis is the ability to put what one has learned together in a new, original way. It includes the
ability to theorize, generalize, reflect, construct hypotheses, and suggest alternatives.
Examples:
Write a poem that used imagery and structure typical of Romantic poets.
Explain what is likely to happen when Chemicals A and B are combined and justify the
explanation.
Use writing and research skills to write a term paper.
Design and conduct a research study.
Design a community service project.
Create a work of art that uses color effectively.
Evaluation, problem-solving, and decision-making skills are the ability to make an informed
judgment about the merits of something the student hasn’t seen before. These skills include the abilities to
conduct research, make appropriate choices, solve problems with no single correct answer, and make and
justify persuasive arguments. As with analysis, evaluation does not consist of merely understanding and
reflecting arguments that have been presented in coursework; that would be simple comprehension.
Examples:
Judge the effectiveness of the use of color in a work of art.
Evaluate the validity of information on a particular Web site.
Research, identify and justify potential careers.
Choose the appropriate mathematical procedure for a given problem.
Identify an audit problem and recommend ways to address it.
Creative thinking skills are the abilities to invent, generate new ideas, be flexible, take intellectual
risks, and generate new ways of viewing a situation.
Example:
Develop modifications to a system that improves its performance.
Metacognition is learning how to learn and how to manage one’s own learning by understanding how
one learns. Because, as noted earlier, knowledge is growing at an exponential pace, there is increasing
recognition that we must not only educate our students with what we know today but also prepare them for a
lifetime of learning, often on their own. Metacognition is thus becoming an increasingly valued skill.
Metacognition includes the abilities to use efficient learning techniques, discuss and evaluate one’s problem-
solving strategies, form efficient plans for completing work, and evaluating the effectiveness of one’s actions.
Metacognition is often taught and assessed by having students write reflections on what and how they’ve
learned.
Examples:
Develop a personal study strategy that makes the most of one’s learning style.
Reflect on one’s writing process.
Reflect on one’s completed work.
Other productive dispositions or habits of mind include the abilities to work independently, set
personal goals, persevere, organize, be clear and accurate, visualize, be curious, and be open-minded to new
ideas.
Examples:
Develop and use effective time management skills.
Follow directions correctly.
Critical thinking skills can also include the abilities to seek truth, clarity and accuracy, distinguish
facts from opinions, and have a healthy skepticism about arguments and claims.
Suggested Readings
Anderson, L. W. (Ed.), Krathwohl, D. R. (Ed.), & Bloom, B. S. (2000). Taxonomy for learning, teaching, and
assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman.
Angelo, T.A. (1991). Ten easy pieces: Assessing higher learning in four dimensions. In Classroom research:
Early lessons from success. New Directions in Teaching and Learning No. 46. San Francisco: Jossey-
Bass.
Biggs, J. (2001). Assessing for quality in learning. In Suskie, L. (Ed.), Assessment to promote deep learning.
Washington: American Association for Higher Education.
Erwin, T. Dary. (2000). The NPEC sourcebook on assessment, volume 1: Definitions and assessment methods
for critical thinking, problem solving, and writing [Electronic version]. Washington: National Center
for Education Statistics. Retrieved May 1, 2003, from http://www.nces.ed.gov/npec/evaltests/
Facione, P. A. (1998). Critical thinking: What it is and why it counts. Retrieved May 1, 2003, from
http://www.calpress.com/pdf_files/what&why.pdf
Greenwood, A. (Ed.). (1994). The national assessment of college student learning: Identification of the skills
to be taught, learned, and assessed: A report on the proceedings of the second study design workshop.
Research and Development Report NCES 94-286. Washington: U.S. Department of Education, Office
of Educational Research and Improvement, National Center for Education Statistics.
Gronlund, N. E. (1999). How to write and use instructional objectives (6th ed.). Upper Saddle River, NJ:
Prentice Hall.
Herman, J. L., Aschbacher, P. R., & Winters, L. (1992). A practical guide to alternative assessment.
Alexandria, VA: Association for Supervision and Curriculum Development.
Marzano, R., Pickering, D., & McTighe, J. (1993). Assessing student outcomes: Performance assessment using
the dimensions of learning model. Alexandria, VA: Association for Supervision and Curriculum
Development.
Tittle, C. K., Hecht, T., & Moore, P. (1993). Assessment theory and research for classrooms: From
‘taxonomies’ to constructing meaning in context. Educational Measurement: Issues and Practice,
12(4), 13-19.
Items in boldface are particularly suitable for assessing student learning within a course.
• Test “blueprints” (outlines of the concepts
Direct Evidence of What Students Are Learning and skills covered on tests)
• Ratings by cooperative education/internship • Documentation of the match between
supervisors of student skills course/program objectives and assessments
• Employer ratings of satisfaction with the • Percent of freshman-level classes taught by full
program and employee skills professors
• Pass rates on appropriate licensure/certification • Number or percent of courses with service
exams (e.g., Praxis, NLN) or exit exams (e.g., learning opportunities
MFATs, Test of Critical Thinking Ability) that • Number or percent of courses with
assess key learning outcomes collaborative learning opportunities
• “Blind” or externally-scored rubric (rating scale) • Number or percent of courses taught using
scores on “capstone” projects such as research culturally-responsive teaching techniques
papers, class presentations, exhibitions, or • Percent of class time spent in active learning
performances • Number of student hours spent in community
• Portfolios of student work service activities
• Rubric (rating scale) scores for written work, • Percent of student majors participating in
oral presentations, or performances relevant co-curricular activities (e.g., club in
• Scores on locally-designed multiple choice discipline)
and/or essay tests, accompanied by test • Voluntary attendance at intellectual/cultural
“blueprints” describing what the test assesses events germane to the course or program
• Score gains between entry and exit on
published or local tests or writing samples Insights into Why Students Are or Aren’t
• Electronic discussion threads Learning
• Student reflections on what they have learned • Length of time to degree
over the course of the program • Student/alumni satisfaction, collected through
• Student reflections on their values, attitudes surveys, exit interviews, or focus groups
and beliefs, if developing those are intended • Student feedback via Angelo & Cross’s
outcomes of the course or program Classroom Assessment Techniques
• Student publications and conference • Course portfolios
presentations • Library holdings in the program’s discipline(s)
• Expenditures for faculty professional
Indirect Evidence of Student Learning development
(Signs that Students Are Probably Learning, But • Department-sponsored opportunities for faculty
Exactly What They Are Learning is Less Clear) professional development
• Graduate program admission rate • Number and/or dollar value of grants awarded to
• Graduate program success (completion) rate faculty whose purpose is improved student
• Quality/reputation of graduate and professional learning
programs into which students are accepted
• Placement into career positions Evidence of Other Aspects of Academic Quality
• Honors, awards, and scholarships awarded to • Specialized accreditation
students and graduates • Retention and graduation rates
• Transcript analyses • Percent of students in the program who are
• List of the major learning outcomes of the students of color
program, distributed to all students in the • Percent of faculty in the program who are faculty
program of color
• Percent of courses whose syllabi include a list • Cost and cost-effectiveness of the program (e.g.,
of the major learning outcomes of the course budget, student/faculty ratios, average class size)
• Percent of courses whose syllabi state learning • Number and/or dollar value of grants awarded to
outcomes that include thinking skills (not just faculty
simple understanding of facts and principles) • Number and/or dollar value of gifts to the
• Average proportion of final grade based on department
assessments of thinking skills
• Ratio of paper-and-pencil tests to Linda Suskie
performance assessments June 19, 2003
A Rating Scale Rubric for an Oral Presentation
Strongly Strongly
Agree Agree Disagree Disagree
The presenter…
Clearly stated the purpose
of the presentation. □ □ □ □
Was well organized. □ □ □ □
Was knowledgeable about
the subject. □ □ □ □
Answered questions
authoritatively. □ □ □ □
Spoke clearly and loudly. □ □ □ □
Maintained eye contact
with the audience. □ □ □ □
Appeared confident. □ □ □ □
Adhered to time
constraints. □ □ □ □
Had main points that were
appropriate to the central
topic. □ □ □ □
Accomplished the stated
objectives. □ □ □ □
Adapted with permission from a rubric used by the Department of Health Science, Towson
University
A Rating Scale Rubric for an Information Literacy Assignment
Please indicate the student’s skill in each of the following respects, as evidenced by this
assignment, by checking the appropriate box. If this assignment is not intended to elicit a
particular skill, please check the “N/A” box.
Inadequate (F)
Adequate (D)
Adequate (C)
Outstanding
Marginally
N/A
(A)
1. Identify, locate, and access □ □ □ □ □ □
sources of information.
2. Critically evaluate □ □ □ □ □ □
information, including its
legitimacy, validity, and
appropriateness.
3. Organize information to □ □ □ □ □ □
present a sound central idea
supported by relevant material
in a logical order.
4. Use information to answer □ □ □ □ □ □
questions and/or solve problems.
5. Clearly articulate information □ □ □ □ □ □
and ideas.
6. Use information technologies □ □ □ □ □ □
to communicate, manage, and
process information.
7. Use information technologies □ □ □ □ □ □
to solve problems.
8. Use the work of others □ □ □ □ □ □
accurately and ethically.
9. What grade are you awarding □ □ □ □ □ □
this assignment?
10. If you had to assign a final □ □ □ □ □ □
course grade for this student
today, what would it be?
Introduction The introduction smoothly pulls the The introduction is organized The introduction presents the The introduction is disorganized
(10 points) reader into the topic, is organized, but does not adequately main argument and the and difficult to follow. The main
presents the main argument clearly, present the main argument or author’s views but is disor- argument and the author’s views
and states the author’s views. (10) does not state the author’s ganized and does not flow are not introduced. (5)
views. (8) smoothly. (7)
Content Information is presented clearly, Information is unclear and Information is unclear and The paper is unclear and difficult
(20 points) completely and accurately across all difficult to understand in 1 difficult to understand in 2-3 to understand across 4 or more
sections. At least 3 major sections; section. (18) sections. (16) sections. (12)
at least 1 major section has 2-3
subsections. (20)
Organization Organization is clear; good Organization is unclear in 1 Organization is unclear in 2-3 Organization is unclear in 4 or
(20 points) framework. Headers, preview section (unfocused paragraphs, sections OR headers and more sections. (12)
paragraphs, topic sentences, and poor topic sentences, poor preview paragraphs or
transitions aid in understanding transitions). All other sections sentences are missing. (16)
main points. Information is are logically organized. (18)
presented logically. (20)
Conclusion/ Specific ideas for improving Specific ideas are presented Ideas are presented but in a Fewer than 3 original ideas
Original research or other ideas are presented but the rationales for 1 idea vague, generic format related to the topic are presented
Thought in an organized manner with logical may be weak. (18) OR rationales for 2 or more OR all ideas are not well
(20 points) rationales. (20) ideas are weak. (16) explained. (12)
Writing Style Tone is professional, vocabulary Syntax or vocabulary is Syntax or vocabulary is Writing style makes more than 4
(10 points) and syntax are mature, and easy to complex, awkward, or filled complex, awkward, or filled sections of the paper difficult to
understand terms are used with jargon in 1-2 sections of with jargon in 3-4 sections of read and understand. (3)
throughout the paper (10) the paper OR words are used the paper OR words are used
incorrectly in 1-2 sections of incorrectly in 3-4 sections of
the paper. (7) the paper. (5)
Writing Use/ The paper is free of spelling, The paper has less than 5 The paper has 6-15 spelling, More than 16 errors across the
Mechanics (10 syntax, formatting, punctuation spelling, punctuation, for- punctuation, formatting, paper make it difficult to follow.
points) errors. (10) matting, syntax errors. (7) syntax errors. (5) (3)
APA Rules All APA rules are followed for Fewer than 3 violations of 4-10 violations of APA rules 11 or more violations of APA
(10 points) citations, headers, numbers, series, APA rules, or 1-2 missing or and/or 3-5 missing or incorrect rules and/or 6 or more missing or
quotes, references, etc. (10) incorrect citations and citations and references (5) incorrect citations and references.
references (7) (3)
Adapted with permission from a rubric used by the Department of Communication Sciences & Disorders, Towson University.
A Descriptive Rubric
for a Slide Presentation on Findings from Research Sources
Well Done (5) Satisfactory (4-3) Needs Improvement (2-1) Incomplete (0)
Organization Clearly, concisely written. Logical progression Vague in conveying view- Lacks a clear point of view
Logical, intuitive progression of ideas & point and purpose. Some and logical sequence of in-
of ideas & supporting supporting infor- logical progression of ideas formation. Cues to infor-
information. Clear & direct mation. Most cues to & supporting information, mation are not evident.
cues to all information. information are clear but cues are confusing or
and direct. flawed.
Persuasiveness Motivating questions & ad- Includes persuasive Includes persuasive infor- Information is incomplete,
vance organizers convey information. mation with few facts. out of date, and/or incor-
main idea. Information is rect.
accurate.
Introduction Presents overall topic. Clear, coherent, and Some structure but does not Does not orient audience
Draws in audience with related to topic. create a sense of what fol- to what will follow.
compelling questions or by lows. May be overly detailed
relating to audience’s inter- or incomplete. Somewhat
ests or goals. appealing.
Clarity Readable, well-sized fonts. Sometimes fonts are Overall readability is diffi- Text is very difficult to
Italics, boldface, and inden- readable, but in a cult with lengthy para- read. Long blocks of text,
tations enhance readability. few places fonts, graphs, too many fonts, small fonts, inappropriate
Text is appropriate length. italics, boldface, long dark or busy background, colors, or poor use of
Background and colors en- paragraphs, color, or overuse of boldface, or lack headings, indentations, or
hance readability. background detract. of appropriate indentations. boldface.
Layout Aesthetically pleasing. Con- Uses white space ap- Shows some structure but is Cluttered and confusing.
tributes to message with ap- propriately. cluttered, busy or Spacing and headings do
propriate use of headings distracting. not enhance readability.
and white space.
Adapted with permission from a rubric developed by Patricia Ryan, Lecturer, Department of Reading, Special Education, and
Instructional Technology, Towson University
Critical Thinking Holistic Scoring Guide
by Peter and Noreen Facione
Source: http://www.calpress.com/rubric.html
Examples of Assignments
Beyond Essays, Term Papers, and Research Reports
4 points Determine the value of t needed to find a confidence interval of a given size.
4 points Given a proportion and a sample size, decide if the normal distribution can be
used instead of the binomial.
4 points Calculate a “sample error” or “error margin” for a proportion.
4 points Understand the effect of p on the standard error of a proportion.
2. What will you say, if you have a chance to speak to your friends, about this course?
3. What suggestions would you give other students on ways to get the most out this course?
7. What was the one most useful or meaningful thing you learned in this course?
9. In what area did you improve the most? What improvement(s) did you make?
10. What one assignment for this course was your best work? What makes it your best work? What did you learn
by creating it? What does it say about you as a writer/teacher/biologist/sociologist/etc.?
11. Describe something major that you’ve learned about yourself in this course.
12. List three ways you think you have grown or developed as a result of this course.
14. What have you learned in this course that will help you continue to grow as a
writer/teacher/biologist/sociologist/etc.?
16. What goals did you set for yourself in this course? How well did you accomplish them?
17. If you were to start this course over, what would you do differently next time?
18. What strategies you used to learn the material in this course? Which were most effective? Why?
20. If you could change any one of the assignments you did for this course, which one would it be? What would
you change about it?
21. What problems did you encounter in this course? How did you solve them?
22. What one question about this course is uppermost on your mind?
23. What would you like to learn further about this subject/discipline?
24. In what area would you like to continue to strengthen your knowledge or skills?
25. Write one goal for next semester and tell how you plan to reach it.
Linda Suskie
June 19, 2003
SOME USEFUL REFERENCES ON ASSESSMENT
American Association for Higher Education. (undated). 9 Principles of good practice for assessing student
learning [On-line]. Available: http://www.aahe.org/assessment/principl.htm
Anderson, R. S., & Speck, B. W. (Eds.) (1998, Summer). Changing the way we grade student
performance: Classroom assessment and the new learning paradigm. New Directions for
Teaching and Learning, No. 74. San Francisco: Jossey-Bass.
Andrade, H. G. (2000). Using rubrics to promote thinking and learning. Educational Leadership, 57(5).
Retrieved June 2, 2003, from http://ascd.org/publications/ed_lead/200002/andrade.html
Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers
(2nd ed.). San Francisco: Jossey-Bass.
Astin, A. W. (1996). Assessment for excellence: The philosophy and practice of assessment and
evaluation in higher education. Portland: Oryx and American Council on Education.
Banta, T. W., Lund, J. P., Black, K. E., & Oblander, F. W. (1996). Assessment in practice: Putting
principles to work on college campuses. San Francisco: Jossey-Bass.
Boud, D. (1995). Enhancing learning through self assessment. London, United Kingdom: Kogan Page.
Cashin, W. E. (1987). Improving essay tests (IDEA Paper No. 17). Manhattan, KS: Kansas State
University, Center for Faculty Evaluation and Development.
Catholic Community Forum. (2000, November 18). Assessing metacognition. Ballwin, MO: Author.
Retrieved June 2, 2003, from http://www.catholic-
forum.com/catholicteacher/english_METACOGNITION.htm
Chicago Board of Education. (2000). How to create a rubric from scratch: A guide for rugged
individualists. Chicago, IL: Author. Retrieved June 2, 2003, from
http://intranet.cps.k12.il.us/Assessments/Ideas_and_Rubrics/Create_Rubric/create_rubric.html
Chicago Board of Education. (2000). Performance assessment tasks. Chicago, IL: Author. Retrieved June
2, 2003, from
http://intranet.cps.k12.il.us/Assessments/Ideas_and_Rubrics/Assessment_Tasks/assessment_tasks
.html
Coalition of Essential Schools. (n.d.). How to analyze a curriculum unit or project and provide the
scaffolding students need to succeed. Oakland, CA: Author. Retrieved June 2, 2003, from
http://www.essentialschools.org/cs/resources/view/ces_res/85
Coalition of Essential Schools. (2002, May 14). Overview of alternative assessment approaches.
Oakland, CA: Author. Retrieved June 2, 2003, from
http://www.essentialschools.org/cs/resources/view/ces_res/127
Costa, A. L., & Kallick, B. (2000). Getting into the habit of reflection. Educational Leadership, 57(7), 60-
62.
Diamond, R. M. (1998). Designing and assessing courses and curricula: A practical guide. San
Francisco: Jossey-Bass.
ERIC Clearinghouse on Assessment and Evaluation. (1999, December 23). Scoring rubrics – Definitions
& construction. Washington, DC: Author. Retrieved June 2, 2003, from
http://ericae.net/faqs/rubrics/scoring_rubrics.htm
Gardiner, L. F., Anderson, C., & Cambridge, B. L. (Eds.) (1997). Learning through assessment: A
resource guide for higher education. Washington: American Association for Higher Education.
Haladyna, T. M. (1997). Writing test items to evaluate higher order thinking. Boston, MA: Allyn &
Bacon.
Herman, J. L., Aschbacher, P. R., & Winters, L. (1992). A practical guide to alternative assessment.
Alexandria, VA: Association for Supervision & Curriculum Development.
Huba, M. E., & Freed, J. E. (2000). Learner-centered assessment on college campuses: Shifting the focus
from teaching to learning. Boston: Allyn & Bacon.
Jones, E., & Voorhees, R., with Paulson, K. (2002). Defining and assessing learning: Exploring
competency-based initiatives. Washington, DC: National Center for Education Statistics.
Retrieved June 2, 2003, from http://nces.ed.gov/pubs2002/2002159.pdf
Mertler, C. A. (2001). Designing scoring rubrics for your classroom. Practical Assessment, Research, &
Evaluation, 7(25). Retrieved June 2, 2003, from http://ericae.net/pare/getvn.asp?v=7&n=25
Middle States Commission on Higher Education. (2003). Student learning assessment: Options and
resources. Philadelphia: Author.
Moskal, B. M. (2003). Recommendations for developing classroom performance assessments and scoring
rubrics. Practical Assessment, Research & Evaluation, 8(14). Retrieved June 2, 2003, from
http://ericae.net/pare/getvn.asp?v=8&n=14
Moskal, B. M. (2000). Scoring rubrics: What, when and how? Practical Assessment, Research, &
Evaluation, 7(3). Retrieved June 2, 2003, from http://ericae.net/pare/getvn.asp?v=7&n=3
Moskal, B. M., & Leydens, J. A. (2000). Scoring rubric development: Validity and reliability. Practical
Assessment, Research & Evaluation, 7(10). Retrieved June 2, 2003, from
http://ericae.net/pare/getvn.asp?v=7&n=10
Palomba, C. A., & Banta, T. W. (1999). Assessment essentials: Planning, implementing, improving. San
Francisco: Jossey-Bass.
Pickett, N. (1999, March 31). Guidelines for rubric development. San Diego, CA: San Diego State
University, Educational Technology Department. Retrieved June 2, 2003, from
http://edweb.sdsu.edu/triton/july/rubrics/Rubric_Guidelines.html
Prus, J., & Johnson, R. (1994). A critical review of student assessment options. In Bers, T. H., & Mittler,
M. L. (Eds.), Assessment and testing: Myths and realities (pp. 69-83) (New Directions for
Community Colleges, No. 88). San Francisco, CA: Jossey-Bass.
Relearning by Design, Inc. (2000). Rubric sampler. Ewing, NJ: Author. Retrieved June 2, 2003, from
http://www.relearning.org/resources/PDF/rubric_sampler.pdf
Rudner, L. M., & Boston, C. (1994). Performance assessment. ERIC Review, 3(1), 1-12.
Suskie, L. (2000, May). Fair assessment practices: Giving students equitable opportunities to demonstrate
learning. AAHE Bulletin, 52(9), 7-9.
Taggart, G. L., Phifer, S. J., Nixon, J. A., & Wood, M. (Eds.). (1998). Rubrics: A handbook for
construction and use. Nevada City, CA: Performance Learning Systems.
Walvoord, B. E., & Anderson, V. J. (1998). Effective grading: A tool for learning and assessment. San
Francisco, CA: Jossey-Bass.
Weaver, R. L., & Cotrell, H. W. (1985, Fall/Winter). Mental aerobics: The half-sheet response. Innovative
Higher Education, 10, 23-31.