Outcome Based Assessment 2003
Outcome Based Assessment 2003
Outcome Based Assessment 2003
Ahmet S. Yigit
Office of Academic Assessment
College of Engineering and Petroleum
Kuwait University
Fall 2003
Why Assessment?
"We give grades, don't we? That's assessment. Isn't that enough?"
"We don't have enough time to start another new project."
"'Outcomes,' 'Goals,' 'Objectives' - all this is educational jargon!"
"Isn't this another way of evaluating us, of finding fault with our
work?"
"Find a standardized test or something, and move on to more
important things."
"You want us to lower standards? Have us give more A's and B's?"
"Our goals can't be quantified like some industrial process."
"Let's just wait until the (dept chair, dean, president, etc.) leaves, and
it'll go away."
Fall 2003
Why Assessment?
Continuous improvement
Total Quality Management applied in
educational setting
Accreditation/External evaluation
Competition
Industry push
Learning needs
Fall 2003
Recent Developments
Fundamental questions raised (1980s)
How well are students learning?
How effectively are teachers teaching?
Fall 2003
Desired
output
Process
Comparison
Output
Process
Then
Output
Now
Measurement
Fall 2003
What is Assessment?
An ongoing process aimed at understanding and
improving student learning. It involves making
our expectations explicit and public; setting
appropriate criteria and high standards for
learning quality; systematically gathering,
analyzing, and interpreting evidence to determine
how well performance matches those
expectations and standards;
and using the resulting information to document,
explain, and improve performance.
American Association for Higher Education
Fall 2003
Fall 2003
Assessment is
Active
Collaborative
Dynamic
Integrative
Learner-Centered
Objective-Driven
Systemic
Fall 2003
Assessment
is more than just a grade
is a mechanism for providing all parties with data
for improving teaching and learning
helps students to become more effective, selfassessing, self-directing learners
Fall 2003
Levels of Assessment
Institution
Department
Program
Course/Module/Lesson
Individual/Group
Fall 2003
institutional mission
departmental/program objectives
accreditation bodies (e.g., ABET)
professional societies
constituents (students, faculty, alumni, employers,
etc.)
continuous feedback
Fall 2003
Fall 2003
Step #3
Review existing
assessment
methods
Fall 2003
Fall 2003
Step #4
Define additional
methods and
measures
Step #3
Review existing
assessment
methods
Fall 2003
Step #4
Define additional
methods and
measures
Step #2
Identify data
required &
sources
Step #3
Review existing
assessment
methods
Fall 2003
Continuous
Improvement
Step #4
Define additional
methods and
measures
Step #2
Identify data
required &
sources
Step #3
Review existing
assessment
methods
Development Process
Goals
Objectives
Outcomes
Tools
Improvement
Fall 2003
Example:
Fall 2003
State Objectives
Objectives
Question:
Examples:
Fall 2003
Define Outcomes
Outcomes
Question:
Examples:
Fall 2003
Objectives Summary
Each addresses one or more needs of one or
more constituencies
Understandable by constituency addressed
Number of statements should be limited
Should not be simply restatement of outcomes
Fall 2003
Outcomes Summary
Each describes an area of knowledge and/or
skill that a person can possess
Should be stated such that a student can
demonstrate before graduation/end of term
Should be supportive of one or more
Educational Objectives
Do not have to include measures or
performance expectations
Fall 2003
Review Tools
Tools
Questions:
Fall 2003
Strategies/Practices
Practice
Curriculum
Courses
Instruction (Teaching methods)
Assessment
Policies
Admission and transfer policies
Reward systems
Extra-curricular activities
Fall 2003
A Manufacturing Analogy
Mission: To produce passenger cars
Fall 2003
Fall 2003
Fall 2003
Input from
Constituencies
(e.g., Students, Alumni,
Employers)
Fall 2003
Determine Outcomes
Required to Achieve
Objectives
Assess Outcomes/
Evaluate
Objectives
Determine How
Outcomes will be
Achieved
Formal Instruction
Student Activities
Determine How
Outcomes will be
Assessed
Establish Indicators
for Outcomes to Lead
to Achievement
of Objectives
Exercise
Given your University and your Program
missions develop two educational objectives
which address the needs of one or two of
your constituencies
Given the program objectives you developed,
select ONE objective and develop a set of
measurable outcomes for it.
Be prepared to report to the full group
Fall 2003
Assessment Design
Fall 2003
Example Objectives
To teach students various analysis methods of
control systems
To teach students the basic principles of
classical thermodynamics
To motivate students to learn a new software
package on their own
To provide opportunities to practice team
building skills
Fall 2003
Example Outcomes
Obtain linear models (state space and transfer
functions) of electro-mechanical systems for control
design (measurable)
Select the optimum heat exchanger configuration
from several alternatives based on economic
considerations (measurable)
Understand the concept of conservation of mass and
energy (not measurable)
know how to use the first law of thermodynamics
(not measurable)
Fall 2003
Writing Outcomes
Write outcomes using quantifiable action verbs and
avoid terms which are open to many interpretations
Fall 2003
Blooms Taxonomy
Cognitive domain of required thinking levels
Lower order thinking
knowledge, comprehension, application
Fall 2003
Comprehension
Distinguish a particle from a rigid body
Application
Given the initial velocity, find the trajectory of
a projectile
Fall 2003
Synthesis
Determine the required friction coefficient for a
given motion
Evaluation
Choose the best solution method for a given
kinetics problem
Fall 2003
Assessment Design
(continued)
M (medium)
Demonstrating this knowledge or skill has considerable
impact on the overall performance of the student
L (low)
Demonstrating this knowledge or skill has only minor
impact on the overall performance of the student
Fall 2003
Assessment Practices
Identify resources
Support personnel and facilities
Available instruments
Develop necessary tools (e.g., scoring rubrics)
Implement assessment
Analyze and interpret results
Feedback for improvement
Fall 2003
Exercise
Choose a course you currently teach or would
like to teach
Complete the teaching goals inventory (TGI)
Write 2-3 general objectives for the course
Be prepared to report to the full group
Fall 2003
Exercise
Consider the course you chose earlier
Develop one of the objectives into measurable
outcomes based on Blooms taxonomy
Discuss with the whole group
Fall 2003
Assessment Design
Assessment Methods
Program Assessment
Tests (standard exams, locally developed tests)
Competency-based methods (stone courses)
Attitudes and perceptions (surveys, interviews,
focus groups)
Course/Classroom Assessment
Performance evaluations (oral presentations,
written reports, projects, laboratory, teamwork)
Classroom Assessment Techniques (minute paper,
background probe, concept maps)
Fall 2003
Employer survey
Alumni survey
Faculty survey
Exit survey
Drop-out survey
Fall 2003
Fall 2003
Important Points
All assessment methods have advantages and
disadvantages
The ideal methods are those that are the best
compromise between program needs, satisfactory
validity, and affordability (resources)
Need to use multi-method/multi-source approach to
improve validity
Need to pilot test to see if a method is appropriate for
your program/course
Fall 2003
Validity
Relevance: the option measures the
educational outcome as directly as possible
Accuracy: the option measures the
educational outcome as precisely as possible
Utility: the option provides formative and
summative results with clear implications for
program/course evaluation and improvement
Fall 2003
Exercise
Consider the outcomes you developed earlier
Specify relevant activities/strategies to
achieve these outcomes
Determine the assessment methods/tools to
measure each outcome
Fall 2003
Assessment Practice
Strategies
Refine and maintain a structured process
Involve all constituents
Establish a viable framework
Fall 2003
ME Program at KU (continued)
Program Outcomes (sample)
Fall 2003
Fall 2003
Practices
Fall 2003
societies
(ASME,
Assessment
Instructor course evaluation at selected courses (every
term) - Faculty
Exit survey (every term) - OAA
Alumni survey (every three years) - OAA
Employer survey (every four years) - OAA
Faculty survey (every two years) - OAA
Fall 2003
Fall 2003
Feedback
Faculty
Undergraduate Program Committee
Department council
Student advisory council
External advisory board
Fall 2003
Fall 2003
ME-455 (continued)
Course design
Make sure all course objectives are addressed
theoretical framework, hands on experience with packages, soft
skills
Fall 2003
Me-455 (continued)
Course assessment
Make sure all course outcomes are measured
Use standard assessment tools (written report, oral
presentation, teamwork)
Develop and use self evaluation report (survey and essay)
Design appropriate quizzes to test specific outcomes
Ethics quiz
Team building skills quiz
Fall 2003
ME-455 (continued)
Assessment results
Students were able to learn and use the software packages
for analysis and design
Students recognized the need for life long learning
Students were able to acquire information not covered in
class
Students are not prepared well with respect to
communication and teamwork skills
Students lack a clear understanding of ethical and
professional responsibilities of an engineer
Students are deficient in their ability to integrate and apply
previously learned material
Fall 2003
ME-455 (continued)
Corrective measures
Communicate and discuss the deficiencies to students
Discuss the results within the area group and formulate
common strategies for corrective actions.
Increase opportunities to practice communication and
teamwork skills with curricular and extra-curricular activities
Communicate results to concerned parties
Introduce and explain engineers code of ethics at the
beginning of the course. Introduce more case studies.
Fall 2003
Assessment Practice
Fall 2003
Fall 2003
Fall 2003
Contact us
E-mail:[email protected]
Fall 2003