143 Paper PDF
143 Paper PDF
143 Paper PDF
1)
Jens Bennedsen
Fredrik Georgsson
Juha Kontio
Turku University of Applied Sciences, Faculty of Business, ICT and Chemical Engineering,
Finland
ABSTRACT
On November 13, 2015 the CDIO Council approved an updated version of the self-evaluation
rubric. This paper will present the updated version of the rubric along with some general
thoughts on how to work with it. In this paper we will also present the process that started
with a paper in the 2014 CDIO world conference identifying inconsistencies in the version 2.0
of the CDIO rubric for self-evaluation and ended in the proposed rubric.
KEYWORDS
INTRODUCTION
The outline of the paper is as follows: Firstly, the updated rubric will be presented, then an
introduction on how to think about the levels of the rubric will be given along with some
theoretical foundation. Lastly the process of developing the new rubric will be described.
Proceedings of the 12th International CDIO Conference, Turku University of Applied Sciences,
Turku, Finland, June 12-16, 2016.
THE UPDATED RUBRIC
Since this document is intended to serve as a description of the latest version of the CDIO
rubric for self-evaluation it will be listed here in its entirety. We have chosen to list it
alongside the old version of the rubric for comparison.
Proceedings of the 12th International CDIO Conference, Turku University of Applied Sciences,
Turku, Finland, June 12-16, 2016.
2 A plan to incorporate explicit statements A plan to incorporate explicit statements
of program learning outcomes is of learning outcomes at course/module
accepted by program leaders, level as well as program outcomes is
engineering faculty, and other accepted by program leaders,
stakeholders. engineering faculty, and other
stakeholders.
1 The need to create or modify program The need to create or modify learning
learning outcomes is recognized and outcomes at course/module level and
such a process has been initiated. program outcomes are recognized and
such a process has been initiated
0 There are no explicit program learning There are no explicit learning outcomes
outcomes that cover knowledge, at course/module level nor program
personal and interpersonal skills, and outcomes that cover knowledge,
product, process and system building personal and interpersonal skills, and
skills. product, process and system building
skills.
Proceedings of the 12th International CDIO Conference, Turku University of Applied Sciences,
Turku, Finland, June 12-16, 2016.
Level Old version of the rubric New version of the Rubric
5 The introductory course is regularly The introductory course is regularly
evaluated and revised, based on evaluated and revised as needed, based
feedback from students, instructors, and on feedback from students, instructors,
other stakeholders. and other stakeholders.
4 There is documented evidence that NO CHANGE
students have achieved the intended
learning outcomes of the introductory
engineering course.
3 An introductory course that includes NO CHANGE
engineering learning experiences and
introduces essential personal and
interpersonal skills has been
implemented.
2 A plan for an introductory engineering A plan for an introductory engineering
course introducing a framework for course introducing a framework for
practice has been approved. practice has been approved and a
process to implement the plan has been
initiated.
1 The need for an introductory course that The need for an introductory course that
provides the framework for engineering provides the framework for engineering
practice is recognized and a process to practice is recognized and a planning
address that need has been initiated. process initiated.
0 There is no introductory engineering NO CHANGE
course that provides a framework for
practice and introduces key skills.
Proceedings of the 12th International CDIO Conference, Turku University of Applied Sciences,
Turku, Finland, June 12-16, 2016.
Level Old version of the rubric New version of the Rubric
5 Internal and external groups regularly The program leaders, students, teachers
evaluate the impact and effectiveness of and external stakeholders regularly
workspaces on learning and provide evaluate the functionality and
recommendations for improving them. purposefulness of workspaces on
learning and provide recommendations
for improving them.
4 Engineering workspaces fully support all NO CHANGE
components of hands-on, knowledge,
and skills learning.
3 Plans are being implemented and some Development plans of engineering
new or remodelled spaces are in use. workplaces are being implemented and
some new or remodelled spaces are in
use.
2 Plans to remodel or build additional Workspaces, their functionality and
engineering workspaces have been purposefulness for teaching are being
approved by the appropriate bodies. evaluated by internal groups including
stakeholders
1 The need for engineering workspaces to NO CHANGE
support hands-on, knowledge, and skills
activities is recognized and a process to
address the need has been initiated.
0 Engineering workspaces are inadequate NO CHANGE
or inappropriate to support and
encourage hands-on skills, knowledge,
and social learning.
Proceedings of the 12th International CDIO Conference, Turku University of Applied Sciences,
Turku, Finland, June 12-16, 2016.
Table 8 Rubric of standard 8
Proceedings of the 12th International CDIO Conference, Turku University of Applied Sciences,
Turku, Finland, June 12-16, 2016.
interpersonal, product, process, and
system building skills.
Proceedings of the 12th International CDIO Conference, Turku University of Applied Sciences,
Turku, Finland, June 12-16, 2016.
Table 12 Rubric of standard 12
There are six levels in the rubric describing levels of maturity. As shown in Table 13, the
levels range from 0: there is no documented plan or activity related to the standard, to 5:
evidence related to the standard is regularly reviewed and used to make improvements. In
general, in order to be at level n, level n-1 should also be fulfilled. In this sense the levels of
the rubric form a hierarchy, as described in Figure 1.
Level Rubric
5 Evidence related to the standard is regularly reviewed and used to make
improvements
4 There is documented evidence of the full implementation and impact of
the standard across the program components and constituents.
3 Implementation of the plan to address the standard is underway across
the program components and constituents.
2 There is a plan in place to address the standard.
1 There is an awareness of need to adopt the standard an a process is in
place to address it.
0 There is no documented plan or activity related to the standard.
Proceedings of the 12th International CDIO Conference, Turku University of Applied Sciences,
Turku, Finland, June 12-16, 2016.
Level 5
Level 4
Level 3
Level 2
Level 1
Level 0
One problem with this view is that you could be tempted to view level 5 as a final state,
indicating that you in some way have “finished” your quality work when you self-assess
yourself at this level (as indicated in Figure 1). It can even be so that you run into trouble
when it comes to level 4: There is documented evidence of full implementation, which tells us
that we have reached a satisfactory implementation of the standard and you might be
tempted to stop the developing process there. At this point we must stress that the correct
interpretation of level 5 is that you have made sure you have a satisfactory level of
implementation (level 4) and that you have processes in place that guarantee continued
improvements, i.e. you can never state that you are finished when it comes to improving
yourself.
We suggest that it could be helpful to think about the levels of the self-assessment rubric as
shown in Figure 2: First we have to conceive what the standard is all about, during that
process we are at level 1. When we start designing how we should address the
implementation of the standard we are at level 2. When we start implementing the design we
are at level 3. After level 3, we leave the linear implementation phases and enter an
operation phase where we repeatedly assess that we have an accepted level of
implementation (level 4) but still systematically address the shortcomings of our
implementations (level 5). With this view of self-assessment it is obvious that we never will
be finished.
Proceedings of the 12th International CDIO Conference, Turku University of Applied Sciences,
Turku, Finland, June 12-16, 2016.
Operate
Level 4 Level 5
Implement
Level 3
Design
Level 2
Conceive
Level 1
Level 0
Improving quality of the higher educational systems, its universities and programmes are
very much in focus all over the world. In many (most?) countries, accreditation bodies are in
place that will ensure the quality of a program or an institution. Such bodies exist in many
shapes and forms; private bodies like ABET (ABET, 2016), public bodies like the Danish
Accreditation agency (The Danish Accreditation Institution, 2016), bodies covering one
country like (CTI, 2016) and bodies covering many countries like EURACE (ENAEE, 2016).
All of these have their own accreditation system. For a description of accreditation systems
see (Bennedsen, Clark, Rouvrais, & Schrey-Niemenmaa, 2015)
The accreditation systems of today are mostly inspired by quality models like EFQM (EFQM)
or the Capability Maturity Model used for software development (Paulk, Curtis, Chrissis, &
Weber, 1993) where the focus is on process maturity and continuous improvement rather
than a measurement of the current status (although the evaluation of the current state is an
important part of the quality process).
Boele at al. (Boele, Burgler, & Kuiper, 2008) describe the EFQM model like this:
The EFQM (European Foundation for Quality Management) model basically looks at
an organization, its results, and the way the results lead to learning, improvement and
innovation. It was developed for firms but can be applied to any kind of organization.
Proceedings of the 12th International CDIO Conference, Turku University of Applied Sciences,
Turku, Finland, June 12-16, 2016.
describes how and when the assessment is done (how data is collected and validated and
how the planning is done). The process focuses on the roles and responsibilities of the
involved stakeholders, the inputs and the outputs. The assessment process is supported by
an assessment model. The assessment model is based on a reference model that defines a
set of best practices (or standards) related to the domain that needs to be assessed. It is
measurement against these standards that is important as this is then the basis for improving
quality. The measurement framework defines the maturity levels to be considered and
contains a set of assessment indicators which support the ratings against the various
standards. The CDIO rubric is therefore NOT an accreditation system; we have only
described the measurement framework and that even without a set of indicators that could
be used to indicate on what level a given programme/institutions is with respect to a given
standard.
We have chosen NOT to include these elements since the rubric’s main purpose is for
internal use. It is therefore not important that it is reliable (i.e. that the rubric gives the same
score when applied by different individuals on the same programme and/or that it is possible
to compare self-evaluations from different institutions)
The process for updating the rubric has had several cycles. At the beginning the authors
were discussing about CDIO self-evaluation and compared their experiences in using CDIO
standards for self-evaluation. It became obvious that CDIO standards with the rubrics were in
active use in the authors’ universities, but we all had noticed some challenges with the exact
definition of the rubric levels, usability of the rubrics as well as the coherence of the rubrics.
The discussion started a development process where each of the authors worked with four
standards and produced a new proposal of those rubrics. The standards were then cross-
checked and at the end first modified version of the rubrics was published in CDIO
conference in Barcelona (Bennedsen, Georgsson & Kontio, 2014). The feedback received in
Barcelona showed that rubrics still need modifications and especially opinions from other
CDIO collaborators were hoped. We ourselves shared this opinion and wanted to get
feedback from the CDIO community. The CDIO council asked the authors to continue this
development work aiming at new version of CDIO rubrics to the 12 standards. The goal was
set to produce CDIO standards with rubrics v. 2.1.
The next development cycle started with the aim of getting feedback in a more systematic
way. We wanted to evaluate the proposed improvements and modifications among the other
CDIO members. We wanted to hear whether they see the proposed changes necessary at all
and whether the new proposed rubrics are more understandable. In addition, we wanted to
see if there are needs to further modify and improve the rubrics. The data collection had two
phases: a web questionnaire and short semi-structured interviews with selected CDIO
collaborators. The web questionnaire was sent to all CDIO collaborators representing the
CDIO member universities at the end of 2014. In addition, more detailed comments were
acquired with a short semi-structured interview with selected CDIO collaborators and a
session at the 2014 fall meeting with experienced CDIO members. Based on the feedback
an improved version of CDIO rubrics was presented and processed in a workshop during the
CDIO conference in Chengdu 2015 (Georgsson, Kontio & Bennedssen, 2015). The
workshop in Chengdu once more processed, checked and provided input for final
improvements.
Proceedings of the 12th International CDIO Conference, Turku University of Applied Sciences,
Turku, Finland, June 12-16, 2016.
The third development cycle used the results of the Chengdu workshop and tuned the final
nuances of the rubrics. The final version of the rubrics was presented in CDIO council
meeting in Belfast 2015. The proposed changes were accepted as presented in this paper.
The whole process of rubrics development is shown in Figure 3.
CDIO
standards
Identified Barcelona Chengdu
Turku 2016 with
need 2014 2015
rubrics v.
2.1
CONCLUSION
Based on the rubrics development the overall change process within CDIO-framework can
be generalized into following:
1. Have an idea on what to change
2. Find others that are willing to discuss it
3. Inform the council about the wish to change
4. Perform an analysis – that is analyze current presentation based on theory, existing
documents etc.
5. Conduct a survey or in some other way collect the opinion of the CDIO-members
6. Document including analysis and proposed changes, normally together with
additional CDIO collaborators that want to contribute. The style of the paper should
be to clearly compare what exist to what is proposed and for every change clearly
justify why it is proposed.
7. Present at CDIO conference, preferably in workshop-format where you collect
feedback on proposed changes in a structured comparative way.
8. Revise suggestion based on feedback and present to the council.
9. Once the change is accepted by the council, report the final version at a CDIO-world
conference.
REFERENCES
Proceedings of the 12th International CDIO Conference, Turku University of Applied Sciences,
Turku, Finland, June 12-16, 2016.
from
http://www.weef2015.eu/Proceedings_WEEF2015/proceedings/papers/Contribution10
92.pdf
Bennedsen, J., Georgsson, F., & Kontio, J. (2014). Evaluating the CDIO self evaluation.
Proceedings of the CDIO World Conference. Barcelona.
Boele, E. B., Burgler, H., & Kuiper, H. (2008, Heft 1). Using EFQM in higher education: Ten
years of experience with programme auditing at Hanzehogeschool Groningen.
Beiträge zur hochschuleforschung, pp. 94-110.
CDIO. (2010, December 16). The CDIO Standards v 2.0 (with customised rubrics). Retrieved
January 19, 2016, from CDIO: http://www.cdio.org/knowledge-
library/documents/cdio-standards-v-20-customized-rubrics
CTI. (2016, February). Commission des Titres d'Ingénieur - La CTI est un organisme
indépendant chargé d’habiliter, de développer et de promouvoir la formation et le
métier d’ingénieur en France et à l’étranger. Retrieved from CTI (Commission des
Titres d’Ingénieur): http://www.cti-commission.fr/spip.php?page=sommaire-en
EFQM. (n.d.). The EFQM Excellence Model | EFQM. Retrieved from
http://www.efqm.org/the-efqm-excellence-model
ENAEE. (2016). EUR-ASE System. Retrieved from http://www.enaee.eu/eur-ace-system/
Georgsson, F., Kontio, J., & Bennedssen, J. (2015). Updating the CDIO self-evaluation
rubrics. Proceeding of the CDIO World Conference 2015. Chengdu: CDIO.
Paulk, M. C., Curtis, B., Chrissis, M. B., & Weber, M. B. (1993). Capability maturity model,
version 1.1. Software IEEE, 18-27.
Rouvrais, S., & Lassudrie, C. (2014). An Assessment Framework for Engineering Education
Systems. In A. Mitasiunas, T. Rout, R. V. O'Connor, & A. Dorling, Software Process
Improvement and Capability Determination (pp. 250-255). Springer International
Publishing.
The Danish Accreditation Institution. (2016, January). The Danish Accreditation Institution.
Retrieved January 25, 2016, from http://en.akkr.dk/
Proceedings of the 12th International CDIO Conference, Turku University of Applied Sciences,
Turku, Finland, June 12-16, 2016.
BIOGRAPHICAL INFORMATION
Corresponding author
Jens Bennedsen
Aarhus University, School of Engineering
Inge Lehmanns Gade 10
This work is licensed under a Creative
DK-8000 Aarhus C, Denmark
Commons Attribution-NonCommercial-
+45 4189 3090
NoDerivs 3.0 Unported License.
[email protected]
Proceedings of the 12th International CDIO Conference, Turku University of Applied Sciences,
Turku, Finland, June 12-16, 2016.