viewcontent
viewcontent
viewcontent
ARROW@DIT
Conference Papers School of Mechanical and Transport Engineering
2011-01-01
Recommended Citation
Delaney, K.:Evaluating "Independent" Assessment of Capstone Projects by Mechanical Engineering Students in DIT. Proceedings of
Edulearn 2011, Barcelona (Spain) 4th-6th July 2011.
This Conference Paper is brought to you for free and open access by the
School of Mechanical and Transport Engineering at ARROW@DIT. It has
been accepted for inclusion in Conference Papers by an authorized
administrator of ARROW@DIT. For more information, please contact
[email protected], [email protected].
Abstract
Capstone projects by mechanical engineering students in Dublin Institute of Technology (DIT) are
designed for students to showcase their skills and technical knowledge. To successfully complete
capstone projects, students must typically demonstrate skills of project planning, time management,
negotiation, component sourcing and an awareness of the multi-disciplinary nature of engineering in
addition to the application of technical knowledge. A key element is that students explain and justify
their work before a panel of technical experts.
The Mechanical Engineering Department at DIT introduced “independent” assessment for student
presentations on a pilot basis for the Bachelor of Engineering technology students in 2010. Under this
system, assessors of a particular project are not ordinarily involved with teaching or mentoring the
student, or with any other part of that student’s project, on a day to day basis. Consequently,
assessors are unaware of specific project details in advance of the assessment session and they must
rely on the student to describe and defend the project and the approach taken. More specifically, the
student’s supervisor does not attend the sessions at which students present their work. This approach
is closer to the post-college environment in which most engineers work and therefore simulates
workplace experiences for students.
A review has recently been conducted to evaluate the process from the perspective of both students
and staff with a view to determining whether or not this structure should be continued with. This paper
documents the rationale behind the concept, the logistical challenges, and an analysis of the feedback
received from both students and staff as part of the review. In addition, future plans to improve the
fairness, consistency and transparency of the assessment process for such projects are outlined.
Keywords: Capstone project assessment, independent assessment.
1 INTRODUCTION
Capstone projects are completed by final year Bachelor of Engineering Technology students in the
Mechanical Engineering Department of Dublin Institute of Technology (DIT) to give them an
opportunity to work on a “real life” project. These projects help learners make connections between
classroom experiences and real world engineering practice. In order to successfully complete the
projects students must apply technical knowledge, and demonstrate skills of project planning, time
management, negotiation, component sourcing and an awareness of the multi-disciplinary nature of
engineering. The importance for engineers to develop such skills has been highlighted by several
authors such as Kamm [1]. These skills also feature on the list of programme outcomes expected of
level 7 graduates laid down by Engineers Ireland and used when accrediting engineering programmes
[2]. As illustrated in Figure 1 it is through the appropriate combination of such skills that students are
expected to become more effective engineers when they graduate and enter the workforce.
Capstone projects undertaken by the DIT mechanical engineering students can be broadly
categorized as design and make, design only, and investigative and/or software-based projects.
Regardless of which project category the students take on, the projects motivate students to
synthesize and apply knowledge that they have accumulated through their courses. The module
learning outcomes, as outlined in the module descriptor, are listed in Figure 2.
004917
Figure 3: Assessment deliverables, generalised list of tasks, the number of weeks between
assessment sessions and the marks allocation (for the 2010-2011 academic year).
One learning outcome of the project module is that students be able to communicate technical results,
information and conclusions to others by means of formal presentations, drawings and reports. A key
element stressed during review milestones is that students be able to explain and justify their work
before a panel of technical experts. In previous years students presented their work in front of their
own supervisor, two other supervisors and also a group of about ten of their peers. Presenting to
peers gives students an opportunity to benchmark their own projects and progress and is an important
element of the assessment approach followed over several years. Assessment panels are arranged,
where possible, to ensure that staff members with several years experience of supervising and
assessing students are on panels with less experienced staff members.
After the 2009-2010 academic year project supervisors raised concerns that some students were
overly-reliant on their own supervisors to address and clarify issues raised during the questions and
answers phase of the assessment process. Other staff commented that they would like the opportunity
to see a broader range of students’ projects in addition to the students they were mentoring. To
address these issues the final year project committee for the Bachelor of Engineering Technology
students implemented changes. On a pilot basis a system of “independent” assessment was
introduced for the first two assessment sessions of the academic year 2010-2011.
It is important to clearly define what is meant by the term “independent” assessment. Under the
system piloted in DIT assessors are not ordinarily involved with teaching or mentoring the student, or
with any other part of the project they are assessing, on a day to day basis. Consequently, the
assessors are unaware of the project details in advance of the assessment session and rely on the
student to describe and defend the project and the approach taken. More specifically, in the format
that was piloted the student’s supervisor does not attend the sessions at which students present their
work. This approach is closer to the post-college environment in which most engineers will have to
work and therefore gives them simulated workplace experiences.
This paper summarises an evaluation of the process from the perspective of both students and staff
with a view to determining whether or not this initiative should be continued. Section 2 documents the
rationale behind the concept and section 3 details the challenges of implementing it. Section 4
explains the need for a review and summarises the feedback received as part of the review. Section 5
describes future plans to improve the fairness, consistency and transparency of the assessment
process for such projects and section 6 presents concluding remarks.
004918
According to Grossman et al “such experiences are designed to focus students’ attention on key
aspects of practice that may be difficult for novices but are second nature to more experienced
practitioners” [4]. It is clear that these are “approximations” of practice and the benefit for the students
is that the assessment sessions are designed to give them the opportunity to experience sitting in front
of an unknown panel. However, contrary to what graduates might experience in industry, the panels
are not “hostile”. Academic staff members assess the students in a structured and supportive manner.
The concept of independent assessment is also commonly used in manufacturing companies where
independent or “blind” assessors, often from outside the company itself, are engaged to assess
products based on specific criteria.
It is important to distinguish the form of “independent” assessment defined in this paper to the concept
of “blind marking” where the assessors are unaware of the identity of the person being assessed. The
usefulness of the latter was questioned by Brown [5] who wrote that “Double blind marking doubles
the administrative and assessment load yet evidence from studies at secondary level (e.g. Murphy [6];
Newton [7]) indicate that it is no more reliable than single marking and moderating based on borderline
and central samples for each grade”.
The introduction of independent assessment for the initial assessment sessions has several potential
benefits for the project students, the academic staff involved and for industry despite the logistical
challenges for the staff involved. For convenience Table 1 summarises these benefits. Students
benefit by receiving opinions and feedback from additional staff members. Several students reported
finding this formative feedback to be advantageous. Such feedback is particularly useful in situations
where staff members without supervising experience are mentoring students.
Students can get feedback and Particularly beneficial for staff Graduates have an ability to
advice from a larger pool of new to supervising students on present in front of a full group
lecturers. this programme. of strangers.
Increased confidence.
In addition to the benefits listed in Table 1 there are also risks of this approach. A specific risk of this
approach is that students may twist the facts and may be inclined to say things such as “my supervisor
told me to do….” or “we couldn’t do that because…”. During the after-assessment review sessions by
staff concerns were raised about a small number of such situations. Other potential
risks/disadvantages of the approach are listed in Table 2.
004919
Potential risks/disadvantages of “independent” assessment
For students For lecturers and DIT For Industry
Students are expected to submit their presentation and report files by 9am on the Monday of the week
during which the assessment sessions have been scheduled (students make their presentations on
Wednesday afternoons). The files are submitted through “webcourses”, course management software
in use throughout DIT. Receiving files two days before the presentation date gives staff an opportunity
to review the content of the student’s reports before the presentation. It also gives students experience
of meeting deadlines and time to practice their presentations.
Assessment panel members propose and review marks as a group before an agreed mark for the
presentation and report is returned to the project co-ordinator. Separately supervisors read their own
students’ reports and also submit a mark to the project co-ordinator. The average of these marks is
recorded for the students. Where there is a significant difference between these two marks an
additional assessor is assigned to review the report of the student concerned (without knowing the
marks given by the previous assessors) and provide a neutral, deciding, opinion.
The most important feature for students, particularly for the first assessment, is the formative feedback
that they receive from the assessment panel. Sample feedback from 2010-2011 advised students to
restrict the broadness of their projects, refocus the direction of their projects and to research specific
literature. Other students were advised about the existence of specific equipment, previous projects
and the research interests of specific staff who may be able to help and advise them in more detail.
Sometimes this feedback is given to the students directly during the question and answer sessions
following their presentation. In addition a formal record of the feedback for the student’s presentation
and report is retained and given to the students’ supervisor after the assessment session.
004920
1. 60% of students expressed a preference that their supervisors be present at all assessments.
2. About 25% of the students expressed an opinion that it was a benefit that their supervisor was
not present at the first two assessment sessions. (citing reasons that they received
better/varied opinions which helped their projects and the opportunity to improve their
communication skills).
3. About 70% of the students felt that having their supervisor present at the third assessment
was beneficial. Students stated that this was so that their supervisor “knew how much work
they had done”.
4. Three students stated that they wanted to know their actual results and that knowing their
score was more important than the formative feedback and advice that they received. One
student felt that they could have received “better” feedback with no additional explanation.
5. Over 90% of the students stated that the assessment process, specifically the questions
asked of each student after their presentations and the overall handling of the assessment
process, was fair and consistent.
6. A majority of staff commented that supervisors should be present for the second and
subsequent assessments in future.
7. Approximately a fifth of the students suggested that they would like to have their final
presentation after their thesis submission. Over half of the students stated that they got
feedback from the third assessment presentation which helped them to optimise their thesis in
terms of content and structure. It is also noted that the exact timing of the latter assessments
are dependent upon the timing of Easter each year. The time delay between the third
assessment presentation and submission of the final thesis was longer than usual in 2010-
2011.
8. In a surprising development several students suggested that there should be an additional
assessment between the first and second assessments. The reason given was that the gap
between these assessments was excessively long. Students reported finding it difficult to
maintain focus on and enthusiasm for their projects with other assignments and more pressing
deadlines competing for their attention.
Figure 4: Assessment deliverables, generalised list of tasks, the number of weeks between
assessment sessions and the marks allocation (proposed for the 2011-2012 academic year).
In addition the committee is considering the potential that students should do preliminary research on
their chosen project area during second year. The benefit of such an approach is that students could
then “hit the ground running” and be able to make more substantial progress with their projects.
Several training sessions such as using library resources and health and safety are already organised
for project students by the project committee. For 2011-2012 additional formal sessions regarding
004921
project report writing and project planning will be introduced to help students revise and apply what
they have learnt in previous years.
Students also commented on experiencing difficulties in accessing resources to actually manufacture
the components for their projects. Part of this problem can be traced to excessive numbers of students
arriving into the workshops with their completed drawings at the same time. It is hoped that the
modified timing of assessments, particularly having assessment 2 towards the end of semester 1, will
help students to focus on having their drawings prepared more promptly and help to stabilise the
demands on workshop resources. This is also an important learning experience for students.
A number of students complained that having to submit files at 9am on Monday morning was not
appropriate. The project committee has endeavoured to accommodate all students even those without
internet access at home by having the submission deadline at a time that students can access the DIT
campus buildings. A significant number of files were submitted in the middle of the night so in future
students will be asked to submit their files by 5pm on the Saturday before their presentations.
6 CONCLUDING REMARKS
“Independent” assessment was implemented and trialled on a pilot basis for the first two capstone
project assessments in 2010-2011. To evaluate the effectiveness of this trial a structured review of the
process was conducted. As part of this each student submitting a project was asked to complete a
survey. The vast majority of students who responded reported that they had received useful and
constructive feedback from their assessment teams. Staff members also commented that their
students benefitted from the comments and fresh perspective of the “independent” assessment teams.
Students responded that they felt the assessment process is fair and consistent for all students.
Acting directly on the proposals of current students changes are already planned for the next
academic year and are close to receiving final approval from the relevant authorities in DIT. Even with
such changes it is likely that continuous refinement will be needed as the project assessment team
strives to improve the fairness, consistency and transparency of the assessment process for all
students.
Evaluating this specific capstone or “final year” project module in isolation from the overall engineering
program is difficult since (1) it is difficult for employers to relate skills development to specific modules,
(2) students’ lack of experience of alternative learning methods may affect student objectivity in
conducting a student evaluation and (3) the absence of a control group may affect the interpretation of
any results obtained. How such a formal evaluation may be performed is under consideration.
It is hoped that this experience of capstone project assessment in DIT may be beneficial for those
involved in assessing such projects both within DIT and also in other institutions.
ACKNOWLEDGEMENTS
Thanks to the students and supervisors involved with the capstone projects this year. A special thanks
to the Department of Mechanical Engineering for the support and opportunity to simulate industrial
experience for our students and to the (Learning Teaching and Technology Centre) LTTC for the
advice and encouragement to pursue such improvements.
REFERENCES
[1] Kamm, L.J., Real World Engineering: A guide to achieving career success. 1991, New York:
IEEE Press.
[2] Engineers-Ireland. Accreditation Criteria. 2011 [cited 2011 18-February-2011]; Available from:
http://www.engineersireland.ie/media/engineersireland/services/Download%20the%20accreditat
ion%20criteria%20%28PDF,%20240kb%29.pdf.
[3] McKenzie, L.J., et al., Capstone Design Courses and Assessment: A National Study, in
American Society of Engineering Education Annual Conference & Exposition. 2004.
[4] Grossman, P., et al., Teaching Practice: A Cross-Professional Perspective. Teachers College
Record, 2009. 111(9): p. 2055-2100.
[5] Brown, G. (2001) Assessment: A Guide for Lecturers. LTSN Generic Centre Volume,
004922
[6] Murphy, R.J.H., Removing the marks from examination scripts before remarking them. British
Journal of Educational Psychology, 1979. 49: p. 73-78.
[7] Newton, P.E., The reliability of marking of General Certificate of Secondary Education scripts:
Mathematics and English. British Educational Research Journal, 1996. 22: p. 405-420.
004923