|
ELT News, June 2009
Training Script Raters For The KPG Exams
Within
the context of the KPG exams, the
systematic training of oral and
script raters is regarded as one of
the most significant factors
contributing to the validity and
reliability of the writing and the
speaking tests (Modules 2 and 4
respectively) and to the
sustainability of the examination
system as a whole. In a previous
issue of ELT
News the
main aims and defining features of
the oral examiner training programme
were presented. Today's text focuses
on the main principles and
characteristics of the training
programme for our body of script
raters.
Who are our script
raters?
Script
raters are experienced English
language teachers who have been
chosen from an initial pool of
applicants to the Ministry of
Education after a successful initial
screening interview. Most script
raters have been KPG oral examiners
as well. As new levels of language
competence were introduced in the
KPG exam battery, the need for a
larger number of script raters arose.
Thus, in subsequent phases of the
programme, after screening more
recent applications from potential
script raters, ELT professionals
with postgraduate studies in applied
linguistics and with experience in
marking scripts in other language
examinations were invited to become
part of the KPG script rater
programme. Unavoidably, the area of
residence has become a criterion for
selection since the KPG marking
centre for the English exam is
located in Athens and thus only
teachers living and working within
the Attiki area are able to join the
programme and become practicing
script raters.
What are the aims of
the script rater training programme?
The aims of the
English script rater training
programme include the development of:
-
a body of 300
script raters who have been
fully trained for assessing
all levels of the writing
module offered by the KPG
exam battery and whose
performance has been
evaluated on the job,
a comprehensive and fully
updated database of trained
script raters which the
Ministry of Education can
draw from to make
appointments for every exam
period,
comprehensive training
handbooks for script raters
accompanied by samples of
candidates' written
production for self training
and awareness raising
purposes.
How are script raters
trained?
Training before the
marking process begins
The training of
script raters started from the first
exam administration and it gradually
became more systematic and
principled from the second exam
administration onwards. Today, the
script rater training programme
consists of a series of stages which
all script raters are required to go
through.
After screening the
applications, groups of 'trainee'
script raters are formed who are
invited to a four-hour induction
seminar. During this seminar,
trainees are provided with a
detailed script rater information
pack and are informed about the
theory of language underlying the
writing test, the content and
structure of the test for each level,
the expectations for written
language production for each level
and the assessment criteria. The
information pack also contains
samples of candidate scripts from
past examination periods which the
trainee script raters evaluate
applying the assessment criteria.
New script raters are then invited
to the script rater training
seminars which take place after each
exam administration. They are
requested to participate in all the
tasks assigned during the seminar.
New script raters are required to
take part in at least two script
rater seminars (apart from the
induction seminar) before they are
allowed to take part in the marking
process.
The script rater
seminar takes place two weeks after
the actual exams. The preparation
for this seminar entails quite a
complicated and time consuming
process in which many individuals
and parties are involved.The
preparatory stages for the seminar
are as
follows:
-
Immediately after
the examination, members of the
test development team prepare
the expected outcomes of each
writing task for every exam
level in terms of (a) the genre,
communicative purpose, register/style,
(b) expectations regarding
coherence and cohesion and (c)
lexicogrammatical choices.
At the same time, the committees
at the Examination Centres
around the country pack the
English scripts and send them to
the Rating Centre in Athens.
There, the Rating Centre
committee randomly selects 100
scripts from each level writing
task; the selected scripts have
been produced by candidates
having taken the exam in cities
and towns in different parts of
the country.
After gathering 100 scripts, 20
experts from the KPG English
team, who are well aware of what
the writing tasks are aiming to
test and what the expected
outcomes of each writing task
are, as these have already been
recorded, meet at the Rating
Centre and, divided into groups
according to test level, use the
rating scale and evaluate 100
scripts. Each script is
evaluated and rated by two
'expert' raters, as when the
regular rating process begins.
Detailed discussion regarding
candidates' performance on the
particular tasks follows with
the purpose of (a) assessing
task validity, (b) finalizing
expected outcomes, (c)
fine-tuning the rating scale,
(d) selecting the scripts which
are best examples of
satisfactory and non
satisfactory scripts, (e)
conferring about scripts that
may have resulted in rating
discrepancy between experts.
On the basis of this discussion,
members of the KPG test
development team, update, refine
and revise the Script Rater
Guide which includes information
on the nature, structure and
main features of the written
test for all levels, the
criteria for the assessment of
candidate's written production,
the writing tasks of each
recently administered exam
together with detailed
information on the expected
output for each activity. Sample
candidate scripts are included
with the marks assigned by the
test development team and brief
commentaries justifying the
assigned mark. Finally, sample
candidate scripts are also
provided without marks and
commentaries to be evaluated by
script raters during the
seminar.
After the
production of the script rater
guide, our body of script raters
(experienced and new) are
invited to a one day
seminar/workshop. They are
provided with a copy of the
script rater guide (which they
are requested to bring with them
at the marking centre and
consult when marking proper
begins). During the seminar,
script raters are informed of
the writing tasks of each level
and the expected language
outcomes, they are presented
with candidate scripts and how
they were marked by the test
development team. The seminar
then takes the form of a
workshop where script raters are
provided with samples of
candidates' answers and are
asked to mark them and to
justify their mark in relation
to the criteria for assessment
of written production. Problems
or queries with the marking of
scripts are discussed and
clarified.
Individualised
Training at the Marking Centre
The training of
script raters continues at the
marking centre on an individual
basis. Centre coordinators closely
monitor the marking process and
offer on the spot advice, help and
training to script raters. More
specifically, our body of script
rater coordinators consists of
highly qualified and experienced
associates who in their majority are
members of the KPG development team.
Coordinators are present at the
marking centre throughout the whole
period of the marking process. They
work at the centre at predetermined
shifts (there are two shifts per day
except Sunday when there is only one)
taking care that there are at least
2-3 coordinators at the marking
centre at any one time. The number
of the coordinators needed for each
marking period depends on the number
of a) candidates of the particular
exam period and b) script raters
involved in the process in each exam
period. The duties of the
coordinators are briefly listed
below:
-
They advice
script raters whenever the
latter face a problem with the
application of the assessment
criteria.
They monitor raters' individual
performance during the rating of
a certain number (at least three
in every packet of 25) scripts
in all levels of the exam. This
procedure is followed each time
raters are obliged to move to
the next level of the exam.
They monitor the script raters'
application of the assessment
criteria in each of these
scripts and they keep records of
their performance by filling in
two different statistical sheets.
The first one includes more
general comments while the
second one is more detailed and
asks for the coordinators'
justified evaluation of the
individual script raters.
Additionally, the coordinators
monitor raters' performance
through randomly chosen scripts
already marked in which the
raters are asked to justify
their assigned marks. The
coordinators discuss the
application of the rating scale
and they keep records of the
whole procedure. These records
are analyzed after the rating
period has ended and details of
the raters' actual performance
are recorded and analyzed for
further reference and evaluation
of the individuals and of the
process itself.
Whenever different coordinators
realize that particular script
raters need more training on the
correct application of the
rating scale, they keep working
together with these raters until
it is decided that they do not
need further assistance.
At the end of each rating period
there follows a statistical
analysis of all rating
discrepancies between raters
which offers additional data for
evaluating both the rating
process and the performance of
the participating raters. Data
are also compared with those
produced in previous rating
periods so the results could be
used for the development of a
comprehensive and fully updated
database of trained script
raters. Additionally,
coordinators' and script raters'
feedback forms offer suggestions
for the improvement of the
rating procedure itself.
Introducing mentor
raters for on-site training of new
script raters
One of the strategies
that were successfully used with new
script raters was the Mentorship
scheme. A small number of
experienced and trained raters were
chosen to act as Mentors, that is a
body mediating between the
coordinators and the raters, whose
job would be to help new KPG raters
familiarize themselves with the
procedure and the assessment scale.
This scheme has been successfully
used and it is considered a
successful way of training both new
raters and experienced ones giving
them the opportunity to improve
acquired rating skills.
The future of our
script rater training programme
A comprehensive
Handbook for script raters has been
prepared (RCeL Publications,
forthcoming) and it is to be used as
a reference guide for practising and
prospective KPG Evaluators. This
Handbook, prefaced by Prof. B.
Dendrinos and edited by Prof. Bessie
Mitsikopoulou, brings together of
all the work carried out within the
context of the English Script Rater
Training Programme to date and
provides detailed information about
script rating (the criteria of
evaluation, rating grid, etc.), as
well as sample papers.
Our work with the
body of 300 script raters continues
on a systematic basis. As the KPG
examination system continues to
develop, as more innovations are
being introduced and as more data
are gathered through the monitoring
of the script rating process and the
systematic evaluation of inter-rater
reliability, the training programme
refines script raters' skills and to
prepares to meet new demands and
challenges, such as how to rate
scripts of the integrated B level
exam, which will be administered in
May 2010 for the first time.
Kia Karavas, Lecturer
Research Centre for
English Language Teaching, Learning
and Assessment
Faculty of English
Studies, University of Athens
<Πίσω> |
|
|