Yes, it’s that time of year again. Like a dog with a frisbee, a really bad virus or a bad cheque (check), it’s back. Time to get the Internal Assessment (IA) portfolios together, submit final grades and then compile the sample for external moderation.

Many teachers dread the process, seeing the external moderation not so much as a tweaking of set grades but rather a ‘grade on the teachers’ themselves. Putting in grades and then being asked to submit samples for moderation can indeed be perceived as a control-function and no doubt there is a great deal of face involved in the initial IB meetings after the summer holiday when grade and teacher performance are looked at.

However – and this is a big ‘however’ – the reason for external moderation is primarily to guarantee a certain standard and avoid grade inflation. I have often heard, from very experienced examiners/moderators, that the key is consistency rather than perfection; as long as teachers apply the assessment criteria in the same manner across the student cohort, it’s reasonably straightforward to adjust all the grades up or down a mark or two. It’s inconsistency in setting marks that throws sand into the works.

Let’s face it; the metrics used in the mark descriptors are largely qualitative (e.g. ‘correctly labelled’) rather than quantitative (‘max 750 words’). This inevitably adds a measure of interpretation in the mix and no two teachers will overlap 100%. I try to put some effort into making the moderator’s task a bit easier and also help him/her see things my way. I do this by making sure that students’ portfolios are identical in style/format and by including a (sealed) letter justifying and underpinning marks set in every criterion. This is not just simple consideration for a colleague but perhaps primarily making clear to him/her why a given mark has been set.

IA pile

I had the Minion glue the cover sheets on.

There has been a bit of discussion as to whether one should include written comments on the side of IA scripts sent to moderators. The past few years I have simply left it at the discretion of the moderator; I write comments to each and every script underpinning and justifying marks set and then put the notes in a sealed envelope explaining that it is the moderator’s choice whether to look or not. As I’ve not received a whack on the knuckles from IBO, I shall continue this method.

Here are three examples (out of a sample of ten) of my comments.

John Smith (41 marks)

Marks Comments
Criterion A 9 Cogent and well-developed diags that are clearly used in support of iteration throughout the portfolio   (= in all three commentaries)
Criterion B 6 Definitions omni-present, clear and correct terms that are well linked to article content,
Criterion C 6 Relevant terms/concepts used well, clear links to underlining article content
Criterion D 8 C1 somewhat hazy in linking theory to fully to article content, C2 and C3 show excellent analysis using core theory and diagrams
Criterion E 9 All three commentaries exhibit conclusions/judgements that are supported by the iteration – which in turn are squarely based on economic reasoning
Criterion F 3 Obviously he read the instructions! Full marks here.
Sum: 41

 

Alias Smith (24 marks total)

Marks Comments
Criterion A 4 C1: Relevant diagrams included, 1 explained incorrectly, C2: Written work does not match graph, some explanations are confused, C3: Relevant diagrams are included, no arrows are included to show shifts or changes
Criterion B 4 C1: Some use of terminology, some definitions incomplete, C2: Some definitions are incomplete, C3: Key terms are defined, terminology is used appropriately throughout the commentary
Criterion C 4 C1: Student tries to link theory to article, C2: Student links concepts to article, C3: Links to article and quotes figures
Criterion D 4 C1: Student expresses understanding but in a limited way, sometimes using flawed application of theory, C2: The economic analysis is not always logical and consistent, C3: Consistently expresses ideas using economic theory
Criterion E 6 C1: Judgements are not well supported, C2: Judgements are made but are not well supported, C3: Makes some judgements but does not go into depth
Criterion F 2 Meets 4 rubric requirements
Sum: 24

 

I. M. A. Poncer (16 marks)

Marks Comments
Criterion A 2 Hazy, incomplete, erroneous, microscopic and often made-up diags, lacks cogent links to article content mostly
Criterion B 3 Uses economic terms but in a most haphazard and slovenly manner, often tossed-in with any eye on “…use economic terms…” from IA Success for The Extremely Lazy
Criterion C 3 Application is often of the “scattergun approach” where the student hopes at least one or two 12-gauge pellets hit the mark
Criterion D 3 Limited analysis and again, weak attempts to link cogent use of economics/diags to core issues in the article, no real break-down of issues using economics as a method
Criterion E 3 Evaluation is tepid as any attempts herein are far too random for any serious consideration of different possible economic scenarios based on solid use of economic reasoning
Criterion F 2 In spending too much time on IA Success for the Extremely Lazy, the student managed to write C1 and C2 from the viewpoint of Section 1, micro
Sum: 16