Tuesday, October 6, 2015

School Results Sotogrande International school IB and response to my complaint from IBO

MY IBO COMPLAINT YR END 2015 My daughter attended Sotogrande international school for just over 8 years and during that time we have been very happy with her development and her enthusiastic approach to learning and wanting to do well.
However some serious downgrading on course work by external examiners seemed to totally contradict marks given by a number of teachers. My wife spoke emailed and dealt with the school over the summer and I took it upon myself to tackle the IBO making a formal complaint about the level of discrepancy between two of their experienced examiners (one who marked my daughters work as a teacher) who marked the same piece of course work 6 and the external examiner giving it a 3.
There have been many exchange of emails between myself and the IBO over the last few months but i have copied and pasted the two most relevant please read below.
International Baccalaureate discrepancies reveal problems My daughter recently completed her school year and was disappointed with the 35 points she achieved 4 below predicted. She was figuring exams didnt go that well. However on finding out exam results were good she was horrified to find out the low marks had come from the course work.
'That cannot be she announced my course work was graded quite high by my teacher'
So here you have a student who worked hard under what we believed was good tutelage and submitted her course work well in time allowing herself the possibility to make amendments if deemed necessary. The teacher made recommendations and adjustments were made and 6 points were awarded by the tutor. This equates to an A. The said work was submitted to the IB for the examiner to mark and the examiner awarded a 3. The result was complete shock for all of us. How can a teacher who also is an IB examiner get it so wrong and if the teacher did get it wrong who has failed. In my daughters words 'Had I any inkling that the work was to be scored so low i would have stayed up day and night to improve it. Instead we trust our teachers to know the standards required'
The teacher an experienced IB examiner is totally bemused and in his response to head of secondary replied 'i am well aware of the standards required to do well and would not dream of submitting work of Level .. 3 standard'
Here is an extract from IBO where school is questioning the whopping gap between the marking
'I do not believe there is enough evidence to suggest the marking of the essays are incorrect and warrant a re-mark and unfortunately we do not offer a re-mark service for individual components. In order that we can resolve any injustices in the final grades awarded to candidates, we offer the category 1 EUR service to schools (re-marks). As you are aware this service involves a candidate's externally-assessed components being re-marked by the most senior examiners available to ensure that the grade upon re-marking is fair and correct. If you feel that the grade awarded to any of your candidates is incorrect, I would advise you to request a category 1 re-mark of their work. You will not be charged if the re-mark results in a change of grade.'
We feel very agrieved by this situation as if the teachers get it so wrong its the student that suffers. Surely the IBO should be looking at Teachers own marks for course work done and where such huge discrepancies exist a formal inquiry should be set in place. My daughter is 2 marks short of either University due to her course work that she was told was excellent
I have tweeted my disdain over this situation on the IBO twitter page many times in last 2 weeks. Its very frustrating and annoying and you have to ask who has failed the pupil is it the teacher or the system. Whatever way you look at it the student has been duped by the system.
Our youngest is just starting secondary so it maybe a good time to bail out of International Baccalaureate system
The final email (there were many more between these) is below
Dear Francis, Your complaint has been forwarded to me to respond to as Head of Assessment Principles and Practice.
I appreciate that your concern is now about the apparent difference in marking standard between two of our examiners, and so I will discuss that issue. If you would like to discuss your daughters grade then please let me know.
I completely agree with you that we need to be confident that students will get the same outcome whichever of our examiners marks their work. Occasionally when a candidate is borderline between two grades then we may have a situation where 1 mark difference between examiners results in a different grade, but that is substantially different to the situation who are describing when two examiners have very different views of the quality of students work.
If I may I would like to explain the approach the IB takes to examiner reliability, which is the technical term used to describe the concern you raise. We have two different approaches to monitoring that examiners are marking to the correct standard, “seeding” and moderation. We are currently moving away from the moderation approach to what we call “dynamic sampling” which has many of the advantages of the “seeding” approach.
With “seeding” we ask the examiner to show that they have a good understanding of the correct standard by marking a number of “qualification scripts.” These have already been marked by the Principal Examiner (who sets the correct standard) and so we can compare their marks with those of the Principal to check they are accurate. They then start marking candidates work, but every 10 scripts or so we give them a “seed” which the Principal Examiner has already marked so that we can check they are still working to the correct standard. This means we can immediately stop any examiner we are concerned about.
For teacher marked internal assessment and a few essay based tasks, including the literature written assignment, we are currently using a moderation approach. This means that we provide examiners with clear guidance and examples of the marking standard but do not have a “qualification” process before they start. Instead we collect a sample of their marking and have it remarked by a senior examiner. By comparing the two sets of marks we decide whether we can accept the marks, need to remark all of the student scripts that they have done or whether a mathematical calculation can be made to bring them in line with the Principal Examiner.
As an example of the latter case, if we know that one examiner is always 5 marks more generous than the Principal Examiner, but is very consistent in always being that generous we would simply reduce their marks by that amount. This means that all candidates would get the same outcome whether marked by the “generous” examiner or the Principal Examiner.
If there is no clear pattern when we look at the examiners “moderation sample” we will simply remark all of their work, and we do this every session. If however a particular examiner is consistently harsh or generous in their decisions we can still get a reliable result by applying this correction factor. I agree that a better solution is to monitor the examiner as they carry out their marking (i.e. the seeding approach) to make sure they match the Principal Examiners standard and this is what dynamic sampling will allow us to do.
Neither approach can identify an occasion when an examiner marks one particular script incorrectly, but the rest of the scripts in line with our expectations. We do identify candidates who are receiving an overall grade which is radically different to their predicted grade and endeavour to have these reviewed by a senior examiner. We call this the “at risk” process and typically review candidate work which is two grades below their predicted grade. We also have the EUR process as a final opportunity for schools to bring such cases to our attention.
It is not appropriate for me to disclose the performance of specific examiners with you, and so I am not able to provide evidence to expand on the particular case you raise – however I would reiterate that I agree with you that a 3 grade difference of opinion would be cause for concern and justify further investigation. As you acknowledge it is not appropriate for us to ask a teacher to mark their own students work, but our experience is that even examiners who are very reliable markers find it difficult not to be influenced by their wider knowledge of the candidates when marking their own students. It is not uncommon for senior examiners to have a correction factor applied to their IA marks, sometime to correct over generous marking sometimes to correct marks that are too harsh.
I hope this response will provide you with some confidence that the IB takes the reliability of marking very seriously and are continually monitoring and looking to improve. If you would like me to expand on any of these points, or explain part of the system further I would be very happy to do so.
Dr Matt Glanville, Head of Assessment Principles and Practice International Baccalaureate Organization (UK) Limited
IB Assessment Centre
Peterson House, Malthouse Avenue, Cardiff Gate
Cardiff, CF23 8GL, United Kingdom
Tel: +44 29 2054 7818 | Fax: +44 29 2054 7703

1 comment:

  1. Thank you for posting this information. The IB coursework moderation is a strong negative, together with the impossibility to pass IB exams as an external student, making it one of the most costly diplomas. Today, there are alternative options to demonstrate the IB skills (and more), deep subject knowledge as well as breadth across many disciplines. Students can easily supplement their learning and demonstrate their attained skill level by turning to alternative exam boards. I used to be an IB supporter but after my daughter nearly missed her Oxbridge offer after she had her coursework downgraded for 2 subjects - I now recommend parents not to choose IB if their students have the ambition to apply to UK universities.