EXAMS ANALYSED-why we have them, what they measure and how to succeed in them Part 2

25 years ago I discovered while studying for my M.Ed. that my background had contributed to my fascination and research into exams. I had been carrying out teacher-led research for 16 years, and had already discovered a great deal about exam success. It transpired that there were several key factors that were necessary to achieve this.
• Interested/skilled in Research & Statistical Analysis
• Keen to develop & measure effective learning & teaching
• Access to a Large cohort of 11-18 year old students
• Taught P.E. and Chemistry & Science at O & A Level
• Knew the school & environment as student and teacher
• Able to empathise from the student perspective

I explained in Part 1 that my fascination and research into exams was initiated by my mysterious success in them. Consequently when I began teaching in 1974 I immediately began experimenting and researching with my students to discover what exams actually measure and how they could be helped to succeed in them. Having numerous students over many years able to assist in this research proved invaluable, as was having access to exam papers and marking schemes in many different subjects and levels. I was fortunate in having many colleagues who were willing to provide me with exam papers and mark schemes. I also worked for several exam boards as an examiner to research how exam papers were marked, discovering how rigid and strict the marking scheme was adhered to, claiming that it is essential if there was to be consistency.

The 6 Exam Skills

By the start of the eighties, the findings of these experiments and research had uncovered 6 key skills (or activities) that were central to all written exams.


This was an experience that even in the seventies was quite rare for most students, in the decades since then it is has become extremely rare. The numerous discussions and surveys that I was able to carry out in school revealed that few students spent any time doing it home (or in school) and clearly most found it difficult and struggled with it. Most students said that ‘silence’ was very rare in their lives and found it extremely uncomfortable.
To highlight the need to practise being ‘UNDER EXAM CONDITIONS’, I created posters similar to these and shared this message to students, staff and parents.


Assessment for Learning

Discovering these 6 exam skills (or activities) immediately had a huge effect on the older (O and A Level) students as they were simply (and accurately) able to identify why they were ‘losing marks’ and which exam skills to practice. Crucially this also meant they were able to ‘take control of their exam preparation’, and we stopped calling it ‘revision’ as that term now seemed inappropriate.
Furthermore, these 6 exam skills helped to me continue to focus on developing the student’s skills, using the exam papers and mark schemes as opportunities and resources to do so. In fact, the feedback from the students using these 6 exam skills helped to gradually develop five key questions that we realised could be used to analyse ALL WRITTEN EXAMS.


With so many students using this analysis I had a great deal of feedback and evidence on what exams were actually assessing and how I could more effectively help students (anyone actually) identify the difficulties that really needed to be overcome to achieve exam success.
In 1998, the booklet, “Inside the black box”, written by Paul Black and Dylan Wiliam, was published that highlighted that students who learn using assessment for learning achieve significantly better than matched control groups receiving normal teaching, prompting a huge emphasis on ‘Assessment for Learning’ in UK schools.
Until this point I had not realised that I had been using ‘Assessment for Learning’ throughout my teaching career. Basically:-
“Assessment for Learning’ (AfL) refers to activities undertaken by teachers, and by their students in assessing themselves, which provide information to be used as feedback to modify the teaching and learning activities in which they are engaged.”
The UK Assessment Reform Group (1999) identified ‘The big 5 principles of assessment for learning
1. The provision of effective feedback to students.
2. The active involvement of students in their own learning.
3. Adjusting teaching to take account of the results of assessment.
4. Recognition of the profound influence assessment has on the motivation and self-esteem of pupils, both of which are critical influences on learning.
5. The need for students to be able to assess themselves and understand how to improve.
Since I had struggled to cope as a student with most teachers in school, I had learnt most effectively outside lessons or in sport, using my own analysis, assessment and practice. it appears that ‘Assessment for Learning’ was central to my approach to teaching right from the start in science and sport simply because my own experience, thinking and research had discovered its effectiveness.

The Five Learning Requirements

My experiments in the seventies when teaching science, chemistry and sport provided me with a vast amount of invaluable feedback and learning, transforming my self-awareness and cognition on the skills needed to succeed and teacher effectiveness. I was able to frequently study and explore (experiment) the key factors involved in the learning of my students (and myself) in lessons and in sport, eventually I postulated that there were 5 key factors central to effective learning which I called ‘The 5 Learning Requirements’.


By the early eighties, following my incessant experimenting and scrutiny, ‘The 5 Learning Requirements’ became central to all my teaching and learning, with this poster being displayed and shared with students, staff and parents. I was determined to analyse the key factors in learning so that I could improve the learning of the students and I integrated the 6 exam skills with the 5 learning requirement to construct a questionnaire to explore the most important learning difficulties encountered by students.


The evidence from hundreds of students, together with many discussions and interviews was invaluable and extremely revealing, providing a clear conclusion:

‘Having the motivation (self-discipline) to prioritise their study to overcome their difficulties’, was easily the greatest difficulty.

In recent years these factors have been linked with Mindsets (Carol Dweck), Grit (Angela Duckworth), non-cognitive skills (James Heckman), self-regulation and self-control (Walter Mischel).

Measuring Exam Success

I began this blog by pointing out that my success in exams was a huge mystery to me but radically changed my life. I considered passing the 11+ to be a success because my family knew of nobody that had achieved this before. When I got my O level results I almost feinted with shock (fortunately I was in our kitchen with my family) because I’d passed so many more than anyone expected. However, these apparent successes were very personal and related to my background and environment, but had little statistical viability. In the seventies, when I began teaching there seemed to be little collation and comparison of exam results. I felt that if I was to discover how to help my students achieve exam success I needed to discover a way of measuring exam success.

Year on Year

My Head of department in science when I began teaching in the seventies had used a very simple measure for many years, by simply comparing the number of students who got O Level passes (grades 1-6 in those days) and A level grades, A to E. The following table shows that this approach has been the traditional one since O and A levels were first introduced. In essence, it relies on the cohorts being much the same each year and in the Grammar schools they probably were. However, in the seventies, comprehensive schools had arrived and the cohorts were varying a great deal. Using this simple approach there was a little change in numbers between 1974 and 1979.



When I became Head of Chemistry in 1978 I decided to introduce another measure of exam success by comparing the student’s grades in their each of the different subjects. I felt this may allow for the variation in cohorts.


This simple extract shows how this analysis compared O level results between departments, which did help me to illustrate how students in my department were achieving better exam success than others. However, it does not really provide an analysis of the relative exam success of each student.


By the time I began my Dip. Ed. In 1984, the 5 Learning Requirements, 6 Exam Skills and the Marks Lost In Exams Analysis were all well established and it had become clear that exam success was more due to the skills (or qualities as some preferred to refer to them) of the student. Therefore I was determined to create an analysis that would focus on measuring the exam success of each student.
I had measured my own exam success by measuring my ‘passes’ with the people around me. So when I passed the 11+ and we knew of nobody that had achieved that, I considered myself to be successful. When I passed 8 out of 9 subjects at O level (English Language was my only ‘failure’ that I passed in the resit), it was more than almost anyone else in our school year, so again I felt I was successful. From numerous discussions with students, parents and staff it appeared that virtually everyone used this simple (crude) measure, so I decided that I should begin collecting and collating these ‘student-centred’ measures of exam success.
As shown by the table of Examination Results, England and Wales, since the introduction of GCE O levels the benchmark of 5 O level passes was used (the grades varied over years and between exam boards). When GCSEs replaced, O levels and CSEs in the late eighties, this benchmark was continued with Grades C and above being referred to as ‘passes’ despite the term ‘FAIL’ not appearing anywhere. By the time the Government introduced ‘Exam League Tables’ in 1991, I had spent over decade analysing exam results, the availability and emphasis on exam results helped me greatly in my data collection and analysis.
I completed my M.Ed. in Research and Evaluation in 1992, by which time, I had sufficient evidence to illustrate that in most schools:-



Throughout the nineties I was able to collect, analyse, compare and predict results for not only my school but also others.

It became very obvious that statistics and data analysis was not a strength for many people (including teachers) and with the increasing data and analyses I experimented in a variety of ways of displaying it, trying to simplify it.
I created activities, quizzes, discussions and INSET using the data and evidence from it. My Maths, Science and PSHE students regularly took part in these activities which prompted requests from parents to provide presentations for them. Here is an example of one of these activities


If a typical tutor group contains 31 students, attempt to estimate (guess) the number of students that are likely (an average) to attain:-


Gradually through the nineties schools Initially secondaries, then primaries, began trying to analyse exam results, all either used the Year on Year measures or inter-departmental (or inter-school). Whenever, I was given an opportunity to explain the student-centred analysis, the recipient was impressed.

Measuring Student Improvement

If you’ve read my blog https://succesfeelosophy.wordpress.com/now-i-understand-how-to-close-the-disadvantage-gap/ you will be aware that in the late seventies I introduced a phrase (now frequently used) “It’s a marathon not a sprint” to explain why short term measures are very limited in use and it is important to measure frequently only the things that really matter in the long term, William Edwards Deming. ‘The father of quality’, emphasises a similar approach. Therefore, once my research had uncovered the Five Learning Requirements and Six Exam Skills at the start of the eighties, I began researching how to measure the skills that matter as regular, simply and effectively as possible.
As a P.E. teacher and scientist, I had spent many years analysing, measuring and developing the key skills needed to succeed in various sports, therefore, I merely applied this same approach to all my lessons and students. Consequently, I experimented repeatedly with activities, tests, quizzes, and studied students very carefully throughout the eighties to research how to measure the progress of the students, the following report being an example.

In 1987, I was able to extend this research a great deal when I became Head of Year/Director of Learning for 250+ students. This meant I had a large cohort for 5 years on which I could experiment and research. Since in the eighties there were no SATs (Key Stage 2 SATs introduced 1994) I used cognitive abilities tests (CAT) scores as a (crude) measure of exam skills alongside their reading age at 11. I had access to all the 250+ students’ end of year exam scores in all the subjects for each year through to their Mock GCSEs and final GCSE grades. I used all this data to compile and analyse my students’ progress.

Progress 8 –“measuring exam success”

“Progress 8 will be introduced for all schools in 2016 (based on 2016 exam results, with the Progress 8 score showing in performance tables published in late 2016/early 2017).
The Progress 8 measure is designed to encourage schools to offer a broad and balanced curriculum at KS4, and reward schools for the teaching of all their pupils. The new measure will be based on students’ progress measured across eight subjects:
From 2016, the floor standard will be based on schools’ results on the Progress 8 measure.” (DofE)

I think the similarity between Progress 8 and my attempt to measure progress of my students in the nineties is obvious. However, I did not have the measurements of academia and exam achievement that is now available, but our objectives were the same. It is important to appreciate that these are only measurements in progress in exam success. With such comprehensive longitudinal data on so many students (I repeated it for my next cohort, another 250+ students) I had clear evidence from this extensive statistical analysis and research. It provided me with some clear conclusions that led to the creation of S.U.P.E.R.learning for success which is explained in part 3 (focuses on effective development of their exam skills).

EXAMS MEASURE the students ability to
• CONCENTRATE on exam papers on their own, in complete silence, without a break
• READ exam papers
• UNDERSTAND the information and the questions in exam papers
• RECALL the information needed in exams
• Realise which information is needed to MATCH the question
• WRITES the answers clear enough to match the mark scheme

Development and application of these SIX EXAM SKILLS depends on
• Organises and plans exam preparation
• Environment at home to study without distractions.
• Friends that have positive attitude to study.
• Has the motivation (self-discipline) to prioritise study to overcome the difficulties


_2_SUPERlearning for EXAM success