About cookies

The NCETM site uses cookies. Read more about our privacy policy

Please agree to accept our cookies. If you continue to use the site, we'll assume you're happy to accept them.


Personal Learning Login

Sign Up | Forgotten password?
Register with the NCETM

Secondary Mathematics Subject Leader In-Depth Study Module 6

Created on 01 April 2010 by ncetm_administrator
Updated on 09 April 2013 by ncetm_administrator

Mathematics Subject Leader In-Depth Study Module 6 (2 hours)

Use pupil data (attainment, feedback etc.) in order to adapt current departmental practice to help bring about improvements

By studying this module you will consider ideas about what it means to use data to identify strengths and weaknesses so that you know where to adapt current departmental practice to help bring about improvements. In particular you will come to know more about:

  • what data is available nationally, locally and in school?
  • what use is data?
  • what data can not be used to do?
  • sampling;
  • using data to explain, justify and raise questions;
  • answering the questions raised by data;
  • adapting departmental practice to improve learning.

The sections can be studied in the order presented here or you can click on one of the sections below to take you to a section that particularly interests you. 

  1. Getting the terminology right (5 mins)
  2. What data is available? (30 mins)
  3. What use is data? (30 mins)
  4. Developing understanding of sampling (10 mins)
  5. Raising and answering questions (20 mins)
  6. Adapting departmental policy (15 mins)
  7. Reviewing the module (10 mins)

 Back to top

1. Getting the terminology right

There are many terms used to describe ideas surrounding assessment and the data that results from it and often the terms are sources of confusion. Therefore throughout this module we shall use the following terminology

is raw scores in national tests (SATS) or exams; “standards” usually means attainment.

is the progress that students make over a period of time. This can be informal when a teacher says someone has made ‘ good achievement’ meaning their attainment has significantly increased over a certain period of time. Or it can be formal as when measured by Value Added (VA), i.e. the improvements in attainment made from one Key Stage to another.

 Back to top

2. What data is available?

The data that is available to subject leaders can seem confusing. What is available? How do the different types compare? How do I use data to improve my department? In this section we will consider what is available and compare some of the sources, look at some of the issues in interpreting data and using in data in the school context. Click the title of the area that you would like to study:

The different sources of data

Fischer Family Trust (FFT), RAISEonline, Local Authority (LA) data, Learner Achievement Tracker (LAT) and commercial schemes all have the same purpose. Data is provided that enables subject leaders to identify which student or groups of students are underachieving or performing in a way that is better than expected. Schools can then use the data to:

  1. Firstly, identify which groups of students need additional support in order to catch up.
  2. Secondly, identify whether or not strategies that are in place are effective i.e. by monitoring the progress of students who are receiving additional support. In other words, data can measure the impact of what the department is doing in order to raise achievement.
  3. The third main purpose of attainment and achievement data is to hold schools and subject departments accountable to external agencies including Governors, OFSTED, LA, parents and student.

There are five main sources of data available to schools in England:


RAISEonline includes data from Key Stages 2, 3 and 4, and provides analysis for KS2 → KS3 and KS3 → KS4 value added.

RAISEonline data contains attainment and achievement information at subject and student level for each school. There are many and various subject graphs, tables and charts provided. It can be accessed through the website above but teachers need a password to enter the site.

The school RAISEonline administrator should generate individual passwords for each member of staff so that when/if they leave the school, the passwords can be deleted. This is a data protection issue because RAISEonline contains pupil level data. Subject Leaders must be aware of data protection issues when passing on passwords. Who should see the data on individual students and what is the purpose e.g. do governors need to see pupil level data?

Some examples of the screens that are available from RAISEonline follow.
1. The RAISEonline Exceptions report from KS2 to KS3 (listen to a commentary on the graph) provides useful Contextual Value Added data for different groups of students – see below.

Download the latest version of Adobe Flash to listen to this resource.

The RAISEonline exceptions report above indicates that the progress made by students in maths is good overall with a CVA in 2007 of 100.9 and shaded green (circled). This green shade means achievements are significantly above the national average. The progress made by lower attaining students, those on FSM and those whose First Language is not English (all identified in a rectangle) are given no shading and therefore their progress is satisfactory. If there were any groups shaded blue, their progress would be significantly below what was expected of them. As most groups are shaded green in 2007 and overall in 2 of the last 3 years progress is significantly above average, this looks like a good department! There is a similar report in RAISEonline for GCSE value added.

Both Fisher Family Trust and RAISEonline reports make use of upwards or downwards arrows. These indicate whether or not value added is significantly improving over time or significantly falling over time.

2. The RAISEonline KS3 attainment report highlights the overall attainment of groups of students. A second page (not shown) provides attainment figures of different minority ethnic communities. When judging overall attainment OFSTED use the Average Points Score (APS), not %L5+.

3. The RAISEonline mathematics scatter plot (from the same school as in the tables above) enables subject leaders to form a view about the achievements of individual students and the department generally.

This graph is evidence of good achievement because there are many students performing in the top 10% nationally and an even greater number in the top 25% nationally. Lower attainers (circled) perform less well as indicated by very few of them achieving above the median line and a significant percentage in the bottom 25%. This relatively weaker performance of lower attaining students is also highlighted in the RAISEonline exceptions report shown earlier.

When this graph is accessed, teachers can select any dot or square and the details about that student will be displayed.

Fisher Family Trust (FFT)

Fisher Family Trust data includes KS2 – 3, KS3 – 4 and KS4 – 5

Most Local Authorities (LAs) make Fischer Family Trust data available to their schools although some may not. It is available in 3 parts:

  1. A pupil level database provided from FFT to the LA. They in turn usually pass this to the school either electronically or more commonly on a DVD or CD. The school’s Data Manager or Senior member of staff is likely to have access to this database.
  2. FFT online is a new development and is constantly under review. This provides subject value added information at KS3, GCSE and post 16. It can be accessed online but teachers need a password to enter this site. Passwords are given to schools by the LA. A member of the school’s SMT is likely to have this password.
  3. FFT provide LAs with a FFT Self Review Document that greatly assists schools with judging their achievements in mathematics. LAs usually provide this document to their schools.

The FFT database and Self Review Document will be provided to the school separately on a database, sometimes over a secure internet site or more usually on a CD. A member of the school’s Leadership Team should be able to give you access to this information.

FFT live website needs a password that is provided by the LA. Some LAs do not give these passwords to school, some provide individual school passwords and some LAs provide a generic password so that schools can view the data of other schools. If no-one in the school appears to have a password you should consider requesting one from the LA Data team
This FFT report is for the same school as used to illustrate RAISEonline at KS3.

You will notice that overall, the picture also conveys good achievements in mathematics. Notice also that the data is a combination of progress data of students combined over the last 3 years i.e. 2005-2007. Also, more detail is provided about different groups of students e.g. the three different ability groups are identified by gender in FFT but not in RAISEonline.

Both FFT and RAISEonline reports make use of upwards or downwards arrows. These indicate whether or not value added is significantly improving over time or significantly falling over time.

Learner Achievement Tracker (LAT)

Post-16 data only

The Learner Achievement Tracker (LAT) provided by the Learning and Skills Council (LSC) contains Value Added graphs and tables for schools for every post-16 subject including mathematics. Since March 2007, the LAT software has replaced the value added section contained within the sixth form PANDA report. The LAT can be accessed through the website, but teachers will need a password to enter the site. The password was sent to the Headteacher a long time ago and in many schools, this password has gone missing. If this is the case in your school, you need to contact the local LSC for a new one.

The Learner Achievement Tracker produced by the LSC contains valuable information about post-16 value added. These graphs and tables (same school as above) highlight the achievements of students in mathematics.

The achievements of students in mathematics in this exam year are close to inadequate because when taking account of their prior GCSE attainment, half the students make very slow progress and all but one student are well below the median line.

The overall value added line for the school is very close to the outer edge of the accepted range (in blue)

The overall subject graph (for the same school as above) confirms overall value added in maths needs improving because the score is close to the limit of what should reasonably be expected of those students. However, overall summary judgements based on the results of a small number of students in one year should not be considered without looking at patterns of achievement over time or how students are progressing currently in class. Scrutiny of current work and lesson observation findings will help confirm your judgement.

Local Authority data for school improvement

Most Local Authorities have data teams that provide schools with data to support school improvement. This is usually given electronically in the form of a school profile or set of tables and charts. The Headteacher or member of the SMT will know how to access this information. At KS3 and GCSE this will normally include details about attainment and achievements in mathematics since it is a core subject.

LA data can place the school in context to local conditions e.g. Building Schools for the Future restructuring all secondary schools or the if LA itself is in special measures and offering poor support to schools.

The LA also has access to the school’s mathematics KS3 to GCSE Value Added graph with all students plotted on the graph overall and by gender. This would generally only be available on request. Teachers should ask their LA for the “NCER report VA34A for maths”

The LA also has post 16 subject value added graphs that show the progress made by individual students. This graph (from the same school as shown in the LAT) plots students on a national value added graph. About 50% of students will lie between the 2 dotted lines, 25% below the lower dotted line and 25% above the upper dotted line. The thick black line in the middle (median line) represents broadly average progress. In this school most students are at or below the lower quartile confirming low achievement.

Commercially available data providers

There are a number of commercial agencies that provide a data service. Some of the more common ones are:

NfER offer reading assessments, and also CATs assessments

Issues in interpreting data sources

When deciding on what improvements are required, mathematics departments should use the full range of data available to them. Data should be used to make judgements and draw conclusions to identify which groups of students are doing well and which ones are falling behind.

Differences between RAISEonline and FFT

RAISEonline and FFT provide similar data – both refer to attainment and achievement levels at both departmental and student level. FFT provides contextual value added as does RAISEonline but they use different methodologies. The main differences are in the way socio-economic circumstances for each pupil are identified. In the vast majority of schools, RAISEonline and FFT identify similar strengths and weaknesses within mathematics. Sometimes additional important information can be extracted from FFT. For example, FFT provides data and information about value added in maths over a 3 year period combined whereas RAISEonline separates each of the last 3 years. FFT also provides more detailed information about the value added of different groups of children by gender and prior attainment group. RAISEonline provides detailed information about the proportion of students making 2 levels progress in mathematics at both KS3 and GCSE. This data is not available in FFT. These relatively new indicators used mainly by the national strategy are likely to become part of the DCSF performance tables in 2009 or later. Both RAISEonline and FFT provide estimates at subject and student level for future performance at both KS3 and GCSE. Both are useful to help teachers and students set challenging targets. RAISEonline provides estimates at different levels for the top 75% of schools, top 50%, top 25% and top 10%, FFT also provides estimates at different levels, FFT B estimates are based on the previous performance of similar schools whereas FFT D estimates are based on the performance of the top 25% of similar schools. Very broadly speaking, FFT D is similar to the RAISEonline top 50% estimate. Departments who are usually attaining figures that are above FFT D estimates should consider using RAISEonline top 25% or top 10% as these are more challenging. The DCSF has instructed all LAs to encourage schools to use FFT D estimates as a minimum aspiration.

OFSTED and all that

RAISEonline usually contains government statistics although changes made due to appeals and corrections are slow to be reflected in RAISEonline. An OFSTED inspection team will use RAISEonline statistics, judgements from the LAT and the school’s own Self Evaluation Form (SEF) to form the basis of their initial judgements. Initial judgements are reported to the school before an inspection in a report called a Pre-Inspection Briefing (PIB). Here, mathematics attainment and value added is considered and judgements are formed. At this early stage, areas for further investigation during the inspection are identified. Prior to any inspection, the OFSTED team does not have access to FFT or LA data. If the school believes FFT or in - school analysis provides a more accurate reflection of the strengths and weaknesses of the department then this should be referred to and made clear in the school or departmental SEF. If there are anomalies between RAISEonline and FFT, the Department should attempt to identify why this has happened e.g. RAISE might only be able to match 85% of students GCSE results to their KS3 results whereas FFT has matched 93%!

Data in School

This section begins to answer some of the questions that Subject Leaders come up against. What data might I find in school? What data have other departments found useful? Do I need to generate data?

What data is there in school?

New subject leaders are likely to find that someone in the Leadership Team(LT) has carried out a subject analysis of mathematics at KS3 and GCSE performance using a range of data including RAISEonline, FFT and internal data. Internal tracking data is becoming more sophisticated as IT systems improve. Typically this will be a student tracking system that uses the school’s management information system or an Excel spreadsheet. Tracking usually monitors the progress of students towards their targets. Targets can be set annually or by Key Stage. Teachers generally identify current levels of attainment, predicted levels of attainment or identify whether or not a target will be met. In any event, the software usually produces a list of students identified as underachieving. Identifying underachieving students by the Leadership Team, Subject Leader and class teachers is the aim of data analysis so that support can be put in place to allow these students to catch up.

What data do you collect and store?

The Subject Leader should make themselves aware of any analysis of strengths and weaknesses made on the department over the last 3-5 years. This should include old OFSTED or Local Authority monitoring of the mathematics department. This will enable the Subject Leader to make more accurate judgements about improvements over time. A departmental file should be kept on standards and achievements as well as any lesson monitoring information. It is sensible to keep information on students whilst they are in the school. When they have left it would be appropriate to destroy their records in an appropriate manner, being mindful of data protection issues. At any point in time a Subject Leader should be able to refer to current records to answer questions about:
  • the current attainment of a particular student
  • the progress made by a particular student
  • the target grade/levels of each student for the end of the year or key stage
  • whether or not a student is performing above or below expectations
  • any particular strategy that a student is involved in e.g. attending after school revision or Saturday morning clubs.

Task - Review your documentation

What data is available in school, do you have access to all of it? If not ask.

What data are you holding in the department?

Check through and discard any data held unnecessarily, i.e. held for longer than five years, or for pupils who have left the school.

Does the data that you hold enable you to answer these five questions?

  1. What is the current attainment of a particular student?
  2. How much progress has been made by a particular student?
  3. What is the target grade/level of each student for the end of the year or key stage?
  4. Is a particular student is performing above or below expectations?
  5. Is a student involved in any particular strategy for support or enrichment e.g. attending after school revision or Saturday morning clubs?

If it doesn’t, talk to your Leadership Team or to your Local Authority Data Team in order to obtain advice on improvements.

 Back to top

3. What use is data?

Many people involved in school improvements have highlighted the use of data over many years. In this section we will discuss how the various sources of data can be used in school help improve the results in a mathematics department. It is important to bear in mind that the idea of analysing data is to make sure that each and every student is achieving at the best rate they can and to pick up any problems and issues before they have a serious effect on students’ life chances. Click the title of the area that you would like to study: What can you use data to do?

At subject level, data will indicate the strengths and weaknesses of the department. This will enable the Subject Leader to identify strategies for improvement and direct resources more effectively. For example, Learning Mentors or teaching assistants can be directed more to the area of need. Departmental development plans will be more sharply focussed on need and more directly linked to outcomes.

At student level, regular tracking data can indicate whether or not a student is falling behind. Catch up strategies can then be put in place. Tracking data can also be used to measure the impact of strategies that have been put in place.

Tracking data must not be confused with day to day assessment information that tells the teacher whether or not the student has made gains in mathematical knowledge, skills or understanding on a daily basis. Good use of this day to day assessment information will improve standards at a faster rate than any other strategy!

Data will not:

  • always be ‘right’: check and double check;
  • offer solutions or answers to questions;
  • take account of what is happening around the school e.g. significant refurbishment, staff absences or resignations.

Task - Look at the data you have and how you use it

Does it show how the students in your department are progressing?

Does it show what strategies are already in place to support pupils ‘at risk’ of underachieving? Don’t forget pupils are at risk of underachieving at all levels including those who gain high results in mathematics.

How often do you review the data – is it often enough to pick up students who are beginning to underachieve?

Consider how you can improve your tracking records. 

Is one data source more informative than another?

In about 95% of schools, FFT and RAISEonline judgements are not significantly different so using both data sets is appropriate for the majority of cases.

Tracking data is normally based on teacher’s own assessments. This can be determined from a range of activities and questions and from a wider range of contexts than exists within tests. This data is therefore very important in determining standards and achievement between key stages. Test and examination data measure attainment of students at a point in time - in a different week the attainment results may differ.

The data is only ever as good as the test, so care will be needed if you are basing judgements on tests that have been constructed in school. Experienced teachers’ day on day assessment of progress will often be more informative of real progress especially if the department use a moderation process to ensure that all are making similar judgements.

Task – how can you ensure that all your teachers are making similar judgements?

Which of the following might be useful in your department to ensure all teachers are making similar judgements?

  • a department scrutiny of work done during a four-week topic agreeing together the level attained by particular student.
  • double marking of a test piece of work or exam followed by discussion of discrepancies.
  • discussion of the level shown in portfolios of work produced by a selection of students over time.

Consider how you can be as sure as possible that everyone in your department has a similar notion of what constitutes each level of attainment.

How can internal data be made really useful?

Internal data is used most effectively when:

  • maths teachers meet regularly to moderate work so that they have a good understanding of standards.
  • students are involved in discussing and understanding their own standards and in setting their own targets for improvement.
  • clear lines of accountability exist so that Subject Leaders regularly talk to teachers about the students. Members of the LT talk to subject Leaders about the students. The Headteacher talks to LT members about the students. These conversations are held in a supportive way so that all concerned feel supported in the move to raise attainment.
  • Subject Leaders and LT members are trained in the interpretation and the use and limitations of data.
  • the data identifies weaknesses in the students’ grasp of aspects of the subject so that the teacher can remedy the situation. Item level analysis from RAISEonline will support this process.

Task - Identifying problems and doing something about it

When the data indicates that some students are not making the progress that might have been expected of them it will not tell you what to do about it! However with careful scrutiny you should be able to discover if it is a problem with:

  • individual students themselves;
  • a teacher or teachers within the department; or
  • something that you can do nothing about.

How support is offered will depend on which categories the problem lies within. Remember there are problems you can do nothing about e.g. if you have tried all possibilities to recruit a new teacher and failed to secure adequate staffing then teaching for some of your students will be less than ideal. However, there may be ways to support such groups, using a qualified teacher for one lesson a week and unqualified for the other two, or making use of your most experienced teaching assistant are possibilities.

  • Consider what you are going to do to help support the underachieving students.
  • Ensure that your decisions are recorded so that you can use tracking data to decide which strategy makes most difference for your pupils in your school.

 Back to top

4. Developing understanding of sampling

All data are the result of sampling in some way. When discussing data it is important to understand something about sampling and the bias and error that can creep in.
  • The National Key Stage Tests examine a sample of the curriculum and attempt to change that sample year on year, although some aspects will appear every year. School constructed tests will, possibly, take less account of this issue than the NAA does in setting the tests.
  • The national key stage tests and other department tests attempt to examine the whole of the population in school but will only examine the pupils who attend on a particular day. If there are absentees on the day for whatever reason then the people who take the test will be a sample of the whole population.
  • Tests also occur on a certain day at a certain time, fire drills, accidents and other dramatic or traumatic incidents can make the sample time unrepresentative of the pupils’ normal school experience.

Considerations such as the ones above can introduce bias and error

Bias and Error

The sample that is taken can introduce bias and error. A sample is expected to mirror the population from which it comes; however, there is no guarantee that any sample will be precisely representative of the population. Chance may dictate that a disproportionate number of untypical observations will be made. In practice, it is rarely known when a sample is unrepresentative.

The more dangerous error is the less obvious sampling error against which there is very little protection. An example would be a sample in which the average height of a population is overstated by an inch or two because the sample that was used contained a greater proportion of tall people than the population itself. It is the unobvious error that should be of much concern.

There are two basic causes for sampling error.

One is chance: that is the error that occurs just because of bad luck. This may result in untypical choices. Unusual units in a population do exist and there is always a possibility that an abnormally large number of them will be chosen. The main protection against this kind of error is to use a large enough sample.

The second cause of sampling is sampling bias. Sampling bias is a tendency to favour the selection of units that have particular characteristics. Sampling bias is usually the result of a poor sampling plan. The most notable is the bias of non response when for some reason some units have no chance of appearing. An example would be where you would like to know the average income of some community and you decide to use the telephone numbers to select a sample of the total population in a locality where only the rich and middle class households have telephone lines. You will end up with high average income which will lead to the wrong policy decisions.

A non sampling error is an error that results solely from the manner in which the observations are made. The simplest example of non sampling error is inaccurate measurements due to malfunctioning instruments or poor procedures. For example, Consider the observation of human weights. If persons are asked to state their own weights themselves, no two answers will be of equal reliability. An individual’s weight fluctuates diurnally by several pounds, so that the time of weighing will affect the answer. Responses therefore will not be of comparable validity unless all persons are weighed under the same circumstances. 

Task - Consider how these causes of bias and error could occur in the data that you may use in school.

Folio 6.3.1 gives some suggestions

The Fisher Family Trust data uses data from three consecutive years in its comparisons: how could this help overcome some sources of error?

 Back to top

5. Raising and answering questions

This section deals with the questions that data will raise for mathematics departments and hwo to begin to find answers for those questions. Click on the title of the section that you wish to study: What outcomes can be explained by data?

The following are examples of questions that data may raise for the mathematics department:

  • Which students are falling behind and which ones are progressing well?
  • Are there any topics that seem to be taught less well (data here may include the statistics related to lesson observations or item level analysis from tests)?
  • Are there any groups of students that are underachieving say in a particular class?
  • Are the strategies to improve attainment and achievement effective?
  • If those eight mobile students are removed from the data what will be the effect on the results overall?

All activities within a school have an impact on standards and achievements, some more than others. The key is for Subject Leaders to understand the impact of what they do. This is a very difficult area and often not well researched or understood. For example, if a group of 15 boys in Y11 performs better than expected, is it because of the Learning Mentor, the extra revision classes, the day-to-day teaching, the support of parents, the involvement of the LA Subject Consultant working with the group, or was it a combination of all of these activities. If so, how can you find out which activity or combination of activities is best?

To try to find out what activities or strategies are most effective in raising achievement, student tracking must be well established. If this is collated regularly and judgements are sound then the department can begin to explain many of the outcomes of support strategies by using existing data. If a group of students are identified as falling behind and then in the next assessment they are not, it is likely that the department has put in place a recovery plan that was effective.
This aspect has a clear link with Module 5 – Action Research.

Task – Answering Questions

 Use Folio 6.5.1 to review possible answers to questions raised by data.

What questions does your data raise? Work with a more experienced colleague to find answers to your questions.

What further information should subject leaders seek to answer the questions that may be raised by data?

 There are many ways that a subject leader can use to provide information that supports or questions conclusions reached by data scrutiny. The following are some suggestions:

  • Information on standards observed from departmental work scrutiny, particularly of a topic that is identified as not done well in item level analysis of Key Stage 3 tests or GCSE examinations. This is best done with Subject Leader, Line Manager and all department teachers. Including all the teachers ensures they are familiar with the process and aids their understanding of how judgements are reached. Also departmental members begin to take ownership of the process.
  • Generally, most weaknesses identified in the data can be explained through what is happening in lessons, so lesson observation data is important. If achievement and standards are weak over a period of time or at a key stage or with a particular group, say boys, then lesson observation monitoring should be flexible enough to accommodate observations of the weaknesses identified. For example, if boys in KS3 maths is a weakness, then a question relating to boys’ progress should be inserted into the classroom observation prompt sheet, if it’s not there already.
  • Feedback from students themselves can provide the Subject Leader with very valuable information. This can be through structured interview or discussion with a group of students or through surveys. Students can usually pinpoint the problems associated with their learning very accurately if they are asked the right question. A number of commercial surveys are available to use that provide good quality analysis of feedback.
  • If a number of students have been identified as underachieving and strategies put in place to remedy the situation, ask the students if they have noticed any difference to their learning experiences. This may help you decide the impact you are having on improving progress.

Task - Looking for answers elsewhere

Often, the students themselves can identify how learning will improve so ask them!

Members of the department may have ideas about improvements but have never been asked so ask them!

The Subject Leader’s Line Manager should be able to support the Subject Leader in improving areas of weakness so ask them!

Ask other schools - Many LAs are placing schools in families or groups of similar schools. If you discover a similar school to you is performing much better in terms of attainment or CVA, then make arrangements to visit them and find out what they are doing. One tip or idea alone may justify the visit.

Ask the parents - Sometimes, the parents will have a view about how their child can improve their work. Make arrangements to invite parents to school as a group to discuss how both parents and the school might improve their children’s attainment levels. This might be linked to use of software. For example MyMaths or SAM learning (both web-based) could be used at home and parents might not know about this.

 Back to top

6. Adapting departmental practice to improve learning

Task – Making a plan to make a difference

Using all the data available to you, including that from lesson observations, pupil interviews, previous departmental development plans and so on, answer the following questions:

  • where are you now?
  • what needs improving?
  • what practices and systems need changing?
  • where will you start?

Make an Action Plan or new Subject Development Plan to move from where you are now to where you want to be. See Module 5 - Planning to bring about change for help with making your plan.

 Back to top

7. Reviewing the Module

In completing this module, you will have considered:

1. Getting the terminology right

2. What data is available?
  • the different sources of data
  • the issues in interpreting data
  • data in School
3. What use is data?
  • what can you use data to do?
  • is one source of data more informative than another?
  • how can internal data be made really useful?
4. Developing understanding of sampling

5. Using data to explain, justify and raise questions
  • what outcomes can be explained by data?
  • what further information should subject leaders seek to answer questions?
6. Adapting departmental practice to improve learning

Each of these sections is intended to add to your knowledge of how to use pupil data in order to adapt current departmental practice to help bring about improvements.

Review the ideas that you have gained from each of the sections and reflect on which of the sections will make a short term difference to your department and which will add to your long term plans.

Use Folio 6.7.1 Top 10 Tips in Using Data to make sure you have thought about all the important aspects of using data.

 Back to top



Comment on this item  
Add to your NCETM favourites
Remove from your NCETM favourites
Add a note on this item
Recommend to a friend
Comment on this item
Send to printer
Request a reminder of this item
Cancel a reminder of this item



14 October 2010 14:47
An Excellent Resource
Only registered users may comment. Log in to comment