Www.gov.scot



SURVEYS OF EDUCATIONAL ATTAINMENT: TIMSS AND PIRLS BackgroundFollowing officials’ meeting with the Cabinet Secretary on 6 October 2015 to discuss surveys being used in the Scottish education system, further briefing was offered on the value of two surveys, administered by the International Association for Evaluation of Educational Achievement (IEA), from which Scotland withdrew in 2010. The Opposition have also signalled that they will table a Stage 2 amendment to the Education Bill compelling the Scottish Government to participate in TIMSS, PIRLS and PISA in future.Overview of current positionThe decision was taken to withdraw from the Trends in Mathematics and Science Study (TIMSS) and the Progress in Reading & Literacy Study (PIRLS) on a number of grounds: A desire to rationalise the surveys undertaken by SG, with the Programme for International Student Assessment (PISA) being the continuing focus for external comparison. PISA covers Reading, Maths and Science and reports every three years.Savings in school time and public money. Withdrawal from the 2011 rounds of TIMSS and PIRLS saved ?850,000 and 12,000 students were not required to sit assessments. (Subsequent savings may be larger than this because the surveys now fall in different years and there is potentially increasing reliance on electronic means of testing. Our working assumption is that TIMSS would cost ?1m for a four-yearly cycle, and PIRLS ?500k for five-yearly cycle)Coverage of countries at the time. TIMSS and PIRLS cover most, but not all, of the OECD, but also do not cover certain partner countries, such as China and Brazil. PISA 2018 is currently anticipating a new peak in participation of 80 countries.In addition, we were comfortable to concentrate on PISA for a number of reasons:Assessment against an externally validated set of skills, whereas as TIMSS and PIRLS are influenced by a country’s curriculum. It is thus arguable whether TIMSS and PIRLS a true comparison of globally-relevant skills as opposed to a system’s ability to teach its own curriculum.Questions about whether TIMSS and PIRLS control for differences in amount of years studied - a greater difference at age 9-10 than at age 15 (PISA)PISA is targeted at age 15 to be an assessment of the impact of the whole of compulsory education for students in the developed world. Students at age 9-10 may be on differing paths compared to other countries, while hoping to end up at a similar level of abilityPISA, PIRLS and TIMSS results comparedClearly many countries find useful information in TIMSS and PIRLS. Their measurement of social background is arguably not as comprehensive as PISA’s. However they can provide assessment information at an earlier stage than the age of 15 (“4th grade” is equivalent to an age 9-10), as well as contextual information on schools, teaching and home environment. One justification for testing at an earlier stage would be to see whether a country is “on track” to ensure its school-leavers are competitive with the rest of the world.We can test this assumption by looking at how countries’ 15 year-olds performed in PISA 2012 – and how this relates to what TIMSS and PIRLS said about that same group of pupils aged 9-10.In Table 1 we look at the comparisons between PISA 2012 for Reading and PIRLS 2006 on the left hand side. The PIRLS group was roughly a year older than the group which went on to sit PISA six years later. On the right hand side the comparison is between PISA 2012 for Maths and TIMSS 2007. The TIMSS group were actually the same group which were surveyed by PISA in 2012. Scotland is included in this table as we were still participants in TIMSS and PIRLS at the time.The countries are put in order of their test performance, although this should not be seen as a ranking as many countries are actually statistically similar. We have attempted to separate them into broad categories - those who perform significantly above the average, similar to the average, and below the average. We have marked in bold those countries who appear to perform differently in another survey, but also additionally highlighted them where this survey is the one which suggests “better” performance than the other survey.Table 1: PISA 2012 compared to PIRLS 2006 and TIMSS 2007PISA (Reading) 2012PIRLS 2006 (4th grade)PISA (Maths) 2012TIMSS 2007 (4th grade)Above AverageHong Kong, Singapore, British Columbia (Canada), Ontario (Canada), Chinese Tapei, Quebec (Canada), Poland, Belgium (Flemish), New Zealand, Netherlands, Quebec (Canada) Germany, Nova Scotia (Canada), Scotland, France, NorwayRussia, Hong Kong, Alberta (Canada), Singapore, British Columbia (Canada), Luxembourg, Italy, Hungary, Sweden, Germany, Netherlands, Belgium (Flemish), Bulgaria, Denmark, Nova Scotia (Canada), Latvia, US, England, Austria, Lithuania, Chinese Tapei, Quebec (Canada), New Zealand, Slovakia, Scotland, France, Slovenia, Poland, Spain, Israel, Iceland Singapore, Hong Kong, Chinese Tapei, Japan, Quebec (Canada), Netherlands, British Columbia (Canada), Alberta (Canada), Germany, Ontario (Canada), Austria, Australia, Slovenia, Denmark, New ZealandHong Kong, Singapore, Chinese Tapei, Japan, Kazakhstan, Russia, England, Latvia, Netherlands, Lithuania, US, Germany, Denmark, Quebec (Canada), Ontario (Canada), Australia, Hungary, Italy, Austria AverageEngland, US, Belgium (French), Denmark, Belgium (French), NorwayCzech Republic, Scotland, England, Latvia, Norway Alberta (Canada), British Columbia (Canada), Sweden, Slovenia, Slovakia Below averageItaly, Austria, Latvia, Hungary, Spain, Luxembourg, Israel, Sweden, Iceland, Slovenia, Lithuania, Russia, Slovakia, Romania, Bulgaria, Indonesia, QatarRomania, Indonesia, Qatar, Italy, Russia, Slovakia, US, Lithuania, Sweden, Hungary, Dubai (UAE), Kazakhstan, Tunisia, Qatar, Scotland, New Zealand, Czech Republic, Norway, Dubai (UAE), Columbia, Tunisia, QatarWe find there is some representation of OECD countries performing above the TIMSS and PIRLS averages (particularly for PIRLS), and below the average in PISA. This may be because the “average” in PISA is the OECD’s average, rather than that of all participants, so perhaps a higher benchmark than in TIMSS or PIRLS. Nonetheless Russia is a very strong performer, including “topping the table” in PIRLS. However this performance was not borne out in PISA where they were below average in 2012. Conversely, countries like New Zealand, Slovenia and Scotland appeared to do better in PISA than in TIMSS.It may be that, for this latter group, results in TIMSS and PIRLS drove improvement activity which bore fruit 5-6 years later in the 2012 round of PISA. However, it is arguably true that some countries simply are better suited to TIMSS and PIRLS, Russia for example (who also performed well in TIMSS 2011 but not PISA 2012 in Maths), whereas countries such as New Zealand (and perhaps Scotland) seem to find PISA more forgiving. That said, it seems that the Far East countries such as Singapore do well in both surveys.We conducted similar analysis comparing the 2011 PIRLS and TIMSS results at age 9-10 in 2011 with PISA 2012 and, here again, there are examples of countries performing differently across surveys, particularly for TIMSS. They were for different age groups, but even where we looked at the smaller number of countries who surveyed 14 year-olds in TIMSS in 2011 – who then went on to sit PISA in 2012 – we saw countries performing well in TIMSS, and not PISA, but also vice versa. Again Russia seems to do well in TIMSS, and not PISA, and New Zealand may be in the opposite camp.Overall though it seems that PIRLS and TIMSS were variable predictors of future performance of those pupils in PISA, or indeed limited confirmation of performance for similar-aged pupils. We await the PISA 2015 results to see if this variable relationship continues. What it may suggest is that the IEA surveys and PISA are essentially measuring different things and countries are able to perform well in one survey, and not the other.ConclusionWhile many countries do well in PISA and other surveys, the IEA surveys taken at age 9-10 are not always an effective predictor of where a country’s performance will be when students leave school. Combined with concerns about the validity of direct comparisons at age 9-10, this suggests that the main arguments for participation would be as follows:A snapshot of pupil performance combined with the contextual information gathered by TIMSS and PIRLS.An external verification of data gathered internally (principally the National Performance Framework).An additional international perspective which can balance PISA to some extent15.More generally though, it would seem that TIMSS and PIRLS can only be used sparingly as pointer to future PISA performance. The fact we withdrew from them in 2010 is based on concerns we had for validity of the comparisons that they make on performance and our use of resources. Allowing ourselves to be mandated to participate in these surveys (or indeed existing surveys such as PISA) would in no way address any concerns we would have about their methodology or appropriateness for the Scottish education system and constrain our freedom of manoeuvre to collect the best data in the most efficient way.Learning AnalysisNovember 2015 ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download