Using global evaluations, Malaysia is aiming to be on par with developed countries in education - but how far do we need to go?
UNVEILED just last Tuesday, the Preliminary Report Malaysia Education Blueprint 2013 — 2025 has already drawn much talk on the merits of its contents.
Being just a “preliminary report”, the blueprint’s policies and initiatives will not be confirmed until it is taken to the Cabinet in December.
While the Education Ministry has had long-term planning documents before, the striking thing about this blueprint is its honesty in rating the current education system.
A clear example of this is the acknowledgement of Malaysia’s unsatisfactory performance in two international assessments - the Trends in International Mathematics and Science Study (TIMSS) and the Programme for International Student Assessment (Pisa).
As such, a main goal of the blueprint is to transform the education system to make sure that every student can get quality education that matches global standards.
TIMSS is a project by the International Association for the Evaluation of Educational Achievement (IEA), which describes itself as an independent cooperative of education research institutions and government research agencies.
First conducted in 1995, TIMSS has been carried out every four years for fourth (Year Four) and eighth grade (Form Two) students around the world.
Malaysia has participated in the survey since 1999 with only Form Two cohorts.
Meanwhile, Pisa is administered by the Organisation for Economic Co-operation and Development (OECD) every three years on 15-year-olds in both OECD and non-OECD countries.
The assessments have been conducted since 2000, with Malaysia taking part for the first time in 2009.
Each round of assessment focuses on one area of either Reading, Mathematics or Science — the remaining two areas are still included, but less thoroughly.
For example, in Pisa 2009, Reading was the main “domain” of focus, while this year’s assessments focused on Mathematics.
Students sit for a two-hour test (with a short break after an hour) with a mix of multiple choice and subjective questions.
Students also have to fill out a questionnaire on their socio-economic backgrounds and their attitude towards school, while school heads complete a survey offering an insight into the school’s environment and location.
The main difference between TIMSS and Pisa is the scope of questions students are tested on.
While TIMSS focuses on curriculum content that is shared by participating countries, Pisa places emphasis on testing the ability of students to apply their knowledge to real-world scenarios.
Researchers however say that there is a strong correlation to how a student performs on both tests; if you do well in one assessment, you are also likely do well in the other.
Both TIMSS and Pisa offer students questions in the main language of instruction in their respective countries.
For TIMSS, Malaysia has been recording a steady decline in Mathematics, scoring 519 points in 1999 and dropping to 474 points by 2007.
Students’ performance in Science over the years is more mixed; starting with 492 points in 1999; to 510 points in 2003; and finally 471 points in 2007.
For both subjects, Malaysia dropped below the international average in 2007 after being above average in previous years.
Malaysia’s showing in the Pisa 2009 is relatively more alarming.
Around 60% of students failed to meet the minimum Mathematics standards, meaning that they were unable to handle basic concepts and formulae.
For reading, 44% of students were below the minimum standards as they were either unable to recognise the main idea of the reading text and compare the information provided with everyday knowledge.
In Science, 43% of students failed to reach the minimum standard; while they had some Science knowledge, they could only apply it to a few familiar situations.
Overall, Malaysia ranked around 100 points in Pisa 2009 compared to countries such as Singapore, Japan and South Korea – every 38 points is equivalent to a year of schooling.
Why are we falling behind?
An interesting fact brought up by the blueprint is to what extent the national curriculum overlaps with the content assessed by TIMSS.
While the overlap of curriculum content for Science remains at 90%, it is a different story for Mathematics — from a 98% shared curriculum content in 2003, this overlap dropped to 75% in 2007.
The blueprint brings up the discrepancy between students’ performance in national examinations compared to the international assessments.
For instance, in the 2008 Penilaian Menengah Rendah (PMR), 30% of students received an A in Mathematics; in contrast, only 2% of students who sat for the TIMSS 2007 were rated as “advanced level”.
There are also stark differences in the grading of Sijil Pelajaran Malaysia (SPM) English papers, where students receive a grade under both the local marking scheme as well as the external Cambridge English standards.
Under the Cambridge grading scale, 50% of last year’s SPM-takers failed to achieve minimum standards.
Under the Malaysian marking scale, only 20% were deemed to have failed the paper.
In between the blueprint’s commendable ideals and impressive plans, there is a particularly glaring goal — by 2015, all schools should meet basic infrastructure requirements.
These basic requirements include access to clean and treated water, at least 12-hours of electricity per day, as well as sufficient toilets, classrooms, tables and chairs for students and teachers.
According to statistics provided in the blueprint, the Education Ministry had spent RM20bil on school infrastructure alone between the years 2006 and 2010.
The money spent included the building of 400 new schools, expanding ICT infrastructure, maintenance work for existing schools, and providing 24-hour electricity service for 473 schools.
However, the blueprint also points out some rather dire facts.
Some 300 schools still lack adequate electricity access; 1,500 schools do not have clean drinking water; 2,700 schools have no computer laboratories; and 2,000 schools do not have functioning science laboratories.
While these unfortunate statistics only represent a fraction of the 10,000 schools nationwide, children studying in these schools will face an uphill battle to perform well academically.
However, the blueprint maintains that “only 10% of the Malaysian variance between schools in Pisa 2009 assessment can be explained by socio-economic factors, as compared to the OECD average of 55%”.
The breakdown of schools which participated in both TIMSS and Pisa provided in the blueprint implies that a major factor of students’ performance is in the schools themselves.
For example, only 16% of the schools which participated in TIMSS 2007 were classified as “poor”. Of the 152 schools who participated in Pisa 2009, 80% were classified as “poor”.
It is unclear to what extent the “poor” category matches the criteria of the different school bands set out by the National Key Result Areas.
When it comes to the actual teaching of Mathematics and Science here, the popular debate has been over the medium of instruction – rather than the way these subjects are taught in the first place.
Arguably, the foundation of a good Mathematics and Science education should start in primary school itself.
Under the Primary School Integrated Curriculum (KBSR) timetable in 1999, pupils were allocated 210 minutes of Mathematics per week — Science was not formally introduced as a subject until Year Four for 150 minutes a week.
Instruction for both the Bahasa Malaysia and English languages took up a total of 690 minutes a week, while pupils spent 150 minutes on either Islamic or Moral Studies
In the new Standard Curriculum for Primary Schools (KSSR) syllabus which started with Year One pupils in 2011, Science is taught for 60 minutes a week.
But time spent on Mathematics is reduced to 180 minutes a week — the same amount of time allocated for Islamic or Moral Studies.
Pupils also now spend 360 minutes for Bahasa Malaysia and 300 minutes for English language per week.
While we may be inclined to immediately assess our failings against better performing countries, it would not be wise to place a one-size-fits-all approach, especially when it comes to education.
Topping the list in the Pisa 2009, Shanghai has been heralded as a benchmark that we should aspire to.
Though a sizable city with a 13 million population, Shanghai is hardly representative of the rest of China.
What may be a lesson to others is that the city has aimed to raise its standards across the board by upgrading all its schools and classrooms, rather than just pouring resources into a few privileged schools as it did before.
South Korea is another high achiever in international assessments; though this is a credit to the country’s education system, South Korea’s reliance on after-school tuition and coaching cannot be ignored as well.
A recent Asian Development Bank report on “shadow education”, meaning the private tutoring industry, revealed that 90% of secondary school students in Korea receive private tutoring.
Meanwhile, studies dissecting Pisa and TIMSS scores in the United States claim that the country’s relatively poor showing was due to drastic inequality among its schools in terms of resources, qualified and experienced teachers, as well as the socio-economic background of the students themselves.
Academic critics of international assessments point out that such tests do not by themselves explain the complex reasons as to why students underperform in Mathematics and Science, and question the validity of a standardised assessment across diverse cultures and schooling systems.
Some further claim that the reason some countries perform poorly may be due to the students’ perception of the tests.
For instance, Scandinavian students have been said to attach little meaning to Pisa tests, while parents in Taiwan rally outside schools to motivate their children to do well on testing day.
Based on calls made to several schools here, it would seem that some principals are unaware that their students had indeed sat for this year’s Pisa test.
A ministry official who was in charge of invigilating the test meanwhile, says that Pisa questions were suitable for students.
However, she adds: “I think students will have to be at least ‘B’ standard to handle the questions .... Let’s say there are four classes in a form — only the top two classes will be able to handle the questions.”
There is also a concern among some educators that in the race to benchmark ourselves against international standards, examination achievement may be seen as a means to an end in itself.
As one local education researcher puts it: “While the international assessments may be useful to see where and how we can improve, I hope that this just doesn’t lead to even more testing pressure for students and teachers — the tests should be used as guides rather than absolute indicators of a good education.”
A senior Science teacher agrees with this sentiment, while conceding that Malaysia’s showing in international assessments points to some serious problems in the system.
“On one hand we say that we’re trying to move away from excessive focus on examinations to create more holistic students; at the same time, we want to overhaul the education system based on test scores,” he adds.
The level of discussion and hype over the blueprint is proof that Malaysians do place a great deal of care and emphasis on education.
The blueprint even notes that 95% of students believe that their schooling is preparing them for dealing with life’s challenges.
The question now is whether we can work towards realising the blueprint’s goals to meet the expectations of our students, and truly deliver an education that they deserve.