Image description
| — ¶¶Òõ¾«Æ·

THE results of the 2025 Higher Secondary Certificate and equivalent examinations released a couple of weeks ago ‘jolted’ the nation. These conventional measures of student learning have been called a ‘debacle’, which should also serve as a ‘wake-up’ call for all. It’s a call for no-nonsense (re)thinking of education quality, student learning, and assessment that produces artefacts such as grades and GPAs.

The social shock was caused by some key statistics. The overall pass rate was 58.83 per cent which was the lowest in the past 21 years in the country. It also marked a significant drop from last year when it was 77.78 per cent. This means over 41 per cent students failed the exam this year.


The results are not just about student failure. The numbers are a diagnosis of our education system, which itself has been questioned. It is also a failure of schools and teachers — and probably private tutors and coaching centres as well. Failed students and their parents will bear the brunt of the results.

The second important statistic is that most student failures occurred in English and mathematics. There may be nothing new about student performance on these two critical subjects. I will mainly talk about English, as this is my area of research interest.

While the HSC results stirred almost the entire nation, one section of the population would have been in a mood of silent celebration. Every untoward happening since Hasina’s fall in August 2024 has given this section of our society a precious moment of joy and a sense of retribution. Their collective logic and judgement might be something like, ‘We deserved nothing better after what was done to Hasina’, their undisputed embodiment of democracy, development, and progress. They would probably draw our attention to the HSC pass rate, which was never below 75 per cent during the rule of Hasina and her education minister Dipu Moni.

The interim government provided an unflattering explanation for the lower pass rate. The gist of the account is the Hasina government manipulated numbers in every field, including education. During her regime, higher pass rates in public examinations were not true indicators of student learning. The numbers were manufactured to create a positive image of her government as a way of fighting the legitimacy crisis.

Often, directives were given to different departments to indicate higher levels of performance. Teachers and education boards were also brought under such directives. It showed a particular ontological and epistemological stance on numbers and statistics. For the regime, numbers were not only elastic but also loosely related to facts.

An article of mine that appeared in ¶¶Òõ¾«Æ· on 29 August 2022, when Hasina was at the height of her power, began with our national attitude towards numbers:

‘As a nation, we probably need an honest interrogation of our attitudes towards numbers. The search may zero in on how we produce our numbers, what value we attach to them, and how ethical our numerical engagement is. We also need to reflect on how other nations might have been treating us given our stance on numbers and number-making.’

Indeed, one qualitative difference between Hasina’s and the current government is that the latter does not brag about development or progress. We haven’t heard many stories of magical development since she fled the country. It is as if the agent who controlled the material, discursive, and digital display of development propaganda from a distance switched it off.

The interim government’s explanation of the HSC pass rate appears trustworthy. As they don’t suffer from legitimacy issues, they don’t have to hide, inflate, or deflate numbers or manufacture truths. The education advisor claimed that the current pass rate was a true measure of students’ learning. Un-spiced truth may be hard to swallow, but it is better than the illusion of ‘fake truth’, also known as ‘post-truth’.

The current HSC pass rate may not be the product of political directives. However, that does not necessarily mean that there is a clear correspondence between the pass rate and students’ true achievement of learning. This point can be clarified by referring to the English subject.

My PhD research investigated secondary school students’ English learning experiences and outcomes in rural Bangladesh in 2006-2007. As part of the research, I examined students’ performance on the SSC examination, which showed that most students failed in English and mathematics. So, students’ performance on these two subjects in the recent HSC examination has a long history.

There are many views of English, English learning, standards of English, and students’ ability to communicate in English in Bangladesh. In my humble opinion, the overall English proficiency level is increasing day by day. This is not because of the improved quality of English teaching but because of the many opportunities for learning English that are now available outside the classroom. In fact, language learning beyond the classroom has emerged as a new area of research which has documented many success stories of students developing genuine proficiency in languages such as English, Arabic, Korean, and Japanese.

For English learning outside the curriculum in Bangladesh, the key question is whether students have access to online resources through their family and whether they are keen on utilising them. This access is acutely unequal given the socioeconomic and geographic divides in the country. Learning English or another language outside the curriculum is likely to develop students’ functional ability. Although this learning can help them at school and during school-leaving examinations, our English language assessment is unlikely to capture students’ ability to communicate in English in real-life contexts. The reading- and writing-focused pen-and-paper test, which excludes listening and writing skills, cannot provide any sensible measure of English language ability.

Indeed, what is measured by school English tests or public examinations is a question that has never been asked seriously. How the exams assess what they assess is another question worth considering. In our research, we have argued that assessment in Bangladesh is often taken as an academic ritual which does not explicate the what, the how, or the why of assessing.

The rise and fall of numbers that claim to measure student learning should rightly affect us, but shouldn’t we also be interested in knowing what is measured and why and how it is measured, and whether the numbers have any educational, social, or practical value?

It will be hard to claim that school or school-leaving exams are a true measure of students’ English proficiency. Therefore, grades obtained by students — whether A, B, or C — may not say much about their relationship with the global language. Students learn English for 12 years at a huge cost of the curricular space and scant national resources, but we hardly ever get to know how such investment relates to their lives or society.Ìý

My language testing research has a critical, socio-political focus. I am not a fan of tests or testing.Ìý However, I find it unacceptable that the Bangladeshi education system has not done much about meaningfully assessing students’ English learning at different levels of education. Many Asian countries have taken their language assessment seriously, borrowing ideas, theories and experiences from other parts of the world. Bangladesh needs to move forward fast and introduce English language tests which will tell more about our students’ proficiency and their ability to use the language for educational and social goals.

Ìý

Obaidul Hamid is an associate professor at the University of Queensland in Australia. He researches language, education, and society in the developing world. He is a co-editor of Current Issues in Language Planning.