Ever since I started these reviews back in 2009, I've been struck by just how many language testing stories get into the world press. But this year hit a new height, with the top investigative journalism programme Panorama going undercover to investigate cheating and fraud in test centres. But I'd prefer to start on a lighter hearted note to get into the spirit of Christmas and the New Year.
Exam Stress My first story this year involves my very own institution, the University of Leicester. And it's a story that follows on from the 2013 review remarkably well. Last year I reported on growing concerns with examination stress and the creation of novel approaches to help students release pressure just before they take tests. This year, my University tackled this problem by setting up bubble-wrap popping stations around the University, complemented by a petting-zoo, where nervous students could cuddle furry creatures to release that pent up test anxiety. Might help the students, but I'm not sure what it'll do to the menagerie of bunnies, hamsters, lambs and goats. Being around stressed out students certainly puts me out of sorts.
Obviously, this was a gift for the BBC news, but also for the satirical news review, The Now Show, which included this on the 23rd May. It is rather unkind to Leicester, which you will know from the photos on my personal page, is actually a very lovely area. That's my little bit of defence. Now to the clip.
The Panorama Story
In February this year the UK's leading investigative journalism television programme broke the news that one of their undercover teams had filmed cheating on the TOEIC test from Educational Testing Service (ETS), in a London College acting as a test centre. The cheating included ghosting (having someone sit the test for a real candidate who wouldn't be able to pass), and the invigilators (proctors) reading out the answers. The British Broadcasting Corporation had the story on their website and news bulletins for days. The reason this hit the headlines is of course because the government has made language testing a key criterion for issuing visas to enter and stay in the country. In this they are not alone in the world. The instant response of the Home Office to the story was to suspend the licence for the use of ETS tests for visa applications - which includes the Tier 4 student visa. This hit TOEFL as well as TOEIC. Needless to say, there was no reflection by politicians upon the use of language tests as a surrogate immigration control. Rather, this was a good example of the testing companies and the education sector not living up to its responsibilities to government. The Home Secretary, Theresa May, was quick to point the finger of blame at those who were frustrating government policy. This is an extract from her performance on the Today Programme, on 10th February
This is an over-reaction. As I keep reminding visitors to this website, cheating is endemic, and has been ever since the high-stakes examination was invented. The temptation is greater when international mobility and work is at stake. Furthermore, the United States Government did not ban the use of IELTS scores when cheating was uncovered in Australia and China. But at the end of the day, the United Kingdom is merely harming itself, as stories that students taking those tests were not welcome circulated widely in Asia, even causing protests in China.
Here is the video of the statement on the ban as a result of the Panorama investigation from the Minister of State in the Home Office. I don't watch government debates on a regular basis, but when language testing is concerned, I make an exception! By October, of course, UK Universities wondered why numbers of students arriving from China had dropped across the sector. Data on this is publicly available from HEFCE. Three Universities were also banned from recruiting international students because they hadn't noticed an unusually high number of fraudulent scores. The question, of course, is how on earth they were supposed to know. Whatever the question, UK Higher Education was implicated in the fraud at one disreputable college, and the story sent around the world. By the end of the year ETS was warning that the UK may see further reductions in international students. I would like to end this rather sad story with one further observation. This story has also played into a worrying narrative that I am starting to hear at academic conferences in Europe - that the North American approach to language testing is rather inferior to that of Europe, and that any non-European model of validation is fundamentally flawed. No analysis or argumentation. Just assertion, supported by teacher training events that function more as marketing tools than developing critical independent assessment literacy.
"Please find enclosed your end of KS2 test results. We are very proud of you as you demonstrated huge amounts of commitment and tried your very best during this tricky week.
However, we are concerned that these tests do not always assess all of what it is that make each of you special and unique. The people who create these tests and score them do not know each of you- the way your teachers do, the way I hope to, and certainly not the way your families do.
They do not know that many of you speak two languages. They do not know that you can play a musical instrument or that you can dance or paint a picture. They do not know that your friends count on you to be there for them or that your laughter can brighten the dreariest day.
They do not know that you write poetry or songs, play or participate in sports, wonder about the future, or that sometimes you take care of your little brother or sister after school.
They do not know that you have travelled to a really neat place or that you know how to tell a great story or that you really love spending time with special family members and friends.
They do not know that you can be trustworthy, kind or thoughtful, and that you try, every day, to be your very best... the scores you get will tell you something, but they will not tell you everything.
So enjoy your results and be very proud of these but remember there are many ways of being smart."
It is stories like this that remind me that testing is too important to do badly, too frequently, or to take too seriously as "defining us" - as Fairtest have argued so eloquently with their t-shirt.
This must be a first for my annual review. It's not often that accommodations hit the headlines. What is an accommodation, first of all? An accommodation is a change to the administration or delivery of a test in order to mitigate the effects of some disability. Someone who suffers from dyslexia may get additional time, a learner who has poor eyesight may be provided with Braille copies of text, larger text on computer screens, or a reader. Another, with physical problems around writing, may be provided with amanuensis, or a scribe to whom they can dictate. The issue for language testers in the provision of accommodation lies around whether the accommodation changes the nature of the construct being measured, and whether the scores of a non-disadvantaged test taker would be increased if the accommodation were also available to them. But in our story, the official government examination "watchdog" (Ofqual) was collecting data on the number of accommodations granted to independent (fee paying) and state schools, on the assumption that the fee paying students were able to buy advantages. While this may be a matter of social concern, it once again points to the very narrow and uninformed concerns of political appointees. At least the media took a much more enlightened approach to accommodations than did the official agency. Listen to this item from the Today Programme, first broadcast on 1st August this year.
I'm sticking with a theme here - issues that don't often hit the media. In a nutshell, predictive validity is the extent to which scores on a test are able to predict future performance on some non-test criterion. This is pretty fundamental to all testing practice. If the scores bear little or no relationship to the quality of the decision we make about what test takers can or cannot do, the whole activity is pointless, right? In the United States there is a legal requirement to establish the predictive power of a test for a stated purpose through Griggs vs. Duke Power Co. in 1971, and the Civil Rights Act of 1991 (see my recent work on legal aspects of this in: Fulcher, G. (2014). Language testing in the dock. In Kunnan, A. J. (Ed.) The Companion to Language Testing (pp.1553 - 1570). London: Wiley-Blackwell). Yet, predictive power is chimerical in language testing research, and educational assessment more widely. And so it was that the report by Bill Hiss and Valerie Franks, entitled Defining Promise: Testing Optional, hit the top news slots. You can read the summary of their findings here, which amount to the claim that testing prior to college admission doesn't really separate out those who are likely to succeed from those who don't. Here's how the research hit the US national airwaves in February.
Last but not least, a touch of humour to finish off this year's review. Just in case any of the stories above have left you feeling down in the mouth, these will cheer you up. All readers who have been teachers will surely have stories of the hilarious responses to writing prompts that they've seen over their careers. This year the Telegraph produced The 32 worst exam howlers of all time. My personal favourite (you will undoubtedly have your own) is: 21. What is the main reason for Divorce? Answer: Marriage. There simply has to be a statistically significant correlation there! And finally, you will have noticed that this year I haven't (so far) mentioned the Gaokao. So I'll finish with the report that plans to give a "morality score" on the test, for good character and behaviour, apparently needs more work according to the experts. You bet it does.
I hope you've enjoyed my wry look at 2014. I wish you a merry Christmas and Happy New Year, and hope you'll come back to the website in 2015 to catch up on the language testing news, and see the new features that will be available during the year