Rebecca Judd, now 19, found her first term at the University of Ottawa last fall a cold shock. Her Grimsby, Ont., high school could have done much more to prepare her, she says.
“In high school, I definitely tried very hard for my marks, and I feel that sometimes I got the marks I deserved, but other times I did not try very hard at all, and still found that good marks were pretty effortless. In university, that is not how education works at all.”
Judd’s experience wouldn’t have been a surprise to the University of Waterloo’s engineering faculty, 500 kilometres down the highway.
For decades, Waterloo has been using a list of which Ontario high schools’ marks matched the marks their graduates got in engineering school — and which didn’t.
For admissions officers, it meant that they didn’t always have to take marks completely at face value. Universities don’t have much to go on, other than marks, when they made admission decisions, but the same mark from three different schools can mean three different things.
WATCH: Ontario university’s secret list revealed. Shallima Maharaj reports.
A generation ago, Waterloo’s engineers realized the solution was right under their noses — in their own data.
Waterloo could already tell what school inflated its marks by how much, by comparing the final high school marks students were admitted with to the marks at the end of first year. For schools whose graduates were admitted often enough to reach conclusions, that information could be used to help make future admissions decisions.
So they made a list of which high schools’ graduates had small gaps — and which had large ones. They called the gap the “adjustment factor.” Armed with this information, admissions officers could deal with the next year’s crop of applicants more fairly.
The existence of the list wasn’t a secret — but its contents were.
And at the top of the list, three years running, was the worst offender: Grimsby Secondary School.
GSS graduates who ended up in engineering at Waterloo saw their marks drop over 27 per cent, as opposed to 16 per cent for an average Ontario high school not on the list. Some 45 high schools’ graduates had drops below 16 per cent —Toronto’s L’Amoreaux Collegiate Institute, for example, had a drop of just under 10 per cent.
Waterloo’s list helps solve two problems that come from universities accepting high school marks uncritically.
First, students with inflated marks risk being placed in programs they’re not prepared for. As well, the university risks being unfair to graduates of schools with more rigorous marking standards.
“You’re not doing a student any favours if they get into university and aren’t well-prepared for the challenge,” says Greg Moran, who ran admissions for the University of Western Ontario in the 1990s. “They are going to suffer the failure and frustration that that involves.”
Waterloo updates the list every year. Schools fall off and others are added, depending on whether the faculty has seen enough recent examples to draw conclusions.
“There’s always pressure for the marks to be fairly high,” retired Edmonton science and chemistry teacher Lynden Dorval explains.
“At high schools, you always want your students to do well, so you’re always struggling with the issue of making it a little too easy, because everybody’s happy when the marks go up. The principal is happy, the superintendent is happy, the parents are happy, students are happy.”
WATCH: First-year university students transitioning to life on residence
Moran calls grading inconsistency between schools “a massive problem.”
“It’s a great expense to the province, the students, and to the university to bring them to university, and you want them to succeed.”
“The issue of whether or not their grades are a good indicator of the extent to which they’re well-prepared for university is an important issue.”
Bill Anderson, a chemical engineering professor who has been the Waterloo engineering’s admission director for the last 10 years, calls the list “one of our tools in the tool kit to select students that would be a good fit.”
“It plays a role, obviously, but it’s only one of the things we’re looking at.”
“Obviously, we look at grades. We look at the adjustment factor. We look at extracurricular activities, awards, participation in events. We also look at potential work experience, volunteer experience that looks like work. We also look for some indication that they know what engineering is about — what they’re getting into, why they’re getting into it.”
For students at about two-thirds of the schools on the list, the adjustment factor works in their favour — their school’s graduates had lower-than-average gaps.
Some 74 Ontario high schools were on Waterloo’s list at some point in the 2016, 2017 or 2018 admission cycles. Some were on the list for all three years, and others for only one. The average in the table below is the average for the years available, which varies.
Click on a school to see board information and adjustment factors for all years available.
At a board level, schools in the public boards in Toronto, York Region and Ottawa, and in the separate boards in Ottawa and York Region, had average adjustment factors below 16 per cent. Only one school, Toronto’s L’Amoreaux Collegiate Institute, was below 10 per cent.
Of the five private schools on the list, four had adjustment factors above 16 per cent.
Waterloo also applies the concept between provinces in Canada, and for international applicants. Students from New Brunswick had their marks drop about 26 per cent, while graduates of Quebec’s CEGEP stream by only 5.2 per cent.
Internationally, students from Egypt, Saudi Arabia and Bangladesh had the biggest gaps, and those from Indonesia, Singapore and Malaysia had the smallest.
Global News obtained the list under Ontario access-to-information laws. We originally requested it in April 2016; Waterloo refused to release it on privacy grounds, leading to an appeal to Ontario’s information and privacy commissioner. After a ruling ordering the release of the list in June of this year, Waterloo opted not to appeal the decision to the courts and complied. They then voluntarily released data for the 2017 and 2018 admission cycles.
GSS has been slated for closure since local school board officials voted to combine three high schools into one in March of last year.
“Our teachers are highly trained professionals who grade students according to curriculum expectations and other means of assessment,” District School Board of Niagara spokesperson Kim Yielding wrote in an e-mailed statement.
“Grimsby and all DSBN secondary schools do an excellent job preparing students for post-secondary academic studies. However, the University of Waterloo applies their adjustment factor to only 10 per cent of Ontario’s high schools, and only those that have had multiple students successfully admitted into the program. This very small sample does not reflect the hard work of our students and teachers.”
Yielding declined an interview request, as did the school’s principal.
At Western, Moran used a very similar system to the one Waterloo now uses.
“We identified the more rigorous schools, and if a student is on the cusp of the decision of being admitted or not, we would give them the benefit of the doubt by crediting them one or two percentage points. Too much would risk another kind of unfairness coming into the formula.”
Western then sent its data to schools that seemed to have a grade inflation problem.
“It’s not a good situation, either for the students or for the universities,” Moran remembers. “So we fed it back to the schools, thinking that they would likely be interested in the results for their school.”
The feedback was “not as much as we expected.”
He approached other universities, hoping they could pool their data.
“There wasn’t a lot of enthusiasm, and that initiative didn’t really go anywhere.”
Could comparisons be scaled up to a whole province?
“It’s a workaround that doesn’t really solve the problem,” Moran says. “Administratively, it’s not a very big deal at all. Certainly, the entrance grades are available … and the first-year grades could easily be submitted.”
“I would rather see more of a debate about whether standardized testing has a place to play in solving the problem more fundamentally.”
Anderson has his doubts about what it would accomplish.
“We’re not convinced the numbers are useful in a broader sense, because they are specific to our situation — the students that apply to us, the courses that they take here, things like that. So we’re not suggesting that they’re broadly applicable or useful,” Anderson says.
Grade inflation does students no favours in the long run, Judd says:
“A lot of people get these great marks in high school without a whole lot of effort,” she reflects. “They just roll with it. They think it’s all right, so long as things are OK for now, but I definitely noticed that this was a huge, huge problem that so many people were facing, and the sort of laxness of grading can definitely hurt students in the future.”
With files from Shallima Maharaj