The market reads too much into monthly job numbers, so the panic caused by StatsCan’s faulty announcement in July wasn’t justified, say industry observers.
“It’s based on a modest sample of the total Canadian population and there’s a practical limit to how many people you can poll,” says Avery Shenfeld, managing director and chief economist of CIBC World Markets.
“An estimate of one month of employment is not that statistically meaningful. You have to look at a six-month average and see whether there’s a trend in the data, and if so, what it is.”
Statistics Canada says the mistake was an isolated incident caused by human error. On Aug. 15, it corrected the July Labour Force Survey job numbers to 41,700, from the 200 it initially reported on Aug. 8. Economists had predicted 20,000 to 30,000 new jobs.
As well, only 18,100 full-time jobs were lost—not the 59,700 reported earlier.
Adrian Mastracci, portfolio manager of Vancouver-based KCM Wealth Management Inc., says he didn’t change his investment outlook when the erroneous 200-jobs report was released. Nor did he do so when StatsCan revised the numbers.
“I assume that some of the data releases will be changed or amended, so I average the monthly data over a 12-month period when advising clients on investment strategies,” he says. “It’s more realistic and more relevant, and to me, doing an average over a year will have numbers with fewer mistakes.”
Yet, the loonie lost half a cent in trading that day.
An overview of the Labour Force Survey, co-written by its chief, Christel Le Petit, states “monthly estimates will show more variability than trends observed over longer time periods.” She adds that “analysts need to take into account the standard error [also known as standard deviation] and look at trends.”
Still, Shenfeld says more reliable job numbers come from the U.S. Bureau of Labour Statistics’ monthly Current Population Survey (CPS) and its Current Employment Statistics Survey (known as the payroll or establishment survey), which measures employment data from employers. Both are released at the same time.
Canada has two similar monthly programs. The LFS samples 56,000 households, or 110,000 people, and is the same size as the U.S.’s CPS. The other is the Survey of Employment, Payroll and Hours (SEPH), which has a sample of 15,000 businesses and uses payroll data from Canada Revenue Agency. But the SEPH’s monthly report on estimated earnings and hours worked is released about two months after the LFS.
“People don’t pay as much attention to it,” says Shenfeld. “It would be helpful if we had a more timely survey of employers and how many people are working for them.”
Why the error?
A report, commissioned by Chief Statistician Wayne Smith and released Aug. 28, revealed that a programmer detected the problem Aug. 8, after the numbers had been released. After confirming the error, officials decided to remove the erroneous data, notify the public, and release a correction on Aug. 15.
“This has never happened before in LFS history,” says Armine Yalnizyan, senior economist with the Canadian Centre for Policy Alternatives.
“The LFS is among the most carefully put together surveys in the world. It’s a reliable indicator of changes in the labour market over the longer term, and is more timely than any other economic indicator that we have, save the Consumer Price Index. So it naturally gets examined with a fine-tooth comb by everyone from the Bank of Canada to the markets.”
But mistakes happen — in this case, thanks to a computer error.
Earlier this spring, Statistics Canada started updating the program it uses to calculate the jobs numbers. The agency does this every decade to account for changes in Canada’s population and economy. Changes made prior to the July calculation introduced a fault in the program: some survey respondents who should have been classified as employed were not.
That happened because the team updating the computer system didn’t fully understand how the labour force survey program worked, says the StatsCan report. The systems documentation was “out-dated, inaccurate and erroneously supported the team’s assumptions about the system.” Further, “communications […] around this particular issue were inadequate.”
The commissioned report also looked at whether the initial results could have been plausible. Turns out, they were. An analysis of 355 LFS surveys since 1985 showed that when total employment is growing (as it has been), the LFS has charted a net change of 200 jobs or fewer nearly one-fifth of the time.
Regardless, given the 10-year frequency of the computer programming update, it’s unlikely the error will be repeated, says Shenfeld.
Both he and Yalnizyan say the survey wasn’t affected by cuts at StatsCan, which has seen significant budgetary and staff losses over the past two years. And, in 2011, the federal government mandated the replacement of the mandatory long-form census with the voluntary National Household Survey. Academics say the voluntary survey’s data isn’t as reliable as the full census results were. In August 2013, StatsCan postponed numbers from the voluntary survey by a month when workers detected an error.
Shenfeld says the July 2014 glitch was a “big annoyance” for day traders, who will remember it — but most Canadians will not.
“Stats Canada showed a lot of integrity by admitting to its mistake, correcting it, and not trying to bury it in the next month’s numbers and trust that doesn’t leak out,” says Shenfeld.
Let “he who was not had a computer error cast the first stone.”