In the wake of allegations that Washington state’s top cannabis lab was artificially inflating THC test results and improperly passing samples that should have failed microbial safety screenings, the sector has found itself facing a serious credibility crisis.
Only one lab, Peak Analytics, was alleged to have engaged in questionable lab practices, news first made public in a Leafly investigation. But the fact that the company’s shoddy test results went undetected by regulators for so long has spurred consumers and industry insiders alike to ask who, if anyone, was testing the testers. Some have openly wondered whether cannabinoid levels on product labels are even worth the paper they’re printed on.
Now a consortium of Washington state testing laboratories are taking matters into their own hands in an effort to win back trust.
After the initial lab exposé went live, The Cannabis Alliance (TCA), a Washington trade association, invited all the state’s 18 licensed cannabis-testing labs to a meeting. According to Nick Mosely, co-owner of Confidence Analytics, 10 labs attended the meeting, held in the central Washington city of Ellensburg.
“We need the trust of the producer-processors if we’re going to benefit our businesses and our business relationships,” Mosely, Confidence Analytics’ chief science officer and a TCA board member, told me. “Beyond that, obviously the consumers care and are interested in lab testing and what it means for the quality of the product.”
The group of testers arrived at a plan to measure themselves against one another. They settled on a round-robin model, in which each lab would receive blinded, pre-tested samples from one of TCA’s grower members. None of the labs would know which grower the sample came from or what its cannabinoid content was. Each lab would be asked to test the sample to its normal standards using high-performance liquid chromatography (HPLC), a common testing procedure. Results would be publicly available on TCA’s website. [Editor’s note: We’ve embedded the full report at the bottom of this article.]
Eight labs ended up participating in the round robin: Analytical 360, Confidence Analytics, Green Grower Labs, Medicine Creek Analytics, Molecular Testing Labs, Steep Hill, Washington Testing Technologies, and Trace Analytics. Each tested two flower strains, a BHO sample, and kief. All the samples were homogenized via a process called freeze-milling, which is a more reliable method than simply grinding up the flower samples in a traditional blender.
The thinking went like this: If the results were to come back in a tight cluster around the sample’s known cannabinoid content, that would be an indication the labs were operating under the same standards. A wide spread, on the other hand, would indicate bigger methodological issues at play.
The outcome? The resulting spread of percentages—looking at CBDA in Sour Tsunami and THCA in Dutch Treat—was less than three points, according to a report of the findings. That’s relatively good news.
“While the outcomes of this experiment lend credibility to those labs willing to collaborate and show that the variability between them (described as one standard deviation) is less than 1/10th the measurement,” the report says, “there is always room for improvement.”
Shop highly rated dispensaries near you
Showing you dispensaries nearLabs that reported higher THC values than their peers tended to do so for all samples (likewise with labs that reported lower values), which the report’s authors suggested was most likely due to methodological differences in how each lab performs test procedures. The report also notes that there was disagreement among labs in terms of testing for minor cannabinoids, such as CBD and CBG .
Despite the differences, the report commended the eight participating labs for their mutual cooperation. “To have industry leaders, and business competitors, working together toward meaningful improvements to standards of practice is especially needed in a nascent industry where the unknowns are multivariate and the guidelines are still developing,” it said.
Doing a ton of free testing, of course, isn’t something labs are usually keen on. But Mosely said the aim—to get the state’s testing labs on the same page—was well worth it.
“There obviously is quite a bit of extra effort in order to get that consistency,” he said, adding that “everybody up and down the chain is benefitting. It starts with the labs.”
The round-robin testing, Mosely added, goes above and beyond the standard proficiency testing performed by RJ Lee, the state’s accrediting body for labs. RJ Lee is located out of state and thus can’t handle physical cannabis, which makes their tests less applicable, Mosely said, contending that the round robin, performed completely within the confines of the state’s licensed cannabis system, better addresses the issue.
“The labs know they’re being tested and evaluated, and they presumably put on their best face and do the best that they can.”
What the round-robin approach doesn’t do, however, is replace enforcement. Washington data scientist Jim MacRae, who has focused much of his work on tracking suspect results by the state’s cannabis labs, pointed out that all the labs involved knew they were participating in a round-robin test, even if they didn’t know the cannabinoid content of the samples they were testing.
“It’s these guys saying, ‘Here’s how good of a job we can do,’” MacRae told Leafly. “It’s certainly a cynic that would say they’re not doing this on a daily basis, but basically that test is an opportunity for them to show how good they can be.”
That fact, MacRae argued, underscores a fundamental problem with existing proficiency testing performed by the state’s designated lab auditor, RJ Lee: The testing is announced in advance.
“The labs know they’re being tested and evaluated, and they presumably put on their best face and do the best that they can,” MacRae said. Which means RJ Lee’s proficiency testing—as well as the round-robin tests performed by TCA—are great at evaluating a lab’s “capability” but not necessarily its “culpability,” as MacRae put it.
“Capability shows what they can do when they’re on their best behavior,” he said. “If what they display when they don’t know they’re being evaluated is significantly different than what they display when they know they’re being evaluated, then there’s a problem there.”
Mosely, of Confidence Analytics, agreed. But he said the effort to ensure consistency between testing labs isn’t meant to replace enforcement efforts.
“Proficiency testing is not intended as an enforcement mechanism,” he says. “Neither is round robin. The participants know they are participating. They do it to improve their methodology, not as a gotcha. A good proficiency testing program helps good labs do better.”
But what about labs whose problems have more to do with ethical deficiencies than methodological ones?
Identifying bad actors among the state’s licensed cannabis labs is a difficult process, one that requires collecting data from labs without letting them know. That responsibility rests with the Washington State Liquor and Cannabis Board (LCB). And MacRae, for one, doesn’t think the agency is taking the job very seriously.
“The LCB has allowed this to continue on and on and on,” he said. “The bottom line is this: When labs do [this]—giving superior potency results, failing to fail samples—the LCB doesn’t seem to give a shit.”
The LCB does indeed operate a secret shopper program for labs, however, and has tested 220 samples since it began in late 2016, according to Brian Smith, the agency’s communications director. Regulators so far haven’t issued any violations based on those tests and are still reviewing the results, he said.
The LCB did recently suspend Peak Analytics’ testing license, but that action came in response to an audit by RJ Lee, which itself was prompted by an outside complaint against Peak—not by a secret shopper.
As Mosely points out, however, the LCB’s secret-shopping program can be difficult to administer.
“Secret shopping is expensive. You have to pay for the tests. You have to lot the samples in traceability. You have to send someone out undercover. One sample is not enough. You need n-power if you want to stand in court and make a case,” he explained. “Lot’s of ins, lot’s of outs.”
In the meantime, the question of how to effectively stop labs from cheating remains unanswered. While the round robin represents a significant step forward for lab standardization, it doesn’t replace the type of consumer assurance that government oversight provides. Whether the LCB’s program will develop into something that does is anyone’s guess.
Cannabis Testing Laboratory Round Robin, Round 1 Results — The Cannabis Alliance by Ben Adlin on Scribd