As police staffing shortages and public safety concerns continue to dog Alberta police services, more municipal agencies say they’re turning to artificial intelligence for support.
“Our staffing numbers are critical,” Calgary Police Service Deputy Chief Cory Dayley said. “We need to do things differently, for sure, to provide the city what it needs.”
The Calgary Police Service (CPS) first introduced Microsoft Copilot, a form of generative AI, in 2024.
Dayley said that since then, about 800 employees, both professional staff and sworn members, have been using it on a shift-by-shift basis to help with their daily duties.
“It’s deployed to everybody in the service, they have access to AI,” Dayley said. “Stories we’re hearing are how members are using it for reports … to help create more efficient streamlined processes on the front line, in our investigative areas, and our analytical areas.”
As the CPS looks towards the future, report writing and analytics could be the first steps in a whole suite of technological upgrades for officers, with AI software also available for police body-worn cameras.
Just outside Calgary’s city limits on the Tsuut’ina First Nation, some of these upgrades are already being put to work.
“We started about two years ago with the acquisition of the software platform Axon,” said crime analyst Trish Pace with the Tsuut’ina Nation Police Service (TNPS).
Pace explained that officers have been using Axon’s Notes Module, a mobile app that allows officers to dictate their notes, as well as Draft One software, which uses generative AI and the audio from police body-worn cameras to produce officers’ draft reports.
“The reports that are coming forward are more complete, are better crafted and more detailed than what they were doing previously,” Pace said.
“It saves them time in drafting the bones of the report,” Pace added. “But, throughout the report, there will be prompts by the software to say, ‘You need to fill in more information about their arrest.’
“It forces the member to read through the report and find those areas that the software is saying, ‘We haven’t done this for you.’”
AI guardrails
It’s this human oversight that law experts believe is critical in this extremely fast-moving industry.
“The problem is, if the AI hallucinates,” Calgary criminal defence lawyer and former Crown prosecutor Balfour Der said. “It starts making up things or it starts filling in blanks with thoughts and ideas or words that weren’t really there to start with.
“It will certainly be a matter of concern in court because as you know, we hang on every word.”
There have already been issues in the justice system, including a case in 2025 where a Toronto lawyer faced criminal contempt of court proceedings after including cases invented by AI in her submission and then denied it.
“That’s the most important thing, the author has to double-check it’s accurate,” Der said. “If an officer comes and says this is an AI-generated report, then we would need to know what it’s based on.”
The Calgary Police Service said it formed an AI governance framework and an ethics steering committee early on to help navigate these concerns and guide when and how its employees use the technology.
“Privacy leads what we’re doing, the law leads what we’re doing here and it’s responsible and ethical use of AI at the forefront,” Dayley said. “There’s still human moderation, which then helps us to eliminate bias or things that, quite frankly, might just be untrue.”
The Tsuut’ina Nation Police Service said that along with human moderation, its reports must also be approved by a supervisor, and right now, Draft One transcription services are only used for lesser offences.
“Anything more serious — like an assault with a weapon, a robbery file, a homicide — those are still required to be done from scratch by the members themselves,” Pace said.
The company behind the police body-worn cameras said it also has a set of guiding principles in key areas, as well as an Ethics and Equity Advisory Council.
Get daily National news
“Axon recognizes the immense promise of ethical AI innovation and aims to harness cutting-edge AI technology to revolutionize public safety, all the while prioritizing the rigorous mitigation of biases and other potential risks,” an Axon spokesperson said in an email to Global News.
“This includes building tools that preserve the crucial role of human decision-making with specific controls that keep officers in-the-loop.”
Meanwhile, AI consultant Stéphane Contré with Storm Analytics highlighted that criminals are also harnessing the power of AI and police must be up to date with how the technology works.
“Criminals are using AI right now to do very sophisticated attacks,” Contré said. “It’s incumbent upon police agencies to keep up and at least understand what the technology is and what the capabilities are so they can counteract it efficiently.”
AI testing and transparency
In 2025, the Edmonton Police Service (EPS) announced that it would be testing new AI facial recognition technology.
“The EPS has partnered with Axon Enterprises Incorporated and will be the first police service in the world to test Axon’s body-worn video camera recognition technology,” EPS acting superintendent Kurt Martin said in a December 2025 news conference.
Officials said the limited trial would assess the feasibility and functionality of the technology with up to 50 officers testing the facial recognition feature.
EPS said the cameras would use mug shot images from the police service’s database, and later, specially trained officers would review the footage to see if the facial recognition software worked as intended.
“Facial recognition on body-wearing cameras is one tool that we are exploring to potentially reduce investigative timelines,” Martin said. “If this technology can help us identify some of the suspects in our database who are wanted for serious criminal warrants, we want to explore that as an option.”
But some Alberta law experts believe there needs to be more transparency from police agencies and are watching closely to see how accurate the technology is.
“Depending on people’s demographic variables, there tends to be a susceptibility of many of these facial recognition technologies to not properly identify individuals from certain social backgrounds, for example, minorities,” Temitope Oriola, a professor of criminology and sociology at the University of Alberta, said in a December 2025 interview. “That is a thing that appears to be baked into the very functionality of a lot of these tools.”
Gideon Christian, UCalgary’s research chair in AI and law, also pointed out that while police should be able to use AI technology in various ways, more needs to be done to increase public awareness.
“My continued concern, especially when it comes to the use of these tools in public service, is the lack of transparency,” Christian said. “You do not help in addressing suspicion by not being transparent about the use.
“That lack of transparency is disturbing; it does not build trust.”
The Edmonton Police Service said that after the test period, any still images used for facial recognition would be deleted but the original video would be kept according to EPS rules.
The service said the results would be reviewed by the Edmonton Police Commission and the Chief’s Committee and based on its findings would consider whether to move forward with further testing in 2026.
EPS also said it submitted a privacy impact assessment to Alberta’s information and privacy commissioner.
Alberta privacy laws and AI
Finding the balance between elevating innovation and gaining public trust is front and centre for the Office of the Information and Privacy Commissioner of Alberta (OIPC).
“It is being discussed everywhere, every meeting I’ve gone to, every conference I’ve gone to, this is the topic,” Alberta information and privacy commissioner Diane McLeod said. “It’s artificial intelligence and it’s about trust and the lack of trust.”
Just last year, Alberta modernized some of its privacy laws and repealed the Freedom of Information and Protection of Privacy (FOIP) Act, replacing it with the Access to Information Act (ATIA) and the Protection of Privacy Act (POPA).
The commissioner said that while these changes permitted the use of automated systems in Alberta, including AI and added some guardrails, there are still gaps when it comes to individuals’ rights.
“What we’re missing is certain rights with respect to individuals such as the ability to object, the ability to decline, the ability to appeal,” McLeod explained. “So some of those things we did recommend as inclusion in this legislation were not included.”
In a 2025 report, the OIPC also highlighted that Alberta’s privacy laws need to continue to be updated to ensure they offer sufficient privacy protections when personal information is used to train AI or to make decisions that impact Albertans.
“I think that if Albertans or Canadians have concerns about the use of artificial intelligence and the lack of adequate controls to protect their privacy and protect them from other harms, I think they should probably bring that up with their member of parliament and their legislative member as well,” McLeod said.
Meanwhile, Alberta’s private sector legislation, the Personal Information Protection Act (PIPA), is currently under review and the commissioner is hopeful that administrative monetary penalties will be added to further strengthen the OIPC’s abilities to penalize private sector organizations that do not follow the law.
“I am optimistic the government will proceed with that, because as you can appreciate, there are large corporations in the world that are processing highly sensitive personal information, including that of children,” McLeod said.
However, when it comes specifically to policing in Alberta, the provincial government said in an email to Global News that it believes there are sufficient regulations in place for police agencies.
“In Alberta, police services are independent and make their own operational decisions, including whether and how to test or use new technologies,” said Arthur Green, the press secretary of the Office of the Deputy Premier and Public Safety and Emergency Services.
“While there is currently no single, standalone law focused solely on artificial intelligence in policing, police services already operate within a well established legal framework that includes provincial privacy law, the Canadian Charter of Rights and Freedoms, criminal law and court oversight.”
In an email to Global News, the Department of Justice Canada also highlighted that Canada’s legal framework reflects the division of powers between federal and provincial governments and pointed to governance in the Charter of Rights and Freedoms.
“The Charter provides key governance through its principles as well as through jurisprudence including protection in section 8 from unreasonable search and seizure,” said Department of Justice Canada senior media relations advisor Ian McLeod. “Which has been interpreted to include protection of a reasonable expectation of privacy.”
In terms of federal policing, the justice department said that responsibility primarily falls to the Royal Canadian Mounted Police (RCMP).
“Federal police are subject to the Treasury Board Policy on Service and Digital,” Ian McLeod said. “This policy regulates the use of digital technologies, including artificial intelligence, by federal institutions to ensure responsible, transparent, and accountable adoption and management of these technologies.”
Private sector AI and federal legislation
While Canadians and industry wait to see what happens with possible federal AI legislation, experts believe a framework is needed now.
“It’s very important to have the regulation and the regulation seems to be one of the biggest problems that we have had in regards to artificial intelligence in Canada,” Christian from UCalgary said.
The urgency of this issue has once again come to the forefront following the tragic school shooting in Tumbler Ridge, B.C.
On Tuesday, Artificial Intelligence Minister Evan Solomon summoned OpenAI representatives to Ottawa following revelations the tech company had banned the shooter in June 2025 for misusing the AI chatbot “in furtherance of violent activities.”
The company said it did not inform police at that time because the activity did not meet the higher internal threshold of an “imminent” threat.
OpenAI ultimately contacted RCMP after police said 18-year-old Jesse VanRootselaar killed eight people and wounded 25 others on Feb. 10 before taking her own life.
“I’m working closely with Public Safety, the justice minister, the heritage minister and the Bill on Privacy and Data that we have forthcoming to make sure Canadians are kept safe,” Solomon said at a news conference in Ottawa. “I will tell you this: all options are on the table.
“We are working very closely together to make sure that we have a suite of measures to protect Canadians, and I will say that suite of measures will be thorough and make sure that Canadians are kept safe.”
OpenAI told Global News Tuesday evening that the company appreciated the “frank discussion on how to prevent tragedies like this in the future.”
“Over the past several months, we have taken steps to strengthen our safeguards and made changes to our law enforcement referral protocol for cases involving violent activities, but the ministers underscored that Canadians expect continued concrete action and we heard that message loud and clear,” a spokesperson said.
“We’ve committed to follow up in the coming days with an update on additional steps we’re taking, as we continue to support law enforcement and work with the government on strengthening AI safety for all Canadians.”
AI policing path forward
While the Calgary Police Service eyes new AI technology, including possibly one day expanding its drone program, the service said it continues to collaborate with police agencies from across the country and internationally and is ready for further direction.
“We watch very closely what guidance we should be seeing from our province and our legislative bodies, said Dayley, the CPS deputy chief. “We’re waiting for that information, we welcome it, but in the absence of those decisions yet, we also need to understand that technology is being utilized.”
Meanwhile, Pace at the Tsuut’ina Nation Police Service said the AI technology is essential for police agencies in need of officers.
“Right across the board, you’re seeing services that are asking for bodies,” Pace said. “Grande Prairie is brand new, the provincial police service is brand new, everybody’s fighting over the same bodies.
“We’re a small agency; we cannot grow fast enough to keep up with information and with our population that’s coming on nation.”
On the other side of the justice system, criminal defence lawyer Der said he will see how the technology is received in the courts and ultimately by Canadians.
“When we are dealing with people’s liberty, citizens’ liberty; we don’t want to cut corners,” Der said.
“We don’t have that kind of room for error.”
— with files from Sean Boynton, Jasmine King and The Canadian Press’s Paola Loriggio
There’s a critical mistake