‘Do you want some privacy with that?’

Burger King’s AI-powered headsets raise concerns with HR experts when it comes to surveillance, morale and performance management

‘Do you want some privacy with that?’
Adam Seth Litwin, Stacey Cadigan

Burger King’s plan to roll out an AI‑powered voice assistant called “Patty” to Canadian restaurants is being billed as a “game-changer” — but experts say the impact on workers will depend heavily on whether the tool functions as support or as “surveillance with better branding.”

Restaurant Brands International (RBI), which owns Burger King, Tim Hortons, Popeyes and Firehouse Subs, plans to roll out the new tech in the second half of 2026 after piloting the tool in 500 U.S. restaurants. The “BK Assistant” runs through worker headsets, listening to conversations with customers and prompting staff to improve service quality and efficiency, while flagging operational tasks such as updating menus, cleaning washrooms or refilling drink machines.

“It is extraordinary, it is a game-changer in terms of how you run restaurants,” said RBI executive chairman Patrick Doyle.

But from what is known so far about the new tech, experts such as Adam Seth Litwin have concerns.

“If anything, a system that listens in real time and feeds managerial judgments back through the headset risks feeling less like support and more like a very modern form of micromanagement,” says the associate professor of industrial and labour relations at Cornell University.

“And when you’re in a customer-service business, morale is not a side issue — it is part of the product.”

Pros: Real-time coaching, consistency

For employers, the appeal of AI‑enabled headsets starts with consistency across large workforces, in guiding workers through tasks and supporting training, says Stacey Cadigan, partner at IT advisory firm ISG.

“One of the big benefits is being able to enable more operational discipline at scale, so really standardizing how a lot of the front-line work can get done.”

From an employee perspective, real-time coaching and instruction can be “extremely valuable,” especially for new hires, she says.

“In terms of being able to onboard workers faster and being able to reduce errors and challenges across locations and improve consistency, that’s certain a main benefit.”

Pros: Measuring effectiveness

Another key advantage is the data generated. Cadigan points to task-level and performance data and insights that organizations can use “to help measure and track and understand how they're being able to improve some of the key metrics to tie to things like to new‑hire proficiency or customer experience.”

Litwin agrees that the richest benefit may be the “data exhaust” from platforms like Patty.

“This kind of system can generate a very fine-grained record of what happens in real time — what gets said, what works, what slows people down and where the bottlenecks are,” he says.

“That gives the company a much better read on the effectiveness of cross-selling, promotions and other customer interactions.”

In theory, that can boost productivity by making interactions more efficient and helping management identify “where customer conversations stall, where employees need support, and where the operation is losing time or sales,” says Litwin , who is also a Stephen H. Weiss junior fellow and program director/director of graduate studies at the ILR School at Cornell University.

Employers experimenting with AI headsets are looking to close “real or perceived operational gaps such as productivity, efficiency, consistency and training,” says Aida Abraha, a PhD researcher at Osgoode Hall Law School who focuses on AI regulation, law and work.

“On paper, the benefits are enormous,” she says, citing real-time visibility into service patterns, continuous training for workers, a more standardized customer experience across locations,” while alerting managers when inventory runs low, smoothing workflows and reducing wait times.

“But, as with most workplace technologies, the story isn’t just about capability — it’s about how the technology is used,” says Abraha.

Cautions: When coaching feels like surveillance

RBI’s Doyle has described Patty as putting “service under the microscope because it’s listening to employee interactions to uncover room for improvement,” including behaviours such as “Are they being friendly? Are they saying welcome? Are they saying thank you for coming to Burger King?” according to CP.

That line between support and surveillance is where experts see the biggest risks. Standardization can carry hidden issues, in reducing the worker’s judgment and autonomy, says Abraha.

 “Without thoughtful guardrails, protections that preserve worker dignity, fairness, safety and a degree of control over how workers do their jobs, tools like the AI-enabled headsets risk amplifying the very problems employers are trying to solve — worker engagement, productivity, efficiency and customer satisfaction.”

Employees may feel surveilled or micromanaged, says Cadigan.

“If it’s not handled correctly, the line between are they being coached or are they being monitored is an important one,” she says. “That can certainly backfire and have some real negative consequences in terms of decreasing morale or creating resistance among employees or damaging the brand.”

If workers experience Patty as a digital boss “hovering in the background,” Burger King may discover that the technology can be both clever and unwelcome at the same time, says Litwin .

“I can understand why workers or supervisors might feel annoyed, wary or even a bit panicked. A system that presents itself as coaching can easily be experienced as surveillance with better branding.”

Cautions: performance management

A Burger King spokesperson said the device is not designed to track or evaluate employees saying specific words or phrases, and described BK Assistant as “a coaching and operational support tool built to help our restaurant teams manage complexity and stay focused on delivering a great guest experience,” according to the Guardian.

“It’s not about scoring individuals or enforcing scripts. It’s about reinforcing great hospitality and giving managers helpful, real-time insights so they can recognize their teams more effectively.”

Maintaining that boundary will be important, according to Cadigan.

“Stepping past coaching and having real-time feedback and using it in that vein to become more of a 24-hour surveillance or monitor and using it directly in terms of a potential punitive or performance discussion, that crosses into a different threshold in terms of risk.”

Cons: Privacy concerns with ear coaches

Real‑time audio capture also raises significant privacy concerns, says Abraha.

“Depending on how they’re used, these devices have the potential to capture workers’ interactions with customers and co-workers, often without workers’ meaningful consent.”

Day‑to‑day conversations might be recorded and analyzed without clear transparency around what happens to the data, she says, such as using it for performance evaluations, disciplinary action, termination or to train a large language model (LLM) to eventually replace the workers with a service chatbots.

That kind of use will also become more challenging for adoption rates, says Cadigan, and may clash with employers that emphasize trust, autonomy or team environments.

“That becomes hard… to square up with the brand and some of the morale and engagement that you’re trying to create.”

Cons: bias, over-standardization

There are also real risks of bias and discrimination, says Abraha, as the AI‑enabled headsets may not equally understand or evaluate all workers, especially those with different accents, speech patterns or cultural communication styles.

“Concepts like ‘politeness’ or ‘friendliness’ are not neutral but culturally shaped,” she says. “There’s a real risk of penalizing workers for their accent, speech pattern; it may result in unfair assessments or negative outcomes for workers, even if their performance is otherwise strong.”

Cadigan also recommends training managers on the new tech, so they’re not misusing or over‑relying on the data to discipline employees versus using their own judgment.

“There’s always a danger of managers maybe feeling like now they've got the AI to do a lot of the coaching and relieving them of the task,” she says.

Cons: ‘Robotic interactions’

Another concern? For some employees, the constant voice in their ear could degrade performance if they feel overloaded; others might tune it out altogether, says Cadigan.

“It could result in more robotic interactions versus authentic interactions that they might normally do.”

Litwin is keen to see whether the company builds a visible upside for workers.

“I would want to know whether Burger King plans to use Patty to encourage, compliment or reward workers who do especially well,” he says. “Absent that, the technology looks less like coaching and more like measurement — and employees are usually sophisticated enough to notice the difference.”

If the gains all flow upward, Litwin says — meaning “better data, better control, better standardization — while the burdens flow downward to workers, then the company should not be surprised if the reaction is irritation rather than buy-in.”

Cons: lower mental health, engagement

Workers’ mental health is another concern, as real-time coaching can easily blur into real-time surveillance, says Abraha.

“Continuous feedback can support training or it can create pressure on workers and the potential to impact their mental health and well-being.”

That can give workers the sense of being “always on,” increasing stress levels in ways that are similar to highly monitored environments like call centres, she says.

“Over time, it’s not difficult to imagine that this level of surveillance and pressure to perform continuously could lead to burnout, anxiety, and reduced job satisfaction.”

Litwin is skeptical that the new devices will improve engagement.

“When workers know a system is listening for tone, politeness and performance signals, the likely result is not calm professionalism,” says Litwin. “It is stress, self-consciousness and the sense that someone — or something — is always watching.”

Tips for HR: Designing for support

Experts say the difference between support and monitoring is as much about design and orientation as it is about technical capability.

Instead of looking at how to make workers more machine-like, says Litwin, companies like Burger King should be asking, “What do our people already do especially well, and how do we let them do more of that? What else do they need from us in order to do their jobs better and to be happier doing them?”

If the goal is really performance and service quality, he says, the best technologies usually take work off people’s plates, “especially the parts that are tedious, frustrating, or just plain awful,” he says.

“In a sector that is low-paid and high-stress, that kind of design choice could be a genuine advantage.”

Tips for HR: involving workers ‘crucial’

Abraha outlines several steps employers can take to keep AI headsets on the support side of that line.

First, involving workers and unions early in the process is “crucial,” before the procurement and implementation of these devices.

Litwin agrees: If Burger King doesn’t meaningfully involve managers and frontline workers in shaping the technology, he says, “it may be depriving itself of the operational knowledge needed to make the system effective.”

Second, employers, together with worker representatives, should establish clear guidelines around how the AI-headset device is used, including what data is collected, how it will be used, and who has access to it, says Abraha.

Third, if the primary purpose is training and support, she suggests clearly defining “a clear and limited training period” for the use of AI-enabled headsets and offering workers the option to opt out if they choose not to use the technology.

Fourth, there should be ongoing audits of these devices “to ensure they are being used fairly, and that any issues related to bias, misuse or unintended impacts are identified and addressed,” says Abraha.

 She notes that the Supreme Court of Canada has recognized “the importance of worker dignity and autonomy,” emphasizing that workers retain “a reasonable expectation of privacy at work and that employer surveillance must be justified, proportionate and reasonable.”

Tips for HR: transparent data collection

Cadigan similarly stresses transparency in terms of what is and isn’t being tracked, what data is collected and why, who's going to see it and how long is it going to be stored. So, framing it properly by saying, “This is a support. It's not a surveillance. This is a coach. We're not monitoring every word. It's not going to be punitive.”

Whether Patty feels like a helpful coach or a constant supervisor will be decisive, she says, as “the adoption will really hinge on that trust and the visibility more than the sophistication of the technology or the AI itself.”

Listening to workers and adjusting the rollout is also critical, says Cadigan.

“[It’s about] making sure that you have regular feedback from the workers and being prepared to adjust and act on that as needed.”

 

Latest stories