What Alexa’s global rollout reveals about AI, culture and the Canadian workplace

Capital One Canada’s CPO says company culture is ‘the fabric of who we are’ and can’t be replaced by AI

What Alexa’s global rollout reveals about AI, culture and the Canadian workplace
Susan Zettergren

When Amazon launched Alexa+ in Mexico, the company learned a crucial lesson for employers: connecting on a culture level goes far beyond language, local jokes and slang. 

In its pursuit of a tool that it says “feels like one of the family” in homes around the globe, Amazon deployed complex teams – including engineering, data science, and country-specific local ones – to figure out the nuances of those cultures and tread the line between authenticity and stereotyping. For example, in an article on Amazon’s news site, Alexa in the Americas’ director Carlos Perez said that while preparing Alexa+ for launch in Canada, development teams “found too many moose references in responses.” 

For Canadian employers deploying AI tools in their workforces for efficiency gains, the Alexa+ example supports a repeated argument against the breakneck pace of adoption: even the most finely tuned LLM cannot replace employee-level interaction.  

And that interaction is crucial for culture, says Susan Zettergren, chief people officer at Capital One Canada. 

“When you talk about an AI-enabled device, being able to understand local slang or use word usage, that can be really important for that to work as a tool,” she says. 

“However, when you think about a workplace … culture is something that's built through humans and human interaction.” 

Disconnect mirrors AI culture gaps 

A disconnect between surface usage of AI tools and deeper understanding of what workers want from their work culture is already showing up in Canadian workplaces. Recent surveys cited by Canadian HR Reporter reveal that nine out of 10 AI users have abandoned workplace tools at least once out of frustration, while executives remain far more optimistic about adoption. 

Zettergren describes culture as “the fabric of who we are,” including the unspoken rules that shape how work gets done and decisions are made, as well as building trust. That, she says, is where AI tools quickly reach their limits.  

“While understanding and using AI is important for us as workers at this point in human history, it's not the only thing,” she says, emphasizing that AI adoption in the workplace is not “binary” – employers must balance tech with human connection.  

In many organizations, Zettergren notes, leaders risk “over-indexing on one thing,” especially when AI is framed as a quick solution to productivity pressure rather than a tool that needs integration into how people work. 

“In parallel to building those types of skills and having increasingly more sophisticated AI or LLM-enabled tools, [it’s about]  how do you also continue to bring to bear the most important things around judgment, connection, empathy and context, which are all things that create a culture?” she says. 

Why top‑down AI strategies strain trust 

Amazon’s Alexa+ development depended on understanding limits: knowing where AI could adapt, and where human insight remained essential. For Zettergren, the same lesson applies to the workplace; as organizations continue their AI transformations, culture may become the clearest measure of whether the strategy is working, with leadership being the pressure point. 

“There is so much to be said for what you bring as a person, or who you are as a leader,” Zettergren says.  

“If you're too solely focused in one place, you're missing an important whole‑person approach to success.” 

Zettergren emphasizes that successful adoption requires clarity and restraint, along with consistent training and career advancement focused on learning the whole spectrum of skills from tech, AI use and governance to leadership and collaboration.  

“It's bigger than just the thing that you've delivered. It's about how you were able to do that work and how you were able to do it in a collaborative way inside the organization,” Zettergren says. For the tech side, she shares that Capital One uses departmental “champions” – team members who work to engage teams with AI strategies and use cases.  

This helps ensure that tech adoption doesn’t overshadow culture concerns, especially since the technology continues to evolve. 

“And the way we're using it is continuing to evolve, so that it is part of our collective learning, in addition to those more soft skill spaces.” 

Early‑career anxiety around AI automation  

One of the clearest cultural signals around AI, Zettergren says, is emerging anxiety among early‑career workers – a dynamic echoed in broader workforce surveys.  

“If you think about a lot of the headlines and the storytelling about AI, it touches on junior or entry‑level task automation,” she says.  

“You can quickly see how someone new to the workforce might feel very scared.” 

The response from employers must go beyond technical training, says Zettergren. By combining focused tech skills training with essential soft skill and collaboration development, organizations can preserve their pipeline of future leaders and organizational knowledge.  

Without that assurance, employers risk eroding the very intellectual capital they depend on.  

“The opportunity is to build confidence through coaching, feedback and developing skills,” she says. 

“[It’s about] getting people comfortable with other places, where they might not be thinking about using AI – or maybe they're thinking about it, but they're not sure how to go about it – having the central team that's refining those use cases and sharing them out with folks on a regular basis.” 

Latest stories