How IKEA Is Protecting Its Values Culture While Embracing AI

3D rendering of an compass with the word core values. getty

Companies everywhere are introducing AI at a pace that outstrips how quickly people can adapt. The pressure to modernize is real, and so is the strain it puts on established ways of working.

At Ingka Group, the largest IKEA retailer in 32 countries, leaders recognized that tension early and set out to adopt AI in a way that wouldn’t put their culture at risk. The technology would move forward, but not without steady leadership and clear support for their people. This approach stands out because the company is adopting AI across their global workforce while keeping values at the center of every decision. It’s worth underscoring that IKEA’s approach reflects a deliberate social contract to put people first. A choice rooted in its values and history, not a universal standard, and not the only viable path for competing in an AI-driven market.

IKEA is not alone in taking this approach. A growing number of global organizations, including companies like Microsoft and IBM, are working to embed long-standing corporate values directly into their AI governance, product design, and leadership decision-making, recognizing that culture can’t be an afterthought in the age of intelligent systems. A strong culture does not automatically mean preserving jobs, but rather being explicit about how technology and human work are balanced in service of the organization’s purpose.

IKEA’s people-first orientation is reinforced explicitly at the executive level. As their Chief People & Culture Officer, Ulrika Biesert, emphasizes, “People have been at the heart of IKEA for over 80 years—and that’s exactly where they’ll stay.” It’s a disciplined approach that’s helping the organization modernize without losing the people who make it work.

Values as a Filter for AI

Before Ingka Group deploys a new AI tool, leaders examine the tasks within a job: to understand where automation can help and what should remain human. “We look at the tasks within a job to understand where automation can help, so co-workers can focus on more meaningful work,” Biesert explains.

IKEA treats values such as togetherness, simplicity, and caring for people and the planet as practical decision criteria. Those values show up in real questions leaders use to evaluate any AI initiative:

  • Does this simplify or complicate the work?
  • Does this support their co-workers and free up time for more meaningful work?
  • Does this align with fairness, inclusion, and sustainability?

That discipline isn’t only internal. Led by Chief Digital Officer Parag Parekh, the company signed last year with the Partnership on AI (PAI) to help broaden standards around responsible technology and, in Biesert’s words, to “ensure that AI is developed and applied ethically, in line with our values of inclusiveness and caring for people and the planet.”

That same human centric values-first posture guides how Ingka evaluates partners. The company applies a Digital Ethics Group Rule that requires any AI partner or tool to be “robust, auditable, interpretable, fair, inclusive, and sustainable.”

Training Leaders Before Scaling Tools

One of Ingka’s most significant choices was preparing leaders before rolling out technology. During the company's previous financial year Between 1 September 2023 to 31 August 2024, the company trained approximately 30,000 co-workers and around around 500 senior leaders on responsible AI so they can discuss the technology with their teams and support co-workers with care as AI evolves the way that work is done.

This is where some companies fall short. Not because employees can’t adapt to new technology, but because leaders talk out of both sides of their mouths. Employees can handle change when expectations are clear. What slows them down is value ambiguity: mixed signals about what the organization stands for, what is changing, and what will not be compromised. IKEA’s philosophy is simple: be explicit about its values and live them in practice, leading through “dialogue and everyday conversations” as work evolves.

It may not be flashy, but it’s one of the most effective cultural stabilizers available to executives navigating fast change.

Learning in Public: A Culture That Doesn’t Pretend to Have the Answers

Ingka has been testing AI in a range of practical areas: improving demand forecasts, supporting remote sales teams, and helping coworkers with everyday writing and planning. The tools vary, from the BILLY chatbot used by thousands of coworkers to the Hej Copilot, and the company’s own internal AI Assistant (MyAI Porta) that helps with drafting, ideas and improving co-worker workloads They’re also experimenting with a GPT assistant to make digital customer conversations smoother.

What stands out about these pilots is the openness of leaders during the process about the fact that not everything works perfectly the first time. As Biesert put it, “We don’t have all the answers, but we learn by doing, testing, and improving together.”

That openness helps keep people engaged. When teams see leaders working through the learning curve instead of presenting a polished rollout, it makes the whole process easier to trust.

Sustainability Built Into the AI

AI has also strengthened IKEA’s sustainability efforts, particularly in food operations across its retail markets. Using AI-enabled measurement and smart scales, IKEA has:

  • Reduced food waste by an astounding 54%
  • Saved more than 20 million meals

They also evaluate energy-efficient model training and responsible data practices, ensuring AI implementation doesn’t increase environmental impact.

It’s a continuation of IKEA’s long-standing values-based approach: using AI in responsible and beneficial ways for the many people and the planet

Five Leadership Practices Any Company Can Learn From

Across all of Ingka’s work, a handful of practices stand out:

  1. Build AI literacy in Senior Leadership to Enable the Many Employees to Transition. Ingka didn’t train only their employees/co-workers first. They trained nearly 500 senior leaders first, specifically on responsible AI and how it connects to the IKEA values. Equip leaders to explain what’s changing, how to talk about uncertainty without fuelling anxiety, and how to anchor decisions in shared values. Provide talking points, scripts, and FAQ sheets so leaders can guide their teams confidently through transformation, while supporting employees to upskill and grow. This will help enable what Ingka calls “leadership by all”: the belief that every co-worker is a leader and shares responsibility for stewarding change that promotes taking a shared responsibility.
  2. Redesign Work by Studying “Tasks Within a Job,” Not Roles. This is a well-established transformation practice across large enterprises, and Ingka applies it rigorously by breaking jobs into micro-tasks and examining where automation can remove friction, where AI can augment human work, and where human contributions will remain essential. This approach lowers resistance by helping co-workers see how automation improves their day rather than threatening their role.
  3. Make Responsible AI a True Governance Practice. Create clear criteria for every AI tool or vendor before it enters the organization. Standards should go beyond compliance to include reliability, interpretability, fairness, inclusion, and sustainability. Build a simple “AI acceptance criteria” checklist and make it part of the intake process. When possible, participate in external standards groups to avoid designing governance in isolation and to keep your practices aligned with emerging norms.
  4. Use Everyday Conversations as the Primary Change-Management Tool. Instead of corporate broadcasts, Ingka relies on short, regular check-ins between managers and coworkers. These conversations surface confusion early, build trust, and give workers a safe place to express concerns as their jobs evolve. Micro-feedback loops are far more effective than top-down messaging during rapid change.
  5. Treat Pilots as Shared Learning Moments. Ingka openly acknowledges that pilots won’t always be perfect and encourages teams to share what they learned after each iteration. This reduces perfection pressure, increases psychological safety, and brings coworkers into the experimentation instead of waiting for a finished solution. Leaders model learning in public, and teams follow.

A Closing Note for Leadership Teams

One of the most striking elements of Ingka’s journey is how steady the human side of the organization has remained while the work changes. This steadiness has come from leaders showing up, listening to concerns, and staying close to the people doing the work.

Plenty of companies are moving fast on automation, often prioritizing efficiency and speed above all else. Which transformation models will prove most durable over time remains unresolved. Ingka’s experience illustrates one intentional approach: aligning AI adoption with a clearly articulated social contract so change is absorbed with fewer internal shocks.

For leadership teams navigating this wave of technological change, the lesson is not to copy IKEA’s choices, but to be equally explicit about their own: to state their values clearly, make tradeoffs consciously, and lead with consistency as work is redesigned. IKEA offers a useful case of what that clarity can look like in practice.

Originally published at Forbes