Viewpoint

March 17, 2026

The AI Paradox: Why automation must design for human dignity, not efficiency

The AI Paradox: Why automation must design for human dignity, not efficiency

By Adewole Ampitan

The greatest risk of artificial intelligence is not that it will replace humans, but that it will replace human understanding such that we will become so enamoured with what machines can do that we forget to ask what humans need. This is the AI paradox.

The speed of AI adoption makes this paradox even more urgent. Recent global surveys show that more than 70% of organizations are already using AI in at least one business function, and the number continues to grow each year, making automation no longer experimental but infrastructural.

Now, here is my question: What does AI mean for the future of work, design, and human experience? In the advent of AI, the more powerful our tools become, the more essential human-centered design becomes, and the more we automate, the more we must design for dignity.

What I Learned Building AI-Powered Design Tools

When I created DesignFlow Kit, an open-source UX toolkit, I made a deliberate choice to integrate AI capabilities throughout the workflow. The toolkit includes 30 AI automation workflows and 8 GPT-4o powered chat assistants, designed to help designers move from research to handoff more efficiently. Building these features taught me something unexpected: AI is not a substitute for design thinking, it is an enabler of it.

The AI assistants in DesignFlow Kit do not make design decisions. They synthesise research, generate options, and automate repetitive work allowing designers to focus on the other parts of the process that require human judgment which includes understanding users, interpreting context, and making ethical choices about how systems should behave.

The assumption that AI can replace the human elements of design is wrong. Algorithms cannot generate empathy which is an essential human element that is required for every designed product. This dangerous assumption is emerging across the technology industry and it is leading us toward products that are efficient but undignified. This leads us to the efficiency trap.

Efficiency Trap

The technology industry has always been seduced by efficiency. We optimise for speed, scale, and automation because these are things we can measure and AI intensifies this seduction. We build systems that process users quickly because quick processing is easy to track and celebrate. When machines can complete tasks in seconds that once took hours, the temptation is to automate everything to remove all friction, delays, and things that should necessarily require human intervention.

However, efficiency is not the same as quality.

Speed is not the same as trust.

Automation is not the same as respect.

Designing for dignity means resisting the efficiency trap. It means building AI systems that explain themselves, that cede control, and that respect users enough to be transparent, even when transparency adds friction. Because when systems become invisible, users lose understanding. When decisions become automated, users lose control. When interactions become instantaneous, users lose confidence.

Across my career at Cyberspace, DOT, Interswitch, and now at Descinder building products used by hundreds of millions of people across three countries, one thing has remained constant: the human element that every product designed requires. Hence, I have recently focused on AI capabilities guided by principles I developed in practice over the past 7 years as demonstrated in DesignFlow Kit, my open-source kits develop for UI/UX designers.

What Dignity-Centered AI Looks Like

Users should understand what AI is doing, even if they do not understand how it works. Opaque systems that produce correct outputs but inscrutable processes leave users feeling disempowered. We should design AI that explains itself. For example, The GPT-4o assistants in DesignFlow Kit do not just produce outputs. They show their work. They reveal sources. They make their reasoning visible. This transparency builds trust with the user.

Context over optimisation

My ethnographic research in Lagos markets while building a product taught me that context changes everything. The user with a cracked phone, spotty network, and distracted attention is not the average user. Meanwhile, AI optimises for patterns, but patterns are not context. This means the most efficient solution based on aggregate data may be the wrong solution for an individual user in a specific situation. AI systems must be designed to recognise and adapt to context, not just optimise for patterns.

What Is Left for Human Designers?

My last question is this:

If AI can automate research synthesis, generate design options, and produce code from prototypes, what is left for human designers?

For the past seven years, having built systems from scratch at the founding stage and inherited systems burdened by technical debt, I have the following answers:

AI cannot conduct ethnographic research, sit in a market for weeks watching and learning, nor build the kind of trust that comes from direct human engagement. Likewise, it cannot make ethical judgments about how systems should treat people. It cannot fully understand context in a way that recognises the uniqueness of each user situation, and it cannot provide the kind of advocacy that represents users in conversations where they cannot speak for themselves.

For example, when I was at Interswitch, I was working on product design for underbanked merchants. AI could not have replaced the experience of me sitting with Mama Rose and learning about her life. AI could not understand why she checked her balance three times after a successful transaction, and translate that understanding into design decisions that prioritised confidence over speed.

UI/UX designers should start seeing AI as a tool, and themselves as craftspeople who wield it. The quality of the work depends on the craftsperson, not the tool.

The AI paradox is that as machines become more powerful, human-centered design becomes more essential. The more we automate, the more we must design for dignity. The more efficient our systems become, the more we must ensure they remain respectful, transparent, and controllable.

The machines are getting smarter. That makes human designers more important, not less. The question is whether we will use our tools to build systems that respect the humans using them, or systems that process people efficiently while leaving them anxious, confused, and disempowered.

Paying strong attention to this will significantly save the future of UI/UX design globally.

Adewole Ampitan, product design expert, wrote in from the United Kingdom.

Exit mobile version