Good UX helps people do what they came to do. Bad UX corners them into something else.
You may have seen people calling the Digital Fairness Act a new EU "UX law" for 2026. That is a bit too tidy. As of March 2026, it is still a planned European Commission initiative, not a law in force. The Commission says it is under preparation and aims to strengthen protection and digital fairness for consumers, while simplifying rules for businesses Source 1 . It is expected to tackle dark patterns, addictive design, unfair personalisation, misleading influencer marketing, and issues around digital contracts.
Even so, the direction is clear enough to matter now. This is part of a wider EU push against manipulative digital design. Existing rules already cover some of this territory. The Digital Services Act prohibits dark patterns on in-scope online platforms, including interface design that deceives, manipulates, or materially distorts free and informed user decisions Source 2 .
So here's the useful version for site owners, agencies, and product teams. Do not wait for a final legal text before taking it seriously. If your growth tactics rely on pressure, confusion, omission, or friction designed to wear people down, the risk is already there.
What is the Digital Fairness Act, in plain English?
The Digital Fairness Act is the EU's planned attempt to tighten up digital consumer protection where current rules still leave gaps. The Commission's review work says the goal is stronger protection and fairness for consumers online, with clearer and more workable rules for businesses Source 1 . The European Parliament's legislative tracker says the Commission is expected to present it in 2026 Source 4 .
The likely focus areas are not hard to spot. They include dark patterns, addictive product design, unfair personalisation, poor subscription practices, and exploitative design aimed at vulnerable people, especially children. The European Parliament's 2025 briefing on dark patterns says the current EU framework is fragmented and lacks a unified definition, which creates legal uncertainty and uneven enforcement Source 3 .
That matters because it changes the conversation. This is no longer only about whether a design pattern feels grubby. It is about whether it distorts free and informed choice. That is where EU policy is heading.
Why this matters before any new law lands
A lot of businesses treat this sort of story as future legal admin. Something to park until there is a final text, a headline fine, or a panicked email from legal.
That is the wrong read.
First, the EU already has live rules against some deceptive interface practices. Under the Digital Services Act, online platforms must not design or organise their interfaces in a way that deceives, manipulates, or materially distorts users' ability to make free and informed decisions Source 2 . The Commission's public material describes this as a ban on dark patterns.
Second, the policy mood is moving in one direction. The Commission's consumer law review work Source 5 , the Parliament's digital fairness material, and Parliament resolutions on addictive design and child protection all point towards tougher treatment of manipulative design, especially where commercial systems exploit attention, urgency, or vulnerability Source 4 .
Third, unfair UX is rarely only a legal problem. It is usually a trust problem, a support burden, a complaints problem, and often an accessibility problem too. The same teams who hide cancellation links or nudge people into the wrong choice often produce cluttered, confusing journeys that frustrate users across the board. That is my interpretation, based on the broader EU focus on free and informed decision-making and consumer vulnerability.
The UX patterns worth looking at now
If you run a marketing site, online shop, membership product, SaaS service, or app, these are the places I would audit first.
Cookie banners with a loaded choice
This is the classic. "Accept all" is bright, obvious, and one click away. "Reject" is smaller, lower contrast, hidden behind another step, or written in vaguer language.
This sort of consent design has been under scrutiny for years, and the DSA's public explanation explicitly names confusing and misleading consent buttons as an example of prohibited deceptive design on online platforms Source 2 . Even outside platform law, it is a useful red flag for any business trying to keep its UX honest.
If one option shouts and the other mutters, you do not have a fair choice.
Fake urgency and fake scarcity
Countdown timers that magically reset. "Only 1 left" messages with no basis in reality. Pressure copy that implies loss if the user does not act now.
These patterns aim to replace considered choice with panic. They are exactly the kind of manipulative design tactic lawmakers and regulators have in mind when they talk about dark patterns and consumer distortion. The Digital Fairness Act is expected to deal with exploitative commercial practices online Source 1 , and the Parliament's dark patterns briefing reinforces that push for clearer rules and enforcement Source 3 .
A fair journey informs. It does not stage-manage urgency.
Hidden costs late in checkout
A product looks cheap until the final step. Then come service fees, delivery surprises, admin costs, or defaults the user did not actively choose.
This is one of the oldest tricks in digital commerce and still one of the most corrosive. It erodes trust fast. The EU's consumer law review and Digital Fairness Fitness Check identified issues with digital contracts and price marketing as gaps to address Source 5 .
If the real price only appears when the user has already invested time and effort, your conversion strategy is leaning on sunk cost.
Easy sign-up, awkward cancellation
The subscription starts in seconds. Cancelling means hunting through settings, reading evasive copy, facing repeated prompts, or contacting support during narrow office hours.
The European Parliament's legislative tracker explicitly points to difficulties with cancellation and renewal of digital subscriptions as part of the problem space for the Digital Fairness Act Source 4 .
Joining and leaving should not live on different planets.
Personalisation that exploits vulnerability
Recommendation systems and nudges are not neutral. They shape behaviour. That becomes more serious when systems target children, people in distress, or users showing signs of vulnerability.
The Commission and Parliament have both flagged unfair personalisation and addictive design as issues the Digital Fairness Act is expected to address Source 1 . Parliament has also called for stronger action on persuasive technologies and manipulative features affecting minors Source 3 .
When a system is built to exploit weak moments rather than support informed choice, the design problem is obvious even before the legal one catches up.
Where accessibility and digital fairness overlap
This is the bit many teams miss.
Accessibility and fairness overlap all over the place.
A consent flow with weak hierarchy, poor focus handling, vague wording, and hidden controls is not only harder for disabled users. It is harder for everyone to understand and control. A subscription cancellation journey buried behind tiny links, unclear labels, and keyboard traps is not only manipulative. It is also exclusionary. A checkout that relies on pressure, clutter, and cognitive overload is not only grubby. It is harder to use, harder to trust, and easier to get wrong.
The EU material does not frame this purely as an accessibility issue, but it keeps returning to free and informed decisions, consumer vulnerability, and harmful interface design. That should ring bells for anyone who cares about usable front ends.
A polished interface is not automatically an honest one. In fact, the most effective manipulative patterns are often the neatest looking.
A quick fairness audit for your site
You do not need a legal team to start improving this. Start with a plain review of your highest-risk journeys.
- Look at your cookie banner. Are "accept" and "reject" equally clear, equally easy to find, and equally easy to activate?
- Look at your checkout. Is the full price visible early enough? Are extras clear? Are defaults honest?
- Look at your sign-up flows. Are marketing opt-ins off by default unless the user actively chooses them?
- Look at your cancellation and unsubscribe journeys. Can a person leave as easily as they joined?
- Look at your buttons and hierarchy. Are you helping the user make a choice, or steering them towards the answer you want?
- Look at any urgency messaging. Is it factual, provable, and proportionate, or theatre?
- Look at personalisation and recommendation features. Are they there to help, or to trap attention and spend?
- Look at key journeys with a keyboard only. If a user cannot move through the interface clearly and predictably, your control story is already in trouble.
- Look at copy and microcopy. Remove pressure language, guilt prompts, buried choices, and vague labels.
None of this requires waiting for Brussels to publish a final law. It is basic interface honesty. The Commission's framing of digital fairness—strengthening protection while simplifying rules for businesses—supports this sort of early audit mindset Source 1 .
Honest UX is good business
There is a lazy myth in some corners of digital work that manipulative UX is what "converts", and honest UX is a nice ethical extra.
I do not buy it.
Trick conversions are fragile. They produce buyer's remorse, support tickets, chargebacks, unsubscribes, distrust, and bad word of mouth. Clear journeys, visible choices, and plain language tend to produce better quality outcomes, because people understand what they are doing and why they are doing it. That is not moral posturing. It is operational common sense.
The EU's digital policy work leans heavily on trust, safety, transparency, and informed choice online. Businesses that clean this up early are less likely to end up scrambling later.
If your conversion rate depends on confusion, the problem is not user hesitation. The problem is the journey.
Summary
The Digital Fairness Act is not here yet. The warning signs are.
If your website, online shop, or product relies on confusion, pressure, or hidden friction, now is a good time to stop calling that optimisation and start calling it what it is.
Design debt.
If you want an external review of your consent flow, checkout UX, or high-risk journeys, I help businesses find accessibility, performance, and UX issues before they become a legal, commercial, or trust problem. See website design and build for new projects, or get in touch to discuss an audit.
Good UX respects the user. The law is catching up.
For more on consent and cookie banners, see cookie banners without breaking UX or accessibility. For accessibility and ethical design, see inclusive design in practice: beyond the checklist. For performance and trust, see performance myths: why score chasing fails and Core Web Vitals for business owners. You can also get in touch to discuss your site.
Sources
- [1] European Commission. Commission launches open consultation on the forthcoming Digital Fairness Act. Back to article
- [2] European Commission. The Digital Services Act. Back to article
- [3] European Parliament EPRS. Regulating dark patterns in the EU: Towards digital fairness. Back to article
- [4] European Parliament Legislative Train. Digital Fairness Act. Back to article
- [5] European Commission. Review of EU consumer law. Back to article