When AI Rules Change Between Products

There is a hidden trap in AI services: experience in one product does not work in a similar one.

For example, in music generators you can sometimes influence the structure of the track with text markup — using different types of brackets:

  • square brackets — for song parts / [Intro], [Chorus], [Verse]...
  • round brackets — for soft backing vocals or canon (Text...)
  • curly brackets — may have special meaning (different sources say different things)
Different models in competing services may use different rules.

And here is the paradox: sometimes these hints improve the result, and sometimes they are ignored completely.

What worked perfectly yesterday may become random today.
But the user is sure:
“I did everything right! Why does it not work?”
It’s the same with image generation:
  • one service can follow the exact reference
  • another sees only the general style
  • a third adds its own vision — unexpectedly for the author

Again — the experience does not transfer.

What should the interface do?

show clearly what this exact model can do

highlight supported elements and syntax

warn about risky commands

do it during input — not in the FAQ

set expectations in advance
AI is still probabilistic and unpredictable.

UX should make interaction clear — and protect user expectations with care.

September 2025

Related articles
Designing UX for AI tools: turning “magic” into clear expectations
How to help users understand what AI can and can’t do. UX solutions that make generative AI clear, transparent, and comfortable to use.
Kartseva Daria
Phone: +358 40 170 33 53 (fi)
E-mail: kartseva.daria@gmail.com