Personalised plateaus.
Personal compounds.
A short argument about the shape of the curve. Why a tuned product flattens out by month three, and a personal product is still steepening at month thirty.
1 · Two definitions
Personalised software is a product that exposes settings. You configure it — pick a layout, a tone, a notification cadence — and the product respects those settings until you change them. Modern SaaS is overwhelmingly this kind.
Personal software is a product that observes you and reshapes itself. You don’t configure it; you *use* it, and it watches what you do. By month three the product fits your particular shape; by month twelve it is not the same product as anyone else’s.
2 · Why personalised plateaus
Personalised products are bounded by what the configuration surface can express. There are perhaps a hundred meaningful settings in a typical productivity app. By the time the user has clicked through them, the product has done all the personalising it can.
A user who wants the product to know "I write more bluntly to internal teammates than to clients" is not going to find that setting. There is no checkbox for it. The product has hit a ceiling that the configuration surface defined.
3 · Why personal compounds
A personal product reads how you use it. Every reply is a sample. Every accepted draft is a calibration. Every rejected suggestion is a gradient. The product’s knowledge of you is bounded by the patience of its observer, not by a checkbox grid.
The interesting fact is the curve. By month three a personal product is roughly as good as a personalised one — slightly worse on cold start, slightly better on warm. By month nine the curves separate. By month thirty the personalised product has plateaued and the personal one is still steepening.
4 · The architectural consequence
If you’re building personal software, you commit to two unusual things. First: a tenant per user. The model has to live somewhere it can take on one person’s shape without averaging across millions. Second: a read-back. The user has to be able to inspect, edit, and delete what the model has learned, or trust collapses.
Both have until recently been too expensive. Both, as of 2025, are not.