For a decade, Elster was hailed as a triumph of e-government. Its software was free, secure, and ruthlessly efficient. The company’s engineers, many recruited from the same technical universities that fed Deutsche Bahn and Siemens, believed in a philosophy they called Perfektion durch Zwang (Perfection through Compulsion). If a user made a mistake, the software would not simply warn them—it would refuse to proceed. This was not a bug; it was a feature.
In the end, the most sophisticated tax software in Europe was undone by a simple truth: a plumber with a wet signature and a kind tax officer is infinitely more efficient than a flawless machine that says “no.” elster software
Elster Software was dismantled in 2018, its assets nationalized and its team dispersed. But its ghost haunts every conversation about AI, automation, and governance today. Elster’s failure was a textbook case of Goodhart’s law applied to software: when a metric (strict schema validation) becomes the target, it ceases to be a good metric. By eliminating all ambiguity, Elster eliminated all discretion, and without discretion, a bureaucratic system cannot function. For a decade, Elster was hailed as a triumph of e-government