As a language it’s terrible. But I don’t think the stability has been the problem there. I also think it’s vital that one be able to view historical academic papers. You literally can no longer do anything with a Word document from 1990; if you have an old paper in Word, the only way to preserve it is to reformat it from scratch from a paper copy if you have one. TeX, by contrast, is “safe”; if you write something in it, you know that in 40 years it will probably still be possible for people to generate a formatted version.
The language itself reflects Knuth’s 1960s era sensibilities, and is a very bad design. But had it been unstable, it would have been a bad design and strictly less useful.
Perhaps it is indeed always possible to write future-proof code, but I think people generally expect that if a tool works today there should be little reason it will break tomorrow, and only experts know what’s brittle and what isn’t. One can of course just blame the users, but that smacks a bit of people saying “well, if you’re a really good programmer, you can write decent C code that doesn’t have undefined behavior in it, it’s your fault if you make a mistake.”
I think we should presume that most people want tools that don’t require a high degree of human perfection to use. We like strongly typed languages because they help the user even when they are imperfect. I think a language ecosystem should, in general, be supportive in that way.