Until very recently, I have used HTML only as a simple markup language for presenting static information. It was quite satisfactory for posting announcements or writing, and easy to automate for formatting large collections of material from my research databases. Even though the databases might be updated daily, a given data dump was static as soon as it was generated.
I've gotten used to the incompatibilities between Python 2 and 3, which fazed me at first. But everything connected with the web seems to be subject to a higher margin of "not quite" than any compiled or scripting language proper that I've seen.
The model of longevity in traditional scholarship — the world in which I cut my teeth — assumes that what you produce in your 20s will still be around and basically accessible when you're in your 70s, at least. Mathematical writings from 120 years ago are still cited with some frequency, and I possess philological books on the order of 1000 years old that are still basically quite usable and remain in print for that reason. I can even make my way around 1400-year old Chinese dictionary manuscripts without too much pain. But the amount of tweaking needed to keep code alive is a high trade-off for what I grant is its immense flexibility.
At bottom, I think this is a difference in economic outlook: When I work as a scholar, I labor to produce a description of something I can show to be true, and to present it in such a way that my thinking can be reproduced and challenged by anyone for a very long time to come. In exchange, I am paid essentially nothing for what I hope is intellectual substance of quality and intrinsic worth.
There is a different model of value involved in coding, and a different time-frame.