software engineering has evolved since the 60s; I don't think you need to worry about the bug count remaining constant throughout versions.
Yes, it has, but... Jevon's Paradox (and the related Parkinson's Law and Grosch's Law) applies, for the same reason it applies to CPU speed, memory size, size and speed of secondary storage, etc. - the need for more, and more complex, software exceeds the need for the software to run correctly, so improving software engineering is as likely to lead to more
bugs (because it facilitates both less skilled programmers, on the one hand, and more ambitious projects, on the other) as it is to lead to fewer.
Hmmn, maybe we can ameliorate that somewhat if we could find a way to apply a modified version of Amdahl's Law to decisions regarding software improvements - if there were a simple heuristic to decide if a feature is worth the expense of developing and maintaining it, based on (for example) feature request frequency, the frequency in which users apply equivalent combinations of features to get the same result, etc., it might give software engineers some hints on how to prioritize which new features should be developed first. It would be a lot less objective than the Amdahl projections for hardware components, and getting people to use it (and not letting MGT override such decisions) would be an uphill battle, but it might be worth looking into (though I expect it has been tried before at least once).