mallard wrote:
Secondly, segmentation was not primarily created to break the 64KB barrier of 16-bit addressing. It's a feature that originated in mainframe systems in the 1960s and 70s, many of which already used address sizes greater than 16 bits.
In the case of the 8086, the reason (well, one of the reasons) was pretty much the opposite. In the 8080, the 8086's predecessor, the data paths were 8-bit, but they used two byte (16-bit) addresses, and with the exception of the accumulator (A), the registers (B, C, D, E, H, and L) could be paired (BC, DE, and HL) for addresses and other values that wouldn't fit in a single byte. However, when they worked on the 8086, the ran into two problems: first, most of the current hardware was made for either 8-bit or 16-bit data paths; and second, doubling the number of data pins would have made the Dual-Inline Package form factor of the planned chip too large (though other manufacturers would do exactly that for their 16/32-bit chips such as the Motorola 68000, and 32-bit chips such as the Zilog Z8000, shortly afterward). Furthermore, memory was so expensive that it was assumed that in the projected 3-5 year production life for the 8086 no one would even try to use a full 4GiB address space - not even mainframes of the time had that much.
They also wanted to make porting 8080 assembly source code easier, and requiring 32-bit addressing would break a lot of the assumptions of a lot of the assumptions such programs made.
So, the settled on segmentation - a well established, if rather unpopular and often derided, technique - as a compromise that would let the majority of programs use the same 64KiB address space, limit the total address space to a manageable 20 address pins, and allow the use of existing 16-bit hardware.
It was definitely a "it seemed like a good idea at the time" sort of thing, especially in light of the fact - which I have mentioned here numerous times - that the 8086 was seen as a stop-gap by Intel until a better embedded controller - the 8051 - and a high-end workstation processor - the ill-starred iAPX 432 - were completed. They were entirely blindsided by the PC, doubly so since they thought that home computers were a dying fad and had actually stopped selling chips to microcomputer builders in the hope that the whole thing would fade. They thought IBM was making a mistake with the PC, and assumed that Big Blue's long-term strategy was to drive the micros out and then slowly wind down the market and maneuver the buyers into using PCs as glorified smart terminals. This was in fact the plan, but needless to say, they both misread the situation drastically in a way that is comparable the plot of
The Producers.