OSDev.org

The Place to Start for Operating System Developers
It is currently Thu Mar 28, 2024 7:35 am

All times are UTC - 6 hours




Post new topic Reply to topic  [ 6 posts ] 
Author Message
 Post subject: Some confusion about VGA and display drivers
PostPosted: Sun Aug 06, 2017 3:15 pm 
Offline
Member
Member

Joined: Fri Jan 04, 2013 6:56 pm
Posts: 98
Hi there,

It's quite a while since I posted here! Some things regarding VGA got me a bit confused, and I thought this would be the best place to ask. In a general sense, I do understand how the VGA hardware works - how the analog signals are generated from video memory etc. With older motherboards, I understand the whole process: the CPU would set a video mode and write to video memory using some bus protocol, and the VGA chip would (in cooperation with the video RAM and the RAMDAC) generate the output, clock, and vertical and horizontal retrace signals.

However, I don't understand the process on slightly more modern motherboards (I say slightly because of course, most modern motherboards have HDMI or at least DVI). I have an old motherboard here, which has an internal GPU. Based on googling, it seems that the integrated GPU usually just uses the system RAM as video RAM. Also, I don't think there are RAMDACs on the motherboard, but I'm not sure. Do CPU's with an integrated GPU just output the (analog) VGA signals immediately? This seems a bit hard to believe when looking at the motherboard, considering how far the CPU is from the VGA connector.

Now suppose my CPU wouldn't have an integrated GPU, but I would be using an external GPU. If I don't have the proper drivers, an OS obviously has to fall back to some commonly supported protocol or interface (right?). Now, I would like to know more about this, but I can't find any information (probably because I don't know the proper terms).


Top
 Profile  
 
 Post subject: Re: Some confusion about VGA and display drivers
PostPosted: Sun Aug 06, 2017 6:22 pm 
Offline
Member
Member

Joined: Mon Mar 25, 2013 7:01 pm
Posts: 5099
kutkloon7 wrote:
Do CPU's with an integrated GPU just output the (analog) VGA signals immediately?

No. The video data is sent in some digital format to another component that converts it to VGA.

kutkloon7 wrote:
If I don't have the proper drivers, an OS obviously has to fall back to some commonly supported protocol or interface (right?). Now, I would like to know more about this, but I can't find any information (probably because I don't know the proper terms).

The video card contains an option ROM with all of the code necessary for boot-time configuration of the video card.

For BIOS, the option ROM will initialize the card to emulate VGA and provide an INT 0x10 handler. Your OS can then attempt to use VESA VBE to configure the card, or fall back to programming it like ordinary VGA if the VBE functions are insufficient or nonexistent.

For (U)EFI, the option ROM will provide boot services (UGA or GOP) that the firmware will use to initialize the card. During boot, your OS can also use the boot services to configure the card.


Top
 Profile  
 
 Post subject: Re: Some confusion about VGA and display drivers
PostPosted: Mon Aug 07, 2017 5:32 am 
Offline
Member
Member

Joined: Fri Jan 04, 2013 6:56 pm
Posts: 98
Crystal clear, thanks (except for the UEFI parts, but I really don't want to go there, graphics hardware/protocols are quite a rabbit hole already, and this seems an especially obscure topic in that I can't find a lot by googling).


Top
 Profile  
 
 Post subject: Re: Some confusion about VGA and display drivers
PostPosted: Mon Aug 07, 2017 11:40 am 
Offline
Member
Member
User avatar

Joined: Sat Jan 15, 2005 12:00 am
Posts: 8561
Location: At his keyboard!
Hi,

kutkloon7 wrote:
However, I don't understand the process on slightly more modern motherboards (I say slightly because of course, most modern motherboards have HDMI or at least DVI). I have an old motherboard here, which has an internal GPU. Based on googling, it seems that the integrated GPU usually just uses the system RAM as video RAM. Also, I don't think there are RAMDACs on the motherboard, but I'm not sure. Do CPU's with an integrated GPU just output the (analog) VGA signals immediately? This seems a bit hard to believe when looking at the motherboard, considering how far the CPU is from the VGA connector.


It's best to think of the video as multiple independent pieces, like:
  • One piece that is mostly a crude serial controller, which handles "DDC" (Display Data Channel) and is used to obtain the display's EDID and control display features. Note: If the card supports multiple monitors there would be multiple copies of this.
  • One piece that scans a pixel buffer and converts the pixels into VGA/HDMI/DVI signals; which is controlled by a pixel clock and various counters (front porch width, sync pulse width, back porch width, ...). Note: If the card supports multiple monitors there would be multiple copies of this.
  • One piece that handles memory management (DMA/bus mastering, and paging).
  • The GPU, which is a special purpose processor that mostly works on a "SIMD" module (to do the same operation on many pieces of data in parallel).
  • Assorted "fixed function" pieces (MPEG decoder, the encryption for HDMI, etc).

VGA hardware only really had one of these pieces (the "scans a pixel buffer and converts the pixels into VGA/HDMI/DVI signals" piece). It was followed by cards that added some memory management (for blitting data to/from memory) and some "fixed function" pieces (for basic line drawing, etc); then those "fixed function" pieces grew (up until we reached "fixed function 3D pipeline"); then we started shifting to GPU to replace most of "fixed function 3D pipeline" with something more flexible/programmable.

Note that (for integrated video) a chip that contains CPUs and GPU may not contain all of the pieces for video (e.g. it could contain GPU but not contain the "scans a pixel buffer and converts the pixels into VGA/HDMI/DVI signals" piece).

kutkloon7 wrote:
Now suppose my CPU wouldn't have an integrated GPU, but I would be using an external GPU. If I don't have the proper drivers, an OS obviously has to fall back to some commonly supported protocol or interface (right?). Now, I would like to know more about this, but I can't find any information (probably because I don't know the proper terms).


There's only 3 possibilities:
  • VGA emulation; which is limited to extremely ugly video modes (e.g. 320*200 with 256 colours, 640*480 with 16 colours), doesn't support multiple monitors, and isn't guaranteed to work under UEFI. This is almost entirely useless for any practical purpose (most people would rather be blind than to see this).
  • VESA VBE (video BIOS extensions); which does supports modern video modes, but doesn't support multiple monitors and won't work under UEFI.
  • UGA and GOP (which are like the equivalent of VBE for UEFI); which supports modern video modes and can support multiple monitors, but won't work for BIOS.

Note that UGA and GOP are boot-time services, which means that they can't be used after boot. The intention is that an OS's boot code sets up a video mode and a raw frame buffer (for each display) during boot (using UGA or GOP); and then (after the video mode is set up) continues using the raw frame buffer after boot (until a native video driver is started).

This same strategy can work the same for VBE; where boot code sets up a video mode and a raw framebuffer during boot, and then continues using the raw frame buffer after boot (until a native video driver is started).

This leads to the idea of boot code as an abstraction layer; where the boot code sets up a video mode and raw frame buffer (using VBE if the boot code is designed for BIOS, and using GOP/UGA if the boot code is designed for UEFI); and the entire rest of the OS doesn't need to know or care what the firmware was (because it just continues using the raw frame buffer regardless, until/unless a native video driver is started). This has a massive advantage - you can add support for UEFI (and OpenFirmware and CoreBoot and whatever else you can think of) just by creating new boot code whenever you feel like it.

This also has a minor disadvantage - you can't change video modes (without a native video driver) without rebooting. Fortunately (assuming the OS provides a modern "resolution independent" video API) the only valid reasons to want to change video modes is if the user changes displays, or if it's necessary for "frame rate" for high-end 3D games; and (without a native video driver) you can't detect when a display has been unplugged or plugged in anyway, and (without a native video driver that supports GPU, etc) you can't really support high-end 3D games; so (without a native video driver) there's no sane reason to want to change video modes after boot at all.


Cheers,

Brendan

_________________
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.


Top
 Profile  
 
 Post subject: Re: Some confusion about VGA and display drivers
PostPosted: Thu Aug 10, 2017 3:50 pm 
Offline
Member
Member

Joined: Thu May 17, 2007 1:27 pm
Posts: 999
kutkloon7 wrote:
However, I don't understand the process on slightly more modern motherboards (I say slightly because of course, most modern motherboards have HDMI or at least DVI). I have an old motherboard here, which has an internal GPU. Based on googling, it seems that the integrated GPU usually just uses the system RAM as video RAM. Also, I don't think there are RAMDACs on the motherboard, but I'm not sure. Do CPU's with an integrated GPU just output the (analog) VGA signals immediately? This seems a bit hard to believe when looking at the motherboard, considering how far the CPU is from the VGA connector.

For old Intel CPUs the graphics unit (the memory interface, GPU and the CRTC (let's call the unit that converts the frame buffer and timings to some bit stream CRTC even if there is no CRT monitor), DAC, DP/HDMI encoder and so on) is entirely part of the mainboard. For modern Intel GPUs (>= Ironlake? I don't remember the exact model) the graphics unit is split between the CPU and the mainboard. The CPU contains the memory interface, GPU and CRTC while the mainboard contains the DAC and DP/HDMI encoders. They are connected via DMI.

Brendan wrote:
This also has a minor disadvantage - you can't change video modes (without a native video driver) without rebooting. Fortunately (assuming the OS provides a modern "resolution independent" video API) the only valid reasons to want to change video modes is if the user changes displays, or if it's necessary for "frame rate" for high-end 3D games; and (without a native video driver) you can't detect when a display has been unplugged or plugged in anyway, and (without a native video driver that supports GPU, etc) you can't really support high-end 3D games; so (without a native video driver) there's no sane reason to want to change video modes after boot at all.

We talked about this before but one should keep in mind that connectors like DP and HDMI need link training to operate properly so even unplugging and replugging (e.g. when using a KVM switch) such a connector might break the preset video mode. Furthermore I'm not sure if VBE option ROMs really support modern (e.g. 4k) resolutions. IIRC I read about some limits of Intel's VBE code. Higher resolutions require a larger driver (e.g. because memory access patterns need to be tuned) and more stolen RAM on integrated cards.

_________________
managarm: Microkernel-based OS capable of running a Wayland desktop (Discord: https://discord.gg/7WB6Ur3). My OS-dev projects: [mlibc: Portable C library for managarm, qword, Linux, Sigma, ...] [LAI: AML interpreter] [xbstrap: Build system for OS distributions].


Top
 Profile  
 
 Post subject: Re: Some confusion about VGA and display drivers
PostPosted: Thu Aug 10, 2017 5:45 pm 
Offline
Member
Member
User avatar

Joined: Fri Oct 27, 2006 9:42 am
Posts: 1925
Location: Athens, GA, USA
Brendan wrote:
VGA hardware only really had one of these pieces (the "scans a pixel buffer and converts the pixels into VGA/HDMI/DVI signals" piece).


To clarify this statement a bit, permit me to give a bit of a history lesson.

What Brendan is alluding to here is that what we usually call 'VGA' today is a rather different thing from the original, even discounting the advances in technology. The original Video Graphics Array - not, as is usually said nowadays, 'Adapter' - was an ASIC directly located on the motherboards of the IBM PS/2 models 50, 60, and 80, and was directly integrated into the system, separate from the Micro Channel Architecture bus that the system used for MCA peripheral cards. They deliberately designed it this way, because the whole purpose of the PS/2 line was to shoot the PC clone market in the head - though it ended up hitting IBM's foot instead.

They made this kind of mistake more than once with the PC market, starting with when they created it in the first place. They had always expected that they could use the PC to rein in the small computer market, so they could then make them basically just glorified smart terminals. This didn't fly in 1981, and it didn't fly seven years later, either.

It was assumed by IBM that the proprietary array could not be cloned, and that the few PC manufacturers who did license the technology would be unable to compete due to ruinous licensing requirements. It was also thought that it would be impossible to run a 640x480 256 color video through an adapter on the 16-bit AT bus, so they discounted the possibility of someone making a work-alike video system that wasn't integrated.

As I said, they made this sort of mistake a lot. Now, to be fair, they had a point; the original AT Bus VGA compatible cards stank on ice, and the VGA 8051 compatible monitors were far too expensive for most PC users; but the main effect this actually had was to give a new lease on life for CGA, MDA and HGC, and for the XT class PCs in general. IBM, more than anything else, misjudged the carrying capacity of the market - most people were perfectly happy with a mono monitor and a Turbo 8088, and the high-performance folks who wanted 80286s and 80386s usually needed an AT (now re-christened Industry Standard Architecture in contrast to it's proposed 32-bit successor, EISA) bus system to run peripherals with no available MCA adapters, anyway.

By the time VGA-compatible monitors and adapters were coming down in price circa 1992, both MCA and EISA were dead in the water (though ISA outlasted them both, and was often combined with other, later buses in the early and mid-1990s), and the resurgent market for 'Super VGA' turned to a new, special-purpose interface, VESA Local Bus, for the video cards. This gave way to PCI a year or so later, then to AGP in 1996, then most recently to PCI Express around 2004.

The point is, you can't really talk about 'VGA' at the hardware level, only at the level of hardware emulation of the original VGA, which is usually at the BIOS level but not always at the register level. The actual hardware used by different adapters was, and remains, radically different from that used by IBM in 1987.

_________________
Rev. First Speaker Schol-R-LEA;2 LCF ELF JAM POEE KoR KCO PPWMTF
Ordo OS Project
Lisp programmers tend to seem very odd to outsiders, just like anyone else who has had a religious experience they can't quite explain to others.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 6 posts ] 

All times are UTC - 6 hours


Who is online

Users browsing this forum: No registered users and 21 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group