OSDev.org

The Place to Start for Operating System Developers
It is currently Thu Mar 28, 2024 9:01 am

All times are UTC - 6 hours




Post new topic Reply to topic  [ 5 posts ] 
Author Message
 Post subject: Driver driven display updates vs. today's poll-based systems
PostPosted: Fri Feb 02, 2018 2:34 pm 
Offline

Joined: Fri Oct 03, 2008 10:14 am
Posts: 23
Hi all,

Apologies in advance if I am off topic here, but I have found it difficult to find a place to ask this that has an audience and would consist of people that could have valued opinion about this.

So, with today's and yesterday's computer systems, we have an HDMI (back in the day VGA) or another digital signal interface, between a graphic card and a display device, typically an LCD. As far as I understand, the graphic card has some control over when to "refresh" contents of the display (not video RAM, which is another aspect), but typically we always talk about some applied refresh rate, frequency at which the display in effect updates itself.

I am no electrical engineer so I don't know if it is typically the graphic card that exclusively drives the display or whether it is the display and/or the HDMI (an example) subsystem in it that polls the graphic card automatically at regular intervals for update from the framebuffer, so these details non-withstanding, I wonder:

Why in this day and age we can't switch over to a manual software-initiated display (not just framebuffer!) update mechanism supported on the hardware level all the way to the display? Application calls display manager, display manager calls driver to execute transaction that refreshes the display straight from the video RAM? Does it have something to do with legacy code and type of thinking on software developers' part?

To explain, we have now a variety of different display applications, software that does wildly different things, from 3D games that need a semi-regular update of a world they render, to spreadsheet and word processors and text editors where nothing has to be going unless the application needs to update the document view somehow, often as response to user action. For some applications there is no clear need for the display or graphic card to drive display update X times per second, indiscriminately refreshing it from video RAM.

Instead, we could imagine the graphic card driver expose the new paradigm by a function that a privileged (you don't want every user process to have monopoly over entire framebuffer, typically, but not a crucial detail here) application can call to signal the graphi card and the rest of the display subsystem that it wants to refresh the display, already having updated the framebuffer by that time. The hardware will update the display from the video RAM once, with the application using polling or asynchronous callback to learn of a completed transaction, so it knows for example when it can issue another signal at the earliest, and so that v-sync problems are a non-issue.

This is thus an entirely push-driven paradigm that rests on client initiated display update transactions, so to speak, eliminating v-sync problem at the root entirely and effectively giving us variable refresh rate in the entirety of rendering pipeline, software and hardware.

The updates are said to thus always be initiated by the graphic card, on request from and indirectly by the driver and in case with a traditional OS kernel, privileged software calling said driver, typically a "display manager" (X.org in Linux, DWM in Windows, etc). There is no polling and fetching and updating the display periodically by the display hardware, as is the case with display hardware systems we are used to currently.

Perhaps, and this is speculations of someone who isn't too familiar with the electronics behind current systems (LCDs, HDMI, DVI technologies), with the electronics that support our systems today, we wouldn't have to waste current on indiscriminately updating a display when it does not need to be updated, or try to solve the old problem of synchronizing the updating of framebuffer with the display refresh iteration.

I hope I am making sense with this. In an advent of e-ink displays for certain kind of computing applications, I imagine this is not that wild a proposition, if it holds water of course.

Imagine that your application, through the display manager tasked with multiplexing the display using a GUI (desktop environment), could signal when the portion of display contents it was allocated are updated, if it is once a second or 60 times a second, with the display manager driving the display based on this information, without useless display refresh. This seems to require involvement of both the display, the graphic card and software architecture. Is there any merit in doing this? If not on hardware level, then at least from a software engineering perspective?


Top
 Profile  
 
 Post subject: Re: Driver driven display updates vs. today's poll-based sys
PostPosted: Fri Feb 02, 2018 3:20 pm 
Offline
Member
Member
User avatar

Joined: Fri Feb 17, 2017 4:01 pm
Posts: 640
Location: Ukraine, Bachmut
display doesn't poll graphic card for update. instead display interface, say HDMI, on the host side (source), drives display (sink). sending data through its protocol. and clocks of course too. it's controlled by software at the host side.

I am not sure that display doesn't need refreshing. It's like RAM probably - if you forget refreshing it, the content will disappear.
What you are talking about is rendering into framebuffer, this content may change often or not. but display by itself probably needs constant refreshing for displaying it no matter does the content change or not. of course it could have internal ram inside, but the host has it in a much bigger amounts. that's probably why it has been laid out this way.

_________________
ANT - NT-like OS for x64 and arm64.
efify - UEFI for a couple of boards (mips and arm). suspended due to lost of all the target park boards (russians destroyed our town).


Top
 Profile  
 
 Post subject: Re: Driver driven display updates vs. today's poll-based sys
PostPosted: Fri Feb 02, 2018 3:51 pm 
Offline
Member
Member

Joined: Thu Jul 05, 2007 8:58 am
Posts: 223
The move away from fixed refresh rates has already somewhat happened. Displayport adaptive sync, which is used in AMDs freesync already has options for delayed frames, and NVidia provides similar functionality with G-Sync displays. These operate on a similar principle as you seem to suggest, in that frames get pushed to the display only after the software/video card indicates they are completed. I can't find exact references on how this is handled precisely in the display itself, but the fact that freesync requires framerates of at least 9 frames per seconds suggest to me that for normal (lcd, tft, oled) displays, some part of the driver circuitry may need regular updates.

The fact remains that, for both oled and lcd (I am less familiar with how tft works), some form of active power is needed to keep content on the display, either in the form of an oscillating waveform to drive the LCD interface (as DC potential across lcd cells can cause accelerated wear), or in the form of power to drive the leds in an oled display. I am guessing that lowering the framerate much below 9Hz wouldn't do too much for power consumptions given that.

As for Eink displays, these operate on a completely different principle, and don't need active drive current outside of changing pixels. I could not find datasheets for actual panels, but typical interfaces for these kind of displays already dont have a fixed refresh rate, and also seem to supply partial screen update.

If you would like to learn more about this subject, I would suggest trying to find datasheets for display panels and display driver ICs, trying to learn from those. From my experience these provide the best picture of how and why this stuff works like it does.


Top
 Profile  
 
 Post subject: Re: Driver driven display updates vs. today's poll-based sys
PostPosted: Fri Feb 02, 2018 6:43 pm 
Offline
Member
Member
User avatar

Joined: Sat Jan 15, 2005 12:00 am
Posts: 8561
Location: At his keyboard!
Hi,

amn wrote:
So, with today's and yesterday's computer systems, we have an HDMI (back in the day VGA) or another digital signal interface, between a graphic card and a display device, typically an LCD. As far as I understand, the graphic card has some control over when to "refresh" contents of the display (not video RAM, which is another aspect), but typically we always talk about some applied refresh rate, frequency at which the display in effect updates itself.

I am no electrical engineer so I don't know if it is typically the graphic card that exclusively drives the display or whether it is the display and/or the HDMI (an example) subsystem in it that polls the graphic card automatically at regular intervals for update from the framebuffer, so these details non-withstanding, I wonder:


For current hardware, you can assume that the video card is mostly a pair of counters where each counter has a bunch of comparators. The first counter is incremented at a specific frequency (called the "pixel clock frequency"), and there's some comparators that do certain things when that counter reaches certain values (start horizontal blanking, start horizontal sync, end horizontal sync, end horizontal blanking, etc). At the end of this (for the comparator with the highest value) it resets the count back to zero and increments a "line counter". The line counter is mostly the same - a bunch of comparators that do similar things when the line counter reaches certain values (start vertical blanking, start vertical sync, ..., reset the line counter back to zero).

The monitor receives the resulting signals, and based on the timing and sync pulse polarities it has to auto-guess what the video mode is. Apart from that (especially for old CRT monitors), the monitor is relatively simple and just does what the video card's signals tell it to do. More importantly (originally) the monitor didn't have any reason to store any pixel values anywhere - you'd have an electron beam that charged up a few dots of phosphor, and that phosphor would glow for a little while. How long the phosphor glowed was part of the monitor's design and determined the refresh rate - in a way, you can think of "refresh rate" as "how often the phosphor dots need to be refreshed". If you refreshed the display too fast the phosphor dots would get "overcharged" and you'd get a horrible "everything too bright" picture, and if you refreshed the display too slowly you'd get a "too dark" picture (and if it's very bad, you'd get a flickering mess). Ideally (to avoid these problems), the video card uses a refresh rate that the monitor is designed for ("60 Hz or suffer").

Modern monitors are designed to behave the same - in other words, they're designed to emulate phosphor dots that need refreshing. I'd assume this is mostly done with capacitors, where the video card's signals are used to charge the right pixel's capacitor/s quickly, and the capacitor/s (driving the LCD or LED or whatever) takes "about 1/60th of a second" to discharge.

In a perfect world (Hah!) the whole lot would be completely redesigned such that the monitor has its own memory/frame buffer that doesn't need to be constantly refreshed, where the video card only tells the monitor which areas were modified (as soon as they're modified), and where the monitor can use whatever refresh rate it likes (possibly splitting the screen into many zones and refreshing all the zones in parallel to get extremely high refresh rates without requiring insane amount of bandwidth from video card to monitor).

amn wrote:
Why in this day and age we can't switch over to a manual software-initiated display (not just framebuffer!) update mechanism supported on the hardware level all the way to the display?


Mostly because "small evolutionary steps" are more cost effective (e.g. easier for compatibility, etc) than "redesign the world".

However please note that how software works is not restricted to the way hardware works. Just because the video card sends pixel data to the monitor at a specific rate (e.g. 60 frames per second) does not mean that software must update the frame buffer at the same specific rate.

More specifically, with modern (GPU) hardware one idea would be to give each application its own texture, and then let the video driver merge/"compose" all the textures (under control of the GUI) to generate the frame buffer. In this case each application can update its texture (possibly using a "push" model) whenever it feels like it without caring about refresh rate at all.


Cheers,

Brendan

_________________
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.


Top
 Profile  
 
 Post subject: Re: Driver driven display updates vs. today's poll-based sys
PostPosted: Sat Feb 03, 2018 5:03 pm 
Offline
Member
Member

Joined: Thu May 17, 2007 1:27 pm
Posts: 999
There are multiple points here:

First, the graphics card already "pushes" all data to the display. The graphics card provides the monitor with its display clock and not the other way around.

Secondly, partial display updates do not really make sense for graphics, you always want to update the whole screen at the same time; otherwise you'll get tearing on screen.

As partial updates do not make sense, having the graphics card push updates at non-constant rates increases the bandwidth and processing requirements instead of decreasing them. Saying "I want to push a new frame now" requires additional communication. As already pointed out by davidv1992, AMD and NVIDIA already have technologies to allow non-constant refresh rates. It should be noted that display port is already the fastest interconnect in consumer grade PCs (at ~6 Gbit/s per lane; compare that to your typical 1 Gbit/s ethernet). It's not that we can improve its bandwidth without non-trivial costs; higher resolution/refresh rate modes (e.g. 4k at 144 Hz) already saturate the 25 Gbit/s bandwidth that a 4-lane display port provides.

As Brendan pointed out, having an API that does not care about refresh rates and is able to push updates to the graphics card at any rate does indeed make sense and I guess this is how all modern compositors work already: At least the Wayland compositors in Linux work exactly this way.

_________________
managarm: Microkernel-based OS capable of running a Wayland desktop (Discord: https://discord.gg/7WB6Ur3). My OS-dev projects: [mlibc: Portable C library for managarm, qword, Linux, Sigma, ...] [LAI: AML interpreter] [xbstrap: Build system for OS distributions].


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 5 posts ] 

All times are UTC - 6 hours


Who is online

Users browsing this forum: No registered users and 25 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group