OSDev.org

The Place to Start for Operating System Developers
It is currently Thu Mar 28, 2024 5:02 am

All times are UTC - 6 hours




Post new topic Reply to topic  [ 15 posts ] 
Author Message
 Post subject: Set VGA mode and draw to screen in protected mode
PostPosted: Sun Nov 19, 2017 8:14 am 
Offline

Joined: Sun Nov 19, 2017 7:55 am
Posts: 5
I want to switch to VGA mode 13h in protected mode but i don't know how.
I also need help in drawing to the screen.


Top
 Profile  
 
 Post subject: Re: Set VGA mode and draw to screen in protected mode
PostPosted: Sun Nov 19, 2017 9:27 am 
Offline
Member
Member

Joined: Sun Sep 06, 2015 5:40 am
Posts: 47
If you can do stuff in Real Mode at any point, you can use int 10 with AH=00h. If not, you can either switch back to Real Mode and make the switch, or ask your bootloader to switch to the correct mode before it passes control to your kernel.

_________________
OS on Github | My Rust ACPI library


Top
 Profile  
 
 Post subject: Re: Set VGA mode and draw to screen in protected mode
PostPosted: Sun Nov 19, 2017 10:55 am 
Offline
Member
Member

Joined: Mon Mar 25, 2013 7:01 pm
Posts: 5099
We do have some VGA resources in case you want to try setting the video mode without using the BIOS.


Top
 Profile  
 
 Post subject: Re: Set VGA mode and draw to screen in protected mode
PostPosted: Sun Nov 19, 2017 11:03 am 
Offline

Joined: Sun Nov 19, 2017 7:55 am
Posts: 5
BaconWraith wrote:
If you can do stuff in Real Mode at any point, you can use int 10 with AH=00h. If not, you can either switch back to Real Mode and make the switch, or ask your bootloader to switch to the correct mode before it passes control to your kernel.


I have created this code for setting VGA mode:
Code:
void set_vga_mode(unsigned char mode) {
    asm(
        "sti;\
        mov $0x00, %%ah;\
        int $0x10;\
        cli;"
        :
        : "al" (mode)
    );

I call it with mode=0x13 and then fill the screen with one color but instead of clearing the screen with one color, everything starts blinking.
The problem happens when trying to switch mode and fill doesn't even happens. :(
Testing it with qemu.


Top
 Profile  
 
 Post subject: Re: Set VGA mode and draw to screen in protected mode
PostPosted: Sun Nov 19, 2017 11:06 am 
Offline
Member
Member

Joined: Mon Mar 25, 2013 7:01 pm
Posts: 5099
If you're in protected mode, you can't use int 0x10.


Top
 Profile  
 
 Post subject: Re: Set VGA mode and draw to screen in protected mode
PostPosted: Sun Nov 19, 2017 11:09 am 
Offline

Joined: Sun Nov 19, 2017 7:55 am
Posts: 5
Octocontrabass wrote:
If you're in protected mode, you can't use int 0x10.

Then i need to switch to real mode for setting mode and then i can switch back to protected mode.


Top
 Profile  
 
 Post subject: Re: Set VGA mode and draw to screen in protected mode
PostPosted: Sun Nov 19, 2017 11:46 am 
Offline
Member
Member
User avatar

Joined: Sun Feb 20, 2011 2:01 pm
Posts: 110
Absolutely, drop into real mode.
If it's any help, here's a binary module I created to do precisely this.
Code:
BITS 16
section .code
org 0x1000
start:
jmp setvesa16
align 4
db "TERAOS16"
db "VESAVBE", 0
dd 0x1000
dd TeraOSentry32
dq TeraOSentry64
setvesa16:
push bx      ;width
push cx      ;height
push dx      ;depth
mov DWORD[maxModeBytes], 0
mov ax, ds
mov es, ax
lea di, [controllerBuffer]
mov ax, 0x4F00
int 10h
cmp ah, 0
jne .fail
cmp al, 0x4F
jne .fail
mov bx, [controllerBuffer+0x0E]
mov cx, [controllerBuffer+0x10]
mov fs, cx

.loop:
mov ax, [fs:bx]
cmp ax, 0xFFFF
je .next
mov cx, ax
lea di, [modeInfo]
mov ax, 0x4F01
int 0x10
mov ax, [es:di]
test ax, 1 << 4
jz .eof_loop
cmp BYTE[es:di+0x1B], 0x06
jne .eof_loop
xor eax, eax
mov ax, [es:di+0x12]
xor edx, edx
mov dx, [es:di+0x14]
cmp eax, 720
je .cmpy
cmp eax, 640
jne .eof_loop
.cmpy:
cmp edx, 480
jne .eof_loop
mul edx
mov [maxModeBytes], eax
mov [maxMode], cx
.eof_loop:
add bx, 2
jmp .loop
.next:
mov bx, [maxMode]
or bx, 0x4000
mov ax, 0x4F02
int 0x10
;Also store mode info for later use
lea di, [modeInfo]
mov cx, [maxMode]
mov ax, 0x4F01
int 0x10
pop dx
pop dx
pop dx
ret
.fail:
cli
hlt
ret

BITS 32
TeraOSentry32:
cli
pushad
sgdt [gdtold]
sgdt [gdtnew]
sidt [idtold]
;Now use existing GDT, but load it for non-paging
mov esi, [gdtold.base]
xor ecx, ecx
mov cx, [gdtold.limit]
add cx, 1
lea edi, [GDT]
rep movsb
lea eax, [GDT]
mov [gdtnew.base], eax
lgdt [gdtnew]
mov ax, 0x30   ;Data 16 segment
mov WORD[savess], 0x10
call 0x28:Entry16
lgdt [gdtold]
lidt [idtold]
;Register VBE info
mov eax, 0
lea ebx, [modeInfo]
int 0x69
popad
ret

BITS 64
bits32addr:
dq TeraOSentry64.bits32
dw 0x28
TeraOSentry64:
cli
push rbx
push rsi
push rcx
push rdi
push rax
sgdt [gdtold]
sgdt [gdtnew]
sidt [idtold]
;Now use existing GDT, but load it for non-paging
mov rsi, [gdtold.base]
xor rcx, rcx
mov cx, [gdtold.limit]
add cx, 1
lea rdi, [GDT]
rep movsb
lea rax, [GDT]
mov [gdtnew.base], rax
mov [saveesp], rsp
lgdt [gdtnew]
jmp far [bits32addr]
BITS 32
.bits32:
mov ax, 0x30
mov ds, ax
mov es, ax
mov fs, ax
mov gs, ax
mov ss, ax
;We are in compatibliity mode. Now we need to move to legacy mode
mov eax, cr0
and eax, 0x7FFFFFFF
mov cr0, eax
jmp 0x28:.pmode
.pmode:
mov ax, 0x50   ;Data 16 segment
mov WORD[savess], 0x30
call 0x48:Entry16_64
mov eax, cr0
or eax, 0x80000000
mov cr0, eax
jmp 0x8:.end
BITS 64
.end:
mov rsp, [saveesp]
mov ax, 0x10
mov ds, ax
mov es, ax
mov fs, ax
mov gs, ax
mov ss, ax
lgdt [gdtold]
lidt [idtold]
;Register VBE info
mov rax, 0
mov rbx, modeInfo
int 0x69
pop rbx

pop rax
pop rdi
pop rcx
pop rsi
pop rbx
ret

BITS 16
idt_real:
   dw 0x3ff      ; 256 entries, 4b each = 1K
   dq 0         ; Real Mode IVT @ 0x0000

savcr0:
   dq 0         ; Storage location for pmode CR0.
saveesp:
   dq 0
savess:
   dw 0
savecs:
   dw 0

Entry16:
        ; We are already in 16-bit mode here!

   cli         ; Disable interrupts.
   mov [saveesp], esp
   ; Need 16-bit Protected Mode GDT entries!
   mov ds, ax
   mov es, ax
   mov fs, ax
   mov gs, ax
   mov ss, ax

   ; Disable paging (we need everything to be 1:1 mapped).
   mov eax, cr0
   mov [savcr0], eax   ; save pmode CR0
   and eax, 0x7FFFFFFe   ; Disable paging bit & enable 16-bit pmode.
   mov cr0, eax

   jmp 0:GoRMode      ; Perform Far jump to set CS.

GoRMode:
   mov sp, 0x9000      ; pick a stack pointer.
   mov ax, 0      ; Reset segment registers to 0.
   mov ds, ax
   mov es, ax
   mov fs, ax
   mov gs, ax
   mov ss, ax
   lidt [idt_real]
   sti         ; Restore interrupts -- be careful, unhandled int's will kill it.
   call setvesa16
   cli
   lidt [idtold]
   mov bx, [savecs]
   mov eax, [savcr0]
   mov cr0, eax
   jmp 0x8:.restored
   BITS 32
   .restored:
   mov ax, [savess]
   mov ds, ax
   mov es, ax
   mov fs, ax
   mov gs, ax
   mov ss, ax
   mov esp, [saveesp]
   retf

   BITS 16
Entry16_64:
        ; We are already in 16-bit mode here!

   cli         ; Disable interrupts.

   ; Need 16-bit Protected Mode GDT entries!
   mov ds, ax
   mov es, ax
   mov fs, ax
   mov gs, ax
   mov ss, ax
   mov [saveesp], esp

   ; Disable paging (we need everything to be 1:1 mapped).
   mov eax, cr0
   mov [savcr0], eax   ; save pmode CR0
   and eax, 0x7FFFFFFe   ; Disable paging bit & enable 16-bit pmode.
   mov cr0, eax

   jmp 0:GoRMode_64      ; Perform Far jump to set CS.

GoRMode_64:
   mov sp, 0x9000      ; pick a stack pointer.
   mov ax, 0      ; Reset segment registers to 0.
   mov ds, ax
   mov es, ax
   mov fs, ax
   mov gs, ax
   mov ss, ax
   lidt [idt_real]
   sti         ; Restore interrupts -- be careful, unhandled int's will kill it.
   call setvesa16
   cli
   lidt [idtold]
   mov bx, [savecs]
   mov esp, [saveesp]
   mov eax, [savcr0]
   mov cr0, eax
   jmp 0x28:.restored
   BITS 32
   .restored:
   mov ax, [savess]
   mov ds, ax
   mov es, ax
   mov fs, ax
   mov gs, ax
   mov ss, ax
   retf
   

section .data

controllerBuffer: TIMES 512 db 0

modeInfo: TIMES 256 db 0

maxMode: dw 0
maxModeBytes: dq 0

gdtold:
.limit: dw 0
.base: dq 0
idtold: dq 0,0

gdtnew:
.limit: dw 0
.base: dq 0

GDT:
TIMES 16 dq 0



The header has a signature, module name, and 32 bit / 64 bit entry points. Note the transition code.

_________________
Whoever said you can't do OS development on Windows?
https://github.com/ChaiSoft/ChaiOS


Top
 Profile  
 
 Post subject: Re: Set VGA mode and draw to screen in protected mode
PostPosted: Sun Nov 19, 2017 11:55 am 
Offline
Member
Member

Joined: Mon Mar 25, 2013 7:01 pm
Posts: 5099
GDavid wrote:
Then i need to switch to real mode for setting mode and then i can switch back to protected mode.

No you don't. You can program the VGA registers directly from protected mode. (It's much nicer than switching back to real mode, if you ask me.)

We even have a list of values to program into those registers.


Top
 Profile  
 
 Post subject: Re: Set VGA mode and draw to screen in protected mode
PostPosted: Sun Nov 19, 2017 12:23 pm 
Offline

Joined: Sun Nov 19, 2017 7:55 am
Posts: 5
Octocontrabass wrote:
GDavid wrote:
Then i need to switch to real mode for setting mode and then i can switch back to protected mode.

No you don't. You can program the VGA registers directly from protected mode. (It's much nicer than switching back to real mode, if you ask me.)

We even have a list of values to program into those registers.

What does the "index" stands for in the list?


Top
 Profile  
 
 Post subject: Re: Set VGA mode and draw to screen in protected mode
PostPosted: Sun Nov 19, 2017 12:46 pm 
Offline
Member
Member

Joined: Mon Mar 25, 2013 7:01 pm
Posts: 5099
GDavid wrote:
What does the "index" stands for in the list?

VGA has dozens of registers, so most ports can be used to access several different registers. To select the register you want, you need to first write the index value.

Keep in mind the access pattern is not the same for every port. For example, to write a register behind port 0x3C0 you must first read port 0x3DA (you may discard the value), then write the index to port 0x3C0, and then write the data to port 0x3C0. However, to write a register behind port 0x3C4, you write the index to port 0x3C4 and then write the data to port 0x3C5.

If that doesn't make sense to you, take a look at the VGA resources to see if someone else can explain it better than me.


Top
 Profile  
 
 Post subject: Re: Set VGA mode and draw to screen in protected mode
PostPosted: Sun Nov 19, 2017 1:36 pm 
Offline

Joined: Sun Nov 19, 2017 7:55 am
Posts: 5
Can someone please send me a working code for setting up VGA mode 13?


Top
 Profile  
 
 Post subject: Re: Set VGA mode and draw to screen in protected mode
PostPosted: Sun Nov 19, 2017 3:44 pm 
Offline
Member
Member
User avatar

Joined: Sat Jan 15, 2005 12:00 am
Posts: 8561
Location: At his keyboard!
Hi,

GDavid wrote:
Can someone please send me a working code for setting up VGA mode 13?


Sure:

Code:
    mov ax,0x0013
    int 0x10


This code only works in some situations:
  • It assumes that there's at least one video card with a monitor attached.
  • It assumes that the video card emulates VGA at the video ROM level
  • It assumes that the OS is designed properly and therefore sets a video mode in real mode during boot
  • It assumes that the computer was booted using BIOS (and not UEFI)

The first two problems can be mitigated a little by using "int 0x10, ax = 0x1A00" to check that a video card that emulates VGA at the video ROM level exists.

Note that there's a difference between "emulates VGA at the video ROM level" and "emulates VGA at the register level". Neither is guaranteed; but the latter is less likely and much harder to check (and should only be relied on in a native video card driver that's been written specifically for VGA cards manufactured by IBM in the 1980s and nothing else).

The other problem is that it might work. The ancient "320*200 with 256 colours" mode is so ugly (and everyone has become so used to resolutions like 1920*1600) that successfully switching to this video mode is like successfully spitting in the user's face - not supporting any video at all is probably an improvement.

Also don't forget that it's relatively easy to avoid all these problems (e.g. during boot, use VBE on BIOS systems or GOP/UGA on UEFI systems to set a video mode that isn't horrifically disgusting).


Cheers,

Brendan

_________________
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.


Top
 Profile  
 
 Post subject: Re: Set VGA mode and draw to screen in protected mode
PostPosted: Sun Nov 19, 2017 5:21 pm 
Offline
Member
Member
User avatar

Joined: Fri Oct 27, 2006 9:42 am
Posts: 1925
Location: Athens, GA, USA
GDavid wrote:
Can someone please send me a working code for setting up VGA mode 13?


From protected mode? Unfortunately, no, nobody can, at least not without more information from you, a lot of work, and a high probability that it simply won't run.

This isn't us being jerks. This is simply the fact of the matter. Despite the widespread claim that VGA and VESA VBE are 'standards', the truth is that the VBE standard only defines real mode BIOS support, and VGA isn't a standard at all - it was a particular set of hardware from IBM back in 1987 that was widely copied or emulated, but was never copied exactly for a variety of reasons, some technical, some legal, and some historical.

A bit of a history lesson is in order, and a sordid tale it is (well, just a little, anyway). This is mostly from memory, so the OP should take it with a grain of salt; for others here, comments and corrections are welcome.

The original Video Graphics Array hardware was an ASIC hardwired to the IBM PS/2 motherboards, and required proprietary firmware which IBM never licensed. All of the VGA cards developed later by other companies were basically attempts to copy the hardware behavior of the actual VGA system, but none of them were exactly the same - they had no legitimate access to the ASIC used by IBM. Most of them only managed to mimic part of the register-level behavior, and many inexpensive ones only mimicked the IBM video ROM routines - and generally only the ones for real mode, at that (since the IBM 32-bit Video BIOS ROMs only applied to the actual, Micro-Channel-based PS/2 systems anyway, and pretty much no one making BIOSes for ISA, EISA, or later, PCI systems, bothered imitating them).

This left a serious gap in the video support, as even ROM-level compatibility was spotty at best. This was part of why, back in the early 1990s, adoption of VGA was rather slow.

In 1989, NEC and the other major PC manufacturers came together to form the Video Electronics Standards Association, to try and rein in this chaos with a new, improved VGA compatible system called Super VGA, but while they were pretty quick to get together on a new high-speed (for the time) bus form factor, VESA Local Bus, the standards development for SVGA compatibility dragged out, especially once it became clear how rapidly the technology was evolving - no one wanted to commit to something that would be outdated before it reached the consumers.

After a few years of putzing around with trying to agree on a register set that they could all implement or emulate, it was decided that they would focus on the ROM BIOS compatibility instead. By 1992, the first version of VESA BIOS Extensions was released, and... quickly found wanting. The version 2 update quickly followed in 1994, and in 1996 VBE 3 was released.

You will note that this standard hasn't been updated in 21 years. This is not a coincidence, as while they were working on VBE 3, two important things happened.

One was that 2D acceleration became a standard (but not standardized) part of most video cards, and 3D acceleration was rapidly coming up. While these accelerated cards all were supposedly VBE compliant (few really were), the standard said nothing about acceleration, so all of these extensions were specific to the card in question. This meant that to use the features people were actually buying these cards for, they would need a card-specific driver.

More significantly, Windows 95 was released right in the middle of the standard's development.

Now, prior to Windows 3.0, the standard version of Windows ran in real mode, and you had to specifically set it to run in either 16-bit or 32-bit protected mode to get anything more than the default 640KB of memory, or else use some kind of memory extender that could bank-switch real mode memory. Windows 3.1 dropped support for real mode, and became moderately successful in the business market, but it was still basically an add-on to MS-DOS. These versions of Windows basically had no support for advanced gaming, so games were still most written to start in MS-DOS, and would set the video mode using VBE (or more often, a card-specific driver) before launching a 32-bit extender. Incompatibility was such an issue that it wasn't uncommon for game developers to ship different versions of a game for different video cards, especially for versions which shipped as pack-ins for a video card.

VESA responded to this by adding a 32-bit BIOS, but only a handful of companies implemented it - they knew that to really get the most of the accelerated cards, the users would need a card-specific driver, so they focused on the drivers and VBE started to become an afterthought - especially since the standard still didn't address the acceleration features so crucial to newer software.

Then Windows 95 came out. Unlike earlier versions, it was a stand-alone OS, with only a stub of MS-DOS still present like an appendix in the form of the 'Command window'. While it could run older 16-bit real-mode DOS and 16-bit protected-mode Windows programs, the system itself always ran in 32-bit p-mode, and didn't play well with older DOS extenders as a rule. While game support in Win95 was pretty abysmal at first, as Microsoft's first attempt to first a high-performance video API (WinG) was a disaster, the introduction of DirectX in late 1995 soon meant that even games didn't need the DOS extenders any more.

What's more, it rendered even the 32-bit VBE functions useless - Windows needed to have a driver, even if only a generic one, and the 32-bit VBE functions weren't really designed with Windows in mind. You could write a generic driver that used them, and Microsoft did, but because they were limited in both support on the cards, and support for critical features, any manufacturer that wanted their card to run well on a Windows system needed to have a Windows driver, or at least give Microsoft enough information to write one for them.

And since no one wanted to share proprietary information on their card's acceleration support, most wrote their own, and no one was publishing enough of a spec to write a driver - a few were reverse engineered eventually, as would later be done for the Nouveau drivers written to support nVIDIA cards under Linux, but by the time a given card was worked out, it's replacement was coming to market.

The same was pretty much true for OS/2, as well, though by then OS/2 was fading fast anyway, and early Linux had effectively zero market share for the desktop (it would eventually increase, but never by much). Most card manufacturers didn't bother with a Linux or OS/2 driver.

Eventually, most of the bigger card manufacturers settled on chipsets from either nVidia, Intel, or ATI (later bought by AMD), and as Linux became (slightly) more important, those three companies would provide Linux drivers for their chipsets, but while Intel would publish their GPU chipsets' docs starting around 2004 (I think), it wasn't until around 2015 (with the release of amdgpu) that AMD would provide enough details for an open version of their drivers (and there is reason to believe that they still kept some part secrets, and that some features are only fully supported in their proprietary Catalyst drivers), while nVidia still haven't give a full spec or driver source code - they provide only a binary blob driver for Linux which they maintain themselves. For anything other than those three, you can consult Wicked-Pedo's list of FOSS graphics drivers, but it is really hit or miss.

Furthermore, by this time OpenGL (which was based on the GL standard originating in the late 1980s) had been ported to Linux, giving Linux a generalized graphics API similar to DirectX, which meant that Linux applications could use and card whose driver played nice with OpenGL, making proprietary drivers somewhat more palatable to at least a significant number of Linux users.

All of these things led to VBE getting more and more marginalized over time; further standardization never got any traction.

As a result, even for OS developers, the video BIOS is effectively useless once you are out of real mode. You can, with a lot of work, switch the system back to real mode, or in some cases use virtual-86 mode to run the real-mode BIOS.

Even if you do that, you are locking yourself into using legacy BIOS, at a time when that is rapidly going away - UEFI generally doesn't provide support for the VGA BIOS routines, since that was a function of the ROMs on the video card. This approach probably won't work at all on a motherboard from after 2015.

Conversely, in order to avoid the BIOS and switch the video mode while in 32-bit p-mode you could write a full-blown driver for whatever video card you are using based on whatever parts of the spec are publicly available and hope that the information you have is both complete and correct. At this point, the VGA support becomes moot - you might as well just focus on supporting the newer modes, since they are probably what you really need anyway.

Since such a driver would perforce be specific to the OS, you as the OS dev would be the only one who would have enough information to write it, at least until you wrote out a how-to for working with the kernel and the driver ABI (and good luck finding someone else interested enough to help you, rather than working on their own OS project).

The practical upshot of all this is that, if you need to set the VGA video mode but aren't ready to write a mostly-complete driver for the specific card you are using, it would be best to do it before switching to p-mode. Fortunately, the UEFI standard has built-in support for mode switching prior to loading the OS, and GRUB can be directed to mode-set prior to loading as well (which should work for both UEFI and BIOS, now that they are getting UEFI support sorted out in GRUB).

TL;DR Set the mode before you switch into protected mode, preferably in your boot loader settings - and even that is only a stopgap until you can write proper drivers for at least the Intel and AMD chipsets. Oh, and you probably shouldn't be writing your own boot loader, if only because of exactly this headache.

_________________
Rev. First Speaker Schol-R-LEA;2 LCF ELF JAM POEE KoR KCO PPWMTF
Ordo OS Project
Lisp programmers tend to seem very odd to outsiders, just like anyone else who has had a religious experience they can't quite explain to others.


Top
 Profile  
 
 Post subject: Re: Set VGA mode and draw to screen in protected mode
PostPosted: Sun Nov 19, 2017 8:52 pm 
Offline
Member
Member
User avatar

Joined: Sun Feb 20, 2011 2:01 pm
Posts: 110
Also, if you don't use any of the above solutions, you can switch to real mode, then return. This allows setting a VBE mode - the only way without a graphics card driver. You won't get any acceleration - but let's face it, most hobby OS's aren't going to be running Call of Duty! You can get access to the framebuffer, and from there it's a classic situation of writing to video memory. UEFI, if you go down that route, provides information about the FB, and also provides primitives to draw to the screen in BootServices. GRUB can be told to set a video mode - multiboot2 is especially good for this. However, if you choose a video mode through GRUB, you can't choose with any of your own logic - you'd have to modify your kernel image and reboot (ugh), or stick with what GRUB gives you (which may not even be the mode your requested) - even more ugh. That's why I personally would recommend producing a real mode video mode module, because it keeps the control in your OS (which, I think, is the whole point for most of us here). I can be a bit of a purist though... and UEFI is definitely worth it if you have access to it.

_________________
Whoever said you can't do OS development on Windows?
https://github.com/ChaiSoft/ChaiOS


Top
 Profile  
 
 Post subject: Re: Set VGA mode and draw to screen in protected mode
PostPosted: Mon Nov 20, 2017 6:43 am 
Offline
Member
Member

Joined: Sat Nov 21, 2009 5:11 pm
Posts: 852
The general algorithm that GigaOS uses is:

- Calculate vertical timing, then horizontal timing, using the Coordinated Video Timing standard.
- Ask display driver to find the one or two most closely matching pixel clock rates.
- For each returned result, calculate the timings again, and use the one that produces a refresh rate closest
to the one specified.
- Use negative H-sync polarity and positive V-sync polarity, to indicate that Coordinated Video Timing is being used.
- Ask driver to set the mode using the calculated parameters. The VGA driver first tries to find a matching VBE mode. If unsuccessful, it will program the mode using the VGA interface if possible.

VBE calls involve going into real mode, but a better solution would be to use Virtual 8086 mode. It's a bit more involved, though.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 15 posts ] 

All times are UTC - 6 hours


Who is online

Users browsing this forum: Bing [Bot], SemrushBot [Bot] and 60 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group