OSDev.org
https://forum.osdev.org/

How exactly do I generate the bitmask for font rendering?
https://forum.osdev.org/viewtopic.php?f=1&t=56289
Page 1 of 1

Author:  ThatCodingGuy89 [ Tue May 24, 2022 10:30 pm ]
Post subject:  How exactly do I generate the bitmask for font rendering?

Post is to do with this page in the wiki: https://wiki.osdev.org/VGA_Fonts#Displaying_a_character

Specifically, the optimized version of the character displaying routine. There is no explanation of how exactly this bitmap is generated. (Or hardcoded)

I don't know if I am an idiot and have missed some computer graphics term for bitmask that implies some structure, or if the page is just unclear on how to do things.

Author:  iansjack [ Wed May 25, 2022 2:40 am ]
Post subject:  Re: How exactly do I generate the bitmask for font rendering

I'm not quite sure what your problem is. The section "Decoding of bitmap fonts" shows you the structure of the character bitmap.

Edit: Ah - I think you're referring to the mask_table array. Sorry, I've no idea where that came from.

Author:  klange [ Wed May 25, 2022 3:01 am ]
Post subject:  Re: How exactly do I generate the bitmask for font rendering

While the math seems to have been left as an exercise for the reader, the mask table is basically the same data as the bitmap - just with whole bytes filled in instead of individual bits. Or rather, with whole pixels filled in - whatever that pixel size may be.

A brief explanation of the idea is that you have your bitmap representation of a glyph, made up of a byte per row, with bits representing columns, and you take this bitmap and expand it out so that each bit is now 8 bits. Note that for an 8-bit bitmap row, this means a mask row is 8 bytes - the example code seems to assume 32-bit values, so you need to two of them, but I'm lazy and will use 64-bit values. So a row of 01101100b becomes 0x00FFFF00_FFFF0000. Except... we're probably on a little-endian machine, so this is actually backwards. It should be byte reversed to 0x0000FFFF_00FFFF00. We do this for every row in our bitmap and then the masking can be used to write multiple pixels at a time, which is probably faster than individual calls to a set-pixel function.

Author:  ThatCodingGuy89 [ Wed May 25, 2022 3:56 am ]
Post subject:  Re: How exactly do I generate the bitmask for font rendering

klange wrote:
While the math seems to have been left as an exercise for the reader, the mask table is basically the same data as the bitmap - just with whole bytes filled in instead of individual bits. Or rather, with whole pixels filled in - whatever that pixel size may be.

A brief explanation of the idea is that you have your bitmap representation of a glyph, made up of a byte per row, with bits representing columns, and you take this bitmap and expand it out so that each bit is now 8 bits. Note that for an 8-bit bitmap row, this means a mask row is 8 bytes - the example code seems to assume 32-bit values, so you need to two of them, but I'm lazy and will use 64-bit values. So a row of 01101100b becomes 0x00FFFF00_FFFF0000. Except... we're probably on a little-endian machine, so this is actually backwards. It should be byte reversed to 0x0000FFFF_00FFFF00. We do this for every row in our bitmap and then the masking can be used to write multiple pixels at a time, which is probably faster than individual calls to a set-pixel function.


Ah, so 0b00100110 becomes 0x00FFFF00, 0x00FF0000. Just checking if I understood your explanation correctly.

Page 1 of 1 All times are UTC - 6 hours
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group
http://www.phpbb.com/