OSDev.org

The Place to Start for Operating System Developers
It is currently Thu Mar 28, 2024 6:00 am

All times are UTC - 6 hours




Post new topic Reply to topic  [ 34 posts ]  Go to page Previous  1, 2, 3
Author Message
 Post subject: Re: Do i really need to study monstrously long intel manual?
PostPosted: Mon Feb 06, 2023 11:09 am 
Offline
Member
Member

Joined: Tue Jun 07, 2022 11:23 am
Posts: 78
Location: France
eekee wrote:
Given this thread topic, I think I ought to mention that a complexity-hating former friend hates Verilog, preferring VHDL. I have no idea how subjective or objective this is, but complexity-haters might want to compare the two languages to see how they get on. It didn't stop my friend from implementing all sorts of stuff in Verilog for a SBC, so perhaps it's not too bad.

I also hate Verilog as your friend (but I use it since I don't want to rewrite my project), my reason of hate is that verilog is super hard to read - especially when everything is mashed together and not correctly indented. And in verilog, code might simulate right but not work right on hardware (aka not all code is synthesizable). And in the opposite, VHDL is very strict (and readable according to me):
Hardolaf#4710 from Discord wrote:
Verilog is what happens when you let Mentor Graphics into your standards committee.
VHDL is what happens when you allow the military to write the standard.

Also here is also a little post about verilog: https://danluu.com/why-hardware-development-is-hard/ after you read it you will probably understand your friend.

_________________
https://github.com/cheyao/Achieve-Core


Top
 Profile  
 
 Post subject: Re: Do i really need to study monstrously long intel manual?
PostPosted: Mon Feb 06, 2023 3:53 pm 
Offline
Member
Member

Joined: Wed Oct 01, 2008 1:55 pm
Posts: 3191
iansjack wrote:
rdos wrote:
A 12-core thread-ripper processor is far more effective at general processing tasks than a CPU core in a FPGA.

I think you rather miss the point. I'm not trying to produce the next Threadripper or i9 anymore than I'm trying to produce the next OS X or Windows.

I'm just having fun. :)


Me too. :-)

However, I see no fun in having to code for Linux/Unix like platforms and to use GCC, so I avoid pi and FPGA-based cores. If a FPGA-based CPU core could run x86 protected mode assembler-code, I might find it more worthwhile.

I would find it interesting to build a CPU-core from scratch, write an assembler for it (from scratch) and maybe even a C compiler, but I certainly find no fun in adapting GCC to a new CPU.

Also, I've coded a lot of PCI drivers for my OS, and creating my own PCI device & writing a driver for it, certainly is a lot of fun, and an interesting project.


Top
 Profile  
 
 Post subject: Re: Do i really need to study monstrously long intel manual?
PostPosted: Mon Feb 06, 2023 4:02 pm 
Offline
Member
Member

Joined: Wed Oct 01, 2008 1:55 pm
Posts: 3191
Cyao wrote:
I also hate Verilog as your friend (but I use it since I don't want to rewrite my project), my reason of hate is that verilog is super hard to read - especially when everything is mashed together and not correctly indented. And in verilog, code might simulate right but not work right on hardware (aka not all code is synthesizable). And in the opposite, VHDL is very strict (and readable according to me):


I like Verilog. I don't know VHDL, but since Xilinx appeared to mainly use Verilog, I decided to learn Verilog and not VHDL.

Cyao wrote:
Also here is also a little post about verilog: https://danluu.com/why-hardware-development-is-hard/ after you read it you will probably understand your friend.


I don't think FPGA programming is for dummies, much like OS programming isn't either. I have a fairly good knowledge of digital logic (it was part of my MSc I took in the 80s), so I don't make that kind of mistakes. I know that the code is translated to flip-flops & hardware primitives, and so I don't use multiply or divide in Verilog, and I certainly don't wonder how to output a string on a monitor. :-)

I very much doubt that VHDL can make sure that code is possible to translate to logic. For instance, how does it know if a 64-bit adder could be implemented in a single cycle at 750 MHz clock frequency in the target FPGA device?

Actually, the toughest stuff is clock domain crossings, how to declare them in contraints files, and that has no relation to Verilog at all.


Top
 Profile  
 
 Post subject: Re: Do i really need to study monstrously long intel manual?
PostPosted: Fri Feb 17, 2023 7:59 am 
Offline
Member
Member
User avatar

Joined: Mon May 22, 2017 5:56 am
Posts: 812
Location: Hyperspace
Cyao wrote:
eekee wrote:
Given this thread topic, I think I ought to mention that a complexity-hating former friend hates Verilog, preferring VHDL. I have no idea how subjective or objective this is, but complexity-haters might want to compare the two languages to see how they get on. It didn't stop my friend from implementing all sorts of stuff in Verilog for a SBC, so perhaps it's not too bad.

I also hate Verilog as your friend (but I use it since I don't want to rewrite my project), my reason of hate is that verilog is super hard to read - especially when everything is mashed together and not correctly indented. And in verilog, code might simulate right but not work right on hardware (aka not all code is synthesizable). And in the opposite, VHDL is very strict (and readable according to me):
Hardolaf#4710 from Discord wrote:
Verilog is what happens when you let Mentor Graphics into your standards committee.
VHDL is what happens when you allow the military to write the standard.

Also here is also a little post about verilog: https://danluu.com/why-hardware-development-is-hard/ after you read it you will probably understand your friend.

Thanks, that does help me understand. Readability, appropriateness to the task, and the inelegance of a design which makes correct simulation (or any other task) much more complex than it needs to be will all have mattered to my friend.


rdos wrote:
I very much doubt that VHDL can make sure that code is possible to translate to logic. For instance, how does it know if a 64-bit adder could be implemented in a single cycle at 750 MHz clock frequency in the target FPGA device?

I understood it as comparable to explicit types in a coding language. For example, some things expressible in C result in undefined behaviour, but specifying the types of variables for the compiler to catch is still worthwhile.

_________________
Kaph — a modular OS intended to be easy and fun to administer and code for.
"May wisdom, fun, and the greater good shine forth in all your work." — Leo Brodie


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 34 posts ]  Go to page Previous  1, 2, 3

All times are UTC - 6 hours


Who is online

Users browsing this forum: No registered users and 28 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group