OSDev.org

The Place to Start for Operating System Developers
It is currently Thu Mar 28, 2024 11:46 am

All times are UTC - 6 hours




Post new topic Reply to topic  [ 24 posts ]  Go to page 1, 2  Next
Author Message
 Post subject: MenuetOS 1.26.10
PostPosted: Wed Jul 05, 2017 1:39 am 
Offline

Joined: Sat Apr 02, 2011 11:19 am
Posts: 12
Recent improvements: support for 32GB ram, SMP for 32 cpus, improved midiplayer, calculator, games, webcam, ..

64bit, 100% x86 asm.

Videos: https://www.youtube.com/channel/UCgNlod ... ZUROMSsJGg
Site: http://www.menuetos.net


Top
 Profile  
 
 Post subject: Re: MenuetOS 1.26.10
PostPosted: Wed Jul 05, 2017 7:51 am 
Offline
Member
Member

Joined: Sat Mar 01, 2014 2:59 pm
Posts: 1146
Why isn't the 64-bit version open-source?

_________________
When you start writing an OS you do the minimum possible to get the x86 processor in a usable state, then you try to get as far away from it as possible.

Syntax checkup:
Wrong: OS's, IRQ's, zero'ing
Right: OSes, IRQs, zeroing


Top
 Profile  
 
 Post subject: Re: MenuetOS 1.26.10
PostPosted: Wed Jul 05, 2017 8:23 am 
Offline
Member
Member
User avatar

Joined: Thu Oct 13, 2016 4:55 pm
Posts: 1584
onlyonemac wrote:
Why isn't the 64-bit version open-source?

+1 I'm curious too.


Top
 Profile  
 
 Post subject: Re: MenuetOS 1.26.10
PostPosted: Wed Jul 05, 2017 8:35 am 
Offline
Member
Member
User avatar

Joined: Fri Feb 17, 2017 4:01 pm
Posts: 640
Location: Ukraine, Bachmut
speaking of open source. your OS, guys, has a way nicer GUI than any linux ever had. :mrgreen:
great work.

_________________
ANT - NT-like OS for x64 and arm64.
efify - UEFI for a couple of boards (mips and arm). suspended due to lost of all the target park boards (russians destroyed our town).


Top
 Profile  
 
 Post subject: Re: MenuetOS 1.26.10
PostPosted: Wed Jul 05, 2017 10:09 am 
Offline
Member
Member
User avatar

Joined: Sat Dec 27, 2014 9:11 am
Posts: 901
Location: Maadi, Cairo, Egypt
onlyonemac wrote:
Why isn't the 64-bit version open-source?

From what I gather, Ville didn't want another fork similar to what happened to Kolibri OS. The exact reasons for this however, I am not aware of.

Anyway, I'll try this release soon; Menuet has always been one of my inspirations. :)

_________________
You know your OS is advanced when you stop using the Intel programming guide as a reference.


Top
 Profile  
 
 Post subject: Re: MenuetOS 1.26.10
PostPosted: Sun Jul 16, 2017 6:34 pm 
Offline
Member
Member
User avatar

Joined: Fri Mar 07, 2008 5:36 pm
Posts: 2111
Location: Bucharest, Romania
Don't be impressed by poor engineering. Clearly, MenuetOS is developed by people who are unable to think clearly. Even the licensing issue is proof of this. Who makes an open source project but changes the license as soon as they have something that people might want to fork? Wasn't that the whole point of picking out such a license in the first place?

As far as languages go, assembly was a downright terrible choice. First of all, it's inherently unportable. Secondly, it's difficult to write and maintain because very few design patterns can be realistically used, the type system is severely limited, the code is too verbose, etc. The whole point of using it revolves around the mistaken idea that the resulting code will perform faster and/or have a smaller memory footprint. However, this is false on two grounds. The first reason is that assembly micro-optimizations are limited to a particular microarchitecture whereas a compiler for a portable language can easily compile with micro-optimizations for any microarchitecture. The second is that compilers can also twist code in ways humans will not, for maintenance reasons. If a 500-line spaghetti code routine performs better than its cleaned-up version, it's not what any sensible human will attempt to write.

Also, just look at this quote from their homepage: "The design goal has been to remove the extra layers between different parts of an OS, which normally complicate programming and create bugs." Abstractions are meant to do the exact opposite (usually at no cost!) so what they did was to go for the very things they were trying to avoid.

People often conflate picking a hard road with skill. The correlation usually goes the other way.

_________________
"Computers in the future may weigh no more than 1.5 tons.", Popular Mechanics (1949)
[ Project UDI ]


Top
 Profile  
 
 Post subject: Re: MenuetOS 1.26.10
PostPosted: Mon Jul 17, 2017 4:37 am 
Offline
Member
Member
User avatar

Joined: Thu Oct 13, 2016 4:55 pm
Posts: 1584
@Love4Boobies: basically I agree with you, but you are mistaken in two things: first, design patterns are not tied to any programing language, therefore they can be used in assembly as well. After all, all high languages such as C/C++ will produce assembly source (or equivalent machine code) at the end. And I agree that removing abstractions would not make code faster, but removing extra layers (bloated middleware if you like) could. In that case a manually optimized assembly will perform better than any machine optimized code (basically because compilers can do only micro-optimizations, while human can do large scale algorithmic optimizations as well). Despite all of this, I've also give up my OS in assembly and started a C version because you're perfectly right about portability, readability and maintainability.


Top
 Profile  
 
 Post subject: Re: MenuetOS 1.26.10
PostPosted: Mon Jul 17, 2017 1:27 pm 
Offline
Member
Member
User avatar

Joined: Tue Mar 06, 2007 11:17 am
Posts: 1225
Making assembly source code that is easy to read is possible, but needs more interest. If it will help learn higher-level languages but also create programs directly in it that have really good speed, it cannot be left out. CPU developers themselves do that, demo coders do that too, and it's at one's reach, so why not, out of so much code written as fast as possible implementing things we need? Why not make the obscure part easier to use and find? Leaving out writing assembly by hand would only achieve concealing and losing more public information at least unintentionally. For example, it's now hard for us to find information on how all optimizations really work, instead of losing interest and let the compiler written by someone else do all for us while we are learning much less.

At least a mainly assembly programmer has the chance to learn those rare things when they come across more readily than if only using pre-made development tools (which happen to be at the root of the crucial point so it must be chosen with great care - it's our development and quality), and implement any other higher level tool, much like a good mathematician trying to find out why things are like they are.

And I downloaded the KolibriOS sources with TortoiseSVN, derived from the old MenuetOS from around 2005-2006. It has around 1 Gigabyte of code in FASM Assembly, probably NASM, C, C++... The kernel seems to still be written in FASM.



I think that the portability gained with a compiler is lost in the lack of hardware standardization of a whole architecture. Only modern personal computers have non-uniform hardware from many vendors, unlike game consoles.

If you want that level of portability, the right choice would be to simply implement the subsystems for the different existing driver types (DOS, Windows 3/9x/NT5/NT6/WDM, Linux, Mac, RadioShack Tandy) and then simply use those existing drivers. As we can see, simply using a compiler and good development techniques won't be enough if there are thousands of devices to be programmed by a small team who is mainly learning. And I remember that Linux was able at some point to use binary blobs of drivers from other platforms, for example for old nVidia cards. They could be emulated if necessary but have the hardware work with whatever driver that already exists. It would be just like Wine, only that it would be to make use of drivers in different formats in any machine. Or like Bochs, only that instead of running drivers inside fake hardware, we would enable real devices with emulated or natively-running drivers that effectively already exist. We could sandbox them to only that device and associated resources (or emulate them as in Bochs), so there's no problem.

I think that the intention in those cases is to use the compiler to get rid of the lack of hardware standardization. They are practically letting manufactures make hardware devices and drivers, and then get rid of the hardware itself by using programming languages with as high level as possible, preferably interpreted languages compiled natively for separating software and hardware fully. Only that it's the difficult part if you want to find information about it and learn it, so it favors monopoly by obscurity if it's done excessively, just like is happening since Windows XP and more since Vista.

I think that x86 Assembly at the application programming level (Ring 3, Multimedia instructions, FPU, memory access and even I/O access) is good enough to make for a language portable to other architectures if it's treated as a compiled language. It really looks like Low Level JavaScript and HTML5 features, so it's good to think that it can be worked a little more to convert it into an assembled, interpreted or platform-independent compiled language, and thus reuse it for other architectures.

Being able to select between Little and Big Endian would be the most difficult, but as HTML5/JavaScript/Typed Arrays and DataView's show, it's achievable and its a specialty of the CPU.

As you can see, even the highest level languages need to specify the way to read low level data, so a language alone won't help you at becoming portable for the most critical operations, or automatically/manually selectable maximum data type sizes. You always need to be aware of the architecture's structure and craft portable code, very much like in Assembly.

It would be a good and valid technology considering that we could make any program only with an assembler, not even a linker (we can build the linking structures in assembly source code).

The best that could happen would be to produce assembly code for each architecture, that still is portable, that is human readable, explaining the optimizations, accumulate an easy to use assembly library that is really reusable, and that by default is based on logical and mathematical, algebraic, optimizations that will always stand. As you can figure, it would make it possible to port the most modern and recent applications, libraries and drivers to any existing or new OS, even if there's no compiler or language available to port that application. With Assembly the number of tool requirements also decreases. And we could offer compiler-generated assembly code from programs and libraries for porting it to a minimal or old OS if we really cannot afford to convert everything to an assembly version translated by hand.

Maybe current optimizations depend on selecting instructions that for a model will produce the less cache misses and that produce more reusable results between instructions, carefully placed in categorized sub-result slots that instructions can know if it's valid.

Old optimizations probably depended on reusing state too at the programmer level, not at the internal microcode level, by hand, and coming up with mathematical and algebraic simplifications, and so many other tricks, so they will always be good and portable.

_________________
Live PC 1: Image Live PC 2: Image

YouTube:
http://youtube.com/@AltComp126/streams
http://youtube.com/@proyectos/streams

http://master.dl.sourceforge.net/projec ... 7z?viasf=1


Last edited by ~ on Tue Aug 01, 2017 12:03 pm, edited 1 time in total.

Top
 Profile  
 
 Post subject: Re: MenuetOS 1.26.10
PostPosted: Mon Jul 17, 2017 6:55 pm 
Offline
Member
Member
User avatar

Joined: Fri Mar 07, 2008 5:36 pm
Posts: 2111
Location: Bucharest, Romania
bzt wrote:
@Love4Boobies: basically I agree with you, but you are mistaken in two things: first, design patterns are not tied to any programing language, therefore they can be used in assembly as well. After all, all high languages such as C/C++ will produce assembly source (or equivalent machine code) at the end. And I agree that removing abstractions would not make code faster, but removing extra layers (bloated middleware if you like) could. In that case a manually optimized assembly will perform better than any machine optimized code (basically because compilers can do only micro-optimizations, while human can do large scale algorithmic optimizations as well). Despite all of this, I've also give up my OS in assembly and started a C version because you're perfectly right about portability, readability and maintainability.


I once amused myself by implementing the syntactic sugar useful for excepting handling standard C, in the form of try-catch-finally blocks, using setjmp/longjmp and macros. I am no stranger to the difference between design patterns and explicit language support for them. However, you will notice that I choose my words quite carefully. Being able to use something in principle is not the same as being able to use it in practice. With regards to design patterns, what I said was this: "very few design patterns can be realistically used." For instance, I would count procedural abstraction as more than realistic. However, consider interface abstraction, as an example. If you develop an OOP API for assembly, not only will there be more opportunities for errors at both ends (compilers automate a lot of the process so they can get it right everytime) but the resulting code will be much more difficult to maintain. In fact, a few assemblers have even tried to introduce explicit language support for classes and their common features (TASM comes to mind) and have all been unsuccessful for this exact reason.

This problem is, of course, language-agnostic. For instance, you can very well write functional code in C but do you know why no one does it? Because, among the usual suspects, C lacks language support for closures, lambda functions, currying, and so on. While there is no strict requirement for any of them, it does make things messy. You could also write a two-page C program that does what AWK can do in a single line. My point is that even if a set of tools is capable of the same things in principle, that doesn't make them equally suited, as they are meant to be used by humans who have all sorts of intellectual limitations. The whole field of engineering evolved as a way to overcome this. Pick the right tool for the right task.

The point about mathematical optimization is nonsense because people can tune high-level code in just the same way they tune low-level code. Have you ever heard anyone say "I'm limited to bubble sort because I'm using Java" or something of the sort (pun intended)?

At any rate, your claim is empirically falsified: experienced assembly programmers are simply unable to optimize code as well as machines because the space of possibilities is way too large for them to navigate in their heads. Take something as trivial as register allocation. We have algorithms that can find optimal solutions to this problem (which is why the "register" keyword has lost its meaning as an optimization hint in C and C++ --- nowadays, it's sometimes used to avoid enquiring references). The best assembly programmers don't come close to their register allocation even for short programs. It's somewhat funny to me that your average Joe knows that Deep Blue beat Kasparov at chess, Watson beat Brad Rutter and Ken Jennings at Jeopardy!, and AlphaGo beat Lee Sedol at Go, yet people working in the industry quote 80s books about assembly's supposed advantages while being oblivious that compilers have been beating them since before all the accomplishments listed above.

Compilers for higher-level languages have a deep understanding of what the code they are compiling is trying to accomplish (lower-level languages don't just miss out on the syntactic sugar, you know, but also on the semantic interpretation that comes with it) so they perform all sorts of neat large-scale optimizations. On top of that, as I've already mentioned, they'll even go out of their way to not write efficient code at times, because they need to have a clear understanding of what is going on---not an issue for compilers.

Don't even get me started on correctness. Have you ever tried writing something in Brainfuck? It's Turing-complete so it should be possible to write, say, a browser in it. Can you think of any reasons you might want to avoid it?

Now, is there ever a time when writing general-purpose assembly in a new code base useful? Sure, small snippets for very tight loops. Despite how good today's optimizing compilers are, they still have a few weak spots (e.g., vectorization --- people generally use intrinsics).

bzt wrote:
After all, all high languages such as C/C++ will produce assembly source (or equivalent machine code) at the end.


Please don't say something like this at a job interview. :)

_________________
"Computers in the future may weigh no more than 1.5 tons.", Popular Mechanics (1949)
[ Project UDI ]


Top
 Profile  
 
 Post subject: Re: MenuetOS 1.26.10
PostPosted: Tue Jul 18, 2017 6:18 am 
Offline
Member
Member
User avatar

Joined: Thu Oct 13, 2016 4:55 pm
Posts: 1584
Love4Boobies wrote:
I once amused myself by implementing the syntactic sugar useful for excepting handling standard C, in the form of try-catch-finally blocks, using setjmp/longjmp and macros. I am no stranger to the difference between design patterns and explicit language support for them. However, you will notice that I choose my words quite carefully. Being able to use something in principle is not the same as being able to use it in practice. With regards to design patterns, what I said was this: "very few design patterns can be realistically used." For instance, I would count procedural abstraction as more than realistic. However, consider interface abstraction, as an example. If you develop an OOP API for assembly, not only will there be more opportunities for errors at both ends (compilers automate a lot of the process so they can get it right everytime) but the resulting code will be much more difficult to maintain. In fact, a few assemblers have even tried to introduce explicit language support for classes and their common features (TASM comes to mind) and have all been unsuccessful for this exact reason.
I know that you have chosen your words carefully, but I don't agree. It's proven that using design patterns are not unrealistic in asm. Actually, assembler were not unsuccessful at all. According to OO ASM, here are a few links:
http://www.heiho.net/download/oo-asm.txt
http://x86asm.net/articles/oop-from-low-level-perspective/
http://www.drdobbs.com/embedded-systems/object-oriented-programming-in-assembly/184408319
And I have also implemented a compiler few years ago, that supported OO and interfaces and made implementing design patterns easy yet were able to write functions and methods in assembly. It had a very simple layout, took a Cish like source, made the checks that were difficult to express in ASM (or were compiler time checks) and outputted FASM source. It's not that hard as everybody thinks it is. There are no more opportunities for errors, just different ones. Defining an object in pure ASM is as easy as in C++, you just use a different syntax. Consider this:
Code:
CLASS EQU STRUC
ENDC EQU ENDS

CLASS ClassName
   member1 db ?
   member2 dw ?

; virtual table of member functions
   Constructor dd ?
   Destructor  dd ?
   memberfunction1 dd ?
   memberfunction2 dd ?
ENDC
It is not that difficult, isn't it?

Love4Boobies wrote:
This problem is, of course, language-agnostic. For instance, you can very well write functional code in C but do you know why no one does it? Because, among the usual suspects, C lacks language support for closures, lambda functions, currying, and so on. While there is no strict requirement for any of them, it does make things messy. You could also write a two-page C program that does what AWK can do in a single line. My point is that even if a set of tools is capable of the same things in principle, that doesn't make them equally suited, as they are meant to be used by humans who have all sorts of intellectual limitations. The whole field of engineering evolved as a way to overcome this. Pick the right tool for the right task.
Agreed. But I also think there's a considerably big effort on hiding the low details from future generations. Not sure what are the reasons behind this, but I think it's not good if a web programmer does not know how HTTP works for example. They tend to write uneffective code if they don't know what really going on beneath. I've seen it several times, specially among java programmers. Also I'm curious how many JavaScript programmers are aware of the memory and speed impact of using closures on DOM events. Undeniably the same algorithm written in pure JavaScript outperforms JQuery version (here again, to write effective code you need to know what's under the hood of JQuery. Introducing another layer could be bad for performance).

Love4Boobies wrote:
The point about mathematical optimization is nonsense because people can tune high-level code in just the same way they tune low-level code. Have you ever heard anyone say "I'm limited to bubble sort because I'm using Java" or something of the sort (pun intended)?
It's not a non-sense, you think that because you have never heard of demo scene. I really would like to see if anybody can write for example this 64k demo in a high language using only compiler optimizations to reduce executable size to 64k. I suggest you to watch this documentary. It's mostly in hungarian, but subtitled. They explain how they did the forementioned demo at around 54:10. (Hint: they generate everything procedurally. I can't image a compiler would be able to do that. For example, given a bitmap file, create an algorithm to generate that so that you won't have to store the entire bitmap as data).
Also I agree with you on people can tune high-level code the same way they tune low-level code, therefore it is possible to implement ANY algorithm and design pattern in assembly. You've have provided a proof that you were wrong earlier.

Love4Boobies wrote:
At any rate, your claim is empirically falsified: experienced assembly programmers are simply unable to optimize code as well as machines because the space of possibilities is way too large for them to navigate in their heads. Take something as trivial as register allocation. We have algorithms that can find optimal solutions to this problem (which is why the "register" keyword has lost its meaning as an optimization hint in C and C++ --- nowadays, it's sometimes used to avoid enquiring references). The best assembly programmers don't come close to their register allocation even for short programs. It's somewhat funny to me that your average Joe knows that Deep Blue beat Kasparov at chess, Watson beat Brad Rutter and Ken Jennings at Jeopardy!, and AlphaGo beat Lee Sedol at Go, yet people working in the industry quote 80s books about assembly's supposed advantages while being oblivious that compilers have been beating them since before all the accomplishments listed above.
I don't want to disappoint you, but it is you who is empirically falsified, watch the links above.

Love4Boobies wrote:
Compilers for higher-level languages have a deep understanding of what the code they are compiling is trying to accomplish (lower-level languages don't just miss out on the syntactic sugar, you know, but also on the semantic interpretation that comes with it) so they perform all sorts of neat large-scale optimizations. On top of that, as I've already mentioned, they'll even go out of their way to not write efficient code at times, because they need to have a clear understanding of what is going on---not an issue for compilers.
You are still stick with code optimization. I'm talking about optimizations at higher level. Let me give you yet another example. Assuming you need to read a config file with xml in it. You know that it's written by a program (hence it's format is very strict) and only contains 3 tags. Now you can use an universal xml parser for that, that would produce large code (including an additional library) and large memory footprint (as universal xml parser will convert xmls into a tree representation). No compiler would be able to optimize that away. On the other hand, a human can choose to use simple libc sscanf to parse that config file with minimal effort. That's perfectly viable as the config file is strict and only contains limited number of tags. That will result in a small code (no additional libraries) and small memory footprint (as no tree representation involved). You see what I mean?

Love4Boobies wrote:
Don't even get me started on correctness. Have you ever tried writing something in Brainfuck? It's Turing-complete so it should be possible to write, say, a browser in it. Can you think of any reasons you might want to avoid it?

Yes. Please compare the number of available elements in Brainfuck language to the number of elements in assembly or in C. You'll see which language has more power to express an algorithm.

Love4Boobies wrote:
Now, is there ever a time when writing general-purpose assembly in a new code base useful? Sure, small snippets for very tight loops. Despite how good today's optimizing compilers are, they still have a few weak spots (e.g., vectorization --- people generally use intrinsics).
Again, you are talking about micro-optimizations, and I'm talking about macro ones.
Love4Boobies wrote:
bzt wrote:
After all, all high languages such as C/C++ will produce assembly source (or equivalent machine code) at the end.


Please don't say something like this at a job interview. :)
Which part do you question? That C/C++ compilers produce asm source or that assembly and machine code has a one-to-one relation? I hope I won't have to do any interview with you, as I'm afraid I'd have to decline your application... :lol:

Basically there are two kind of programmers: one that copy'n'paste and use large libraries for everything, which I call a coder; and the one who give it deep thinking and gives optimal algorithms which I call a real programmer. Clearly you are a coder (no shame on that, I did not want to offend you with that in any way, people are different, that's all).


Top
 Profile  
 
 Post subject: Re: MenuetOS 1.26.10
PostPosted: Tue Jul 18, 2017 7:18 am 
Offline
Member
Member
User avatar

Joined: Fri Mar 07, 2008 5:36 pm
Posts: 2111
Location: Bucharest, Romania
I'm not sure what those links are meant to show. You claimed they suggest two things but I'm not satisfied with either:

1. They supposedly show the popularity of OOP in assemblers. However, they are just a bunch of tutorials for TASM (which offered OOP and is now dead and burried), MASM (which doesn't offer OOP), and HLA (Randy's garbage language that is only used by students while reading his book).

2. They supposedly have some sort of data indicating how assembly language programmers are able to produce better code with respect to some metric. I see nothing of the sort in any of the articles. At most, I see some severely outdated comments that Randy made about compilers from when that article was published, 27 years ago (remember when I predicted this in the last post?).

The class type definition doesn't show anything either. You know very well that the issue I have is interacting with the system in an error-prone manner. And if you're about to show me how that's done next, don't bother. It's obviously not rocket science. Pointers aren't rocket science either yet they are the biggest source of bugs. Just because something is trivial doesn't make it easy. ;)

Quote:
Also I agree with you on people can tune high-level code the same way they tune low-level code, therefore it is possible to implement ANY algorithm and design pattern in assembly. You've have provided a proof that you were wrong earlier.


Ok, I will say it for the third and final time: "very few design patterns can be realistically used." I had to paste it last reply as well. Do you understand that sentence? It's not talking about anything being impossible in assembly.

The demos you showed me are mostly C++ sprinkled with a bit of assembly. What of it and what does procedural generation have to do with the language?

Quote:
Which part do you question?


The part where you conflate languages and language implementations. A C compiler could output Pascal, there's nothing special about assembly. Or lisp for a lisp machine (where there would be no assembler). Or maybe the implementation isn't even a compiler; might very well be an interpreter.

I get the feeling I'm talking to someone who is drunk. You keep losing track of what it is we're discussing. Try to keep the whole conversation in your head, please. Another example is when I explicitly mentioned optimizations that aren't micro and you accused me of the opposite. Or when I pointed out that the "algorithmic optimization" argument was moot, you agreed, and then forgot one reply later and claimed you discussed some "macro optimization". Give one explicit example.

_________________
"Computers in the future may weigh no more than 1.5 tons.", Popular Mechanics (1949)
[ Project UDI ]


Top
 Profile  
 
 Post subject: Re: MenuetOS 1.26.10
PostPosted: Tue Jul 18, 2017 5:07 pm 
Offline
Member
Member
User avatar

Joined: Tue Mar 06, 2007 11:17 am
Posts: 1225
The question is what language(s) do you prefer to use for your formal main development? That would clear up why you don't like the idea of creating excellent human-readable and portable assembly to enrich that low level development environment with a complete set of libraries and applications written in it without the need of compilers or linkers for them to be usable.

An example could be choosing which parts of a program are in dire need of optimizations and which ones won't affect it. Concentrating efforts even in improving the algorithms isn't even part of actual optimizations or programming languages, but a scientific study, almost always based in other disciplines and modeled mathematically and algorithmically.

There seem to be a few operating systems in assembly that are increasingly interesting, and which can interact with other languages like C and C++.

An x86 assembler could also output assembly for other processors, or very low level C code.

Assembly language has the need to have existing programs, libraries and standards properly written in it, optimized but human readable.

I can't think of any programming task in existence that cannot be modeled, designed, patterned in assembly in a way that is maintainable. It's perfectly possible, but you need programmers specifically interested in increasing the quality, portability and number of areas/applications/algorithms fully covered by human-readable assembly library code.

The technology can still be developed to create optimized, portable and fully reusable hand-made assembly based on assembly sources that look normal.

_________________
Live PC 1: Image Live PC 2: Image

YouTube:
http://youtube.com/@AltComp126/streams
http://youtube.com/@proyectos/streams

http://master.dl.sourceforge.net/projec ... 7z?viasf=1


Top
 Profile  
 
 Post subject: Re: MenuetOS 1.26.10
PostPosted: Tue Jul 18, 2017 5:29 pm 
Offline
Member
Member
User avatar

Joined: Fri Mar 07, 2008 5:36 pm
Posts: 2111
Location: Bucharest, Romania
You should be able to think of many. There is this funny notion of egalitarianism between tools going around, possibly stemming from equal language expressivity. But to assume that assembly is just as suited for a random project as any other language is to imply silly propositions such as "proper type systems never avoid bugs" or "there are no benefits to avoiding mutation".

And, again, if you're goal is to optimize, maintainability goes out the window from the get-go. If you aim for maintainability, then there are no benefits---so might as well use a portable language.

_________________
"Computers in the future may weigh no more than 1.5 tons.", Popular Mechanics (1949)
[ Project UDI ]


Top
 Profile  
 
 Post subject: Re: MenuetOS 1.26.10
PostPosted: Tue Jul 18, 2017 5:54 pm 
Offline
Member
Member
User avatar

Joined: Tue Mar 06, 2007 11:17 am
Posts: 1225
If you are to use assembly, you have to think of an application as another digital electronics device being designed that could well reside in software or in silicon. If you try to start by adding complex abstraction to the design of a program in a normal, large scale way, it will become very unmaintainable.

You have to design assembly programs as a set of extended CPU instructions, very brief and specific routines, specifically suited to crunch the task the target application is supposed to perform. Those routines have to make it easier to create the application, more readable, as if we were dealing with general-purpose opcodes. The logic of an instruction-like routine has to be at its minimum, doing strictly only what is needed for a given logical atom, a single generic operation that is independent from the rest. It will make for code that is as reusable as the core native instructions.

Then the application has to have a way to interact with its own components and get fitted in a format understood by a given system.

Even if you use C, it won't help you much if you don't know how to manage stuff.

It's initially assumed that there are as many human-readable programs written in pure assembly as C or C++ programs.

It's also assumed that the code will be human-readable and that there will be full documentation on how it works, and that the code will be in general optimized, and as with anything else will be increasingly optimized over time by its developers.


The intention to write stuff in assembly is not having to depend on more tools, being able to port programs to systems where no compiler exists for the program (too old and too new systems). The intention is also to inspect in code how the whole structure of something looks like, since nothing will be hidden from the developer.

But probably writing routines as if they were specialized CPU instructions and even define the instruction set of an application or type of application would be extremely beneficial even if you use a very high level language as it would simplify it and make it more persistent than if you just try to write it in a high level language just because of its supposed ease. But then the code will probably be very hard to port and understand outside the pure realm of the language it's written in. At least with assembly, you can break the logic and a program to pieces so little that they can be understood briefly and reimplemented in any conceivable language, and even acted as human actions to evaluate them manually in an efficient way.

_________________
Live PC 1: Image Live PC 2: Image

YouTube:
http://youtube.com/@AltComp126/streams
http://youtube.com/@proyectos/streams

http://master.dl.sourceforge.net/projec ... 7z?viasf=1


Top
 Profile  
 
 Post subject: Re: MenuetOS 1.26.10
PostPosted: Tue Jul 18, 2017 7:06 pm 
Offline
Member
Member
User avatar

Joined: Fri Mar 07, 2008 5:36 pm
Posts: 2111
Location: Bucharest, Romania
Even when developing hardware, people still use higher-level languages, such as VHDL, Verilog, or the Xilinx tool, precisely because it's not beneficial to do everything manually: micro-optimizations, abstractions, error checking, etc. Modern techniques do actually scale much better, in fact.

You haven't answered any of my complaints (e.g., type systems help), you simply gave some indication of how you personally go about writing assembly programs and added the (wrong) assumption that there are as many maintainable assembly programs as there are for C and C++. The way you phrased the latter also suggests to me that you approve of this language egalitarianism I've mentioned. I wouldn't even dare say that C is as maintainable as C++. C is rather terrible and C++ is somewhat better in that it provides some good tools for writing maintainable programs but makes no hard constraints on code bases which will no doubt lead to problems every now and then.

As for platforms for which there are no compilers, I don't think that argument holds ground because the cost of writing a C compiler is extremely low. One can slap together a half-assed C89 compiler in no more than a couple of days (and if you use something like LLVM, it will be a whole less half-assed than one might expect for such a short period of time). Or, better yet, they can port an existing one. It's better to invest a little bit in effort in that than have to live with a large-scale assembly program for years. The "no tools" argument is artificial and would've made it in the 80s and maybe early 90s but not today.

_________________
"Computers in the future may weigh no more than 1.5 tons.", Popular Mechanics (1949)
[ Project UDI ]


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 24 posts ]  Go to page 1, 2  Next

All times are UTC - 6 hours


Who is online

Users browsing this forum: No registered users and 23 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group