OSDev.org

The Place to Start for Operating System Developers
It is currently Thu Mar 28, 2024 4:20 pm

All times are UTC - 6 hours




Post new topic Reply to topic  [ 11 posts ] 
Author Message
 Post subject: Why So Much Complexity On Engineering?
PostPosted: Wed Apr 04, 2007 7:55 am 
Offline
Member
Member
User avatar

Joined: Tue Mar 06, 2007 11:17 am
Posts: 1225
More complex doesn't mean better. And for us, OS developers, so much complexity is causing that the projects reach a truly mature stage in a lifetime.

So, why not do things inventing new algorithms for things such as multitasking, filesystems, memory managers and so, in a way that is both innovative, easy (brief) and effective, like in the first days of computing, in which impressive things could be done in a really small program?

At least I try to "invent" such new algorithms, because after 2 years of learning such a bunch of (in my opinion) overcomplicated specifications, I am feeling uncomfortable and willing to find options.

It's not without reason that all of these (our) projects don't seem to go beyond simple networking, simple GUI, simple filesystem handling, and virtually simple everything. So, why not look for ways of switching the balance and make our work simple to do great results instead of working with great difficulty to do simple results?


Top
 Profile  
 
 Post subject:
PostPosted: Wed Apr 04, 2007 8:25 am 
Offline
Member
Member
User avatar

Joined: Thu Nov 16, 2006 12:01 pm
Posts: 7612
Location: Germany
I think that so-called "simplicity" of earlier systems is a deception. One half of that "simplicity" was our lack of deeper understanding, the other half was simple lack of features.

AmigaOS - a highly efficient multitasking microkernel. No memory protection, though.

Early MacOS - small, fast, slick. Unfortunately it didn't do preemptive multitasking, so one "bad" application could freeze your system.

Oversimplified and only two examples, but I think they show my point.

_________________
Every good solution is obvious once you've found it.


Top
 Profile  
 
 Post subject:
PostPosted: Wed Apr 04, 2007 8:39 am 
Offline
Member
Member
User avatar

Joined: Tue Mar 06, 2007 11:17 am
Posts: 1225
Maybe a big deal of what is applied is a workaround and could be stripped down to something that provides security and functionality, to be mid-sized but still simple, which by now is huge and way too complex.

If that's so, we are well lost in our efforts, and if we ever want it to be something more than a hobby, we won't be able to keep up and could eventually turn into something we can't afford anymore... :?


Top
 Profile  
 
 Post subject:
PostPosted: Wed Apr 04, 2007 9:03 am 
Offline
Member
Member

Joined: Thu Mar 15, 2007 8:48 am
Posts: 214
Internal simplicity manifests as external complication. You can either have simple internals that acts like voodoo, or simplicity that internally is a horror.

I prefer the former.


Top
 Profile  
 
 Post subject:
PostPosted: Wed Apr 04, 2007 10:34 am 
Offline
Member
Member
User avatar

Joined: Tue Mar 06, 2007 11:17 am
Posts: 1225
Oh no, my friends, I think there are plenty of simplicity options not yet implemented, like stop redefining standars every couple of years. At least for our software that can be a goal, unlike Microsoft, which intentionally screws up older-versions of their specifications with new features that not only are it but that are also designed to cause incompatibility.

That's a perfectly avoidable plague. And, if a design is really outstanding, it will be thought and re-thought to be both internally straightforward and that naturally fits in the rest of the logical system. If that doesn't happen, then the design is so, so imperfect that it won't be nice to implement and would be a real pain not to find a better solution (like the upgrade from WAV to MP3, not very simple, but a very good example of what I mean: stable, very hard to make obsolete and very tight, takes up having to know all of its principles, but is near a 70% to ideal).


Top
 Profile  
 
 Post subject:
PostPosted: Thu Apr 05, 2007 10:44 pm 
Offline
Member
Member

Joined: Thu Oct 21, 2004 11:00 pm
Posts: 248
ehird wrote:
Internal simplicity manifests as external complication. You can either have simple internals that acts like voodoo, or simplicity that internally is a horror.

I prefer the former.

New Jersey! Unix! Son of Asmodeus!

As you can obviously tell, I prefer that my abstractions work correctly and *simply* (with complicated implementation) rather than make users learn the black magic of my system.


Top
 Profile  
 
 Post subject:
PostPosted: Thu Apr 05, 2007 11:04 pm 
Offline
Member
Member
User avatar

Joined: Thu Mar 08, 2007 11:08 am
Posts: 670
I don't think there's that much black magic in the basic Unix functionality at userlevel. In fact, I'd say Unix belongs to the class of "looks simple from outside, total mess inside", at least if I have to pick one of them.

_________________
The real problem with goto is not with the control transfer, but with environments. Properly tail-recursive closures get both right.


Top
 Profile  
 
 Post subject:
PostPosted: Fri Apr 06, 2007 1:52 am 
Offline
Member
Member
User avatar

Joined: Tue Oct 17, 2006 6:06 pm
Posts: 1437
Location: Vancouver, BC, Canada
This is going to be pretty abstract, so bear with me :)

At the architecture level, complexity is defined as the interconnectedness of things. Things can be functions, modules, classes, whatever. A connection between them is a dependency of some kind -- call, communication protocol, common file format, etc.

The goal of a good design is to reduce the interconnectedness of things as much as possible. I think a lot of people lose sight of this and try instead to create something that looks simple on the outside but is nightmarishly complex on the inside. This is called "simplexity".

Although I can't directly relate it to OS design, I've seen this before in typical OO designs. People who understand the "letter" but not the "spirit" of OOD will look at the requirements, create a class for every noun, and proceed to absorb huge amounts of responsibility into each class. This ultimately leads to all kinds of crazy dependencies between classes. Sure, on the face of it, a system with only 20 classes seems simpler than one with 200, but verifying the correctness of any of those 20 will be more than 10 times harder than doing the same for one of the 200.

I'm not sure I believe in the dichotomy of "simple internally, complex externally" and vice-versa. I think a well-design system's interface reflects its internals while leaving out crucial details. In other words, you can simplify through abstraction, but not to the point where you are re-defining the problem you have to solve. That's when you get leaky abstractions. A good system will usually be quite complex when you look at the most fine grain of detail, but it will not be needlessly complex (like COM, Corba, or EJBs... yuck!)

Perhaps not surprisingly, all these attitudes make me a fan of microkernels, while simultaneously making me uncomfortable with how paging I/O is typically implemented in microkernels. :? It's always felt like an abstraction inversion to me...

Quote:
no, my friends, I think there are plenty of simplicity options not yet implemented, like stop redefining standars every couple of years.


I'm sorry to say you're dreaming. Technical sensibility is rarely involved in the development of standards. A lot of them are pushed by vendors with an agenda (*cough* M$ *cough*) and don't necessarily work very well. Unless you plan to overthrow capitalism, I suspect these sorts of shenanigans will keep happening. I think this is why I've become less interested in studying commercially available technologies and more interested in OS and programming language research in recent years.

_________________
Top three reasons why my OS project died:
  1. Too much overtime at work
  2. Got married
  3. My brain got stuck in an infinite loop while trying to design the memory manager
Don't let this happen to you!


Top
 Profile  
 
 Post subject:
PostPosted: Fri Apr 06, 2007 3:27 am 
Offline
Member
Member
User avatar

Joined: Tue Mar 06, 2007 11:17 am
Posts: 1225
I guess the efforts made to make GNU software, and beyond, like readily public domain, are a sort of technological socialism, in which the intention is to count with knowledge resources not controlled or decided by the big corporations which are more interested in making big mone more than advancing the state of the art.

Sure, there must be a way of getting a simple-inside-simple-outside product, with a reasonable complexity but neither of the type "#define _number_1_ 1". If not, well, it will be the same history of having to learn to deal with strict logical thinking at the expense of a big (huge actually) inversion of time and anything else, for the whole cycle, over and over, and accumulatively complexer.


Top
 Profile  
 
 Post subject:
PostPosted: Fri Apr 06, 2007 11:37 am 
Offline
Member
Member
User avatar

Joined: Tue Oct 17, 2006 6:06 pm
Posts: 1437
Location: Vancouver, BC, Canada
~ wrote:
I guess the efforts made to make GNU software, and beyond, like readily public domain, are a sort of technological socialism


I've always called it Communism, but I get paid to develop software, so that's my bias. ;)

Quote:
If not, well, it will be the same history of having to learn to deal with strict logical thinking at the expense of a big (huge actually) inversion of time and anything else, for the whole cycle, over and over, and accumulatively complexer.


I don't think the cycle of increasing complexity can really go on forever. It isn't sustainable. At a certain point, developers having to deal with this crazy technology can't be productive anymore. For example, MS is transitioning away from COM towards .NET. COM was terribly unwieldy and it was just about impossible to grasp any large COM-based system (I lived and breathed OLE DB for many years... it sucked). .NET in comparison is much easier to understand. If it ever becomes bloated and over-generalized like Java, then something else will gain favour among developers and replace it.

_________________
Top three reasons why my OS project died:
  1. Too much overtime at work
  2. Got married
  3. My brain got stuck in an infinite loop while trying to design the memory manager
Don't let this happen to you!


Top
 Profile  
 
 Post subject:
PostPosted: Fri Apr 06, 2007 8:13 pm 
Offline
Member
Member
User avatar

Joined: Sat Mar 31, 2007 4:57 pm
Posts: 66
Location: Xanadu
Project XANA (my OS) is sort of an attempt at what ~ is saying here. It's based on Project Xanadu, which originated early on, when things were very simple, and progressed until the glory days of cruft and kludgery. Xanadu's goal was something that I very much agree with -- to make things conceptually intuitive, and therefore simple on both sides.

Herein lies the much-forgotten dichotomy: something can be conceptually intuitive (e.x., math) without being humanly intuitive (e.x., english), and vice versa. Computers operate on a conceptual level, not a human level, and therefore things that are conceptually intuitive work very well on computers. Humanly intuitive things tend to work less well on computers, but this is not anything of importance, because the human intuition is defined by what is learned and absorbed by the humans in question. Humans come in to this world knowing very little. We do not know how to speak, but that quickly becomes the basis for all our thoughts. We do not know how to do math, but quickly we begin to favor division over repeatedly subtracting on our fingers. All tools of the past were conceptually simple for the medium in question, and human intuition was not as much a factor.

Essentially, in my view, good design is what is commonly called "cuspiness" or "hackery" -- it is a solution that is not immediately visible to any but the most trained eye, a solution that is conceptually simple, and once understood by one learning it, humanly simple. That is not to say that one should create systems without the human in mind at all (DOS is an example of such a system, in my opinion, as is something like INTERCAL). On the contrary: one must create something that conforms well to the inherent structure of the human mind (think mnemonics, visualizations, arrows, color coding). However, it does NOT have to be simply a clone of everything popular and overdone.

I echo Ted Nelson's sentiment when he said that the current WIMP GUI paradigm is simply a poor simulation of paper, and I echo his sentiment when I say that is a bad thing. Paper is useful, yes, but it is NOT a computer. Nor is a desktop, or a typewriter, or even a 3d world. Computers are limitless. One should appeal in every user interface to the Turing nature of the Machine! By avoiding such (in the name of "user friendliness") you kill the thing inside you, inside the machine, that yearns for the infinte.

~John

P.S.: Sorry if I waxed philosophical (as I am wont), as my passion does not wane for this train of interlocution ;-).

_________________
"It is time to return real programming to users and even beginning users, to whom it has been denied since 1984."
- Theodore Holm Nelson

Image


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 11 posts ] 

All times are UTC - 6 hours


Who is online

Users browsing this forum: DotBot [Bot] and 25 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group