OSDev.org

The Place to Start for Operating System Developers
It is currently Thu Mar 28, 2024 6:56 pm

All times are UTC - 6 hours




Post new topic Reply to topic  [ 25 posts ]  Go to page Previous  1, 2
Author Message
 Post subject: Re: The Future of Programming, circa 1973
PostPosted: Tue Dec 02, 2014 2:17 pm 
Offline
Member
Member

Joined: Tue Mar 04, 2014 5:27 am
Posts: 1108
Arto wrote:
Wirth's law: Software is getting slower more rapidly than hardware becomes faster.


Wirth and the co. stole it from Parkinson and rebranded. :)


Top
 Profile  
 
 Post subject: Re: The Future of Programming, circa 1973
PostPosted: Wed Dec 03, 2014 5:31 am 
no92 wrote:
If everyone works by himself, things tend to be very slow. In that scenario there would be different projects doing basically the same. If everyone programming something (e.g. operating systems) would start to work together with all the others and form one single big project, everything would be super fast.

It is the most important problem of our society. The goal is to make money, so nobody will spend a lot of time without some viable benefit (mostly in form of money). That's why people will never unite and create something better than Windows. And a bit more general - there will be no "real groundbreaking development" because then all ordinary developer organizations will loose money.


Top
  
 
 Post subject: Re: The Future of Programming, circa 1973
PostPosted: Wed Dec 03, 2014 2:02 pm 
Offline
Member
Member
User avatar

Joined: Sat Mar 31, 2012 3:07 am
Posts: 4591
Location: Chichester, UK
It's a shame that he doesn't seem to appreciate the obvious advantages of style-sheets and markup over manually changing things visually or appreciate the simplicity of text files. Also, he fails to address the question of why these new ideas - having been tried - are not now in general use. When a seemingly good idea doesn't gain acceptance you have to ask the question, is it really a good idea.

The Xerox-inspired windowing environments are still with us. But despite the backing of the largest software company in the world, VisualAge-type programming environments only exist as shadows of the original. My take on that is that GUIs work and drag-and-drop programming doesn't. The argument that this is because of self-perpetuating inertia is naive.


Top
 Profile  
 
 Post subject: Re: The Future of Programming, circa 1973
PostPosted: Thu Dec 04, 2014 12:34 pm 
Offline
Member
Member
User avatar

Joined: Wed Jan 06, 2010 7:07 pm
Posts: 792
I switch back and forth between Bret Victor's perspective and yours- while what we do now has its advantages, it also has its disadvantages, and the same goes for these other ideas. I think what he's lamenting (and what I keep coming back to) is that many of the inconveniences and shortcomings we fight today were solved decades ago, and there hasn't been a lot of work to bring both approaches together. You can't deny the powerful effect of legacy code on the way things progress (C++ and Objective-C are good examples).

The biggest area that we could advance in without even touching the way we do stylesheets/markup/text files is debugging. We still use line-by-line step-through debugging (or even just a bunch of print statements, in a lot of dynamic languages!), while compilers internally have had complex data and control flow analysis for years. Why can't we look at the dataflow through a whole function while we edit it? Why don't we have better visualization tools for which paths are taken under what conditions? This is all near trivial for a computer to tell us, but we often sit here and stare at our text editors and debuggers for hours puzzling through it.

The other biggest area that we could improve is programming languages themselves. For a long time we've been stuck with (for the majority of the industry) either the C family, with its header files and literally undecidable parsing and incredibly poor semantics of undefined behavior and unrestricted aliasing and all, or a tower of babel of exorbitantly dynamic interpreted/JIT languages with memory layout semantics so restrictive they can't even handle Minecraft without accidentally running into memory bandwidth problems.

Happily, we are starting to see a light at the end of the tunnel with language design at least. While web programming is still a hopeless, roiling mass of JavaScript frameworks and "transpiled" languages that completely reinvent themselves every two weeks (The Birth and Death of JavaScript is hilarious on that topic), there are a few new languages that actually bring some sanity.
  • Google's Go, while not really useful for OS dev and not my favorite language by any means, is a fantastic replacement scripting language in many ways- it's still garbage collected, but it has sane memory layout semantics (i.e. not "box everything and let the GC handle it!"), a sane type system (i.e. type inference rather than "tag all the values and dispatch everything at runtime!"), and is thus natively compiled (rather than "let's interpret it- no, let's JIT compile it to try to gain back the performance we flushed down the toilet with a giant Rube-Goldberg contraption!").
  • Mozilla's Rust is more at the level of C or C++, but with proper modules, sane semantics wrt undefined behavior, aliasing, etc. (and thus more optimization opportunities than C, harking back to the days of FORTRAN), static memory and thread safety so your system libraries aren't riddled with remote code execution vulnerabilities (again, already solved years ago in languages like Ada), and a much simpler type system that retains more power than plain C (no multiple virtual inheritance thank you).

What we need is to combine the simple semantics and formats we have with the analysis and visualization tools Bret Victor is talking about. Then maybe we can move forward.

_________________
[www.abubalay.com]


Top
 Profile  
 
 Post subject: Re: The Future of Programming, circa 1973
PostPosted: Thu Dec 04, 2014 1:04 pm 
Offline
Member
Member
User avatar

Joined: Sat Mar 31, 2012 3:07 am
Posts: 4591
Location: Chichester, UK
I'd certainly agree that there is always room for improvement in the tools that we use to support proven traditional programming languages. But it would be very short-sighted to suggest that there haven't been huge improvements in these tools since 1973. What there hasn't been is a take up of drag-and-drop programming or other dumbing down paradigms.

The arguments relating to binary vs. assembler and assembler vs. higher languages proves the opposite to me of the presenter's argument. Despite natural inertia, these methods of programming quickly took over. There are always young programmers ready to use good new methods; but they must be useful, not just passing fads.

I've always believed in KISS. So when you consider interacting programs over the Internet do you go for programs that somehow negotiate how they are going to talk to each other or do you just define simple standard interfaces? I know which I prefer. Take the example of the railways in the UK. Originally we had two different guages and there were actually carriages where the guage of the wheels could be adjuste, i.e. they negotiated with the track which guage to use. But we don't do things that way nowadays - we agreed on a standard guage and it just works.

Do I want a system where I can interact with text to decide how I want each element presented, or do I want one where I can change every instance of a particular element by editing just one item in a style sheet? It's a no-brainer to me, but I did work in the publishing industry where standard mark-up styles have worked for centuries.

And I have to confess to being a little less than impressed by all the cute little references to "He's got a company called Intel" and the like.


Top
 Profile  
 
 Post subject: Re: The Future of Programming, circa 1973
PostPosted: Thu Dec 04, 2014 4:23 pm 
Offline
Member
Member
User avatar

Joined: Wed Jan 06, 2010 7:07 pm
Posts: 792
There have certainly been improvements in our tools since 1973, but in many aspects we're at the same level (or worse!). I think his point is that drag-and-drop and friends don't have to be dumbed-down paradigms. Personally, I would also assert that they can also be very KISS (I don't like his idea of programs negotiating how to talk to each other, although it could make sense in some fields at a very high level).

Text is a very versatile and powerful format. But as IDEs and progressively higher level languages show, it leaves much to be desired where humans are concerned. I would love to edit programs at a much more semantic level- for example, Jonathan Edward's Subtext project has a very cool demo for editing conditional logic graphically through something called "schematic tables"- it's very much not like your typical drag-and-drop tool- the representation is more orthogonal and straightforward than text could ever be. Bret Victor has another talk, Inventing on Principle, with a lot of other good examples for specific interfaces like schematic tables.

I don't know if it would make sense to represent entire programs graphically, but these examples tell me that graphical interfaces are at least useful in particular niches.

_________________
[www.abubalay.com]


Top
 Profile  
 
 Post subject: Re: The Future of Programming, circa 1973
PostPosted: Thu Dec 04, 2014 4:26 pm 
Offline
Member
Member
User avatar

Joined: Sat Jan 15, 2005 12:00 am
Posts: 8561
Location: At his keyboard!
Hi,

Rusky wrote:
What we need is to combine the simple semantics and formats we have with the analysis and visualization tools Bret Victor is talking about. Then maybe we can move forward.


Let's consider a "back box" that has input data and output data; and start by trying to use data visualisation for both the input data and the output data. If we're able to do that, I think we'll solve 95% of the problems with programming tools.

WARNING: That "black box" is a compiler.


Cheers,

Brendan

_________________
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.


Top
 Profile  
 
 Post subject: Re: The Future of Programming, circa 1973
PostPosted: Thu Dec 04, 2014 4:53 pm 
Offline
Member
Member
User avatar

Joined: Sat Mar 31, 2012 3:07 am
Posts: 4591
Location: Chichester, UK
Quote:
these examples tell me that graphical interfaces are at least useful in particular niches
That I wouldn't dispute. And there are various computer languages (Prolog, one of Bret Victor's examples, being an obvious case) that have niche uses too. But, pragmatically, so far nothing has seriously competed with the simplicity and versatility of traditional imperative languages produced as text files processed by a compiler. It's not because of lack of imagination or fear of something different; it's because that style of programming has so far proven to be extremely effective.

A lot of the talk reminded me of the glowing predictions made for AI in the days when the lecture was set. In that sense it does capture the naive optimism of those days quite well. But AI has turned out to be a much more difficult problem than was thought at that time. The same goes for 4th-generation languages; they have their particular applications but are always somewhat constrained in what they can produce. It's not because they are badly designed, it's because it is a very difficult problem to design such "intelligent" systems that can match the capabilities and versatility of simple imperative languages.

I've nothing against niche products but I wouldn't want to see us throwing away good programming tools because of some airy-fairy ideas about how the computer can do it all. It's just not that simple.


Top
 Profile  
 
 Post subject: Re: The Future of Programming, circa 1973
PostPosted: Thu Dec 04, 2014 10:44 pm 
Offline
Member
Member
User avatar

Joined: Wed Jan 06, 2010 7:07 pm
Posts: 792
I don't think he's advocating throwing things way, except maybe the bad parts of things or things that are so bad they need to be replaced *ahem*C++*ahem*. He's lamenting that we haven't learned from the past. We all agree here that programming languages and tools can be improved- the talk is saying a lot of the possible improvements have already been done, they just happen to have been done in systems that weren't as successful and then ignored by the people who built what we use now.

Think about how much avoidable drudge work gets done today. People do that work because the "mainstream" frameworks and tools, while very powerful and with many improvements over 1970s technology, don't have the particular improvements that would help their particular field. We still reimplement data binding and database CRUD interfaces manually, write asynchronous code using layers upon layers of nested callbacks, find, report, and fix buffer overflow errors and null pointer dereferences, write code without autocompletion, and so on and so on. These are all mostly solved in some systems somewhere today, but they can't be used where they're needed- the whole web stack is awful for writing backends, JavaScript has no way to avoid callback hell, C and C++ have no way to avoid security vulnerabilities, etc.

I often see an attitude of conflating these "conveniences" with the downsides of the systems they were first implemented in: Graphical programming is declared to be inferior, with no thought given to visualization tools or better organizational tools for text-based code. Memory safety is declared to be irreconcilable with performance or not having a garbage collector. Concurrency is declared to be "too hard." These are all demonstrably false by decades-old technology, even if those systems did some other things wrong!

_________________
[www.abubalay.com]


Top
 Profile  
 
 Post subject: Re: The Future of Programming, circa 1973
PostPosted: Fri Dec 05, 2014 4:32 am 
Offline
Member
Member
User avatar

Joined: Wed May 15, 2013 5:49 pm
Posts: 44
Location: Berlin
iansjack wrote:
A lot of the talk reminded me of the glowing predictions made for AI in the days when the lecture was set. In that sense it does capture the naive optimism of those days quite well. But AI has turned out to be a much more difficult problem than was thought at that time.


Incidentally, a great book on this topic is Computer Power and Human Reason by Joseph Weizenbaum, the guy who developed ELIZA. (Anyone remember ELIZA?)

Written in 1976, you really get a good sense of the naivety of the era with regards to how general-purpose AI was just around the next corner; and further, how they would design it rationally from first principles with little regard to first understanding how actual messy biological organisms functioned.

Living as we are after the AI winter of the 1990s, it's hard for us to imagine just how badly oversold the early AI hype was, so it's a good perspective to gain. Weizenbaum was the ultimate skeptic and critic of all that, and correctly predicted, at a time when it was a minority view, that the endeavor was misconceived and doomed.

_________________
Developer of libc11


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 25 posts ]  Go to page Previous  1, 2

All times are UTC - 6 hours


Who is online

Users browsing this forum: No registered users and 22 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group