February 03, 2005
The Least Important Open Problem in Programming Languages

john wiseman of centralia

From Todd Proebsting's “Disruptive Programming Language Technologies” presentation:

The Least Important Open Problem in Programming Languages: Increasing program performance via compiler optimization

  • Moore’s Law suffices
  • Algorithms and design make the big difference
  • Challenge: Name a single significant software product that relied on compiler optimization for viability.
Posted by jjwiseman at February 03, 2005 11:11 PM
Comments

Challenge: Name a single significant software product that relied on compiler optimisation for viability:

Easy. Dooms 3, 2 and 1 ;)

Posted by: John Connors on February 4, 2005 01:40 AM

Erm, what's "significant"? How about predicting the weather -- or perhaps somewhat topically, seismic shocks and their effects?

Yes, it's probably true that all a better compiler gets you is a better constant factor. Sometimes those constant factors are important. (And besides, in case that's not a sufficiently Open Problem: I'd like something with the speed of Fortran and the ease of development of CL... :-)

Posted by: Christophe Rhodes on February 4, 2005 02:30 AM

As I recall, there is a trope about an engineering school that rewrote its fortran routines in commonlisp, and found that they ran faster.

Posted by: Marcin on February 4, 2005 02:49 AM

Challenge: Name a single significant software product that relied on compiler opt... [a dirty word according to the comment filter] for viability.

How about Google? With their huge server farms, I'm guessing that code-tuning, whether its by hand or by compiler, could have a significant impact on how much hardware they buy. I have no idea whether choosing not to tune code would make their business unviable, but it would make it more expensive.

Posted by: Ken Dyck on February 4, 2005 04:37 AM

So, name a product that would have failed if they hadn't put "-O3" (et al) on their GCC (et al) command line?

Or, name a product that would have failed if it weren't for the *past fourty years* of compiler opt-imization research?

If the former, I suspect: none of them. If the latter, I suspect: most of them.

On the other hand, he doesn't call compiler opt-imization a closed problem, he calls it the least important open problem. [Disclaimer: I haven't Read The Fine PowerPoint.] Which in no way implies that, in the past, it might not have topped the list. :)

Off Topic notice: When previewing this post, I got an error, "MT::App::Comments=HASH(0x80604ac) Use of uninitialized value in sprintf at lib/MT/Template/Context.pm line 1187."

Posted by: Larry Clapp on February 4, 2005 05:03 AM

Challenge: Name a single significant software product that relied on compiler optimisation for viability.

Look at the big picture. All software requires hardware support and modern hardware advances need compiler optimisations to be viable. Take superscalar chips for example. Without compiler optimisations (instruction scheduling) they lose their edge. Take Cray's vector processors in the 70s and 80s, without vectorization optimisations they would have no technical advantage.

p.s. Love the blog.


Posted by: Barry Watson on February 4, 2005 05:38 AM

I've just remembered a book that described optimisations to reduce code size and power usage aimed at compilers for embedded products. This could make certain products viable w.r.t cost (fewer memory chips) and usability (longer battery time).

Posted by: Barry Watson on February 4, 2005 05:51 AM

"* Challenge: Name a single significant software product that relied on compiler optimisation for viability."

If you accept BBT/Lego as a compiler optimisation, then I submit that Microsoft Word is a significant software product.

Posted by: Anonymous on February 4, 2005 07:10 AM

For many people, Java was not viable until the HotSpot compiler came along.

Posted by: Harold Carr on February 4, 2005 07:13 AM

I think Larry Clapp said it well, there have been years of compiler 0ptimization that we take advantage of every day.

At one point (I'm told...) you couldn't write any viable piece of software not in assembly. The overhead caused by compiling it down from some high level language was too much... Wasn't this Lisp's problem? As such I would put forward most of the commericial software written in the past decade or two as potential counter-examples.

I'm just glad the hardware folks never said: "Speeding up the chips is the least important open problem. The compiler writers and algorithm people will suffice to make things run faster." Yes I know it isn't quite the same ;-)

Fun argument, thanks for posting that!

Posted by: Nathan Bird on February 4, 2005 08:08 AM

Here's an example: McCLIM. It's great software---and it takes as much compiler optimisation as it can get. Recently Christophe Rhodes tested McCLIM on SBCL with a patch that made CLOS methods faster (at the expense of safety), and the speedup was reportedly enormous, making it much more usable. If it weren't for compiler optimisation, it probably wouldn't be usable at all.

Posted by: Peter Scott on February 4, 2005 08:29 AM

You don't seem to have a trackback feature. This set off a whole tangential train of thought for me, which I decided belonged on my own blog. Was fun to think about, thank you.

Posted by: Dan Knapp on February 4, 2005 09:03 PM

How about every half-decent computer game out there ;)

Posted by: Michael Walter on February 4, 2005 09:50 PM

I got it! ;-)

The L4 µ-kernel (http://os.inf.tu-dresden.de/L4/ they use compiler optimisation to reduce IPC cost...

Posted by: Andrea on February 5, 2005 02:49 AM

The original L4 was written in highly tuned assembler. Later implementations used C++.

Frankly, any program I have to wait for is too slow. Give me all the compiler opts you can (unless it's the compiler itself I'm waiting for.)

Posted by: Paul Dietz on February 5, 2005 04:36 AM

Hmm. I wouldn't say that the patch that I made for CLOS methods in SBCL is a compiler improvement -- rather, it's a runtime library depessimization; it's much like finding a better data structure for an algorithm. That's the kind of thing that probably won't ever be dealt with by compilers, even given the Holy Grail of Sufficient Smartness.

However, the challenge to name a significant software product is almost completely wrong! The challenge ought to be: show that no significant software product has ever failed because the compiler was not good enough. The fact that we get software doing stuff today doesn't in any way indicate that compiler advances are unimportant.

(And, as for Fortran to Common Lisp translations yielding faster code: of course this is possible, if during the rewrite one notices inherent algorithmic inefficiencies and corrects them; I'd go so far as to say that, given compiler technologies that exist currently, it's unlikely that a direct translation would yield a speedup in Common Lisp compared with Fortran for any non-trivial program.)

Posted by: Christophe Rhodes on February 5, 2005 03:12 PM

My first thought, of course, was desktop video (e.g. MPlayer). My second thought was that most of the video players I use these days write their codecs in assembly anyway, so they don't benefit from compiler optimization. My thoughts about FPS games etc. followed the same track.

There are probably many *hardware* projects that would have failed without compiler optimization, and perhaps most high-level languages until 1990 would have failed without compiler optimization. Even today, we have things like the STL and Blitz++ which would be useless without compiler optimization.

My conclusion: compiler optimization primarily enables people to do the same things with less effort, by making more natural notations efficiently executable, rather than enabling people to do things they couldn't have done before.

Posted by: Kragen Sitaker on February 7, 2005 02:18 PM
Post a comment
Name:


Email Address:


URL:




Unless you answer this question, your comment will be classified as spam and will not be posted.
(I'll give you a hint: the answer is “lisp”.)

Comments:


Remember info?