Saturday, November 28, 2009

Exactly Why We Are No Longer UNIX-ish

When we say that Linux is UNIX-like, what are we saying? At my college, we have a course that is named 'an introduction to UNIX using Linux'. All over I hear people use the phrase 'UNIX/Linux' when referring to UNIX-style systems. It is somewhat hilarious to me, as Linux and the surrounding community have, for the most part, left the UNIX philosophy behind. The UNIX philosophy goes as follows:

Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new features.

Expect the output of every program to become the input to another, as yet unknown, program. Don't clutter output with extraneous information. Avoid stringently columnar or binary input formats. Don't insist on interactive input.

Design and build software, even operating systems, to be tried early, ideally within weeks. Don't hesitate to throw away the clumsy parts and rebuild them.

Use tools in preference to unskilled help to lighten a programming task, even if you have to detour to build the tools and expect to throw some of them out after you've finished using them.

The above are all quotes of Doug McIlroy (this is the guy who invented UNIX pipes). He was later known to have said the following in summary: This is the Unix philosophy: Write programs that do one thing and do it well. Write programs to work together. Write programs to handle text streams, because that is a universal interface.

Another 17 rules for UNIX can be distilled by looking at the design of UNIX.
Rule of Modularity: Write simple parts connected by clean interfaces.
Rule of Clarity: Clarity is better than cleverness.
Rule of Composition: Design programs to be connected to other programs.
Rule of Separation: Separate policy from mechanism; separate interfaces from engines.
Rule of Simplicity: Design for simplicity; add complexity only where you must.
Rule of Parsimony: Write a big program only when it is clear by demonstration that nothing else will do.
Rule of Transparency: Design for visibility to make inspection and debugging easier.
Rule of Robustness: Robustness is the child of transparency and simplicity.
Rule of Representation: Fold knowledge into data so program logic can be stupid and robust.
Rule of Least Surprise: In interface design, always do the least surprising thing.
Rule of Silence: When a program has nothing surprising to say, it should say nothing.
Rule of Repair: When you must fail, fail noisily and as soon as possible.
Rule of Economy: Programmer time is expensive; conserve it in preference to machine time.
Rule of Generation: Avoid hand-hacking; write programs to write programs when you can.
Rule of Optimization: Prototype before polishing. Get it working before you optimize it.
Rule of Diversity: Distrust all claims for “one true way”.
Rule of Extensibility: Design for the future, because it will be here sooner than you think.

Linus Torvalds has admitted that the Linux kernel has gotten heavy, calling it 'bloated and huge'. Much of this bloat is due to necessity, but the size is getting to be unwieldy. The size is getting to kill two points in the above. Rule of Extensibility, and the rule of Parsimony. However, we have some other issues besides the kernel. All of the programs that people are so fond of, and willing to defend to their graves have been breakers of most of the above points.

In my last post, I spoke of us getting away from UNIX, and taking a path I dislike. People say that Linux was not a UNIX clone and thus the heritage is pointless. Well, Linux was designed as an open source Minix kernel initially. From there, we can say that Minix was an operating system designed to teach students about UNIX design and operating systems in general. We can also say the GNU was an open source UNIX implementation. If you would like to argue that, why is it that our modern UNIX (Macintosh OSX) uses GNU components?

We didn't have to jump on with X11, where bloat and complexity are present for no apparent reason whatsoever, and where nasty hacks have resulted in even more complexity. We didn't have to jump on with KDE where a lack of extensibility, parsimony, clarity, modularity, simplicity, and repair lead to the need for a rewrite from the ground up, and left many of us hating the new version. We didn't have to jump on with GNOME or XFCE that are facing similar growing pains. We didn't have to sign on with any of this, but we did. We are now paying the price as our systems get heavier and heavier, and like the guys on the other side of the fence (Windows and such), we are going to start upgrading our systems just because our OS demands a better machine.

23 comments:

KimTjik said...

I've read the previous and this article. Could you please make a brief simple explanation to how a system in your liking look like and what it should be capable of?

Just being UNIX-ish isn't a virtue in itself if it doesn't acomplish anything of real importance.

Carl D said...

I don't mean to be pedantic but isn't it Linus Torvalds?

Ford said...

@Carl D,
indeed, thx for bringing the typo to my attention.

joeuser said...

"Rule of Economy: Programmer time is expensive; conserve it in preference to machine time.
...
Rule of Optimization: Prototype before polishing. Get it working before you optimize it."

Aren't these two rules the main factors for all the 'bloat' out there today??

Machines are very, very cheap these days. So, no need to spend expensive programmer time to optimze most code.

Dann said...

@ joeuser

There's absolutely no excuse for bad programming, even if machines are fast enough there's no real noticeable effect.

Once you get into intensive programs such as 3D design, large graphical image manipulations (Adobe, Maya, Blender, Gimp, etc), one bad block of code that doubles processing time can have a VERY NOTICEABLE difference.

I, for one, do not support acceptance of bad habits.

joeuser said...

For one, I said MOST code. And there is a big difference between 'bad' code and non-optimized code.

Second, we live in the real world where time = money. Optimizing code takes a considerable amount of time and I have yet to meet a programmer who can write optimal code on the first try.

In and ideal world, yes, all code would be perfect. But in reality, every project I have ever worked on needed the code yesterday. I am in now way making excuses for bad code, just don't see the value in optimization in MOST cases.

decentralist.wordpress.com said...

So Linux solved a problem that Unix has not solved, namely cheap and simple high quality computing for the common person that drives down prices in all facits of technology costs and marginal prices, even for the Desktop. So what is the Unix answer? FreeBSD? Solaris? Sure, great in someways, but far behind Linux in usefulness. And for relatively objective comparisons for speed, I like the Phoronix Test Suite. Curious how Windows will compare on benchmarks when Windows support is added.

ruel24 said...

I, at once, agree and disagree with you. I remember when Linux really first took off. It was so pure, so rock solid. I was using Windows 98, and Red Hat Linux couldn't be shaken. That's not true today. There are as many, if not more, bugs in Linux than there are in Windows. The KDE debacle could have been avoided from the offset.

However, at the same time, Linux is evolving. Linux strives to compete with Mac OS and Windows on the same level. Also, not everything is in the hands of one person, and each group has its own goals. In the end, it's up to the users to embrace or reject. I believe the number of people that resisted the move to KDE 4 for so long, and still do, is very telling to the KDE group. I doubt they'll make the same mistake again? Maybe they're as dumb as the Windows using masses?

Bruce said...

I think there is, at some level, a misconception that using a modern GUI means that 'we' are somehow totally out of sync with the original UNIX philosophy. A GUI can very often be a front-end to commands which might be used on a command line, but using a GUI makes it much easier for many users to carry out various complex operations. Example: compare the ease of burning a CD using k3b (or similar) with using cdrecord from the command line. I've done this in the past, but I can assure you that for the desktop user a GUI is really the way to go for this sort of work unless you have some unusual requirement.

Under Windows, to create many complex processes would normally require each developer of a certain type of application to write ALL the code to make 'things work', or obtain (normally purchase) 3rd party software libraries to perform these tasks. Contrast this with the Linux experience: Here the developer is usually able to simply use pre-existing code, whether at the binary or source level, and so perform complex operations. Surely this fits in easily within the UNIX philosophy the author of the original article quoted.

Ford said...

look, I am not saying that a GUI is bad, I am saying that redundant code is bad. a front-end application is one thing, a GUI UNIX tool is yet another thing, but when you have a GUI app that is setting network connections and a cli tool setting network connections, and they conflict, we have a problem. no 'average desktop user' will understand how to solve the problem. quit talking out both sides of your mouth.

Stephen said...

I don't think Linux has entirely lost sight of the UNIX philosophy, it's just working behind the scenes now.If you've ever done any scripting in bash the UNIX philosophy quickly becomes apparent. You take the output of ps, pipe it to grep, and then finally into less. Output and input get redirected and search and processed and reshapped and put back out again all over the place. The tools on the command line that form the heart of linux keep on working behind the scenes. The example of k3b that someone else gave is a pefect example of this: it actually uses all the command line tools to get various jobs done, and wouldn't work without the other dozen little programs that run behind the scenes that make it all work.

Other programs utterly fail at the UNIX philosophy. Music players like Banshee and Amarok come to mind as programs that try to hard to do too much, rather than trying to do one thing well. But despite these odd cases, I really fell that the core programs that make up Linux still follow the UNIX philosophy, and that'd why I still love Linux.

Shane kerns said...

Dude KDE and GNOME are not Linux. Those are Linux window managers and in no way part of the core Linux operating system. KDE and Gnome maybe slow but not Linux.

Ford said...

@Shane_kerns,
Linux is not an operating system. Linux is an operating system kernel, and Linus Torvalds himself said it's getting bloated. My comment is the amalgamate systems that are released are becoming non-Unix-like. They are becoming bloated and slow.

Mr. Eye said...

Which Linux distro still adheres to the UNIX philosophies?

KimTjik said...

@ Ford, don't misquote Linus as an attempt to put weight to your argument (even though I yet don't know what the purpose of your argument is).

It's not that "Linus admitted", like he was uttering this because of heavy criticism. He just talked plain and simple as usual, to explain the current and becoming state of kernel development. I understand if journalists spin these kinds of quotes out of context, but why should we?

Just as the kernel can be slimmed down to fit whatever you want, a libre kernel, a optimized to only support necessary hardware or something else, an operating system using the linux kernel can likewise be shaped to by purpose be bloated and heavy or only consists of one tool for every task. If you so wish it's no big deal to stay in frame buffer and have a perfectly functional operating system.

I can understand your argument from a certain angle, but generally you still have the choice to shape it into whatever you want. When choice, something the Unix-culture of the past didn't want to give, is eradicated then I would worry. As for now it becomes in my view an intellectual mind game.

Ford said...

@KimTjik,
I understand that I always have the choice to do things how I want to do them, and to that aim, I am working on my own distribution for my own use. It's taking a lot of time, because I have to write many of the programs myself. The kernel config is simple enough. Almost everything is deselected and as far as my actual hardware, drivers are selected as modules.

@Mr. Eye,
Currently, Slackware is as close as it gets.

solenskiner said...

hey, wrote a comment a bit too long for this field...
http://solen-skiner.blogspot.com/2009/11/how-we-can-once-again-become-unix-ish.html

underwriter said...

The UNIX philosophy is a good fit with GNU's Free Software philosophy, because there are not only legal restrictions, but also practical restrictions to an individual's freedom to control his or her information. It's like computer-controlled automation in cars and so on: every additional layer of complexity makes it easier for idiots who don't want to look under the hood, and harder for people who know what they're doing and want to tinker or tune things. Every step away from UNIX-like simplicity and towards Windows-like "I don't care what's going on underneath because it looks so pretty on the surface" is a step away from the kind of collaborative community that free software enables, and towards the kind of dumb consumerism that Windows enforces. Yes, GNU/Linux is copylefted so it's still free, thank God, no matter how bloated and complex...but I agree with the author that it's a pity when some of the most important advantages of a UNIX-like OS over rubbish like Windows are increasingly being forgotten, in order to cater to the crowds of ex-Windows users whose expectations of an OS are so bottom-scrapingly low that the UNIX philosophy doesn't even make sense to them.

Flossie said...

"Rule of Economy: Programmer time is expensive; conserve it in preference to machine time."

That depends on circumstances, machine time can be very expensive in a large heavily loaded server environment. A 5% efficiency gain in the code could save 1000's in both hardware and power costs (or even 1000000's if you're Google sized). This rule ought to be :

"Balance the cost of programmer time against the cost of machine time to get the most efficient result for the application"

Eirik said...

You might be running away from Unix ideals, doesn't mean we all do. New tools that fit well with the old mindset are netcat, secure shell/ secure copy, RedHats Newt-toolkit, expect, as well as a whole slew of xml-tools (as a step up (or down depending on your point of view) from untyped textfiles)).

Now, as I see it the unix mindset always was geared towards system/tool-development -- and not that tightly coupled with application programs. So Vim/Emacs are complex programs that allow you to do a lot of things. As are Gimp, and other rich (multi)media applications.

We still have lame/oggenc, x264 that are used by a whole range of "rich" applications -- I think that is pretty "Unixy".

In general the problem of simple interfaces to complex, rich data -- is a hard problem. Smalltalk with it's environment is one attempt at solving this. MS Office with embeddable COM objects (or Open/Star Office) is another (and far worse) attempt at this.

The newest is web 2.0, where a lot of smart people seeminlgy have come together and agreed to forget everything we've learned about security risks of untrusted code executing on client, operation on all data, and switched unsafe, quick, compact executable code, for unsafe, slow javascript-code, along with a blatant ignorance for security domains, trust et al.

It allows wonderful mashups, account and data theft. Couple it with a blind belief in the nobility and security of serviceproviders (be they Google, Myspace or Facebook) -- and you have the awful mess that is web 2.0.

</rant>

Korey said...

How would you apply unix principles to the graphical desktop?

How would you combine graphical applications in the same way command line applications pipe to each other?

How would you make a cohesive, user friendly environment out of small discrete parts?

tangentsoft said...

Your argument is incoherent. You claim Linux is no longer Unix-like. So what *is* Unix-like, according to your definition? Not Unix itself, that's for sure! Have you used a current version of OpenSolaris or FreeBSD? They commit all the "sins" you lay at Linux's feet. If Unix is also not Unix-like, what do you actually want? A return to 1980? 'Kay, enjoy. Go download a copy of xv6 (ancient V6 Unix ported to x86 CPUs) and enjoy.

Ford said...

@tangentsoft,
My argument is not at all incoherent. There are plenty of lesser known operating systems that more-or-less adhere to UNIX philosophy. Darwin is one such system, MINIX3 another, and AuroraUX would be another. BSD and Solaris have both changed their designs to try and compete with Linux and similar GNU systems. However, NetBSD still mostly adheres to the statements I made. Before you start accusing me, you may want to do a little research.

Post a Comment