Linux Still Not Ready for the Desktop
Recently I decided it might be easier to install a recent libxml on Linux rather than try to figure out how to get one on the Mac. I’d forgotten my password for the Linux box I hadn’t turned on in about half a year, and I didn’t seem to have it written down anywhere, so I decided I might as well upgrade. Linux is clearly improving, but is equally clearly not ready for an end user yet. If you like compiling and installing libxml from scratch, Linux is for you. If, on the other hand, “compiling and installing libxml from scratch” is unintelligible techie gibberish, it’s not.
I started by trying to install the latest and greatest Ubuntu, 6.10, Edgy Eft. No go. I got about halfway through the install only to be presented with the flashing underscore of death. This happened twice. I was worried about hardware problems, but I did manage to successfully reinstall 5.04, Hoary Hedgehog, the release I’d had on the same box previously.
I then tried to upgrade to 5.10. No go. I couldn’t find ISOs anywhere, and the upgrade instructions didn’t work. Apparently Ubuntu has pulled crucial files off their web site. I bet you can upgrade from Windows 2000 to Vista though. Ditto for Mac OS X Jaguar to Leopard.
I then downloaded ISOs for 6.06, which is at least still supported. This installed, though my father couldn’t have done it. (/dev/hda1? What’s that?) Setting up partition tables has improved since the early Debian days, but we still shouldn’t be asking users to do it at all; especially when there’s a perfectly good Linux partition set already installed on the hard drive. Of course, you shouldn’t erase the user’s data when you upgrade, which is exactly what the easy install would have done. Thankfully I’m tech savvy enough to decode all this, but I really shouldn’t have to.
Once I finally got everything up and running, I began installing various packages like Apache and PHP. This is actually easier than it was on my Mac, but only because I didn’t stick with Apple’s default version of Apache. On a Mac, Apache is a checkbox item. It’s a little more complex on Ubuntu. I also had to figure out a way to transfer and share files between my Mac and the Ubuntu box. This required installing more software (SSHD and netatalk). On a Mac or Windows machine, this would already be installed and ready to turn on.
Once I finally got it all configured, I’m left with a system that has clearly improved and is equally clearly not ready for prime time. Ubuntu still can’t handle my 1600×1024 widescreen monitor (Windows and the Mac both can). I’m sure there’s a way to setup X to make this work. I’ve done it before; but right now I don’t have time. Instead I’ve just chopped off the left and right sides of my monitor into big black bars.
The networking is better than it used to be, but still not nearly good enough. When I accidentally booted up without the Ethernet cable plugged it it couldn’t find the network. That’s natural enough. However. once I plugged the cable back in, Linux still coudln’t find the network. I had to reboot before it would realize it was reconnected, and this is on a wired desktop. I can only imagine how it behaves on a laptop with a spotty wireless connection.
The default menu layout has improved, but it mixes up menus with buttons, and requires you to navigate a hierarchical menu to find anything.
The file system view (Nautilus?) is still a disaster, but then that’s the case on Mac OS X and Windows too these days. File system interfaces reached their apex in MacOS 9, and have only degraded since then. This probably won’t improve until we abandon file systems completely, perhaps sometime in the 22nd century at the rate we’re going. 🙁
The screen savers are very pretty (My cat Marjorie loves them a little too much and almost pushed the monitor on the floor while batting at them) but the desktop itself uses very funny fonts, that seem to be about twice as wide as they are high.
Simple dialogs like the Synaptic Package Manager display too small by default so some of the content is chopped off until you resize it. You’d never see something like this on a Mac, and usually not on Windows. There are lots of other little annoying inconsistencies throughout the package manager , like menu items that bring up dialogs but don’t end in …. And while I’m on the subject of the package manager, why not just call it “Package Manager” or “Install Software”? What, exactly, does “Synaptic” tell anyone?
Another example: setting up a shared folder with Samba requires a non-standard file dialog that confuses opening and adding a file.
There are lots of glitches like this throughout the user interface. Programs don’t seem to follow any coherent guidelines. Every program goes its own way. I’m not just picking on a few bad programs, by the way. Almost every single program, dialog, or menu item I’ve run in just a couple of hours has had multiple serious issues of one kind or another. In fact, in this article I’m cherry picking just a few of the dozens, perhaps hundreds, of problems I could point to.
Some things clearly have improved. CDs now mount automatically. The network was detected when I installed without asking me for anything; but a lot of stuff still doesn’t just work; and what does work is quirky and weird. It’s not that applications don’t act like Mac applications or Windows applications. It’s that they don’t act like each other.
I’ve heard it claimed that Linux is good enough for users who just want to browse the Web and write simple office documents. That’s false. It clearly isn’t. It probably wouldn’t take too much effort to put Linux in a usable state; but I see no evidence that anyone is doing this. The entire desktop UI needs a talented and knowledgeable redesign. This is going to have to extend into all bundled applications, and I suspect some toes are going to need to be stepped on and some applications forked when their developers refuse to follow basic principles of user interface design and consistency. Until that happens, tasteful Unix geeks everywhere are going to continue to buy PowerBooks and run Mac OS X.
December 30th, 2006 at 6:19 pm
I can’t disagree with your overall judgment, but there’s a good reason to call the package manager “Synaptic”, namely that it isn’t the only package manager out there. It’s only monopolists and would-be monopolists that have the gall to call their word processors “Word” and their mail clients “Mail.app”, as if they were the only ones that anyone might want to use.
No, our mail clients are called Thunderbird, or Evolution, or Elm, or mutt, or pine, or …. (It’s true that there is a very low-level command-line client called “mail”, but that dates way back to the earliest days of Unix.)
Unix, and even Mac OS X, is all about pluggability.
December 30th, 2006 at 7:49 pm
Those application names are a big peeve of mine. They make it harder to search for information about them. Word, like you mentioned. Microsoft Money. Internet Explorer. (Which of course gets shortened to IE, sometimes, which isn’t the best search term either.)
December 30th, 2006 at 8:38 pm
Quick tip for installing libxml on OS X:
Install DarwinPorts
Open Terminal
Installing libxml is as easy as:
sudo port install libxml
December 31st, 2006 at 12:33 am
Mr. Harold makes too many generalizations that are altogether wrong, just wrong:
“…not ready for an end user yet. If you like compiling and installing libxml from scratch, Linux is for you. If, on the other hand, “compiling and installing libxml from scratch†is unintelligible techie gibberish, it’s not.”
Here he defines an “end user” as someone who cannot install libxml; therefore, the supposition is that you have to be a programmer or a scientist to successfully use Linux. I haven’t got a clue as to what libxml is all about, but one thing is for sure, I AM successfully and happily using Linux for my everyday computer tasks. I have SuSE 10.1 installed in my Aspire 1410 Laptop, and must say that I have complete mastery of mundane tasks such as web surfing, email, wordprocessing with OpenOffice 2.0, etc.
“…tried to upgrade to 5.10. No go. I couldn’t find ISOs anywhere, and the upgrade instructions didn’t work. Apparently Ubuntu has pulled crucial files off their web site. I bet you can upgrade from Windows 2000 to Vista though.”
This type of comment is really appalling. He claims to be trying to upgrade to a discontinued version of Ubuntu, and then comparing it to a Windows upgrade. The correct analogy would have been to upgrade from Windows ME to Windows 98SE, which like 5.10, it’s a discontinued version. While the latest version of Ubuntu did not agree with his hardware, I can attest that I have successfully installed Ubuntu 6.06 in a five year-old computer that had Win98. Is Linux NOT ready for the end user because the author could not install some old version of Ubuntu in his dilapidated hardware? What about Red Hat, or Suse?
“…Almost every single program, dialog, or menu item I’ve run in just a couple of hours has had multiple serious issues of one kind or another. In fact, in this article I’m cherry picking just a few of the dozens, perhaps hundreds, of problems I could point to.”
Of course, Mr. Harold does not provide concrete examples to support this bitch-fest, I guess he’s counting on people to rely on the fact that he has written some computer books to provide some authoritative weight to his erroneous claims. What’s the most outrageous claim? Well, that Linux is not ready for prime time because he had a really hard time with Ubuntu. Here’s some news for Rusty: Ubuntu is NOT Linux… and Ubuntu does not represent other distros, which by the way, are still not Linux.
“… I’ve heard it claimed that Linux is good enough for users who just want to browse the Web and write simple office documents. That’s false. It clearly isn’t.”
This whole statement is false in itself. Linux is more than adequate to run a wide spectrum of applications, from basic home stuff to the very sophisticated. I’m far from being a programmer but I’m using Linux to browse the web and write simple office documents; I’m also using Linux for email, web design, business accounting, image editing, and desktop publishing; and I know that someone else is using Linux to make movies, run weather models, and control unmanned vehicles.
I’m not using Ubuntu. Not everyone using Linux is using Ubuntu. Ubuntu Is Not Linux. Ubuntu is a Linux distribution like Red Hat, or SuSE, or Knoppix. Rather, Linux is a Unix variant just like Mac OS X.
It’s too bad that Mr. Harold had such a bad time with Ubuntu, enough to declare Linux unusable for all but high-end techies. Actually this seems very narrow minded since an objective critique of Linux in general would require in-depth analysis of all other distros, as they would compare against Windows and OS X in usability as well as presentation. This analysis would then delve into a comparative view of the inner workings of Unix vs Windows; but everyone knows that Windows is clearly an inferior consumer product that has failed to deliver on its promises of usability, security, and stability. Thankfully, nobody is getting paid a lot of money to make such claims for Linux, instead, these attributes are self-evident. I should know, this has been my own personal experience with Linux.
So what’s libxml anyway?
December 31st, 2006 at 2:58 am
I use Linux on a regular basis but I have also found it to be “…not ready for an end user yet.” There are too many things that just don’t work. I’m still looking for the distro on dvd that’s capable of playing mp3 files and dvd’s, has wireless networking that just works, and a installation process that is easier than Windows. I’m sure that that an “easier than Windows” installation process will be called unfair but most people know someone who can help them with Windows issues but not with Linux. (No – newsgroups, websites, and Linux users groups don’t count – my grandmother doesn’t even know what they are and most other people are willing to spend $100 to not have to deal with these things).
December 31st, 2006 at 3:13 am
Agree. Some months ago I tried a SuSe 10 and had the same impressions. What wonders me is the fact it is *years* that a lot of people is discusssing whether Linux is ready or not for desktop users, but still I’ve seen no results.
December 31st, 2006 at 9:41 am
F.U.D.
I’m a windows developer, and no anti-Microsoft loon, but the other night I reinstalled XP on a laptop which had Ubuntu running quite happily.
It booted up after a good 45 minute to an hour of install – which, I might add, included partitioning the hard drive in a much less friendly way than GParted – in 640*480. No wireless, no sound, no automagic detection of the graphics card, the world’s least secure browser installed by default. Then you have to get online with a different machine (no wireless :)), search the whole intarweb for drivers (my grandmother couldn’t do it: C:\WTF?), download ’em, burn ’em to CD, get ’em installed. Reboot, reboot, reboot, reboot.
Then I can start browsing the internet for things like Realplayer, XVID, M4A codecs, Flash player, a firewall, some antivirus software, and all the other things you need to keep Windows working for a year or so before it’s time for a format/reinstall. Reboot, reboot, reboot.
Aaaaaaaanyway… the damned thing wouldn’t activate, claiming that my product key had been used too often (which was news to me). So I put the Edgy CD in, installed from the LiveCd. 25 minutes later I had a working desktop running at 1280*1024, with a wireless connection auto-detected, a sound card, and everything humming along nicely. It included an office suite, FireFox 2.0 and readers for most common file types.
Then I installed automatix, told it to install all the codecs (including the MP3, MPG etc ones that everyone moans about), the security suite, the firefox plugins for common media, and Google Earth and went to bed. Job done. I will never *ever* install XP on a home machine again, it’s just too damned difficult.
December 31st, 2006 at 10:05 am
I picked on Ubuntu because the consensus seems to be that it’s the best desktop distro. If there’s a better one, holler.
Also, remember that the gold standard for ease of use is not Windows. If Windows is all you know, Linux’s problems will seem less important. Not nonexistent, mind you, but smaller than they really are.
And if anyone didn’t see the concrete examples, go back and read the article again. Sadly I don’t have time to list every single one of the dozens of mistakes I encountered, and past experience has shown me that even if I did no one would fix them. However I did list quite a few specific problems that need to be fixed.
The Linux community as a whole has been in a state of denial for years about their usability problems. Alan Whiteman’s repsonse is just one example of this. Mostly I think the developers working on Linux and Linux apps simply don’t know what a good user interface is and don’t know either the general theory of user interface design or the specific rules that apply in a modern GUI. They’re like the auto mechanics who think a car is good enough because it can drive from New York to San Francisco, even though the steering wheel pulls to the left, the right front tire is two inches too large, it drips oil so badly it needs to be refilled with every tank of gas, and there’s this annoying rattle they can’t quite locate. Functional is not the same as usable.
Sadly no one who does know this material seems willing to work for free and no one seems willing to hire anyone who does know anything about this to work on improving the situation. I don’t think the problems are unfixable by any stretch of the imagination, but I am very pessimistic that they will be fixed.
December 31st, 2006 at 10:10 am
The package manager is a basic function of the operating system (as it’s broadly understood by end users). No non-technical end user would ever consider swapping out their package manager. A distro should pick one and stick to it.
Indeed this excessive customizability is one of Linux’s problems. Change the package manager. Change the GUI. Change the shell. Change the file system. Change the file manager. No non-geek would ever consider doing any of this. If the goal is to support end users, none of these should be presented as separate. There should be a unified experience that combines all of them into a single look-and-feel.
December 31st, 2006 at 10:50 am
I’ll grant you that the UI is inconsistent, and that it’s infuriating sometimes when you find a weird little quirk (like dialogs where the text overflows… WTF?).
All I’m saying is that your arguments get made over and over and over. They go like this: I had difficulty with [Distro X] and [Distro X] does not work like [My favourite OS] ergo Linux is not ready for the desktop.
This is just not true. I have installed Edgy on four machines in the last few months. I have had *zero* problems in doing so, and didn’t need to go outside the automated install process.
My Grandmother (bless) and my father are both running Edgy and they are having *zero* problems doing so. My grandmother particularly is a total n00b (no, Gran, the mouse has to be in contact with the desk when you move it, or it won’t work) but Ubuntu works just fine for her. It works better than the OS that was previously installed, and I know she isn’t going to get any more porn pop-ups.
There are two sweet-spots for Linux ownership. There is the hardcore techie who can happily build a system from scratch with LFS or Damn Small Linux or Gentoo. They love customising, they love choice, they love being able to hack the code of apps when they fail to meet expectations.
The other sweet-spot is the total beginner for whom a Linux system will be reliable, secure, and pretty damned obvious.
The people in the middle struggle whether it’s because [Distro X] does not behave like [my favourite OS] or it’s because hardware support isn’t there for their favourite gadget, or it’s because the idea of typing an obscure command is so much scarier than messing with an obscure dialog.
They are the users on whom mainstream Linux distributions (Ubuntu being the obvious choice) must concentrate if they are to fix bug #1. That’s going to require standardising the most common application interfaces (which, after all, is one of the goals of a desktop distribution) and finding more ways to hide the obscene flexibility of the OS from people who don’t need it.
Linux, and Ubuntu specifically *is* ready for the desktop. I have desktops running it, operated by complete technophobes. The problem is that the majority of users are not ready for Linux in its current state. It *does* need to be made more user-friendly, it does need to be more consistent in its presentation of common applications, but the current install/setup process is hands-down the easiest, least stressful OS install I have ever encountered, and the day-to-day running of the machine (browsing, listening to music, word processing) works exactly like it does on any other OS – so where’s the beef?
Install Edgy, run Automatix, and – in the majority of cases – you have a fully featured machine in under an hour with no searching for drivers or applications, where Things Just Work: Word processing, Email, IM, Music, Video, Firefox 2.0 with Flash, Realplayer, and MPlayer, common fonts, 3D drivers, the lot – with no manual config. Beat that.
December 31st, 2006 at 1:12 pm
Um, go back and read the start of the article. I tried Edgy. It was completely non-functional. 0 out of 10. This is on a pretty old white box PC that’s been well supported by Linux for several prior releases of Ubuntu and other distros (aside from the widescreen monitor issues). If I had limited myself to reviewing Edgy the review would have been a lot shorter and a lot nastier than it was.
You also need to disabuse yourself of the notion that it’s only similarity with familiar operating systems that matters. It’s not, though that may help. There are general principles of user interface design that extend beyond one platform and indeed beyond computers in general. Developers ignore these rules at the peril of their users. Sadly Ubuntu seems to have ignored quite a few of them, as do Gnome and a lot of other parts of the Linux ecosystem. Indeed in many ways Gnome has been making things worse for a few years now after they lost the paid staff at Eazel that actually understood the psychology of human computer interaction.
December 31st, 2006 at 3:16 pm
You can’t use the Mac’s Software Update to transition from 10.2 (Jaguar) to 10.4 (Tiger). You have to buy a CD or DVD of Tiger.
With the physical CD or DVD, you can then upgrade your OS, preserving all configuration options, such as users, groups, network configs, disk partitions, etc. This uses the long-standing “Archive and Install” option that so few people (even Mac users) seem to know about.
Then you’d use Software Update to get the Combo updater from Apple’s update servers.
More generally, I see a LOT of posts that generalize a specific problem to a more general and conclusive problem. Since there are anecdotes on BOTH SIDES, the only general conclusion a rational person can draw is that reality lies somewhere along a continuum, rather than at a single point. That is, there are hardware configs that consistently exhibit zero failures, and there are configs that consistently exhibit zero success (such as ERH’s white box). I suggest that instead of citing the software as the problem, one characterize the hardware in detail, and find the cause of the problem (if possible).
On Mac OS X, System Profiler displays many details of the hardware, and can write an XML file with that info. This can be very helpful in finding out why one Mac works fine yet another won’t, given the exact same version of the OS. If there isn’t a program like System Profiler for Linux, maybe there should be.
December 31st, 2006 at 4:24 pm
Another thing I just thought of, regarding a system profiler.
There should be a web service that can match hardware configs, described in XML, to distros. Then ERH could run the system profiler on his known-good Ubuntu config, and ask a web service whether Edgy Eft or any other distro will work with that hardware.
This would only be useful if you have a machine that works well enough to get a live network connection, but there’s also a fallback feature: files. If you don’t have a network connection, you can write the XML to a file, move it to another machine with a network connection (even a different OS, like Windows or Mac OS X), then send the XML file to the web service and get a reply.
If you can settle on uniform XML tags, it should even work across distros, so a Ubuntu user can query the Red Hat service to see what’s compatible with the hardware config.
None of this is really a new idea. It may even be integratable into a package manager. Or since I’m not a Linux user, it may already exist in some package manager. Or if it doesn’t exist, it may be straightforward to add it there, since package managers are supposed to manage versions, releases, updates, etc. and this is really nothing more than “versioning the hardware”.
December 31st, 2006 at 5:12 pm
Well I’ve been using Ubuntu for a while now. It installs better and faster than XP. Hardware detection is better. Even Vista cannot find my SATA drive. (I have to swich to ‘compatibe’ mode in the BIOS fro it to find my drive) Instant multimedia is troublesome though. I really had to visit a lot of forums to make .wmv and .mov formats work in FireFox.
Installing software has much improved with apt-get. Though I must agree that Linux is not yet ready for mainstream. But it will get there.
Elliotte have you tried Xandros 3.02 or 4.0 I guess they would have fixed all your problems. Unbuntu has some rough edges compared to Xandros. Not that I like KDE, but that distro is probably closest to `prime time`
December 31st, 2006 at 6:50 pm
I have not tried Xandros. I’ll keep it in mind for the next time I feel like reinstalling the OS. Perhaps it’s an improvement.
December 31st, 2006 at 8:07 pm
I wonder if Mr. Harold is counting on the general audience to be a Windows or Mac user, and completely naive in order to buy his arguments.
“… I picked on Ubuntu because the consensus seems to be that it’s the best desktop distro. If there’s a better one, holler.”
Whether a desktop is better over another is purely a subjective decisionl. I think that KDE under SuSE is better than Gnome under debian, which is the basis of Ubuntu. Some Linux experts have voiced preference for KDE over Gnome. Perhaps you should have tried Kubuntu.
“… The Linux community as a whole has been in a state of denial for years about their usability problems. Alan Whiteman’s repsonse is just one example of this.”
Of course, this is just nonsense. If anything, it seems that Mr. Harold has put on some blinders and is very selective on what to opine. He ranted on the apparently disastrous state a popular distro was found, only to conclude that Linux is not “ready”. Absurd, completely absurd.
Nevertheless, this brings an interesting fact. It seems that Microsoft has been releasing operating systems in an ‘unusable’ state since Windows debuted, this has not stoped the general consumer audience from enduring over 20 years of really, really bad programming.
“… Mostly I think the developers working on Linux and Linux apps simply don’t know what a good user interface is …”
The Linux community as a whole has done marvelous things with desktop development that are short of astounding; in fact, taking the cutting edge away from Apple. Check this and other examples in Youtube: http://www.youtube.com/watch?v=f0v2PWsYrFM. I guess a small video is worth a thousand words.
“… Functional is not the same as usable.”
Again, this is subjective. My desktop is both functional and usable all at the same time, at least for *this* non-programmer user.
January 1st, 2007 at 11:04 am
After providing informal tech support for family using Ubuntu Linux, XP, and MacOS X, I’m not sure that any of them is ready for the desktop, because none of
OS X is great as long as you stick to the preinstalled programmes (of which only GarageBand seems genuinely useful), but it’s a nightmare as soon as you try to do anything slightly unusual. I’ve also had four relatives (in different places, with different wireless routers and upstream ISPs) all have severe problems getting WiFi working with OS X — it’s fine as long as you’re using open relays, but it gets very hard once you’re using any kind of WiFi security. Of the three, OS X also seems to crash and lock up most often, which is surprising since it’s based on FreeBSD, which is rock solid.
XP is harder to install than Ubuntu — it took a few tries, and a lot of hunting around the Web for device drivers not bundled with XP — but once I finally got it running, it seemed to be a lot more stable than any of the previous Windows incarnations, and (unlike with MacOS and Linux) third-party software that is not preinstalled usually just works. I haven’t seen many crashes or lockups, either at the OS or application level, but non-tech users find it very difficult to understand how the OS works, and for a consumer-level OS, it’s surprisingly hard to do easy stuff like find and install drivers. DLL hell doesn’t seem as bad as it used to be.
For me, on three separate machines, Ubuntu Linux was hands-down the easier to install (MacOS comes preinstalled, so I cannot compare). On two notebooks and a desktop system, sound and video just worked properly without any intervention (it took me hours and lots of Googling to get to the same point with XP), and getting wireless up was as easy as on XP (much easier than MacOS), possibly because I was lucky in the choice of wireless card. It’s an old canard that Linux doesn’t crash much, but I’d say it’s about on a par with XP for OS-level crashes (i.e. almost never), though much better than OS X. It’s still hard with Ubuntu to install software or drivers that aren’t available via APT — life can get complicated fast. Also, every once in a while, APT gets messed up, and it’s necessary manually to restore old versions of some packages to resolve a conflict, something that a casual user could not reasonably know how to do (exactly the same thing happens in XP with DLL versions, so there’s no advantage or disadvantage there). With MacOS, you just buy a new version of OS X and wipe out everything, so it’s hardly reasonable to claim that it’s better for upgrades.
The main problem with Ubuntu Linux, though, is that kids and teens like to have the same apps as their friends, not just equivalent apps. My younger daughter can use MSN over Web Messenger or one of the Linux clients, but she prefers to use the Windows client on XP with the extra features so that her setup is exactly the same as her friends’. My older daughter is very skilled with Audacity for sound editing, but she still uses GarageBand on a Mac when she has a choice. Personally, I use XP when I have to update the database in my Garmin Aviation GPS — I could try to make it work under Wine, I suppose, but now that there’s an XP computer upstairs, why bother?
Both of my kids are comfortable with Firefox and OpenOffice, though, no matter what the OS.
January 2nd, 2007 at 6:15 am
In the end all those articles are subjective. And as a subjective article, the content might be up to 100% true, but it can’t be generalized in such a wrong title as “Linux Still Not Ready for the Desktop”. Better would be: “Linux Still Not Ready for my Desktop”. There are thousands (millions?) of Linux users and most of them use successfully Linux and one of the desktops for it. (Otherwise they wouldn’t use Linux.)
As I tried a lot of Linux distributions for years, the latest Ubuntu, SuSE and other distributions did a major jump in usability. It’s not perfect though and might not reached the integrity and uniformity of Windows (which was built up with integration with GUI in mind) or the “nearly perfect” Mac OS. I would say that you can get problems on all of these OSes when you don’t stay with the standard software packages supplied. It’s just like your car: if it’s not an off-roader, you can easily break any if leaving a street. And nobody really complains about this.
January 2nd, 2007 at 9:28 am
I wholeheartly agree: Linux is not ready for the desktop for users who are not willing or able to work-around its idiosyncratic properties!.
Denial of such problems won’t help, Al.
I just wish linux desktop programmers would read the common user access guidelines to get an idea of what is wrong with linux.
See here: http://en.wikipedia.org/wiki/Common_User_Access
January 2nd, 2007 at 11:36 am
It’s not all in the eye of the beholder. It’s not all subjective. While some details may be subjective, or based only on prior experience and familiarity with existing interfaces, many are not. Human computer interaction is a well studied and well understood (among HCI professionals) science based on hypothesis, experiment, and fact. It is derived from certain fundamental principles of how humans perceive and interact with the external world, and owes a great deal to experimental psychology.
There is a large literature in this field which is sadly little known and little taught outside a very small professional community. Indeed when scientists such as Dr. Jakob Nielsen attempt to popularize this work for a broader audience, they are routinely pilloried by neophytes who think their subjective opinions are just as valid as the principles derived from observation, measurement, and experiment.
January 2nd, 2007 at 3:22 pm
The ease of installation of Linux is, IMHO, irrelevant for most non-geek users. They can’t easily install any OS because they don’t have the background knowledge to configure it. Hence they use whatever is available preinstalled: Windows or MacOS. Linux becomes “ready for the desktop” no earlier than the point where it is commonly pre-installed on a computer bought in a retail outlet.
Consider a machine pre-configured with current Linux, Ubuntu if you will, with all the right drivers and libraries sorted out by the hardware vendor. Is anything then still broken? Do the UI design fault noted by ERH in this blog post make it unusable for a typical user? Those are the question that need answers and we can’t do the experiment until vendors start making up the systems.
January 4th, 2007 at 1:28 am
I think there is so much variability in hardware that it is very difficult to expect any operating system will be able to work out of the box with arbitrary hardware.
Especially with “uncertified” hardware for PCs, and hardware whose makers don’t test with Linux. Macs of course had things much easier by using their own busses and plugs, so that manufacturers could only work with them. And the movement to having “smart plugs” (i.e. such as USB, where manufacturers have a range of generic IDs available, so that even worst-case a simple functional driver will already be available, is good.
But the Mac “toaster” mentality, where you expect to plug it in an turn it on, is a beautiful myth or memory now. Nowadays, you expect nothing but pain and difficulty. PCs aren’t appliances like a toaster, they are jets built on top of bi-planes. Windows isn’t ready for the desktop, by the criteria in the article too.
My Dad and his mates bought computers together. I got my Dad to buy a Mac. 1 year later, they were all happy. 2 years later, Dad was grumbling that his friends all had whizzbang new applications that were so cheap. 3 years later all his friend’s PCs had failed either by hardward problem, software issues or viruses. 10 years later, Dad is still happy with his Macs.
January 5th, 2007 at 7:26 am
There’s probably a brazilian[1] usability horror stories. I have a meta-usability horror story. Someone (who should be but shall not be embarrassed) working at Nokia on the 770 (pocketable Linux box) told me that a misfeature could not be fixed because the specification would first have to be fixed. A different misfeature could not be fixed in the 770 because it was present across Nokia’s entire product line, usability by historicity being a higher goal than usability by actually being usable. “We can’t change that because we’ve always done it that way and all the current users know how to use it.”
[1] an unknowably large number slowly increasing without limit.
January 10th, 2007 at 10:59 am
Epigrammatic summary:
1. Choice is good. Capitalists are for it, monopolists are against it (not that most capitalists wouldn’t be monopolists if they could).
2. Linux UI design historically sucks. Denial doesn’t help. Neither the technical nor the political problems will be fixed overnight.
3. Most people don’t install anything, so ease of installation is a red herring.
4. The random-hardware problem is a Big Deal. Microsoft solves it by pushing the work onto the hardware companies. Apple solves it by only running on Apple-branded hardware. Linux and friends have to do it the hard way.
5. Nobody’s going to preinstall a distro that doesn’t have hot multimedia stuff (DVD players, iTunes, etc. etc.), and no distro can carry that until they have valid licenses, which cost $$$$$. This problem is being worked on.
January 15th, 2007 at 1:11 pm
I agree with Mr. Harold. If all software doesn’t look and feel the same like it does on the Mac, then the software is _clearly_ not ready for the end user. The millions of people currently installing and running Linux on their PCs are _clearly_ in denial about how bad their software is. The solution for them all is to buy a proprietary operating system and expensive new hardware from Apple. Then they too can be a tasteful Unix geek.
January 26th, 2007 at 3:34 pm
I’m impressed with how the author only had to evaluate one distribution (Ubuntu) in order for him to feel qualified to generalize that all Linux distributions are “not ready for an end user yet”.
Equally impressive is how the author evaluates only one desktop environment (Gnome) and then concludes that he sees “no evidence that anyone is” making effort to put “Linux in a usable state”.
I wonder if he uses this same thorough approach in researching other technology for the computer books that he writes?
January 31st, 2007 at 7:00 am
“setting up a shared folder with Samba requires a non-standard file dialog that confuses opening and adding a file.”
Actually, u can share folder with right click on the folder you want to share and select share on the appeared context menu 🙂
“I had to reboot before it would realize it was reconnected, and this is on a wired desktop. I can only imagine how it behaves on a laptop with a spotty wireless connection.”
My laptop don’t have that problems 🙂
February 9th, 2007 at 4:48 pm
As many other have pointed out, if Windows XP was evaluated based on the same standards that the author evaluated Ubuntu, many people would have to declare that Windows XP is not ready for the desktop. Clearly, ready or not, Windows XP is used by millions of people every day.
Here are some things that struck me as wrong with your article:
“Synaptic Package Manager” is confusing, but “Outlook Express” is clear? How about “Safari”?
Can’t perform an online upgrade from 5.04 to 5.10? Try an online upgrade from Windows 95 to 98 or ME! Heck, try an internet upgrade from Windows XP to Vista, how does that work out for you?
Did you actually claim that Windows come with SSHD and AppleTalk even installed, let alone ready to use out of the box?
As for multimedia, if that is something you need to work out of the box, buy (yes buy, those codecs aren’t free) a distro that includes them. Ubuntu would love to ship with MP3 and WMV support enabled by default, but legally they can’t do that. They would love to ship with first-class graphics card drivers and wireless card drivers, but the people making them don’t always cooperate.
So bottom line is that Linux is ready for the desktop, maybe not every desktop, but then again neither is Windows or Mac OS X. I’m guessing you didn’t try to install OSX on that old PC, or try installing Windows XP on your old G5, did you?
February 10th, 2007 at 9:41 am
Linux works for me. SUSE Linux takes care of my desktop needs.
I would love to have you (Elliotte Rusty Harold) and Bruce Eckel give openSUSE 10.2 a try.
February 10th, 2007 at 7:04 pm
Michael, you’re rationalizing and that’s a bad sign. The question of whether I can install OSs on unsupported hardware is irrelevant. In my experience, Linux doesn’t necessarily install on supported hardware. The concern about online upgrades for Windows is equally specious. The fact is, I can upgrade multiple old versions of Windows and Mac OS to the latest versions and not lose data. I can’t do that with Linux, online or otherwise, even within the same distro, much less between distros.
Oh, yes, out of the box I absolutely can share files between my Windows PCs and my Macs, no extra software or installs necessary.
As for multimedia, the reasons why Ubuntu chooses not to ship various software are irrelevant. The fact is they don’t. That’s what matters to end users.
These problems are very, very real, and they’re just the tip of the iceberg. Until the Linux folks admit they have a problem they can’t begin to fix them, and their market share will be limited to the small percentage of systems that are servers. 🙁
February 12th, 2007 at 9:58 am
Elliotte,
You seem to have missed my point completely, so I’ll try and clarify it:
Most of the hardware complaints against Linux, things like wireless cards, is for hardware that is not supported by Linux. People then expand that to claim that Linux has poor hardware support. I was trying to point out, and perhaps being too subtle, that Linux supports more hardware (in quantity) than either Windows or Mac OSX. There are some things like certain wireless and video chipsets where the manufacturer has not given anything to allow Linux to support their hardware, and I don’t know what you expect Linux to do about that.
I was also, again perhaps being to subtle, trying to point out that your irritation that Canonical doesn’t support online upgrades from 5.04 to 5.10 is also an illogical requirement, since neither Windows nor Mac OSX supports any online upgrades, even for releases less than 2 years old. As for finding ISOs, entering “Ubuntu 5.10 ISO” in google gave me many options to choose from. And for what it’s worth, last weekend I upgraded from 6.10 (Edgy) to 7.04 (Feisty), making it my third upgrade, without issue and without losing any data. Therefore one obviously can upgrade to the latest (even beta) versions of Ubuntu without a problem.
Out of the box, I’m sure you can share files between your windows and Mac, but I’ll bet it’s using SMB right? Well Ubuntu, out of the box, supports file sharing over SMB too, among other protocols. Out of the box, windows does not support SSH and to my knowledge it does not have Appletalk running by default either, both of which were part of your initial claim.
As for multimedia, again, your complaint is illogical. You want expensive software to be pre-installed on a free distro, and can’t understand why people think you’re wrong to want that. If you want software that costs money, you have to spend money on it, and there are several Linux distros that will let you pay them for that software, such as Linspire and Xandros. But if you pick a free distro, you can’t rationally complain about not having expensive software included.
The “Linux folks” know there are problems, some real some perceived, and they are working to fix them. Just recently the Kernel maintainers offered free Linux driver development to any hardware company willing to give them any information they can use to create those drivers. Microsoft makes you pay for the privilege of having your driver certified for Windows, but Linux will make it for you, maintain it for you, and let you advertise it as supported, for FREE! What more can they do? The KDE and Gnome “folks” are revising their human interface guidelines, even cooperating in the freedesktop.org to make guidelines that span both desktops. The next release of Ubuntu (7.04) will tell you what codec you need to play a media file, even allowing you to automatically download and install it if it is available in their repos, then continue playing your file. If you see “no evidence” of effort to make Linux a better desktop OS, you aren’t even trying to look.
September 16th, 2007 at 5:30 pm
“The fact is, I can upgrade multiple old versions of Windows and Mac OS to the latest versions and not lose data. I can’t do that with Linux, online or otherwise, even within the same distro, much less between distros.”
Gee, I went from ubuntu 6.10 to 7.04 with no problems. Did it over the internet without reinstalling. No data loss. What the HELL are you talking about