Showing posts with label user interfaces. Show all posts
Showing posts with label user interfaces. Show all posts

Saturday, August 25, 2012

Windows 8 and taste

One thing about most geeks that I've noticed: They have horrible taste. You can look at their homes, their clothes, their cars, the trinkets scattered about their cubicles, it's all a horrible mishmash of ugly. The way that Apple addressed this was via the Stalinesque concept of the Chief Designer. You may laugh at that description, but the Soviet-era Soyuz rocket and space capsule, designed under the supervision of Chief Designer Sergei Korolev, are still flying today fifty years after their design process began because he had exactly the same kind of qualifications as Apple's Chief Designer -- good engineering taste that balanced simplicity, cost, performance, and capability into a pleasing whole.

What brings this to mind is Windows 8, which I'm using to type this while eval'ing the RTM product. I'm not disclosing any NDA stuff here, it's pretty much the same product you downloaded earlier as the "Consumer Preview" with a few pieces of missing functionality filled in (and undoubtedly many bugs fixed). Windows 8 is Microsoft's attempt to re-invent the user interface, but fails primarily because of two reasons: A lack of courage, and a lack of that chief designer.

The lack of courage part is where Microsoft flinched at the notion of completely re-inventing the desktop. As a result, they have the "classic" desktop available by hitting a button on the "Modern" desktop. The end result is a bizarre mishmash of two different desktop environments in one, twice the amount of stuff to learn if you're a user because the "Classic" desktop environment doesn't *exactly* work the same as the well-known Windows 7 desktop environment, while the "Modern" desktop... well, it's entirely different, period. Twice the amount for end users to learn is user environment fail, period.

The lack of that chief designer, however, shows even more in the design of the "Modern" desktop. A good design is clean, looks simple (even if it isn't), everything's laid out in an obvious manner, there's a limited number of things for end users to learn in order to be productive, and, for lack of a better word, it is tasteful. It doesn't look like a mishmash of unrelated ideas from multiple independent teams all smashed together into one product.

That, however, doesn't describe the "Modern" desktop at all. One of the things I noted about Gnome 3 was that you had to basically know one gesture -- how to move your mouse pointer to the top left of the screen (or touch the top left of the screen on a touchscreen) -- to make it useful to you. Everything else is pretty obvious touch an icon or touch and drag an icon (or the click-on and click-on-and-drag with a mouse) or scroll up and down using the mouse wheel or two fingers. With the "Modern" desktop, every single corner of the screen does something -- and does something *different* (with the exception of the right-hand corners, which does something the *same*). Furthermore, moving to a corner, waiting for the hover timeout, then moving your mouse up and down does something even *different*. And right-clicking does something even *more* different. The confusing number of things you can do, indeed, need to know how to do to make the environment useful, are well past the three things you need to know how to do to use Gnome 3.

In essence, it's as if a bunch of geeks got together and decided to take every idea from every touchscreen environment ever created anywhere, and put them all into the same user interface. It's as if every geek critic of Gnome 3's tasteful design got together and designed their perfect touchscreen environment with every feature they could think of. It's as if Larry Wall designed the thing. Folks, Perl is many things, but clean and easy to use are not among those things -- it's an ugly, nasty, piece of work that will spit you in the eye if you look at it wrong just like the camel on the cover of the definitive book on the language. Like said camel it also happens to be very useful (thus why I wrote the virtualization management infrastructure for our virtualized product line in Perl, because it was the most reasonable way to parse the output of all the low-level virtualization-related utilities that the various virtualization systems use for their low-level management), but nobody has ever suggested that end users be given Perl as their user interface to their computers.

So the question is, will Windows 8 succeed? Well, define "success". The majority of personal computers in the world next year will ship with Windows 8 pre-installed. And because everything in post-Gates Microsoft is an API and Microsoft is quite open with their API's (Apple, not Microsoft, is the "Do Evil" company in the post-Gates era), sooner or later someone is going to come up with a means to tame this mess. But I have to say that Windows 8 is, in the end, a disappointment to me. Microsoft had an opportunity to re-define how personal computers worked, and they have all the pieces needed in Windows 8 to do so. They just needed a tasteful Chief Designer with the power to impose order and taste upon this mess -- and, alas, it appears they have no Jon Ives or Sergei Korolev to do so.

-ELG

Monday, July 18, 2011

Re-inventing the Linux desktop

My opinion of the Linux desktop is pretty much well known by now -- I believe I mentioned "Window 95 as re-implemented by a Soviet Union that never fell"? A.k.a. clunky, incoherent, and obsolete? But recently two distributions have been released with a re-imagining of what the Linux desktop should look like. So today I installed Fedora 15 and Ubuntu 11.04 into Virtualbox on my Macbook Pro (into Virtualbox because it's currently the only virtualization environment with 3D support for Linux), and examined the two new systems -- Gnome 3's Gnome Shell, and Ubuntu's Harmony.

Gnome 3 was the first one I looked at, and by far the most revolutionary. Everything in the new Gnome Shell is set up to be doable with a couple of mouse swishes and one or two clicks. A swish to the upper left corner of the screen does three things -- does an Expose'-like scale of your windows so you can choose which window you want to activate by just clicking on it, sucks in a dock from the left side of the screen, and sucks in a screen list from the right side of the screen. You can grab a window and move it to the next screen in the screen list (there's always one blank screen at the end of the list -- sort of like the iPhone's applications screens), or you can click on the word "Applications" towards the upper left of your screen and see a list of applications in a format somewhat like an iPhone's application chooser. It is all easier to use than it sounds -- it really does make the whole thing swish swish swish point and click easy.

If you have applets running, like gDesklets, you can get to them by going to the bottom right corner of the screen. A sort of fuzzy menu bar then rises up from the bottom.

My general conclusion: Gnome 3 is currently incomplete -- it's barely configurable at all for example -- but it puts together the best ideas in UI's that have come down the pike over the past few years into an easy-to-use whole. My workflow falls out of the way Gnome 3 works naturally. If I want a screen for my browser windows, for example, swish to top left, swish to next screen on the right and click it, swish to left and select my browser on the dock, and voila, it pops open on the new screen and *another* blank screen is created for the *next* thing I want to do. Close all the windows on a screen, and it goes away, so I always have just the screens I need for my workflow -- no more, no less. I spent some time today doing software development with this system, and the usability compared to traditional Gnome is astounding.

Next up was Ubuntu's Unity. That, alas, turns out to be a disappointment. While Gnome 3 re-imagined the world to the point where some of the rumored Windows 8 functionality is going to be a clone of already-existing Gnome 3 functionality (like the iPhone-like application chooser), Unity simply attempts to clone Mac OS X without seeming too obvious about it. The problem is that simply moving the dock to the left side and modifying GTK+ to move application menus to the top like MacOS (except active and context sensitive, can't forget that!) isn't enough to make a quantum leap in functionality. Frankly, it ends up looking a bit of a mess. Gnome 3 re-imagined, Unity cloned, and like most clones, the clone isn't the equal of the original.

So: Does this mean the Linux desktop is usable now? Am I going to abandon my Macbook Pro and run a native Linux desktop again? Uhm... not hardly. I have a major investment in MacOS professional music software for which there is no Linux equivalent, and I still have significant difficulties viewing multimedia-based sites with Linux. Part of that is Microsoft and Apple's fault -- they release all these tools for "free" that produce (and view) multimedia content in their own proprietary formats like Quicktime and WMA, and don't release the viewing tools for Linux. I also can't read my corporate email using Linux -- neither of Evolution's Exchange plugins will handle proxied Exchange servers. That's sort of important too. Still, the fact that there is now a Linux UI which is actually innovative rather than a crude clone of other people's ideas is a sea change in an OS development process which all too often has ignored the desktop in favor of server optimizations. I don't know what happens next, but I suspect it'll be an improvement. Of course, given where the Linux UI started -- as an incoherent mess (inherited from MIT X11) -- that's sort of faint praise. So it goes.

-ELG

Tuesday, March 8, 2011

Virtualization solution #2: VirtualBox

So the next piece of virtualization software that I was going to try out is VirtualBox. Remember, my Linux is installed on an entirely separate hard drive pair, that I mapped into VMware Player as two drives then installed Scientific Linux 6 onto one of the RAID arrays previously configured on that drive pair for that purpose (a 20GB RAID1 pair). 'grub' handles that situation just fine, it skips the RAID header and loads the Linux kernel and does its thing. At that point, I can see the remaining 1.8 terabytes of RAID'ed LVM volumes.

So I fire up VirtualBox and go to create a virtual machine via its GUI and... err... it doesn't allow me to assign physical drives to my VM. Which is one of those "WTF?!" moments, because the underlying QEMU that VirtualBox is based upon certainly has the *capability* to add physical drives into a virtual machine, but a bit of Googling around finds that you must do some cryptic command line hacking to make VirtualBox do it. The GUI won't do it.

At that point, realizing that VMware was point and click and ridiculously easy to set up and did what I wanted it to do without said hacking, I said "F*** that" and uninstalled VirtualBox. It may be that VirtualBox could perform better than VMware Player. But my time is more valuable than any tiny increment of performance that VirtualBox could potentially give me compared to VMware Player.

Next up: I check out Windows Virtual PC and see what I can do with it. For one thing, it'll be cool to try Windows XP Mode, even if it turns out not to be useful for virtualizing Linux...

-- ELG

Friday, February 18, 2011

User interfaces and the Office 2010 problem

Microsoft has some excellent user interface guidelines. The committee that wrote these guidelines made many of them unnecessarily complex and verbose, and I disagree with quite a few of them because I feel programmers will abuse them to do things that shouldn't be done in user interfaces, but by and large, I believe that if you keep these guidelines in mind, you will have a better product.

How, then, do we explain the mess that is Microsoft Office 2010? I can't. Take the mess that is Outlook, for example. Okay, so you want to add an email account. What's the first thing that Outlook tells you to do? Err.... EXIT OUTLOOK! And that makes sense because... err... why?

Yes, I know someone's going to pipe in with technical reasons for why Microsoft does this. But my point is, the user doesn't care. If Microsoft wants to keep a separate database with email accounts so that any email client they support (Outlook, Windows Live, or any future client) can access it, fine and dandy. But don't require people to run another program just to set up their email accounts when they're already in their email program.

Then there is the horrifically slow speed of Outlook 2010. I'm running Office 2010 on a 3.07ghz quad-core Intel Core I7 Extreme 950, which is not a slow processor -- indeed, it's one of the fastest consumer processors on the market. My Windows Experience indexes are near the top for everything except disk i/o, where it's constrained by the mediocre limits of a 7200 rpm SATA drive. I also have 12 gigabytes of RAM. But Outlook 2010 is always lagging. I have my Macbook Core I7, a dual-processor Core I7 at 2.6ghz, also. It has a 5400 rpm SATA drive. The Macbook is running Apple's Mail with the exact same accounts set up as the Windows machine. I hit the "sync" button on Outlook. I hit the "sync" button on Mail. And the winner is... the slower computer. By a landslide. Yes, Apple Mail on a MacBook blows Outlook 2010 out of the water in terms of responsiveness, displaying the count of new messages one by one in its left-side inbox tray long before Outlook 2010 finishes its clumsy sync cycle and starts deciding to display the counts.

Here's a clue: responsiveness does *not* necessarily require faster algorithms. Mail isn't doing anything extra-special compared to Outlook. Mail simply does more in parallel and returns results to the user as they come in, rather than batching them up. Responsiveness matters. Caching values so that you don't have to fetch them every time from the server, displaying results as they come in, and otherwise making it look fast is, in many cases, as good as being fast. I'm not privy to the innards of Outlook, but it appears to me that Outlook performs its sync cycle, tediously gathering new emails from each server, and then -- and only then -- decides to update counts and headers in the email window. I may be wrong about that, but I shouldn't even be guessing in the first place, because, like with Apple Mail, it should just happen.

And finally, let's look at Word 2010. Please. Word 2010 goes full-ribbon. Microsoft's new "ribbon interface" concept is a tabbed icon tray. It suffers from the same problem as Apple's Dock -- icons simply are not descriptive. They're so much hieroglyphics to me. Thing is, Apple's Dock is just one row of icons, and even that is confusing enough that I sometimes hit Preferences when I was aiming for System Performance. Word 2010, on the other hand, is tabbed row after tabbed row of icons. The end result is that it's utterly unusable to me because I can't find anything, any functionality I want is hidden in row after row of hieroglyphics. I instead go into OpenOffice and create my documents there. Way to drive people to your competition, Microsoft!

There's two points here:

  1. Icons do *not* replace text. Don't even think about it. Icons are a *supplement* to text.
  2. Microsoft doesn't follow their own user interface guidelines, adding gratuitous complexity for the sake of complexity, hiding functionality in row after row of "ribbon" icons to the point of rendering their product basically unusable to anybody who doesn't live in it. Look at page 19 of their user interface document (linked to above). Look at Office 2010. Sigh and think of what could have been, if Microsoft had only followed their own guidelines.
Note that Windows 7 itself, while not an outstanding user interface, is understandable and usable. I might laugh at how you must use the search box to find anything in the Control Panel because there's so many of the freakin' things, but there is a logic to it that is easy to understand. Office 2010, on the other hand, has a logic to it... but one that reminds me of the story I presented here a few months back, about the physicist who made a logical interface for our product -- but an interface that only made sense if you were a physicist. Or as he exclaimed when we told him that our users wouldn't understand his UI, "then your users are idiots!" Why... yes! And they're idiots with *money*, who want to get a job done, and who are willing to *pay* us if we give them something that they can use to get the job done that doesn't require them to be nuclear physicists to use it!

Sadly, Microsoft's been stuck in their own hermetic world for so long with no strong leadership from the top to force all the various divisions to comply with things like, say, user interface standards, that there is just no consistency to their product line. Even Windows 7, probably the best Microsoft user interface since Windows 95 introduced their "new" user interface to the world, suffers from this syndrome a bit -- the various vintages of programs in their control panel, for example, have user interfaces that are all over the place. It's a shame, really, because Microsoft has the technology to do it right, and even the people like those who participated in writing their user interface design document. Just not the leadership.

I guess Microsoft chairman Uncle Fester figures that as long as the majority of people need to use his workstation OS because it's the standard, he really just needs to sit back and rake in the money with occasional gratuitously incompatible upgrades to force people to buy replacements for their old stuff. And he may be right. But unless your company is Microsoft, it behooves you to do as Microsoft says -- not as they do. Because your competition is going to come out with a clean, simple, easy to understand user interface for their product... and if your product looks like a mess, if it exposes technical details that customers don't want to know about rather than Just Working, well.

-ELG

Wednesday, January 26, 2011

Denial is more than a river in Egypt

A Linux fanboy asks, "Why do so few people use Linux on the desktop? After all, it's superior to Windows."

My response is: Are you joking? Surely you're joking, right?

First of all, people don't use operating systems. They use applications. And most end-user applications run on Windows. If, for example, I want to manage a ESXi server, I have to use vSphere Client to do that. And vSphere Client runs on... err... Windows. As does pretty much every other specialty application on the planet that people use, or even many non-specialty ones -- try viewing WMC or QuickTime videos on Linux. You can't do it. They simply don't work. You can (illegally) hack your Linux system to do this by copying components from Windows, but really, how many end users are going to do this? And they certainly aren't going to do it in a corporate environment, where systems are locked down to prevent end users from installing illegal software.

Secondly, as I've repeatedly pointed out, the Linux desktop environment is a mess. It's like if the Soviet Union had not fallen in the early 90's, had seen Windows 95, and decided to create a Soviet version of Windows 95. It's clunky, creaky, overly complex, makes little sense from an end user perspective, and things that are easy on the Mac or Windows are ridiculously difficult to do with the Linux desktop. For example, to assign one of the side mouse buttons to the window switcher on Windows or Mac is a simple tool in the Mac control panel or Logitech mouse manager on Windows. To do so on Linux, on the other hand, is an adventure.

So anyhow, what inspired this rant? Well, simple: I was annoyed at the slow speed of the QEMU console used by KVM and Xen. This appears to be an issue with QEMU's console driver, it does the same thing with both KVM on Fedora and Xen on OpenSUSE. QEMU's console driver screen-scrapes video memory, then stuffs it into a VNC session, but does this so ridiculously slow as to render it basically unusable. I know it's possible to do virtualized console drivers that operate quickly -- VMware does it *over a network*, for cryin' out loud, install ESXi on one of your spare systems and point vSphere Client at it from a Windows system on your network if you disbelieve me -- but apparently nobody in the Xen or KVM communities cares, since this has been a problem for literally years. I guess it's because Xen and KVM are typically used to virtualize things like web and email servers, where nobody cares about how fast the console is.

The workaround is to a) use ssh if you need CLI access to the system, and b) spawn off a vnc session if you need GUI access to the system. For example, for my Fedora guest, I have this line in rc.local:

su -c "vncserver -geometry 1440x900 -depth 16" egreen > ~egreen/vnc-log 2>&1 &

From there on, I access it via a VNC viewer from my Macbook Pro or from the Linux host system.

[flame on]
This problem has been there since the beginning of the QEMU project, but nobody cares to fix it because, well, it's "good enough" to get in and start vncserver, so what's the problem? This disregard for end user experience is the #1 reason why Linux is a sad also-ran in the desktop operation system competition. Even netbooks have switched away from Linux to Windows, because the end-user experience with Linux is so pathetic that even Windows -- sad pathetic WINDOWS -- does it better. It's the whole XKCD 619 problem. It's not because Linux doesn't have the technical capability to have a decent end-user experience, it's because nobody *cares* because what exists is good enough for geeks and if you point out that end users don't like it, you get oodles of pushback from the Linux fanboys about how you don't *really* need feature X because there is workaround Y. Denial is more than a river in Egypt, folks. I've been using Linux for 15 years. My name is in the Linux source code. I've written at least half a million lines of userland code for Linux in the past 15 years and while my kernel contributions are minor driver contributions, they're there. And my desktop is a Mac because I find the Linux desktop environments to be so clumsy, clunky, and annoying. Q.E.D.
[/flame off]

So anyhow, that's my gripe of the day. I'm sure I'd get some pushback from Linux fanboys on it if they bothered reading it, which of course they won't, because they're too busy making sure Linux runs well on 4096-core processors. All I'll point out is that refusing to admit you have a problem is a guarantee that the problem will continue. The Linux community is like a drunk that refuses to admit he has a drinking problem. Linux has a user interface problem, people -- and like the drunk who won't go to rehab because he refuses to admit he has a problem, Linux's user interface problem is not going to get fixed as long as Linux geeks continue to insist they have no problem.

-- ELG

Wednesday, September 29, 2010

And the winner is...

The iPhone 4.

The Droid X is an awesome piece of hardware. But my experiments with the various Android-based phones said to me that Android is still a work in progress. Each of them had odd bits of user interface that seemed unfinished or clunky or just plain badly thought out. After my brief experience with the Google hiring process it's pretty clear why that's true -- Google's hiring process, other than for a few superstars, has a built-in bias towards young mathematical types who recently took an algorithms course in college and also has a built-in filter to get rid of those of us who've been around long enough to know what we like to do and are good at doing. Youth has its advantages, but also its disadvantages -- young arrogant mathematical types rarely give much thought to user experience.

Which reminds me of an incident at a prior job. Me and my office-mate combined had maybe 15 years experience in the industry at the time, had actually used our product in production environments before joining the company that made it, and were now working on taking it to the next level with a new GUI and a new management infrastructure around our core data engine to make it easier to use in modern network environments. Our boss had about 20 years experience in the industry. So my boss assigns one of these young arrogant math PhD types to mock up a user interface for our project, we gave him the basic architecture and workflow and told him "make it easy to use." So he produces this mock up and calls me in to take a look at it and I scratch my head because I can't make heads or tails of what he's done, it isn't oriented around the workflow of any site admin that I've ever encountered. I call in my office-mate. He can't make heads or tails of it either. We ask this young brilliant mathematical type just out of college questions about how to do various site-admin-ish kinds of things, and he takes us on this long complicated set of procedures through a number of incomprehensible dialogs. My office-mate and I say "This doesn't seem like an easy to use interface for site admins." He goes, "but it's obvious! It's simple!"

So we look at each other, think, "hmm, he seems really sure about this, maybe it's just us," and call in our boss, just telling him "You have to look at this" but not why we want him to look at it. By this time we have 35 years of experience in the room, people who've actually used the technology in question in production environments. He runs through the same thing as we did, and comes to the same conclusion. By this time the arrogant young mathematical type is in pure snit mode. How *dare* we question his impeccable user interface! I explain to him that there's 35 years of experience in this room who've actually used the technology in production environments, so if we can't make heads or tails of it our customers will be utterly lost. "Then your customers are idiots!" he shouts.

Indeed. Indeed. But they are *paying* idiots. Which is what Apple understands. The customer may not always be right, but the customer is what keeps you in business, and customers want something that doesn't require a math PhD to understand or use. And in that regard, the iPhone despite its occasional glitches is still the one to beat, and Android still has a lot of growing up to do. As for that math PhD guy? Eventually after another design disagreement months later (on internals, not GUI -- we knew not to put him anywhere near the GUI by then) he stomped off and turned in his resignation because we didn't properly respect his brilliance, and my manager's manager talked him into working in another area of the company on another product rather than resigning outright, and eventually he turned in his resignation from *that* position after his tastes in user interface proved to be equally daunting for them for the same basic reason. Oddly enough, after a few years of seasoning elsewhere in the industry to brush the edges off his arrogance to turn it into less-obnoxious self confidence, he turned out to be a decent engineer... just don't put him anywhere near a user interface, for cryin' out loud. Which Google would have done immediately, if the Android user interface is any indicator.

-ELG

Tuesday, June 29, 2010

That XKCD 619 feeling

A Linux advocate says:

The case for using Apple software of Microsoft Windows for something is so slim it tends to sound like the techno lust (sooo shiny ...) or the machinations of a mad man (I HAVE TO HAVE IE!!!!!).

Ah yes, I'm getting that XKCD 619 feeling again, where Linux advocates say about usable user interfaces, "why would anybody want that?!". I've been using and developing for Linux since 1995, so I'm not exactly a newbie. I have the latest Ubuntu on my big Linux development machine (the latest Fedora is similar in my experience) and you know the latest Ubuntu desktop with a high-end graphics card reminds me of? It's as if someone had described MacOS and Windows 7 to engineers in the old Soviet Union, and they sat down and wrote their own clunky half-a** clone based upon nothing but those descriptions. You can practically hear the clunks of heavy metal and whirring of primitive gyroscopes as you operate it. I'm sorry, but anybody who says that Linux has the usability of MacOS or Windows 7 on the desktop is drinkin' some mighty strong kool-aid.

KDE is a bloated incoherent resource-hogging mess (consider it the Windows Vista of Linux desktops), and Gnome's primitive old-school Windows 95 Meets Motif style desktop is usable compared to the competition only if you have a high-end graphics card and can enable 3D Effects and their CCCP-style Expose' and Spaces clones (I say CCCP-style because they have significant usability issues compared to the real thing). And both are limited by "X" which has significant problems dealing with the modern world and hot-pluggable monitors. As in, it doesn't do it. On the day when you can plug an external monitor into your Linux laptop and have the desktop automagically just extend onto the new monitor, with no "dead spaces" and no problems dragging and dropping things between monitors, let me know. Right now, due to Xinerama basically being abandonware, the only multiple-monitor setup that works properly is nVidia's, and only with two same-sized monitors (otherwise there are created "dead spaces" that can eat your windows so you can't get at them), and only if you manually set it up using nVidia's own setup program. Wow, how competitive with MacBooks (where it Just Works) or Windows 7 (one right-click to get Display Settings, then select "extends desktop" rather than "mirrors desktop" from the Displays options) is that? Err... not!

I use Linux where it is appropriate -- my web and email server is running Linux, and I'm developing on Linux for embedded servers that run Linux. But to say there's no reason to use anything other than Linux is just koolaid-drinking ... and, uhm, for the guy who says his HTC Evo 4G proves FOSS rocks, I might point out that the EVO 4G is running a proprietary closed-source "skin" (HTC's "Sense" UI). Yeah, that's "proof" alright... but maybe not of what the original commenter claimed :).

--ELG

Wednesday, June 2, 2010

Doing an installer right: Microsoft Office 2010

So out of curiousity I downloaded and installed Microsoft Office 2010 today (don't freak out about piracy, folks -- I'm a Microsoft TechNet subscriber and this copy is a quite legit eval copy). I haven't had a chance to use the software yet, but one thing I have to say is about the installer: Microsoft did it right.

A good installer must do the following things:

  1. It must be SIMPLE. People don't want to select lots of stuff, they just want to click one button and have it happen. With the Office 2010 installer you click the 'Install' button (or 'Upgrade' button if Office 2007 is installed on your system), it prompts you for the license key, validates it right then and there, you click 'Next', accept the license, and then it just does it. It's basically four clicks (assuming you can cut-and-paste the license key from the TechNet site of course, if you have to type it in then there's a few keystrokes too). If your geeks or marketroids insist on all sorts of additional functionality, hide it behind a little "+" sign or something where people won't get freaked out about it, users just want it to Just Work, they don't care about all that stuff.
  2. It must handle both upgrades and fresh installs in a clean manner. So if a prior version is already installed, it should give you the option of upgrading it and keeping your configuration settings as much as possible.
  3. If possible, it should offer to import settings from a prior program, or from a competing program, much as the latest IE will import settings from an install of Firefox or Safari.
  4. It must handle aborted installs gracefully. The installer should be idempotent -- you should be able to run it regardless of what state the system got left in, and it will just Do The Right Thing. If the process fails halfway through removing the old version of the software due to something out of your control -- like the moron behind the keyboard accidentally hitting the shutdown button when he was trying for another button -- you should be able to run the installer again and have it Do The Right Thing, knowing what part of the process was last successfully finished and continuing from there, or unwinding back to the original conditions and starting from scratch again, but either way it should Just Work.
  5. Once it starts actually installing, it should just do it, not bother you anymore, until the end of the process where, if a reboot is required, it can prompt you for that.
Microsoft has accomplished all of these things with the Office 2010 installer. And you should do the same when you write yours.

So let's state that principle one more time: End users want it to Just Work. Geeking out with oodles of settings and such might make marketroids drool with all the checkboxes they can fill in on the inevitable "competitive comparison checklist", and might make geeks drool over all the cool widgets they can play with, but for 99% of the people out there all you're doing is a) confusing them, and b) making your technical support people pull their hair out trying to deal with end users who want it to Just Work rather than have all these options to select. Especially now, with 2 terabyte hard drives selling for $130 at Fry's and most computers shipping with a minimum of 4 gigabytes of RAM, it doesn't make sense to do anything other than install the whole tamale in the default place. For 99.9% of your users, that's going to be all they want. For the other 0.1% of your users, put that little "+" if you want... just put it somewhere out of the way so someone has to *want* to click on it. And realize that in reality, nobody cares other than a few fellow geeks.

Thinking like an end user. That's what it takes to make a program that Just Works. That's something I've had to pound into my team's head over and over and over again over the years, think like an end user, not like a geek... and Microsoft, at least, appears to have finally learned that lesson in at least this one instance. At which point I must congratulate them, because it's *hard* for geeks to think like end users, but in this one instance, at least, they managed it.

--ELG

Thursday, May 27, 2010

Configuring Compiz to emulate Spaces and Expose

If you have an OpenGL-capable video card and driver for Linux, like the GeForce 7900 GS in my big box, you can run Compiz on Ubuntu 10.04 and emulate Spaces and Expose'. So here's how to do it in Gnome (I do not recommend KDE on Ubuntu 10.04 due to some serious bugs I found):
  1. Install the latest proprietary driver via System->Administration->Hardware Drivers. Without this installed, my GeForce 7900 simply would not do 3D, and Compiz wouldn't run.
  2. Install the Compiz settings manager:
    • # apt-get install compizconfig-settings-manager simple-ccsm
  3. Identify the X11 mouse buttons you wish to use. Sorry, I identified those via trial and error. On my Logitech Anywhere MX mouse, here is the map from mouse action to X11 mouse button:
    • Button 1: Left click
    • Button 2: Right click
    • Button 3: The 'menu' button
    • Button 4: scroll wheel forward
    • Button 5: scroll wheel back
    • Button 6: scroll wheel left
    • Button 7: scroll wheel right
    • Button 8: backmost-arrow button (on left side of mouse)
    • Button 9: forwardmost-arrow button (on left side of mouse)
  4. Select Preferences->Appearances and select your theme that you want, then click the Visual Effects tab. Select 'Normal' unless you want the wobble-on-window-move effect (which I hate because it makes it hard to accurately place the window). It should do some work, then ask you if you want to keep it. Say yes :). Then exit out of that.
  5. Select Preferences->CompizConfig Settings Manager
  6. Under Desktop, choose "Expo". This is half of the Spaces look-alike, though it doesn't *quite* work like Spaces. Under Expo Key, set it to whatever key you wish to use to enable Expo, either Windows-E (the usual setting, note that the Windows key is called 'Super' in this UI because the Compiz folks apparently hate Windows ;), or a function key of your choice. Note that I recommend using the 'Super' prefix for that function key, because otherwise you end up conflicting with applications, since normal keyboards don't have a 'fn' key like Mac keyboards that can be used to access regular function key codes of function keys assigned functions in the GUI.
  7. Expo won't work with four workspaces all in a row, so right-click on the four workspaces in a row at the bottom right of your screen, select 'Preferences' from the resulting pop-up menu, and set it to a 2x2 grid.
  8. Now you need to set your arrow keys left/right/up/down to move you between the workspaces. Click 'Back' in the Compiz Settings UI, and select 'Desktop Wall'. Click the 'Bindings' tab. Expand the 'Move within wall' collection, and set the Move Left/Right/Up/Down keyboard shortcuts. I suggest Super-left, Super-right, Super-up, Super-down.
  9. Okay, click Back, and now let's set our Expose'-lookalike. This is under 'Windows' and is called 'Scale'
  10. The question is, which one of these do you want to use? "Initiate Window Picker" allows "Expose" on all windows on the current workspace. Unfortunately, if you have a multi-monitor system, it puts all those windows onto the monitor where your mouse is currently residing. This gets very cluttered if you have two large monitors (I'm running a pair of 1050p monitors). In that case, I suggest using 'Initiate Window Picker for Windows on Current Output', which does it just on the monitor your mouse pointer is hovering over. I selected Mouse Button 9, the forward-arrow button on the left side of my mouse, to do this.
The end result: Something that approximates what Expose' and Spaces would have looked like if implemented by someone in the Communist-era USSR who'd approximately heard descriptions of how they worked, but had never actually seen them. You can practically hear the clunks of heavy metal and whirring of primitive gyroscopes as you activate them. The Expose-clone doesn't work well with multiple displays because of its bad habit of trying to collect all the windows onto the current display, thus requiring the 'on Current Output' kludge to avoid getting window overload. The Spaces clone requires more mouse clicks to move windows between "Spaces" (it doesn't simply exit to the workspace you just moved the window to, it stays activated until you double-click on a workspace), and doesn't have a handy icon to get to it quickly in case you're using a laptop that lacks a 9-button mouse (!). And like virtually all things dealing with Linux user interfaces, it takes a jillion-step process to get it configured and set up, with oodles of trial and error to figure out which X11 "mouse button" corresponds to which actual button on a mouse.

In short, Linux programmers still haven't figured out that users just want things to work. I've had to whack my own teams on the hands a few times when they brought back a design prototype that had oodles of screens, buttons and widgets to tweak -- "No, I want one input box here, one submit button there, that's all the user cares about, he just wants to do the job, he doesn't want to adjust all the internal stuff you're exposing here." Complexity is the enemy of user interface usability and consistency -- something which geeks seem to not understand, I've had to do this (shoot down too-complex user interfaces) repeatedly over the past ten years. Sadly, there is nobody to do this for Linux (well, except what Nokia did for Maemo, which works really well at giving a consistent user interface to all Maemo apps, but Maemo is pretty specific to its particular environment) -- and thus Linux continues being an incoherent mess for the average end-user.

Still, for my purposes, it works fine at keeping my workflow working while I run a bunch of KVM virtual machines with VNC viewers into them. So I'm a bit less grumpy today. But I sure wish there was a benevolent dictator for the Linux user interface the way there is for the Linux kernel itself... it's frustrating, the technology is there, but nobody has actually turned it into a coherent whole, and the distribution vendors seem either overwhelmed by the situation or just don't care. Oh well, back to work...

-ELG

Sunday, November 1, 2009

The Windows 7 hoopla

So is Windows 7 a Mac killer? Or is Windows 7 lipstick on a pig? The answer is "No."

Let's look at the first one first. Windows 95 in many ways introduced "the" Windows user experience. It was a clean, reasonably logical user interface that was surprisingly good from a user interface perspective considering the limitations of the underlying platform, limitations which were necessitated by the limitations of the underlying hardware and the need for DOS compatibility until Windows-specific software arrived. It was Windows 95 that I evaluated, then went to my boss and said, "This is going to be big. We need to figure out some way to make money with it." That was a few months before a customer brought Linux to our attention (and my reaction to that later -- it was not favorable, initially), but certainly I wasn't wrong when I said that to my boss.

It's been all downhill since from a user interface standpoint, with each new release of Windows having yet more useless folderol to waste resources and confuse customers but no fundamental change in the UI. Windows 7 continues that tradition, adding lipstick to the pig that has become Microsoft's overly complex user interface by re-naming some things, changing text to icons on the menu bar, and somehow managing to make the Control Panel even more complex than it already was. People who claim Windows 7 could somehow be a "Mac Killer" are being ridiculous. Changing the text on the menu bar to icons does not make it a dock, and Windows 7 is even more confusing to set up and configure than its predecessors were if you're trying to integrate it into an already-existing network. I clicked away in the control panel for quite some time before finally typing "change workgroup" into the search bar. That took me to a place where I could change the workgroup (so it matched my home and office workgroup name so my systems would appear in the network browser), but where is that located in the morass that is the Windows 7 control panel? I have absolutely no idea, I clicked into the logical place and it changed my workgroup to "WORKGROUP", which isn't what I wanted at all.

Meantime, click on the open-apple icon and select 'System Preferences'. There's two possible places where you could set the workgroup -- 'Sharing', or 'Network'. I clicked on 'Sharing' and didn't find it, so I clicked on 'Network', there's a button 'Advanced', I clicked on 'Advanced', saw the word 'WINS', and yep, there's my NetBIOS name and workgroup name. Three clicks once I got the Mac "control panel" up - Network, Advanced, WINS -- to get me where I needed to be.

So from a user interface perspective, Windows 7 definitely is lipstick on a pig. It's just a bunch of lipstick on top of the original Windows 95 user interface, and like a toddler messing with mommy's lipsticks, the results are not all that great from a usability perspective. Frankly, I prefer the original, which was fast, clean, useful. However, that's not the important changes that have been made to Windows 7. The important changes are under the hood. Windows 7, in my test, used approximately 3GB more disk space than Windows XP -- i.e., around 8GB rather than 5GB. Its memory usage for snappy performance is approximately 256MB more than Windows XP (around 756M vs. 512M) if you disable Aero by switching to a 'Basic' theme, and since Aero is just lipstick, that's no big deal. In exchange you get a more secure operating system that has built-in functionality that Windows XP lacks, such as the ability to record a DVD. I have not tested Windows 7 on a netbook yet, but I'm not seeing any reason why it wouldn't work -- even with Microsoft Office installed and various third-party Internet software (Firefox, Safari, Flash, etc.) I'm using only 14GB of disk space for my Windows 7 system, and even low-end netbooks come with 32GB SSD drives and 1GB of memory today.

So from that perspective, Windows 7 accomplishes what Microsoft wanted it to do -- it allows them to discontinue support for Windows XP because it will run pretty much everywhere that XP is currently required due to the resource usage of Vista. It also accomplishes what most IT people want -- a more secure operating system that won't require them to spend half their time cleaning up after virus outbreaks, and which allows them to standardize on *one* operating system, rather than having a mismash of various versions of Windows. On the other hand, it's pretty clear that Microsoft needs more than lipstick on a pig to clean up their user interface. They need a few iFools to lead the charge against useless UI complexity, including at least one iFool who has the status in the corporation to push back against the marketing droids and geeks who always want one...more...feature... to never be used by actual customers, but look good on a marketing flyer or looks, like, really rad, dude. I wish them luck, because after fifteen years of putting lipstick on a pig, there's almost more lipstick than pig insofar as the Windows UI is concerned.

-- EG

Monday, September 21, 2009

iFool

My main computing platform, the one I use for all my software development (via VMware which lets me develop for a half dozen variants of Linux), is a top-of-the-line Apple MacBook Pro 13.3". My phone is an iPhone 3G 16GB. Am I an iFool? Have I drank the kool-aid? Shouldn't a hardcore Linux penguin be using an Android phone and running Linux on his laptop?

Well, folks, that would be fine and dandy except for one thing: I want to get things done, I don't want to spend all my time trying to get Linux drivers working on my work laptop. And for my phone, I want it to seamlessly sync music with my laptop, I want it to be able to be plugged into the iPod input of my car stereo and just work, I want it to seamlessly sync the address book and bookmarks and notes from my laptop without having to do anything special. In short, I want it to Just Work.

So for my laptop I bought a MacBook several years ago. It Just Worked. And was Unix to boot, so all my normal tools were available from the command line, and GUI versions of them (like Emacs, The Gimp, etc.) just worked. The entire developer toolkit came for free with the computer, so I could even type './configure ; make' in my project and compile it under MacOS if I wanted to do so. I chose the 13.3" form factor because it fits on an airline tray, where bigger laptops won't. I've upgraded to the latest and greatest where it gives me a real advantage -- most recently to the new aluminum MacBook Pro with the 7-hour battery and Firewire 800 and nVidia graphics chipset -- and whenever I upgrade, the new MacBook sucks in my accumulated years of files, notes, photos, and videos without a problem. It Just Works, meaning I can do my job, instead of fiddle with the technology all day long.

For my phone, I used a Palm Treo running the Palm OS for many years after it was obsolete. Yes, the email and web clients sucked. But it synced my addresses, notes, ToDo lists, and calendars without a problem. The problem came when I was stuck in an airport waiting room needing to check on what was up with my itinerary. The hoary old Palm web browser just completely choked on the airline's web site. So what to do?

Now, I have an HTC Wizard that I used on T-Mobile some years ago. It runs Windows Mobile 5. WM5 will not sync with my Mac without extra-cost software, and my experience with WM5 was that it was technically astute, but had the user interface from hell. The Treo's user interface was simple, plain, and easy to use one-handed, the WM5 user interface really wanted three hands -- one to hold the phone sideways, and two to thumb on the thumb-board.

Then I looked at Android. And Android, alas, reminded me sorely of Windows Mobile 5. It is technically astute, but its user interface was similarly designed by geeks for geeks, rather than designed for simplicity and ease of use. As with WM5, getting anything done requires two hands, and the user interface is complex and, well, ugly.

So I arrived at the iPhone by default. It syncs seamlessly with my MacBook, and the user interface, while more complex than that of my old Treo, is fairly simple to use to do ordinary things. When I plunk it into its WindowSeat in my Jeep I can do the most common operations with one finger of one hand (mostly selecting an iTunes playlist since I'm using it for music while hooked into my car stereo at that point). No fumbling with a stylus, no poking at tiny indecipherable little pictures with the point of said stylus, everything is big and bold and easy to reach out and touch.

So what's the lessons here? First of all, if you're designing a user interface, complexity is the enemy. The iPhone was blasted for its simple -- some say simplistic -- user interface, much as PalmOS was blasted for its user interface. Yet both manage to make a virtue of simplicity to make their device much easier to use. Geeks love adding complexity to products, as do product managers who are looking to satisfy marketing checklists. It is your job as a software development manager to push back on the continual drive for user interface complexity. I once had one of my engineers give me a design proposal that was five web pages worth of highly technical stuff, all of which was useful to geeks but which would simply be gibberish for our user base. I sent it back to him with all five pages X'ed out and, on the launch page which led into that long series, I drew a selection box and a button beside it. The user selected the previously-downloaded configuration to upload, then the program just did what it was supposed to do. We probably missed out on some marketing checkboxes somewhere, but the end product was much more usable, because most users simply do not want, need, or care about all the technical details of what exactly is supposed to happen behind the scenes. They just want the computer to do the right thing. They want it to Just Work.

The second lesson is that lack of a marketing checkbox often means nothing in real life. The iPhone lacked cut-and-paste for the first two years of its life. This was a missing marketing checkbox, but it didn't hurt the iPhone's sales any. The iPhone became the best-selling smartphone in the USA despite not having a feature that everybody claimed was "necessary". The simplicity of use that Apple got from not having that feature was far more attractive to customers than the additional feature would have been, and when Apple finally developed a way to add cut-and-paste that would not impact the simplicity of the product, it was just icing on an already tasty cake as far as most iPhone customers were concerned.

The final lesson is that people just want to get work done. They want to get work done without having to fight the technology all the time, and without having to look at the internals of the technology to figure out what's wrong and how to fix it. I'm very good at debugging our products. When there's a defect that nobody else can figure out, I'm the guy who looks at it, goes and puts a few printf statements in the right place (source code debuggers are for wimps, heh!), says "Ah, I see," and then tells the appropriate programmer what went wrong and why and how to fix it. But while I'm doing that, I don't want to have to be debugging my laptop too. Both Windows and Linux force me to fix OS stuff all the time rather than actually do my work. MacOS just works. And that should be your product too -- the customer should install it, maybe type in a few setup options, and then it Just Works.

So am I an iFool? Well, yes. And you should be too. By which I do not mean that you should go out and buy an iPhone and a Mac (especially with the recent release of the Palm Pre, which appears to have learned some of the lessons of its predecessors), but, rather, that you should embrace the lessons that these products teach -- simplicity as a virtue, simplicity as not being a barrier to sales, and a product that just works, without a constant need to tweak or maintain it in order to keep it working. Do that, and your product has a chance to become the next iFool's purchase. Make it a typical overly complex difficult to manage product, on the other hand... well, then you're just another mess in a large marketplace full of messes, and will stand out about as well as a bowling pin at a bowling alley, just one more indistinguishable product in a marketplace full of indistinguishable products. Which isn't what you were setting out to do, right?

-E