A very good program containing a web browser, word processor, appointment book, and database, all set up to share data (especially the appointments and the database) over the LAN with other Lotus Notes users.

The Windows version is generally good. The Unix version sucks.

From my experience Lotus Notes is only marginally better than cc:Mail. While Notes has many more features and may be less buggy, it is very counterintuitive with ambiguous settings and options menus, and replication features that most users would never be able to figure out without proper documentation and training. The nonstandard interface made it especially difficult for our employees to migrate from Outlook. Unfortunately it's our only option for business email. Of course Notes also offers the calendar, to do list, and address books, but so do most other email programs in a much less confusing manner.

For personal correspondence I'll stick with telnet and Pine.

Robert Venturi, one of the greatest architects of the last few decades, is also famous for writing what are arguably two of the three seminal books in the modern theory of architecture. One of them, entitled Learning from Las Vegas1, takes the city of Las Vegas to task as an example of the failures and excesses of the transition away from the high modernist architectural aesthetic into a "postmodern" one emphasizing symbols, signs and decoration. Implicit in the book's title is the idea that one way we can learn to do things right (or at least better) is to take a closer look at our present failures and to learn from their mistakes.

I can think of no application that we can learn from more than Lotus Notes. Even in its latest, shiniest incarnation, from a design perspective Notes is simply an appalling piece of software on fundamentally every level. It is inflicted upon me and countless other employees of various organizations and corporations as the Official Email and Groupware Solution. I have been using it for years now and through two major releases, and it continues to astonish.

And I am not alone. There's an excellent site maintained by a software design/consulting firm which they have dubbed the "Interface Hall of Shame"2, in which they highlight prototypically horrific instances of application design. They have an entire section devoted to Notes. Here is the preface:

We wish we found IBM's Lotus Notes a long time ago. This single application could have formed the basis for the entire site. The interface is so problematic, that one might conclude that the designers had previously visited this site, and misread "Hall of Shame" as "Hall of Fame".

So here, in the spirit of Venturi, is a list of application design principles, in the hope that we can learn from Lotus' mistakes.

1. don't fight the operating system

In most cases, the designers of an operating system (MacOS, Be, Windows) have developed a core set of objects, widgets, views and metaphors (drag & drop, right-click for menu, etc) that are de facto standards. As a designer, you are more or less obligated to use them when they've been provided. Application designers tend to want to reinvent the wheel whenever they feel that the tools provided by the OS are not exactly what they think they need. This is a mistake. Like it or not, you are stuck with the OS you're developing for, because your users are.

One reason that users find Notes so difficult to use is that it does not have the "look and feel" of Windows when it is running on Windows. Except for the top menus and a few standard dialogs, most of the buttons, menus, dialogs and popups have been reimplemented by Notes. Not only does this make Notes an extremely heavyweight application (on disk and in memory), but it also means that users cannot leverage any knowledge they have about the OS when they use the application. An important corollary to this, as nicely stated by the iarchitect folks, is...

2. platform independence benefits the developer, not the user

Lotus representatives and certified Lotus developers will almost universally defend the ubiquitously odd, nonstandard interfaces in Notes by claiming that it is done so that it can be platform independent: they will claim that it is 'very important' to the user base that Lotus Notes looks and feels the same on every platform.

If you quickly glance down at your bullshit-o-meter, it will be pinned to the right as you hear this.

First of all, statistically speaking the vast majority of users have access to only a single OS platform, and the small fraction that do have access to more than one tend to use (or be provided with) major applications on only one of them instead of both. There are lots of reasons for this: it may be more expensive to license both, they just need to run the application somewhere, not necessarily everywhere, or it simply isn't available on both platforms. Whatever the reason, we're talking about a tiny percentage of the user base that could potentially be 'helped' by look-and-feel platform independence.

Second, the small fraction of folks that *do* use an application on multiple platforms simply will not care that the Windows version uses (for example) a Windows common file dialog for opening files, and the Mac version uses a (totally different) MacOS one. In fact, they are actually *helped* by this because those widgets are things they are already familiar with, since they are standard components of the OS.

Finally, all the additional GUI baggage to make the application look and feel the same everywhere is unnecessary bloat that sits on your hard drive, and takes up space in memory when the application is running.

Make no mistake: their interface is nonstandard because it makes it easier for them to write software for different platforms, not for you to use it on them. And the application should be designed for you.

3. All visual feedback should be meaningful

Our eyes are particularly adept at spotting movement, even peripherally. Presumably in the early ages of human development this conferred a huge evolutionary advantage over the guy who failed to notice the sabre-tooth tiger sneaking up on him off to the side in the bushes.

In the information age there are no sabre-tooth tigers, only sabre-tooth interfaces. Menus animate themselves into existence, paperclips dance in the corner, animated gifs rotate and flash and jump around. Documents crumple themselves up and go bouncey-bouncey-bouncey into the trash. Not only is none of this (and all of the extra code that is needed to drive it) strictly necessary, but it also distracting. If I'm (e.g.) pressing a button, I don't need it to depress itself in 3D just to tell me I'm pressing it. I *know* I'm pressing it. The eye-catching tricks should be saved for when the user's eye needs to be caught.

4. Do not make something into an icon unless it unambiguously looks like what it represents

Most application designers go icon crazy, and this in general is a horrible mistake. Icons are supposed to be a nice synthesis of ease-of-use (pictures being easily recognizable) and power use (frequently-used tasks bound to small buttons). The problem is that designers tend to stretch the metaphor well beyond its useful limits by creating icons for concepts that are simply not iconic. If you have ever found yourself in an application you use a lot holding the mouse over an icon to wait for the "tool tip" hint to come up so you can see what it does, than that is a sign that the icon did not deserve to be there in the first place.

For instance: Notes has an icon representing an opened jar of paste for the "Edit Paste" action. Three immediate observations:

  • you probably haven't seen or used an actual jar of paste since art class when you were in grade school
  • you probably wouldn't recognize something as a jar of paste anyway if it were represented as a 16x16 8-bit color icon
  • you probably instinctively hit Ctrl-V for 'paste', since the software industry has settled on this as a standard key binding for more than a decade now

Those that defend iconification will point at simple psychology experiments that indicate that people are much better and quicker at picking out patterns, shapes and colors than they are at choosing words from a list. However, one can also show that this ability decreases extremely rapidly as the number of graphical choices increases, as the pictures become more and more uniformly like each other (as icons tend to be) and especially as the degree of abstraction between what the icon depicts and what it represents increases. For graphics applications like Photoshop/Gimp/Illustrator, and WYSIWIG-aspiring editors like Word, certain broad classes of icons can work well, because if what you want is to draw an orange circle, you are thinking "orange" and "circle" and can pick an orange swath from a color bank or a picture of a circle from a shapes tool much more swiftly and easily than if you had been given a text list of colors or shapes to choose from.

However, if you want to see the properties of an email attachment, there simply is no shape that naturally represents this. Notes has a yellow diamond in front of a blue square3, which the Notes designers decided means "Properties". I have been using Notes for 3 years now and I still forget what that little yellow diamond does. When I want the properties of an object, I naturally want to manipulate that object in some way to get them (in this case the Notes designers also put Propeties on the right-click menu, so in this case it's not a total abuse of metaphor).

Once you get above about 7 small pictures of abstract things, especially if they are not grouped into logically smaller "banks" of icons with related meanings, it starts to become a whole lot easier to pick out a word from a list (as in a menu) instead.

Here is a useful, albeit simpleminded, experiment. Find 'Apple' below:

Pear
Plum
Banana
Kiwi
Soy
Grapefruit
Peach
Blueberry
Apple
Papaya
Banana
Uglifruit
Pineapple
Grape
Tangerine
Kumquat
Mango

Found it almost immediately, right? And that list wasn't even alphabetized, which makes it much easier. Now let's choose an abstract shape like: &*.

Now find it:

!= ## ** $# $= != ## ** $# $= 

$_ }{ .. <> ^^ {} @# &* !! :: 

Most people find that to be much harder. It would be even more difficult if you had to think 'apple' while you were trying to look for it.

5. Use color exceedingly sparingly, and only to convey information

Everybody loves color. In the classic 1939 film "The Wizard of Oz", when Dorothy gets to Oz the film switches from black-and-white to technicolor, which heightens the effect of being transported into a different, magical world. TV, movies, computer screens, printers: everything seems to be embraced by consumers once it can display things in color4. So color must be a great thing, right?

Sort of. One of the reasons that color is so powerful is that our eyes are extremely adept at picking it out. Take a page of black-on-white text with 1,500 words on it, use a yellow highlighter on a few words or sentences, and those few things suddenly leap off the page. The eye is irresistably drawn to them. So much so, in fact, that most people now find the original text to be harder to read because their eyes naturally want to jump to the (colorful) highlighted parts.

And this is the problem: overuse of color sends the user's eye darting all over the screen looking for somewhere to land. Color should be used only for those instances where you need that effect: charts where data needs to be distinguished, highlighting of specific things when you need to draw the user's eye (example: red-underlining misspelled words during a spell-check), and so on. As a general rule, you should have to justify every single instance of use of color in your program. Start with the idea that your entire application will be in grayscale. Color is a powerful tool when used appropriately. It is an unholy mess when it isn't.

Notes would actually score rather well in this regard, if you were able to turn off all of the ridiculous icons. But you can't, and they are awash in an angry fruit salad spectrum of colors.

6. I have an abundance of screen real estate. Let me use it.

Notes is designed to be a groupware solution that scales to large corporations, ones with potentially tens of thousands of users. However when I choose an email address from a companywide address book I am shown a scrolling list that displays only six items at a time out of maybe 500, and this is yoked to a scroll bar where even the slightest drag sends me hurtling 50 or 100 names forward. I have a 1600x1200 display. Show me more names. Give me a "narrow to names containing string" function on the dialog box so I can pare down the list. The dinky little popup has grabbed the entire screen in a way that I can't use anything else until I dismiss it anyway, so while you are monopolizing the whole screen why not use more than 1/50 of it to help me find the information I need.

This is one that Microsoft is also guilty of in spades. Much of their standard GUI widgets are designed around the fixed-width concept. And admittedly it is easier for programmers to design the behavior of standard widgets if they know that they will contain a fixed number of items. But good interface design is not about making things easier for the programmers.

7. Yes, I'm sure.

And if I do realize I've made a mistake, let me 'undo' it. That's the great thing that separates the virtual from the physical: most virtual actions can in principle be undone. In most cases there is no excuse for not providing the user with the ability to unwind something he/she has just done. So just keep track of my sequence of actions and STOP ASKING ME if I'm sure I wanted to do what I just told you to do.

Furthermore, even in the cases where I will be doing something unavoidably permanent, instead of warning me about it at every stage like I'm a six-year-old you should design the interface so that it's obvious that I'm actually about to do the thing I'm doing. Before computers, we all lived exclusively in a physical reality without an "undo", so people are accustomed to committing themselves to irreversable actions hundreds of times a day. Can you imagine any other context where the sort of nonsense we go through with software would be put up with?

Boss: The Peterson account just called and they're going with another vendor. You can go ahead and tear up their contract.

Assistant: Are you SURE you want me to tear up the Peterson contract?

Boss: Yes, I'm sure.

Assistant: It will be PERMANENTLY torn up.

Boss: I'm aware of that.

Assistant: Please get out of your chair, come over here and point at the Peterson contract so I can tear up the right one.

Boss: You're fired. Get the fuck out of my office.

8. Don't be monolithic

Our users can dial into work from home if they have a high-speed line. Even so, Notes takes a couple of minutes or so to start because it has to transfer more or less everything about you to your client locally. For an average user that has been using the application for a year or so this turns out to be around 50-100 megabytes of information. Notes needs to have this information locally even if you have mounted the drive on which your files reside.

And that's ridiculous. Applications should be lightweight, and an email/groupware application of all things should be designed with mobile use in mind. Shipping the whole kit-n-caboodle back and forth is yet another sign of laziness on the part of the designers: they are making it easier for themselves, not you.

9. Be fault tolerant

One of the more amazing bugs that is not fixed in the latest release of Notes (v.5) is the 'dazed and confused' bug: if Notes loses contact with its server or the network for any significant amount of time it will go into a sort of panic, and decide that nothing can be trusted. Its 'solution' to this is that it refuses to exit the application, claiming that 'An unspecified network error has occurred'. To get out, you have to kill the application's process. Once you have done that, or if it crashes on its own, it will refuse to start up again without your either logging out and back in first, or rebooting the machine.

Networks and filesystems are just plain unreliable, and that reality should be designed into software. There's simply no excuse for an application to crash or get into a bad state over them.

10. Let me customize my environment.

Everyone works differently. Applications sometimes need to establish arbitrary formalisms in order to get their job done, but there's usually a lot of wiggle room to give the user a chance to tailor the application's behavior to their work habits. Let them build macros and bind them to keystrokes or buttons. Let them customize the tool bars, removing items they don't need.

Lotus is unbelievably deficient in this regard. Part of this is that Lotus/IBM has adopted the "fortress of computing" model in which all control is given to a central set of "administrators" of the software. Lotus uses this, believe it or not, as an active selling point for the application: the "headaches" caused by users mucking about with their configurations are eliminated by passing all control away from the user. The fallacy at work here is that configuration and customization are the same ideas: they're not. There are ways that users can make themselves more productive and happy that can coexist with access control, information flow, maintenance of a broader look-and-feel, and other things that administrators would like to keep consistent across the user base.

When you turn this idea around and look at it from the other end, you get...

11. Enable the power user

Most applications needs to be easy enough to use for novices where the learning curve is possibly quite steep, but powerful and fast enough so that experts won't be slowed down or frustrated. There's a myth that this sort of thing is an unavoidable 'tradeoff', and necessarily involves compromise between power use and ease of use.

It simply isn't true for the most part. We have loads of devices to help with the transition to power use. To name just a few:

  • learning menus, or menus with 'expert' modes
  • user customization
  • a macro langage sitting under the application, so power users can write scripts for common tasks and bind them to key combinations
  • keyboard-driven interfaces, with lots of common tasks bound to memorable (and standard!) key combinations.

One power application that suffers frequently at the hands of ease-of-use criticism is emacs. Emacs is the ultimate power application: virtually everything is keyboard driven, there is a special flavor of Lisp (a programming language) that drives the whole application and in which most of its functionality is written (and is customizable). I am writing this in emacs right now. There is simply nothing that comes close in terms of power when it comes to text editing.

Yet after you've learned a few basic key commands, emacs is also remarkably easy to use. I can give you an index card with a handful of commands and you can be productive in emacs in almost no time, because ultimately the execution of the application's interface is extremely simple if you want to use it that way: one window, put your cursor down where you want the text to go, and type.

Notes, on the other hand, has no macro language5, no opportunity to bind keys to common functions, and fights you at every turn when you try to get things done.

12. If I need extensive training sessions to use an application, then the application is almost certainly at fault

It is amazing how many folks, even programmers, either do not comprehend this idea or simply do not appear to believe it. Admittedly application design (all information design really) is hard, and almost by definition is done to address some sort of task or problem in the virtual world which is probably complex, abstract and may have little or no equivalence in the real world. To an extent this abstraction can and should be moderated by the careful selection of analogous metaphors from the real world ('cut' and 'paste' from scissors-and-glue typesetting, drag-and-drop from physics, the desktop-trashcan-file-folder metaphor from familiar office furniture) but ultimately most things we do with computers are pure abstractions. The example I like to use when purists protest this is: what is the real world analogue of apply-the-fast-fourier-transform?

Sadly, people who design software tend to take this lack of analogues, combined with the 'enable the power user' principle in whatever form they have come to understand it (see #11) and interpret this as license to make the application incredibly and needlessly complex. Part of the mistake here is the idea that power necessitates complexity in some sort of natural way. It *is* true that complexity increases with power, in the sense that the complexity of an application can be said to increase nominally with the number of independent things it can do. But application designers can do much to assure this complexity is organized and managed so that it increases linearly and not exponentially with the number of features.

The Lotus designers have taken this to the extreme with the design of Notes. Even the simplest things in Notes: sending email, finding a particular note, changing the paltry number of things that you are permitted to customize, are made incredibly difficult. My favorite: to delete an email item in notes, you must click on the email and hit the 'delete' key (a right-mouse button does bring up a list of choices to operate on the email as you would expect, but delete is not one of them). When you do this nothing happens except for a tiny unrecognizable blue icon that appears to the left of the email. This blue speck means that your email is 'in the trash'. However it is not gone from your window the way you obviously want it to be: it continues to appear, just with a little blue dot beside it. You must then hit the F9 key to 'refresh' the contents of the window. Then you are asked if you want to delete the items that are 'in' your trash folder (even though they continue to appear in your inbox). If you say yes, they are then gone, and are unrecoverable. You can also delete items by dragging them directly to the trash folder. This works, but this way the item really does disappear from your inbox window, in violation of the original metaphor.

This is a classic example of bending the interface to fit some sort of abstract conceptual design, in this case the standard transactional ('OLTP') model: objects are marked for deletion, and the transaction only happens after it is 'committed' to the database.

13) There is no excuse for the GUI not to run in its own thread at all times

'Multithreaded' applications can be doing more than one thing at once. Most all modern operating systems support multithreading in some form, and most have full support for it built into the OS. So one guiding principle for application design should be: the GUI runs in its own thread. This is vital, as a single-threaded application that is busy calculating, talking to the network or in disk I/O is not updating and repainting its windows in the interim. At best, this looks as if the application is crashing, or is about to crash. At worst, it confuses or hangs up your entire window manager because your window stops responding to events. Because Lotus is designed monolithically (see #8) it frequently hangs for several seconds, sometimes more than a minute while it is fetching information from the network, phoning home, or whatever it seems to need to do. Meanwhile, you wait, and hope it is not about to disappear along with that long email you've spent the last half hour composing.


1The others being Complexity and Contradiction in Architecture also by Venturi, and Vers Une Architecture by Le Corbusier.
2www.iarchitect.com/shame.htm
3Lucky Charms?
4The president of Motorola once declared rather abashedly that they had spent millions of dollars in R&D to improve the performance and usability of their pagers, but the one thing they did that tripled their sales was offer them in color.
5Actually it does, but it is typically not available to the end user, only to the 'Notes administrator'. The fortress-of-computing model again.

I've been using Lotus Notes (and Lotus Domino) for only about 2 months but I can say that is is the best groupware tool in the market.

Lotus Notes has greatly evolved from v.4 to its current version, the R6. Its evolution was triggered by the sudden surge of the internet. From a simple communication and organizing tool to a dynamic groupware application, Lotus Notes is most probably the hottest form-flow medium in the market. It's robust, stable and object-oriented design makes it easier for the user to utilize its full potential and capabilities.

On the reliability and versatility part: Lotus Notes doesn't only support native Lotus script. It can also handle C,BASIC, HTML, Java and Javascript (provided you have this some sort of SDK residing in its data repository- in notes data folder, I think). I'm sure there are more supported languages which I may have failed to mention. The bottomline here is Lotus Notes and Domino can offer the best of all worlds to different programmers with various programming language of choice.

Security is not a problem in Lotus Notes/Domino. You can define who can see, create or edit documents, views, folders, agents or databases. But of course, setting up security is the joint responsibility of the Domino Administrator and the Application Designer. From the encrypted fields to controlled-access section, this is one tough egg to break.

That's all.

Log in or register to write something here or to contact authors.