Bringing Design to Software
© Addison-Wesley, 1996


Design of the Conceptual Model

An interview with David Liddle

David Liddle was interviewed by a group from the Association for Software Design, consisting of Barry Polley, Andrew Singer, Suzanne Stefanac, and Terry Winograd.

In 1978, David Liddle was asked to head Xerox’s new System Development Division, to create products based on the research being done at the Xerox Palo Alto Research Center (PARC). Over the next few years, Liddle and his group developed the Star system, which was the forerunner of today's graphical user interfaces (GUIs), such as the Macintosh, Microsoft Windows, NeXT Step, and Motif. Although the Star did not have the commercial success of its later derivatives, its innovations were the model for consistent integration of now-familiar mechanisms for windows, icons, menus, drawings, and onscreen formatted text.

To a degree that was unusual at the time, and that has been matched only rarely since then, the Star development proceeded in a reflective way, with a good deal of attention to the development methods and to the role of those methods in defining the product. The design methods emphasized the user’s experience, as reflected in the conceptual model the user had of what the system was and of what it could do. In this way, the Star was as much an innovation in the design process as in the product itself. Rather than deciding what the system would do, then figuring out how to produce interfaces, the developers engaged psychologists and designers from the beginning in an extensive set of mockups, prototypes, and user tests to see what would work, and how it could work.

This interview of Liddle by a group from the Association for Software Design offers insights into the nature of software design, and into the specific ideas that were generated by the Star's programmers, psychologists, and designers, who pioneered a new kind of interaction with the computer.

§ § §

Association for Software Design: You were instrumental in the design and development of the Xerox Star, one of the first systems to use a graphical user interface. What was the overall context in which the Star work got started?

David E. Liddle: By the mid-seventies, developers were beginning to move away from the distributed model of computing, in which an employee sat at a terminal and shared processor time. IBM's entry-level division was promoting the idea of individual computers, almost like tiny mainframes, for small businesses. But connectivity was not part of that picture.

Xerox had been thinking about an individual, self-contained unit that would be connected to a laser printer and to other self-contained units on a network. This concept was based on ideas coming out of the Palo Alto Research Center (PARC). Inspired by Alan Kay's vision of the Dynabook, we had built the Alto, which was the first personal computer with a bitmap screen, graphical interface, pointing device, and many other features that we now take for granted. Other groups at PARC were developing the original Ethernet and the first laser printers.

We were also familiar with and impressed by the work done in Doug Engelbart's Human Augmentation group at the Stanford Research Institute (SRI), where they had created the first on-screen editor, had integrated it with a unified hypertext system, and had built new interface devices such as the mouse and the chording keyset. Although theirs was still a time-sharing system built around a minicomputer shared by a workgroup of expert users, it set a new standard for the kinds of interaction that were possible, such as spatial editing, pointing, and a command language different from text typed in a command mode.

The PARC environment fostered an important collision of ideas: interconnected peer computers controlled by individuals; spatial, gestural, and nonverbal interaction techniques; and extremely high-quality typographical text and graphics.

ASD: What did it take to move this collision from the research laboratory into a product?

DEL: Within PARC, there had been no attempt to see that applications were integrated. With the Star, we were intent on identifying a single model that would work across all the different capabilities. In early 1975, I was asked to start a development project to produce a basic system platform to support future image-processing products. This platform included a new processor architecture, an operating system called Pilot, and a development environment called Mesa. That summer, I was asked to move the work out of the pure research realm of PARC, and to help start the System Development Division. We were charged with building products based on PARC's work—getting the ideas out of the laboratory and into commercial practice.

In the summer of 1978, we were commissioned to build a product line on this platform, including the first file and gateway servers; small-scale laser printers; the Ethernet; and the first graphical, bitmap-display, iconic user-interface product—the 8010 Star. The Star was launched in May of 1981. We had started on it in earnest in summer 1978, so there were 3 years of head-down, pick-and-shovel work, building Star, preceded by about 2 years of system development on the processor family and underlying software.

ASD: Given the novelty of the ideas, the development must have called for going beyond the standard process. How did you envision your task?

DEL: We tackled it in a very different way from most software development groups. Before we designed the Star, I commissioned a study that produced a document on the methodology for the design of user interface. We borrowed additional people from PARC to help us work on it. It laid out an approach—a thorough approach—to doing software design. That document and that methodology were the basis for the whole Star design process. It began with task analysis, looking at a fairly wide range of users. Next came the job of developing scenarios for uses of the imagined product based on the task. Then, it proposed a model for a graphical user interface, carefully distinguishing three aspects: the display of information, the control or command mechanisms, and the user's conceptual model.

The first component—information display—deals literally with what appears on the screen. It encompasses all those relatively minor aspects, like what window borders and buttons should look like, what fonts are used and where, what icon shapes are used, and so on. This component is important, but is not the crucial concern from the standpoint of usability. Information display is the least important of the three separate components.

The second component is the control mechanism—the machinery used to invoke commands. It is extremely important that command invocation be designed consistently across different applications. In terms of usability, this component is much more important than information display.

The most important component to design properly is the third, the user's conceptual model. Everything else should be subordinated to making that model clear, obvious, and substantial. That is almost exactly the opposite of how most software is designed.

ASD: Just what do you mean by a user's conceptual model?

DEL: This model represents what the user is likely to think, and how the user is likely to respond. For example, one of the events that changed civilization in the past decade was Dan Bricklin's choice of the spreadsheet metaphor and its underlying conceptual model (see Profile 11). Neither the information display (limited by the machines that were available, and by how much space his program, VisiCalc, required), nor the command invocation were great. However, that conceptual model was exceptionally durable. The desktop metaphor for managing objects—for filing and printing and mailing and the like—is another widely applicable metaphor.

ASD: Is the metaphorical connection to the physical world the key element of a user's conceptual model?

DEL: No. It is a mistake to think that either spreadsheets or desktops were intended to imitate accounting pads, office furniture, or other physical objects. The critically important role of these metaphors was as abstractions that users could then relate to their. The purpose of computer metaphors, in general, and particularly of graphical or icon-oriented ones, is to let people use recognition rather than recall. People are good at recognition, but tend to be poor at recall. People can see objects and operations on the screen, and can manage them quite well. But when you ask people to remember what string to type to perform some task, you are counting on one of their weakest abilities.

ASD: How did you apply your analysis of the users' model in developing the specification for the Star?

DEL: We ended up writing a 400-page functional specification before we ever wrote one line of code. It was long, because it included a screen view of every possible screen that a user would see. But we didn't just sit down and write it. We prototyped a little bit, did some testing with users to decide what made sense, and then wrote a spec for that aspect. Then we prototyped a bit more, tested it, and then spec'd it again, over and over until the process was done.

I got Bill Verplank, who had done human-factors work at MIT, to work with the testing. Verplankwas in our Los Angeles development center, where he would bring in the users and make accurately time-stamped videos that superimposed a view of the screen, a view of the user, and a view of the user's hands. Verplankand his crew did 600 or 700 hours of video, looking at every single feature and function. From all these video recordings, we were able to identify and eliminate many problems. For example, we chose a two-button mouse because, in testing, we found that users demonstrated lower error rates, shorter learning times, and less confusion than when they used either one-button or three-button mice.

ASD: You were presenting the test users with new concepts, right? The ideas of icons, windows, and menus were not familiar to them.

DEL: Yes. The term menu-driven had been used prior to our work, but its meaning was different from what you and I think of today. Most mainframe applications were starting to switch from command lines to menu-oriented screens. Initially, software had to work on Teletypes, which were linear—the typewritten stream just moved down the page—so you couldn't have menus. Generally, you could hit a question mark and it would list all the commands you could give. You typed one of those commands and the system told you whether the command was acceptable. With the advent of the IBM 3270 terminal, you could display a whole field of text on the screen, could use control keys to move around, and could enter text anywhere on the screen. So the display could present a set of commands and let you choose one by moving the cursor over it. People already recognized the benefit of menus for making the possible commands more easily visible.

We did not call any of the mechanisms in the Star menus. Apple always called them menus; we never did. But the idea was there nonetheless, tied to the objects on the screen. Instead of being in command space, where you were always asking, "Okay, what's the next command?" you sat there with a restful screen full of icons. You pointed to one of them, and then you hit properties or options, and you were presented with all the operations that you could do with the object, as well as its characteristics (Figure 2.1)


Figure 2.1 Property Sheets The new flexibility that the Star provided by allowing users to manipulate objects directly on the screen required its designers to invent ways for users to examine and modify the properties of those objects. The property-sheet mechanism was applied uniformly to every kind of object; users could select any object, hit the PROPS key on the keyboard, and see a window displaying all the properties and options relevant to that particular object. (Source: Reprinted by permission from Jeff Johnson et al. Xerox Star, a retrospective. IEEE Computer 22:9 (September, 1989), p. 17.)

So we had an early object-oriented approach, the idea being that you inspect the particular icon and it displays to you the messages that it will accept. It does not accept anything else: It tells you what you can do relative to that object, in that particular context. You could never see a command alternative in a context in which you could not use it. This idea of progressive disclosure—providing a great deal of context for everything you did, and evaporating the context when you were finished—was extremely important.

In later systems, such as the Macintosh and Windows, people did strange things with icons, such as using them to represent an application program. The user should never need to operate directly on programs. An icon should represent a document of some kind—you double-click on it, and you are automatically in the right program for dealing with it. That was not what happened, because the later designers were retrofitting the Star's concepts over existing ideas. In the Macintosh, they just missed it; in Windows, they were retrofitting it over DOS.

ASD: What do you think in general of the later interfaces that were based on the Star?

DEL: The structure and design of NeXT’s Interface Builder was great, but the user interface of the NeXT machine is not that good. They did with Unix what Windows did with DOS—they didn't have the courage to break from Unix, so as a result you see lots of things that look like limestone tablets, but when you open them up, you have to deal with a Unix file or process, with all of its intrinsic ugliness. Instead of designing good, new, useful metaphors and end-user illusions, they took the simple design of the Star and made all the property sheets look like pre-Raphaelite frescos (see Figure 2.2). It's distracting. The graphics suffer from this Muscle Beach phenomenon—they're not harmonized with the task that you have at hand. When you're typing a humble business memo, and up pops a carved monument, it’s out of all proportion, and is disruptive.


Figure 2.2 Evolution of the Graphic Interface Interfaces that followed the Star, such as NeXT Step, adopted many of the conceptual elements of the Star, embellishing them with elaborate graphics. (Source: Reprinted by permission from NeXT Computer. NeXT User's Reference. 1990, p. 119).

ASD: You clearly were focused on how the software that you developed could create what is now often called the user experience. How did all that user involvement go over with programmers, who would be likely to enjoy designing limestone obelisks?

DEL: Development using real testing was not all that smooth. Sometimes, an idea would seem to be settled. We would all agree on a prototype; then, Bill would try it out with real users, and we would find out that they had trouble with it. People on the team sometimes got upset. We'd have to go in and renegotiate something we had all thought was settled. Still, all in all, people did agree to be responsive, within all possible reason.

Although most of the individuals on our team were technically programmers, they were first and foremost software designers. Many were researchers in computer science, who saw their activity as distinct from programming. Within our group, you were viewed as a philistine if you wanted to argue too much about implementation before the design was done. Our attitude was always, "Wait a minute, let's make this work for the user. If we find we can't implement something, then we'll go back to the drawing board. But we're not going to pick things that we think will be small or fast to build, and then bully the user into accepting them."

ASD: Didn't your approach lead to specifications that were difficult to fit into the capacities of the machines you had? Most applications developers focus on taking advantage of the specific power of the platform, but you were concentrating on the user.

DEL: That's where we parted company, in my view, from many development projects. For example, at around the same time, Charles Simonyi and his team at PARC were developing a text editor (or, as we would now call it, a word processor) named Bravo—the direct predecessor of Microsoft Word, which was developed when Simonyi went to Microsoft. Bravo was very, very carefully and cleverly designed around what would run fast and be small and easy to implement on the Alto. Its designers took the view that the users should be willing to learn more, so that they could take advantage of a program’s functionality. That was okay—it was a good balancing act. We, on the other hand, always worried more about the functionality than about implementation.

One consequence of our emphasis on the user was that, when Version 1.0 of Star was first released, it was too slow. We couldn't reach the performance levels that we wanted. The operating system, Pilot, ran fast, but the Star application software that ran on top of it created a bottleneck. We couldn't just speed up the machine. We were already pushing the limits of the hardware that we could produce at a viable cost.

ASD: How did corporate management respond to the project?

DEL: We were treated with benign neglect the whole time we were developing Star. Xerox's executives really weren't prepared to deal with the Star as a product, even when we had it almost finished. The Star was almost done, and we had no division willing to manufacture or market it! The staff in El Segundo were ready to make the boards; we could get the electronics done; but we couldn't convince Xerox to put it in a box and package it. Every 3 months, there would be an attempt to kill the project.

Just when things were looking pretty bad, I met Don Massaro, who was president of the Office Products Division. He thought that dedicated word-processing systems of the kind on the market in those days were unbelievably boring, and he didn’t believe that they would last. He already knew about PCs, and he had helped to get Apple financial credit when it started up. The top brass at Xerox had told him, "We've been trying to figure out what to do with this Star stuff. Why don't you take a look at it: If it's no good, we'll kill it." So I showed it to Massaro. All his people said to get rid of it, but he said, "No, this idea is great, I've got to have the product." So suddenly, at the last minute, we were working with someone who had a whole division, with a factory in Texas and a sales crew.

ASD: How did the Star fare as a product?

DEL: Nearly 100,000 Star systems were eventually shipped (later under the name Viewpoint), but obviously we didn't capture the market that was opened by the Macintosh, PCs, and so on. The people who used the Star liked it, but the market and Xerox's positioning in that market weren't right to take full advantage of the package we had put together.

ASD: One critique I’ve heard is that Star didn’t include a bookkeeping mode, in the vein of our present-day spreadsheets, and that this omission impaired its acceptance. Is there any validity to that story?

DEL: Well, it is true in a way. Spreadsheets didn't come out until §a year after the Star was born. The Star actually did have smart forms in it, so you could do many tasks that are now done with spreadsheets, by interconnecting these forms and fields. But the smart forms were more difficult to use than spreadsheets, because they didn't offer a consistent metaphor.

The Star's data management was simple—it managed linear, typical records well. The problem was that we couldn't anticipate and develop all the different kinds of functions that people wanted. We didn't have a platform for third-party applications, which is what operating systems offer today. Everything had to be built by Xerox: the hardware, the operating system, the development environment, the servers, the system for creating and managing files, and the user applications. The Star was a highly integrated, user-oriented system—but a closed one. It's development, of course, was before the PC, and before there was any microcomputer software industry.

ASD: Was it ever part of the game plan to open up the system—to invite other companies to develop applications?

DEL: There was no option back then other than totally proprietary systems. There weren't any open systems. No software ran on any two brand names, so that alternative just didn't occur to us. The ultimate weakness of the Star as an overall system was our failure to provide an opening for the kind of diverse, robust application-software–development industry that we see today.

ASD: What do you think were the main strengths and contributions of the Star?

DEL: Two different dimensions of what we did still have had loud echoes in today's world of software design. The first was the set of ideas embodied in the Star interface itself (see Profile 2): the physical desktop metaphor, direct manipulation, a general way of dealing with property sheets for objects within a consistent user model, what-you-see-is-what-you-get (WYSIWYG) editing, generic commands (such as Move and Copy), and so on. Later systems have improved on these ideas, although in certain aspects I still think that Star was a great advance over its successors.

The second dimension was the design methodology. We put the user's experience in the forefront, and developed our design principles from what we saw users doing. These principles were a significant part of the Star legacy.

Many people ask, "Why worry about specific principles of user-interface design? Why not regard design as an artistic process, with each product a fresh expression?" But that view isn't realistic. The repeated, serial use of products having gratuitously different user interfaces produces cognitive dissonance. The abrupt switching back and forth between the abstractions of your job and the abstractions of computing can be dispiriting and debilitating, reducing productivity and satisfaction significantly. Designers make their greatest user-interface errors when they don't think about users in terms of what those users are doing in their jobs.

We articulated specific principles, which later found their way into documents such as the Macintosh Human Interface Guidelines (see Profile 4). We applied them to each of the design elements in the Star, to get maximal unity and simplicity.

For example, we had to identify the core control mechanisms for commands. It was clear that these mechanisms had to be consistent across different applications. We ended up with a small set of generic commands that we built into the hardware of the Star keyboard. They were the basis for the idea of standard menu commands in the Macintosh. The uniformity of these commands across applications gave users a consistent interface—one of the features that gave the Macintosh platform a jump start.

ASD: You mentioned that your user-based approach included many prototyping cycles. Was that a new idea?

DEL: The Star development was the first project I know of that used a systematic and extensive prototyping cycle to develop the interface. It wasn't dominated by the code jockeys, as most projects were.

ASD: Do you recommend, then, that prototyping always precede any writing of code?

DEL: Absolutely, provided you can make prototypes that are good enough for users to test. If you cannot, then you can show the users screen views and other visual representations, so at least you can start with thoroughly specified user functionality.

ASD: What do you see happening in software design and development in the future?

DEL: The situation today is different from what we were tackling in the days of the Star. The industry is entering a new phase of development of computer products, following a cycle of technological maturity—one that has been repeated for many new technologies, such as the radio, the automobile, and the telephone.

In the first phase, the new technology is difficult to use, and its benefits are not yet obvious. It appeals mainly to those people who are fascinated with it for its own sake—the early adopters. You'll see clubs of enthusiasts who love to share stories on how they fought and overcame the trials and tribulations. They see most people as not worthy of using the technology. Ham radio is a good example, and, in the same vein, the legends of Silicon Valley are full of stories of the brave pioneers who tackled the MITS Altair or the Osborne.

In the second phase, the economic benefits are developed to the point where the hard-headed business managers will adopt that technology for practical uses. Their interest is in the bottom line—not whether the technology is fascinating or easy to use, but whether it will promote greater efficiency, productivity, and profits. (To continue with the radio metaphor, everyone is familiar with radios for truck and taxi dispatch, police, military, and so on.) This business class covers most of the major microcomputer applications sold today (except for the games market). In designing for it, the main consideration is cost effectiveness. If better design can speed up use, cut training time, or add to efficiency in any other such way, then it is important. If the new technology doesn't produce a measurable difference in one of these dimensions, then it is a frill.

The third phase reaches the public—the discretionary users who choose a product because it satisfies a need or urge. They don't care about cost–benefit analyses, but they do care about whether the product is likable, beautiful, satisfying, or exciting. Cellular telephones have reached this level in the radio market. Computer games have been there since the beginning—a game succeeds because people like to play it. They enjoy the experience that they get from the design of the graphics, the sounds, and the flow of play. An increasing portion the computer market is shifting to the consumer end of the spectrum. The huge new markets of the future—the successors to the productivity markets dominated by IBM and Microsoft in the past—will be in this new consumer arena. Design that focuses on the user, rather than on the mechanisms, will move to center stage.

ASD: How would you describe software design in general?

Software design is the act of determining the user’s experience with a piece of software. It has nothing to do with how the code works inside, or how big or small the code is. The designer's task is to specify completely and unambiguously the user’s whole experience.

That is the key to the whole software industry, but, in most companies, software design does not exist as a visible function—it is done secretly, without a profession, without honor. I hope that this situation will be changed by organizations such as the Association for Software Design and by the new teaching programs that are springing up in universities (see Profile 9).

When the Wright brothers first flew an airplane, or when Benz drove the first automobile, the wonder was not that people could drive or fly easily, but that they could do so at all. These machines eventually left the enthusiast realm and became forces of change in society, because the principles of their use became more important than the technology of their construction. We have not yet piloted computers and software to that point, but we have made a good start. If we can keep our eye on the user, we can keep moving ahead in the right direction.

Suggested Readings

Jeff Johnson, Terry Roberts, William Verplank, David C. Smith, Charles Irby, Marian Beard, and Kevin Mackey. Xerox Star, a retrospective. IEEE Computer 22:9 (September, 1989), 11–26. (Reprinted in Baecker et al., 1995, 53–70).

About the Author

David Liddle is president of Interval Research Corp., in Palo Alto, California. He was a founder of Metaphor Computer Systems and was a vice-president at IBM. He is Consulting Professor of Computer Science at Stanford University, where he teaches courses on human–computer interaction and on the computing industry. He is also on the board of directors of a number of prominent software companies, is chair of the advisory board of the Santa Fe Institute, and serves on the engineering advisory committees at Stanford University and the University of Michigan.