Firelily Designs

Home Sites Gallery Clients Opinions Tutorials About Contact
Firelily Designs

Prototyping and the Software Development Cycle

The following article is drawn from a number of sources at the ACM CHI '92 conference. This is the first in a series.

Prototyping and iterative design have a reputation for being difficult to manage. The keys to managing a prototyping effort are becoming clear, however; they include knowing what you want to learn from the prototype, access to rapid prototyping techniques, and end-user involvement in development of the prototype. This article provides a framework for prototyping within the development process.

We have access to a wealth of information on how to make prototyping work. Prototyping and iterative design enable us to create vastly improved products in less time than with the waterfall methodology, provided that the role of prototyping is well understood, and that it is managed properly.

Prototyping does have its perils. These include:

  • Standardization--Prototypes, particularly in a high-technology prototyping environment, tend to be shaped by the tools that are available, rather than by users' needs.
  • Distraction--Work on the prototype can take attention away from the problems to be solved.
  • Seduction--Developers can be trapped in an endless loop of refinement.
  • Rejection--If the cost of implementing an idea is too high, ideas will be rejected too early in the cycle.
  • Obscured historical perspective--Prototypes tend to lose the reasoning that went into them--why decisions were made, for example, or which requirements led to a set of behaviors or functions.

Not prototyping also has its perils:

  • Competitive positioning--For each product area, there is a minimum set of functional requirements that a product must meet in order to be competitive. This set is always growing. Without extensive user testing of prototypes--before design closure--it is much more difficult to assess whether you have covered enough requirements to be competitive, or whether the functions you have included are necessary and sufficient to complete each user task. This user involvement in prototype evaluation is essential.
    How requirements become necessities over time
    Figure 1
    How requirements become necessities

    Consider the modified Market Opportunity Map in Figure 1. Requirements are not static; their importance changes over time. As products satisfy new requirements, users become accustomed to those solutions, and the corresponding requirements become part of the minimum set that is expected for all products. The products that do well in the marketplace are those that satisfy requirements before they become part of that minimum set. The products that lead the market are those that pull requirements into that minimum set.

    If you target the upper-right quadrant (again, in Figure 1), you are aiming at where the market has been. When your product hits the street, the market will have left you behind.

  • Complexity--The underlying foundations of products are becoming increasingly complicated. Graphical user interfaces, in particular, can account for as much as 70% to 80% of the code in a product. The cost of correcting design mistakes has grown accordingly.
  • Isolationism--The development team may implement concepts without knowing if their solutions are usable. As a result, when later user testing shows that a design doesn't work, management may feel that development has invested too much to be able to throw it away and to start over.

    Development team members usually do not know who their users are, why they will buy the product, exactly what they will do with it, or how they will learn it. In short, they have no deep knowledge of the problem area--no history, no feeling for the culture, no experience in living with the problem on a day-to-day basis. Even if the development group tries to fix this problem by listening to users, it can fall prey to more subtle traps--picking too few users, or picking users who are easiest to talk to. The fewer users that contribute to product design, the less flexible and adaptable the design will be. If a user is easy for developers to talk to, it is because that user thinks like a developer. Such users are not typical!

  • Guideline-itis--Interface guidelines are general; they are written by people who know little about a specific product or its customers. A development group that relies exclusively on guidelines, rather than prototyping with extensive end-user involvement, is not likely to produce a product that satisfies anyone's needs. Guidelines are not a substitute for user testing.

    Apple, in its User-Centered Design course, goes so far as to say that you can follow every guideline in Human Interface Guidelines: The Apple Desktop Interface, and still have an unusable interface. Apple's human factors experts say that their guidelines are called just that--guidelines--because it would be impossible for them to anticipate every situation that any developer might encounter. They go one step further, and say that every good user interface violates at least one of the guidelines. To prove their point, they show how one user interface feature can follow and break the same guideline, and how another function can be implemented following different guidelines depending on the context. What advice do they offer? Test early, and test often.

  • Unusability testing--Usability work does not have complete buy-in from the development team, or from management. Too often, it is still budgeted as two weeks of testing at the end of two years of development.

Team leaders can avoid both sets of perils if they focus on these questions: What are you prototyping, and why? What do you expect to learn? When do you need it in order to affect the product? How will you know when to stop?

The answers to these questions come from users, and from nowhere else. You are prototyping a solution to users' problems. You learn what those problems are, and you stop when you have a proven solution. You must do all of this before closing your design.

Development groups must gather the specific answers for their projects at key points in the development cycle:

  • Before the project begins. This is the time to do a task analysis, so that each member of the development team understands the problem. It can also be helpful for some developers to learn how to do the users' jobs before starting development.
  • Out of the task analysis, the development team should build and validate scenarios that describe real users doing real tasks. These scenarios provide a model for the flow of work through a product; they also ground the product in reality by attaching real people to the tasks. Instead of arguing about what a user might want or need at a particular point, the team can say, "Fred starts with a list of accounts."
  • Armed with a task analysis and scenarios, the development team can build and refine prototypes of a solution. Each iteration of the prototype must be tested with real users, always including users who have not seen previous versions of the prototype.

    This is not a substitute for field testing a product, nor is field test a substitute for testing iterations of a prototype. At field-test time, the most you can learn is whether or not you have produced a useful product, and the most you can do is fine-tuning. Prototypes--iterative prototypes, with user testing--are your insurance that field test will confirm that you've done a good job.

How the Prototyping Cycle Works

Ideas are more useful when they come from early in the cycle
Figure 2
Where ideas come from
The software development cycle, iterating through stages and repeating
Figure 3
The iterative development cycle

Product development tends to move along four axes (see Figure 2):

  • Ideas to product
  • Low technology to high technology
  • Drawings to code
  • Appearance and behavior to performance

Ideas for improving the product, whether in terms of requirements, understanding the users, or designing a solution, tend to occur within the early phases along each axis. Proper use of prototyping can help keep the development effort focused on those early phases until the solution is well defined.

The problem is not a lack of creativity in the later phases of development. Rather, the problem is that the cost of change increases exponentially with the passage of time.

The iterative development cycle is a spiral that passes repeatedly through four phases (see Figure 3):

  • Plan. In this phase, you are trying to understand your users and their needs, as well as how you want to address those needs.
  • Implement. During implementation, you build a prototype to test the solutions you developed during the planning phase.
  • Measure. Now it's time to see how users react. How long does it take them to understand your solution? How long does it take them to do their work? What problems do they encounter? It is important to understand that measurements must be both objective and subjective; time on task is important, but if a user takes more time because the tools enable him or her to do a superior job, then you must be able to weigh improved quality of work against increased time on task.
  • Learn. This is the analysis phase, where you decide which parts of your prototype are doing well, and which parts are not.

    There can be many reasons that a prototype does not achieve its goals. Sometimes these will be implementation issues, but more often the problem will be an incorrect or insufficient understanding of the users or the work they need to accomplish. As you make the transition back to planning, this fresh look--from a real user's perspective, derived from user testing in the measurement phase--is essential!

The measure and learn phases need as much time as planning and implementation; for this to be feasible, the plan and implement phases must be done very quickly--hours and days, not weeks and months. This is where prototyping becomes invaluable.

Prototyping must be iterative, and to work well, must support rapid creation and modification of prototypes. The advantages of rapid prototyping include:

  • Fast cycles, with little or no code development
  • Early visualization of the product
  • Crisp definition of requirements
  • Early user testing
  • Enhanced communication within the development organization
  • Enhanced feedback to users.

As the development group cycles through the phases, iterations will tend to become longer; but at the same time, the team is developing a fuller understanding of users and their requirements, and coming closer to a finished product. Also, as you iterate, portions of the prototype from previous cycles can be converted to real, working code. Iteration ends when you are satisfied (based on the measurement and learning phases) that the product is finished.

The temptation always exists to set a fixed time to end the prototyping cycle. Team leaders should resist as long as possible. Imposing a time limit implies that you have learned all that you need to know, and that you have the resources to build a solution. How can you know this if you are still learning about the problem? And how can you expect to sell a product if you haven't done enough user testing to be certain that you have a solution?

Setting Your Focus

Prototypes can take many forms--products, metaphors, storyboards, painted screens, animated screens, and programs.

High-technology prototypes focus on presentation and GUI. While useful in this context, they tend to focus developers on these issues far too early in the design process. They also tend to foster an image of prototype-as-product, and carry an implicit assumption that the design is already complete. Instead, prototyping should be a joint exploration of function between developers and users. Prototypes should not be used to develop "look and feel"--this grows with the prototype, as we shall see.

Low-technology prototyping offers an alternative to complex and expensive prototyping tools. Low technology prototypes are created with paper, scissors, pens or crayons, and manual animation. You can fake almost any user interface technique (using the Wizard of Oz technique: "Pay no attention to the man behind the screen."), and you can include the world outside the prototype. This makes it easy to test multiple ideas very quickly. Also, leaving detail out of these early prototypes forces evaluators to focus on the overall picture.

High-technology prototyping becomes more important as the problem becomes better understood, and the development cycle begins its transition to implementation. As prototyping tools evolve to be able to generate working code, they will enable faster iterations--and inclusion of late-breaking ideas--in later cycles of development.

We will look at prototyping technology in a future article. Right now, the important issue is what goes into the prototype--how it implements functions, and how it feeds results back to the user; how much information is explicit, and how much depends on implicit knowledge. Collectively, these fall into the categories of affordances and mental models, and these are the real building-blocks of "look and feel."

  • Affordances are things that communicate information about the operation of a product. They are particularly important when a person is learning to use a product, but it's also true that a user can't use a product without the subtle, "back-channel" communication that indicates what the product is prepared to do, how it's doing it, what it does or doesn't understand, and what needs to happen next.

    Affordances are often culture-specific. As we move from text interfaces to include graphics and sound, "national language support" needs to be expanded to "national culture support." Icons, gestures, and sounds may also need to be translated.

  • Mental models are the private images that our minds use to interpret the world. Developers will create a "design model" that encompasses the structure and concepts behind the scenes. The prototype will present a "system model" that consists of cues, responses, feedback, and other mechanisms that communicate the design to the user.

    Users, on the other hand, do not have access to either of these mental models, and will of necessity develop their own. This "user model" is incomplete, continually evolving, and only occasionally (and coincidentally) related to either of the development models. Why?

    Users bring their own preconceptions and perceptual abilities to your product. They use only as much of your product as they need to, and learn only from what they use. What they learn, they learn through trial and error, or (more politely) "cause and effect." Unpredictable or inconsistent behavior in your product will distort or even destroy a user's mental model. And remember, this user model begins to form before the user begins using your product.

    If you can shape the user's mental model toward convergence with your system model, then the user's expectations will be more in line with the behavior of your product. Better still, though, is to mold your system model to fit the user's preconceived model.

You want to deliver the essence of your product to your users. Minimalism is a virtue in interface design--no excess verbiage, no pretty pictures, no nuisance panels. If it doesn't communicate currently available function or the current state of the user's data, get rid of it. Anticipate errors, and prevent them if at all possible. The user's every thought and gesture should be spent on the task, and never on housekeeping. And the user decides what to do next, not the product. The user is always right.

The "Invisibility Rule" summarizes this approach: If the user notices your design, then your design has failed. Or, to paraphrase Albert Einstein, your system model should be as simple as it needs to be, but no simpler.

Documentation--which is an integral part of both prototype and product--needs to contribute to and reinforce this simplicity. Beyond being task-oriented, it must lead the user through each task simply, with an emphasis on the user accomplishing real work while learning the product. If it is too hard to explain, then the product needs to be redesigned.

Consistency is a sticky issue, largely because it has been set up as a standard, without much regard for the underlying applications or what consistency really means. The real issues are affordances and mental models. If consistency provides an affordance by mimicking a known, similar behavior, one that also produces results similar to what you want, then use it. If it fits within the user's mental model, then use it. Otherwise, consistency becomes forced consistency, and leads to unpredictable behavior that destroys the user's mental model. At all costs, avoid forced consistency.

How do you evaluate affordances? How do you know that your design and system models mesh smoothly with your users' tasks and mental model? How do you know that your product solves your users' problems, and that it does so in a manner that lets them focus on their work, without even noticing your design? What process provides the best assurance of a world-class design, and helps avoid the cost and delay of rework? Prototype, test, and repeat. There is no other answer.


  1. Apple Computer, Inc., Human Interface Guidelines: The Apple Desktop Interface. Reading, MA: Addison-Wesley, 1987.
  2. Apple Computer, Inc., Macintosh User-Centered Design. Cupertino, CA: Apple Developer University.
  3. Evenson, S., Rheinfrank, J., Sutherland, D., Welker, K., and Wulff, W., Innovating interfaces: Concept creation and visualization. Worthington, OH: Fitch Richardson Smith, 1992. (CHI '92 tutorial)
  4. Muñoz, R., Miller-Jacobs, H. H., Spool, J. M., and Verplank, W., "In Search of the Ideal Prototype," CHI '92 Conference Proceedings. Reading, MA: Addison-Wesley, 1992.
  5. Norman, Donald A., The Design of Everyday Things. New York: Doubleday Currency, 1990. (Originally published by Basic Books, under the title The Psychology of Everyday Things.)
  6. Spool, Jared M., Product Usability Survival Techniques. New York: ACM, 1992. (CHI '92 tutorial)
  7. Tognazzini, Bruce, Tog on Interface. Reading, MA: Addison-Wesley, 1992.
  8. Wagner, Annette, and Tognazzini, Bruce, Designing Graphical Interfaces in the Real World. New York: ACM, 1992. (CHI '92 tutorial)

Copyright © 1992 by IBM Corporation. Published with permission.