2. The Coming of Age of Constructive Design Research
Constructive design research has a long history. Its roots go back to the 1950s and 1960s in Europe and North America, but most of its research practices have a more recent origin. Most constructive research programs have their immediate roots in the 1990s in usability and user experience studies. Current research goes beyond usercentered design, however. Constructive design researchers borrow methods from design practice, value practice just as much as theory, prefer open conceptual frameworks that encourage creativity, and excavate design tradition for inspiration. Constructive design research is not a unified field, but works in conjunction with many types of institutions, such as the natural and the social sciences, contemporary art, business, and several technologies.
Most early writings on design research are built on rationalistic assumptions. Perhaps the most ambitious call for basing design on rationalistic thinking came from Herbert Simon, who proposed basing design on systems and operations analysis. For him, design became an exercise in mathematics, and the task of design research was to describe the natural and human rationalities that govern it. 1 Such rationalistic assumptions were particularly strong in the 1950s and 1960s. At that time, the studio model of the Bauhaus became too limited to respond to the demands of increasingly complex and growing industries.
1.Simon (1996, pp. 2–9). Perhaps characteristically, Simon’s ideas remained open in The Sciences of the Artificial. The best way to understand this book is perhaps to see it as an opening into a new domain, a prolegomena, to borrow the words of his colleague Jim March (1978). His notes on design remained so abstract that there is simply no way to know where they would have led, should he have built a complete research program based on them. Simon’s biographer, Hunter Crowther-Heyck (2005, p. 176), said that characterizing Simon’s work as a collection of prolegomena is “uncharitable, but not entirely inaccurate.” For more recent analyses of systems thinking in design, see Sato’s notes in his 2009 paper (pp. 32–34), and Forlizzi (forthcoming).
However, rationalistic methods failed to get much of a following in design, probably because they barely tackled the human and artistic faces of design—for example, the “design methods movement,” which bloomed for a few years in the 1960s mainly in the United States and England. 2 Writing at the end of the 1990s, Swedish designer Henrik Gedenryd noted how this movement built on operations research and systems theory, trying to lay the foundations for design on
2.Archer (1968), Jones (1992, but first edition in 1970), Alexander (1968), Simon (1996) and Lawson (1980). The most influential of these writers is probably Jones, whose text still appears, even at doctoral-level research, as the definitive document of what design research is about. His rational approach on design was based on the notion of “complexity,” claiming that new problems like urban traffic systems required methods and processes that could not be found from existing design traditions. Jones’ (1970) solution was a matrix in which methods were classified by their place in the design process. The process was one in which a problem was first discovered and then transformed into design ideas that were then broken into subproblems. These subproblems were to be solved one by one and then combined into alternative designs that were then evaluated to find the best solution to the problem.
logic, rationality, abstraction, and rigorous principles. It portrays, or rather prescribes, design as an orderly, stringent procedure which systematically collects information, establishes objectives, and computes the design solution, following the principles of logical deduction and mathematical optimization models…. This view is still very much alive, and there is a good reason to believe that this won’t change for a long time.
However, discontent with this approach is widespread and quite old, even though no substantive replacement has yet been proposed. Experience from design practice and from studies of authentic design processes has consistently been that not only don’t designers work as design methodology says they should, it is alsoa well established fact that to do design in the prescribed manner just doesn’t work.3
3.Gedenryd (1998, p. 1).
The leading rationalists like J.C. Jones and the mathematician-turned-architect Christopher Alexander quickly changed their earlier teachings about research. By the end of the 1960s, Alexander’s advice was to “forget the whole thing,” and Jones turned to music and poetry. In the end, they had encouraged designers to experiment with art. 4
4.The now (in)famous new introduction to Jones’ (1992) Design Methods not only said that the rationalistic program failed but also changed its form: it referred mostly to artists like John Cage and consisted of drawings, poetry, and fictional narratives. His book Designing Designing (Jones, 1991) developed this radical approach further, telling designers to discard rationalism and focus on imagination instead.It is hard not to agree with his call, but it is also worth noting that his solution does not have to be followed, turning the very design process into a critique and even travesty of design or, by implication, design research. Here, the problem is the same as in science, but more general: scientists and social scientists sometimes turn to poetry and the performing arts in an effort to break the conventions of their craft. However, more often than not, their work is not on par with poets, performance artists, and dancers. It is far easier to behave as an artist than to be one.Similarly, Alexander (1971) said in an interview:
There is so little in what is called “design methods” that has anything useful to say about how to design buildings that I never even read the literature any more…. I would say forget it, forget the whole thing. Period. Until those people who talk about design methods are actually engaged in the problem of creating buildings and actually trying to create buildings, I wouldn’t give a penny for their efforts.
This statement should be taken with caution. One irony here is that Alexander was talking about architecture. In this field where every building is unique, the idea of the method is consequently slightly offbeat. Another problem lies in the notion of method. There are methods that work perfectly well, even though their design could not be reduced to a particular method. That rationalistic methods failed does not mean that every method will fail forever. It depends on how “methods” are understood and even more on what foundations they build upon (see Chapter 7). Finally, Alexander talks about methods, not research, and these are different things.However, Alexander points to an important issue — the connection between theory and doing. Given his background in mathematics, his interest in formalisms was understandable, but as the first ever to receive a PhD in architecture at Harvard, his theories most likely reflected his background more than his practice. Over time, the realities of practice won.The difficulty here is that his analysis equates a person with his practice in saying that only practitioners’ texts have value. This view does not take into account the social basis of architecture (or design). By his measure, a person like Kees Overbeeke has no value to design research because he is not a designer. This claim is blatantly false, and fails to take into account that Overbeeke’s work is necessary in expanding design. We introduce language in detail in Chapter 3 and Overbeeke’s work is essential to the welfare of one research program and contributes to design, even though his background is in psychology. Following Alexander’s blindness to the social background of a discipline would be plainly destructive.Coming from a similar background, Horst Rittel’s notion of “wicked problems” is indebted to Herbert Simon’s earlier work on the limits of rationality. This critique came from within the rationalist movement, and was a part of the paradigm change of the 1960s that paved the way to more philosophical criticisms of the 1980s. We will come back to these later in this chapter.It must be noted that the design methodology movement continued to inspire design research quietly, in particular in design schools in England, with the exception of design studies, in an attempt to understand designers’ thinking (Lawson, 1980 and Lawson, 2004; Cross, 2007; Visser, 2006). This field went into hibernation for two decades (see also Bayazit, 2004, p. 21).
As Peter Downton noted, the rationalistic movement left a legacy of many useful means for improving design, but its problems went deep. 5 The rationalistic mentality faced many external problems. The 1960s saw the opening of the space era and Lyndon B. Johnson’s Great Society, but is was also the high point of Branzi’s first modernism. Soon after, the West was on a course to a second modernism. Along came a shift to consumer society, a general mistrust in authority, an explosive growth and diversification of higher education, and an awareness of looming ecological crises. Despite increasingly sophisticated methods aimed at handling complexity, human, social, and ecological problems proved to be “wicked” and unsolvable by rationalistic methods. 6
5.Downton (2005, p. 35) noted how
Writings concerned with what design should be, have focused on attempting to improve the design process by devising a rational method…. such formulations (labeled as “Design Methods”) were accompanied by virtual guarantees that their use would banish irrational design and herald the dawn of the era of rationality. Without wishing to decry such attempts, examination and attempted use over four decades have made it clear that they were ambitious and even misguided…. It is hard, perhaps impossible, to cite a single example of a building or urban design produced through the rigorous and unsullied use of one of these methods. They have left a legacy of many useful strategies and tools that can be used in research for design. The desire to promote means, if not methods, for “improving design” remains alive, although tempered with world-weary awareness, if not cynicism, of post-postmodernism. [italics removed].
6.For wicked problems, see Rittel and Webber (1973).
In a sense, the design methods movement arrived at design when it was already too late. To claim that technical expertise somehow automatically makes the world better was hardly credible to people who had lived through Auschwitz and Vietnam.
The failure of the movement was more than a matter of changing mental landscape. The best known attempt to lay design on rational foundations was the Hochschule für Gestaltung in Ulm, Germany. Starting as New Bauhaus in 1953 with roots in art and design, by 1956 its agenda had turned to teaching teamwork, science, research, and social consciousness in a modernist spirit. 7 The Ulm school is typically seen as the first serious attempt at turning design into a science of planning. 8
7.For a recent review of teamwork and collaboration in design, see Poggenpohl (2009b, p. 139ff) who noted that collaboration has a long, though largely unwritten history in design and also reviews recent studies on managing information and communication as well as issues related to human dimensions of interdisciplinarity.
8.As Herbert Lindinger, himself trained in Ulm, noted in his introduction to a book he edited about Ulm in 1991. The school was established in 1952 as New Bauhaus. After 1956, the school first stressed teamwork, science, research, and multidisciplinary collaboration. From around 1958, scientists like Horst Rittel and Bruce Archer begun to formulate design methodology, and artistic extravagance gave way to scientific caution and value neutral design, both beliefs stemming from logical positivism. As Lindinger said, universal manifestos like “building a new culture” changed to working hypotheses, dubbing the years from 1958 to 1962 as years of “planning mania.” Soon, designers became a minority in the school. There was a crisis period that led to a search for balance between theory and practice around 1962 until about 1966, with Tomás Maldonado and Bruce Archer as leading lights (Lindinger, 1991, pp. 10–12).As this history suggests, the school’s position on theory and methodology was not consistent after 1956. As Michael Erlhoff (1991, p. 51) noted in the same collection, Ulm “took the case of modernity … back to the last phase of the Bauhaus, and carried abstraction forward into systematization. The HfG set out to be on the side of the modern age and found itself … subscribing to humanistic principles and so resisting the truth of its own modernity.”The point, quite simply, was that the modern tendency to see the world through abstract, scientific concepts may carry the promise of a rational society, but it also leads to the horrors of the twentieth century. People at Ulm may have learned their methodology from logical positivists, but this dilemma was something they learned from the Frankfurt School of philosophers, most notably Theodor Adorno. Bonsiepe (1999, p. 13) listed some of the influences of the Ulm School with the demurrer “if my memory does not fail me.” His list has a place for positivists, pragmatists, the Frankfurt School, and apparently the late Wittgenstein as well as systems theory, concrete art/constructivism, Abraham Moles’ aesthetics of information, and as he said, to a lesser degree, surrealism.The case for turning design into a science was never on solid ground but was strong enough to attract people like Reyner Banham (1991, pp. 58–59), for whom Ulm was like “a breath of painfully fresh air blowing down from the snowy Kuhberg” after London, where designers still believed in old shibboleths like “form follows function.” It was a place where one could take intellectual risks because every claim, no matter how outrageous, was subjected to intense research and debate.Andrea Branzi had the most notable alternative view of Ulm. For him, people working on the hill of Ulm were extraordinary artists who disguised themselves as ordinary artists (Branzi, 1988, p. 42). We come back to this argument in Chapter 6.
However, the Ulm experiment was short-lived. The long time head of the Ulm school, Tomás Maldonado, reflected on his experience 15 years after the school was closed. 9 For him, the main cause of failure was sticking to “the theoretical generalities of a ‘problem solving’ which did not go beyond a ‘discourse on method’ of Cartesian memory.”10 He wrote:
9.Here Maldonado (1972, p. 22) talked about Western rationalism in generic terms, but he captures its spirit perfectly.
What is really happening today is that men are being transformed into things so that it will be easier to administer them. Instead of working with men, one can work with schemes, numbers, and graphs that represent men. In that context, models became more important than the objects of the persons of which they were a mere replica. For many years now, the fetishism of models, especially in the fields of economics, politics, and military strategy, has typified the attitude of the late Enlightenment of the modern technocrats.
According to these people, perfection of the instructional and decision-making process is possible only if one succeeds in getting rid of all subjective interference with the construction and manipulation of the models used for obtaining that perfection.
By turning design into a science, one could get rid of “subjective interference” and pave the way to a world of plenty. Revolution would come by design, as Buckminster Fuller once prophesied (cf. Maldonado, 1972, pp. 27–29).
10.Maldonado (1984). This critique, somewhat paradoxically, also extends to art. In a recently republished paper Otl Aicher, the Bauhaus gave too much priority to art at the expense of engineering and science. It built on a Platonic idea, in which art was the means to achieve knowledge of the idea, spiritual, and abstract world that lies behind things we see. Aicher (2009, pp. 177, 181) asked:
is design an applied art manifested in the elements of square, triangle, and circle, or is it a discipline that derives its criteria from the task at hand; from function, production, and technology? and noted that this conflict remained unsolved at the Bauhaus “as long as the concept of art remained taboo, as long as an uncritical Platonism of pure form remained in force as a world principle.” His example of such Platonism was Rietveld’s chairs that “turned out to be nothing more than Mondrians for sitting, ineffectual art objects with the pretext of wanting to be useful.
At Ulm, the models were designers like the Eameses. As Aicher wrote, “designers like Charles Eames were the first to show what it meant to develop products on the basis of their purpose, material, and methods of manufacture — on the basis of their function,” rather than on the basis of geometry. “We all had good reasons to have reservations about the Bauhaus,” he concluded (Aicher, 2009, pp. 181–182). In contrast, at Ulm, “the objective was not to extend art into everyday life, to apply it. The objective was an anti-art, a work of civilization, a culture of civilization” (pp. 178, 180–181). This realization in its part paved the way for user-centered design four decades later.
The driving force behind our curiosity, of our studies and of our theoretical effort consisted of our desire to furnish a solid methodological basis for design. One must admit that such a pretext was very ambitious: one attempted to force a change in the field of design which was very similar to the process which turned alchemy into chemistry. But our attempt was, as we know now, premature.11
11.Maldonado (1984, p. 5). In this text, Maldonado also refers to Herbert Simon’s “limited rationality” thesis. We have omitted this sentence, because we see it as another attempt to salvage rationalistic thinking and its “Cartesian” view of the world as a place of individual entities that can only be known by organizing painstaking observations into more abstract, meaningful entities.Several intellectual movements have argued that Cartesian thinking presupposes those very things that make it possible in the first place. For example, we relate to things around us not only through ideas in our minds, but also with our bodies, and more often than not with other people. If one accepts the Cartesian worldview, many things are no longer considered. Out goes working with the body and hands; out goes sketching and prototyping; out goes basing design on social meanings; and out go dreams, beliefs, and emotions. Also no longer considered are working with people, studies of non-logical things like religion, integrating non-analytic tasks done by hand, and sketchy design processes designed for flexibility. For design research, this kind of rationalism provides a particularly narrow focus. (See also Maldonado, 1991).
Indeed, how can anyone “solve” the problem of climate change through design? Modesty was in demand, given the scale of emerging environmental and social problems. Solving known problems rationally is a part of design, but can hardly provide anything like a solid foundation for it. Ultimately, the problem is one of creativity and critique, imagining something better than what exists, not the lack of rational justification (see Figure 2.1).
B978012385502200002X/f02-01-9780123855022.jpg is missing
Figure 2.1
Rationalism faces post-Cartesian philosophy.
Small wonder that Gedenryd’s conclusion about the usefulness of self-proclaimed rationalistic design processes was grim. 12 When he was writing his thesis, he was able to build not only on the disappointment of the rationalist program, but also on the rich debate of the limits of rationalism. For example, the Berkeley-based phenomenologist Hubert Dreyfus analyzed the assumptions at work in artificial intelligence. Despite their prowess in calculation, even the most sophisticated computers could do a few things any child could, such as speaking, understanding ambiguity, or walking. Several computer scientists followed in the footsteps of Dreyfus’ critique. 13 The 1980s was a decade when most humanities and social sciences turned to French social theory and philosophy that further eroded belief in rationalism. 14 In the 1990s, Kees Dorst and Henrik Gedenryd followed Donald Schön’s pragmatist perspective, arguing for seeing designers as sense-making beings rather than problem solvers. 15
12.Dreyfus (1972, 1993, 2001).
14.These criticisms pointed out that rationalism has limits that explained a good deal of its elegance. For example, when one does not have to deal with the body, or anything social, it is far easier to imagine people making rational decisions and, as important, obeying them. From a post-Cartesian perspective, rationalism was only possible because something in our lives made it possible: language, social action, our ability to talk and act in an orderly manner. From this perspective, rationalism is but a special case of a far more general way of thinking about humans. Rationalism works in the community when it believes in it, and has the same idea of what is relevant and what is not. This is the case in some closed, isolated communities, and certainly in some academic groups, but rarely anywhere else.
Also, there were several well-spoken critics in the field coming from the social sciences and the human-centered corner of computer science. For example, Lucy Suchman studied how people use copy machines at Palo Alto Research Center. She demonstrated how rational reasoning has little to do with how people actually use the machines, and urged designers to take social action seriously. 16 Participatory designers and critical information systems researchers borrowed from Ludwig Wittgenstein’s philosophy to understand how ordinary language works at the background in any system. 17 Groups at the University of Toronto, Stanford, Carnegie Mellon, MIT, and many other American universities proved that technological research can be done without complex rationalistic methodology on pragmatic grounds.
16.Suchman (1987). Another important writer who pointed out the importance of looking at social action was Edward Hutchins (1996), who introduced the notion “cognition in the wild,” referring to the need to study thinking in real settings. Activity theorists added that there also was a need to look at historical background in any attempt to understand action (Kuutti, 1996).

2.1. The User-Centered Turn: Searching the Middle Way

After the demise of the design methods movement, designers turned to the behavioral and social sciences in their search to find new beginnings. In several places, user-centered design gained a foothold. 18 In terms used by Nelson and Stolterman, the rationalists were idealists in their search for truth. When this search was over, the next place to look at was the real world. 19
18.Focusing on humans is represented in many ways. The design program’s Web site at Stanford claims that the idea of human-centered design was invented at Stanford when John Arnold built the design program in the mid-1950s. This may be true, but one should also remember that in the United States, designers like Dreyfuss and Teague had already been working with the military for a long time while putting humans into the middle of design work. In Europe, the Ulm school was built on the same idea, and ISCID was already working to make humans the center point of its definition of design.When computers became design material in the 1990s, humans became “users,” which suggests that they are seen as parts of technical systems (see Bannon, 1991). Seen against the history of design, this was an extraordinary semantic reduction. At its narrowest, people came to be seen as barely more than biological information processing units in technical systems. When reading, say, ICSID’s definition of industrial design, one is struck by the discordance to its humanistic spirit.
This step was not radical, given designers’ self-image. Designers have long seen themselves as speakers for people in the industry. The global organization for industrial designers, ICSID, defines the basic ethos of the occupation as follows:
Design is a creative activity whose aim is to establish the multi-faceted qualities of objects, processes, services and their systems in whole life cycles. Therefore, design is the central factor of innovative humanization of technologies and the crucial factor of cultural and economic exchange.20
20.ICSID, icsid.org/about/about/articles31.htm, retrieved October 22, 2009. The definition goes back to the turn of the 1950s and 1960s and is based on Tomás Maldonado’s thinking. See Anceschi and Botta (2009, p. 23), and note 5 in their text.Maldonado had his predecessors. Ulm’s first principal, Max Bill (2009), was trained in the Bauhaus and used Bauhausian language when writing about design as a human discipline in 1954:
the task of the artist is not to express himself and his feelings in a subjective way; it is to create harmonious objects that will serve people…. artists, as part of their responsibility for human culture, have to grapple with the problems of mass production…. the basis of all production should be the unity of functions, including the aesthetic functions of an object … and the aim of all production should be to satisfy people's needs and aspirations.
For Maldonado and his colleagues in Ulm, the way forward was the then fashionable information theory and linguistics. Otl Aicher tells how one of the first books he acquired for the Ulm School’s library was Charles Morris’ Sign Theory. Its classification of information into semantics, syntax, and pragmatics became a theoretical foundation for him and for Maldonado. For Aicher, this classification revealed that the focus of design must be semantics, that is, communication, not the syntax of elementary geometry then prevalent in avant-garde graphic design and photography. For instance, in photography this led to a study of photojournalists like Felix H. Mann, Stephan Laurant, and Robert Capa whose job was communication, not art.As Aicher related (2009, pp. 183–185), studies of mathematical logic led him and Maldonado to realize that any answer they wanted to get to their questions depended on the method: “the spirit was a method, but not a substance. We experience the order of the world as the order of thought, as information.”
As this definition shows, designers see themselves as proponents of people in the industry. This self-image has more than a grain of truth, especially when designers are compared to engineers. 21 This self-image has deep historical roots. The importance of studying people was first forcefully introduced to design in post-war America, largely through practitioners like Henry Dreyfuss, one of the founding fathers of design ergonomics. In particular, Dreyfuss’ books Designing for People and Measures of Man influenced generations of designers. 22
21.For some of the paradoxes here, see Redström (2006).
22.Tilley and Dreyfuss’ (2002) The Measure of Man in 1959 was a landmark that described the dimensions of Joe and Josephine, two average Americans. The origins of ergonomics — or human factors, as ergonomics is also called in the United States — in America are in the war. As Russell Flinchum (1997, pp. 78, 84) noted, the exact history of how ergonomics came to be established in design is probably lost in old classified materials. Ergonomics in design was largely codified by Alvin Tilley, an engineer working in Dreyfuss’ design firm. Tilley used a variety of sources creatively in The Measure of Man (Tilley and Dreyfuss, 2002), including military sources, as well as material from Manhattan’s fashion industry (Flinchum 1997, p. 87). As Flinchum also noted, the characters of Joe and Josephine were meant to be used as guidelines in preliminary investigations in design; they were never meant to be used as exact descriptions of humans (Flinchum 1997, p. 175).
However, it was in the 1990s that industrial design and the emerging interaction design went through the so-called user-centered turn. The key idea was that everyone has expertise of some kind and, hence, can inspire design. In retrospect, the most important ideas from this time built on usability and user-centered design.
Usability fell on the fertile ground of ergonomics and spread quickly. Its roots go back to the early 1980s, with companies such as Digital Equipment Corporation and IBM at the forefront. Early on, usability was divided into two camps: practical engineers and researchers whose backgrounds were usually in cognitive psychology. 23 Usability laboratories popped up in hi-tech companies and universities in North America, Japan, and Europe, and the academic community grew rapidly. Practitioners built on books like Usability Engineering by Jacob Nielsen, while the more academic field was reading books like Don Norman’s The Design of Everyday Things. 24
24.Nielsen (1993), Norman (1998), with the original in 1988.
The problem with usability was that, while it did help to manage design problems with increasingly complex information technologies, it did little to inform design about the “context” — the environment in which some piece of design was meant to do its work. The image of a human being was that of an information processor, a cybernetic servomechanism. 25 Context was but a variable in these mechanisms. New, more open methods were developed, and they came from ethnography.
25.For a good analysis of where this worldview came from in computer science and psychology, see Crowther-Heyck’s (2005, pp. 184–274) analysis of Herbert Simon and the early stages of artificial intelligence in America.Few reliable sources exist about Japanese companies’ user-centric practices from the 1970s and 1980s, but anecdotes reveal that they were in the frontline with the Europeans and the Americans. For instance, John Thackara (1998, p. 20) admired Sharp’s “humanware design” in the 1980s, telling how the company anchored its practice in it and reversed the traditional production-led Western ways in which design attempts to fit product specifications to match factories and laboratories. Instead, “Sharp employs sociologists to study how people live and behave, and then plans products to fill the gaps they discover…. new technology is used to create when consumers are discovered to ‘want,’” he wrote.
The design industry started to hire ethnographers in the 1970s, first in the Midwest and the Chicago area and slightly later in California. 26 The best known pioneers were Rick Robinson working for Jay Doblin and later E-Lab, and its marketing-oriented rival Cheskin. Interval Research at Stanford, funded by Microsoft’s Paul Allen, hired John Hughes and Bonnie Johnson to teach fieldwork. Several anthropologists were hired by major companies in the 1990s, including Apple (1994) and Intel (1996). Another inspiration was fieldwork done in design firms like IDEO and Fitch. These were quick and rough ethnographies done very early in the design process for inspiration and provided a vision that worked as “glue” in long and arduous product development processes. Yet another American precursor was Xerox PARC, where design was infused with ethnographic techniques, ethnomethodology, and conversation analysis. 27 Through PARC, ethnomethodology influenced a field called “computer-supported collaborative work” (CSCW). The aims of much of this work were summarized by Peggy Szymanski and Jack Whalen:
Plainly, as social scientists these researchers were committed to understanding the fundamentally socio-cultural organization of human reasoning and action …. moreover, these researchers were equally committed to naturalistic observation of that action — to leaving the highly controlled environment of the laboratory so that what humans did and how they did it could be studied in real-world habitats and settings, under ordinary, everyday conditions.28
In Europe, an important inspiration was Scandinavian participatory design, even though its radical political ideology was lost when it spread to industry. Although its direct influence was not felt much in design beyond the borders of Scandinavia, it had a degree of impact in software development and later in design in the United States. 29 It also had limited impact in art and design schools. Still, in retrospect, it managed to do two things typical to contemporary design research: working with people using mock-ups. 30
29.For a history of the early years of participatory design in the United States, see Greenbaum (2009). More history can be found in Bannon (2009). Obstacles to participatory techniques in organizations were mapped by Grudin (2009).
30.For participatory design, see Ehn (1988a), Iversen (2005) and Johansson (2005). For contextual design, see Beyer and Holtzblatt (1998). For recent work in combining anthropology and design, see Halse et al. (2010). We will come back to participatory design in chapter 5 and chapter 7.

Eureka
Fieldwork Leads to an Information System
Jack Whalen
How can you design an information system that enables a firm’s employees to easily share their practical knowledge, and then put this knowledge to use each and every day to solve their most vexing problems? (See Figure 2.2.)
B978012385502200002X/f02-02-9780123855022.jpg is missing
Figure 2.2
Observing technicians at work in Eureka project.
Most companies have tackled this problem by brute force, building massive repositories of their reports, presentations, and other officially authorized documents that they hope contain enough useful knowledge to justify the effort, or by placing their faith in artificial intelligence, designing expert systems that basically try to capture that same authorized knowledge in a box. Yet everyone recognizes that much of any organization’s truly valuable knowledge, its essential intellectual capital, is found in the undocumented ideas, unauthorized inventions and insights, and practicable know-how of its members. Most of this knowledge is embodied in the employee’s everyday work practice, commonly shared through bits of conversations and stories among small circles of colleagues and work groups, with members filling in the blanks from their own experience.
Researchers at Xerox’s renowned Palo Alto Research Center (PARC) came face to face with this reality only after they first took the artificial intelligence (AI) route, designing a sophisticated expert system for the company’s field service technicians to use when solving problems with customers’ copiers and printers. Its knowledge base was everything that was known about the machines — everything in “the book.” But the researchers soon discovered, after going into the field and observing technicians as they went about their daily rounds, that technicians often had to devise solutions to problems for which “the book” had no answer — what you could call “the black arts of machine repair.” A way to share this kind of knowledge throughout their community — an information system designed to work like those stories and conversations, and managed by the community itself — is what technicians needed most.
And so together the technicians and PARC researchers co-designed a peer-to-peer system for sharing previously undocumented solutions to machine problems that are invented by technicians around the world, and named it Eureka. From the very start, Eureka saved the company an estimated $20M annually and continues to do so, with Xerox being named “Knowledge Company of the Year” by KMWorld Magazine (and garnering several other IT and management awards) as a result (see Figure 2.3).
B978012385502200002X/f02-03-9780123855022.jpg is missing
Figure 2.3
Research and design process in Eureka.
From such humble beginnings, the field has grown over the past few years into a community of industrial ethnographers sizeable enough to run an annual international conference, Ethnographic Praxis in Industry (EPIC). As its founder Ken Anderson explained, it was designed mainly to share learning between practitioners of design ethnography. Still, it also sought academic approval from the American Anthropological Association to make it more than a business conference where consultants run through their company portfolios (Figure 2.4). 31
31.Ken Anderson and Scott Mainwaring to Koskinen, August 19, 2010, at Hillsboro, Oregon.
B978012385502200002X/f02-04-9780123855022.jpg is missing
Figure 2.4
Why study people? It is not difficult and provides better results than thinking.
However, user-centered design had its problems too. Ethnography mainly focused on the early stages of design, and usability at its very end, which limited their usefulness. User-centered design was software-oriented in its tone, and slowly spread to other fields of design. Both were largely seen as imports from sociology, anthropology, and psychology. They were also seen as research rather than design practices. Also, if stretched to a prophecy, user-centered design fails: as Roberto Verganti argued, most products on the market are designed without much user research. 33 For reasons like these, user-centered design failed to attract a following, especially among more artistically oriented designers.
33.Verganti (2009). One standard complaint about user-centered design is that it leads to unimaginative and conservative design. Although this is only a part of the story, there certainly is a grain of truth in this criticism. However, this criticism has its faults too: there are many examples of short-sighted, designer-driven design that has led to rubbish, and there are better ways to judge how effective a design approach is than by looking at traditional products like coffee pots and sofas. See Verganti (2009) for a defense of designer-driven design.
The outcome of this work was a series of fieldwork techniques that became popular in the second half of the 1990s. American interaction designers also created a blend of analytic and communication techniques, such as “personas.” These are constructed, detailed descriptions of individual characters done to both highlight research results and to encourage developers to implement the design team’s design. Through scenarios, designers study the viability of these concepts in different future situations. 32
32.The main statement of personas is in Cooper (1999). John Carroll has edited and written several books about scenarios (see especially Carroll, 2000).
For good reasons, both usability and user-centered design are alive and well today. In particular, they placed people into the middle of design and gave credibility to designers’ claims that they are the spokespersons of people in production. It also produced many successful designs, and provided design researchers ways to publish their research.

2.2. Beyond the User

Despite these limitations, user-centered design created powerful tools for understanding people and creating designs that work. However, it was just as obvious that it was not able to respond to many interests coming from the more traditional design world. User-centered design methods may have helped to explore context for inspiration, but it left too many important sources of imagination in design unused.
Constructive design researchers have had good reasons to go back to contemporary art and design in search of more design-specific methods and ways of working (see Figure 2.5). The past15 years have seen a proliferation of openings that not only build on user-centered design, but also go beyond it. Several research groups have begun to address the problem of creativity with methodic, conceptual, technological, and artistic means. 34
B978012385502200002X/f02-05-9780123855022.jpg is missing
Figure 2.5
Making people imagine is a problem in constructive design research.

2.2.1. Design Practice Provides Methods

One push beyond the user was methodic. The 1990s and 2000s saw the growth of “generative” research methods that put design practice at the core of the research process. These design-inspired methodologies include experience prototypes, design games, and many types of traditional design tools such as collages, mood boards, storyboards, pastiche scenarios, scenarios, “personas,” and various types of role-plays. 35 There is no shortage of such methods: Froukje Sleeswijk Visser listed 44 user-centered methods in her doctoral thesis at Delft, and IDEO introduced a pack of cards having 52 methods (see Figure 2.6). 36
36.Sleeswijk Visser (2009, p. 63).
B978012385502200002X/f02-06-9780123855022.jpg is missing
Figure 2.6
IDEO methods cards. These cards describe design methods in words on one page. The other side gives an illustration of the method.
One striking feature of much of this work is the speed at which it has gained influence and has been adopted by its audience even beyond design. In the computer industry, scenarios and personas have become mainstream, while in industrial design, cultural probes, Make Tools, and action research have spread fast. 37 These methods have been quickly adapted to a wide variety of design work, often with a limited connection to the intentions of the original work. 38 Still, they have given designers ways to research issues like user experience. They also help to open the design process to multiple stakeholders. 39
39For example, Sanders (2006).

2.2.2. Turn to Technology

Another important concept that has pushed design research beyond user studies can be loosely called the “sandbox culture.” This is similar to engineering in Thomas Edison’s Menlo Park, or in the hacker culture of Silicon Valley in the 1970s. One can, as engineers at the University of Toronto, turn a (computer) mouse into a door sensor without going into the physics of sensors. The modus operandi of the most successful design firm in the world over the past two decades, IDEO, has been characterized as “technology brokering”: finding problems and solving them by finding answers by exploring technology creatively through engineering imagination, not scientifically. 40
The most famous sandbox culture existed at MIT’s Media Lab under the leadership of Nicholas Negroponte, where the old scientific adage “publish or perish” became “demo or die.”41 Other sandbox cultures that served as exemplars for design researchers were Toronto, Carnegie Mellon, the Interactive Television Program at Tisch School of the Arts at New York University, and Stanford’s design program. 42 They showed that it is possible to do research with things at hand without complex justifications and theoretical grounds and just let imagination loose in the workshop. 43 This is typical of software design as well. 44 The prestige of these places has also given legitimacy to building new sandboxes in places like Technische Universiteit Eindhoven.
41.Our reference to “demo or die,” as well as attributing it to the MIT Media Lab under Nicholas Negroponte, is from Peter Lunenfeld (2000).
42.Stanford’s “d.school” is an informal name. The full correct name of the institute is Hasso Plattner Institute of Design at Stanford, after its principal source of funds.
43.For example, there exists human–computer interaction, and at least these “computings”: mobile, urban, social, physical, collective, ubiquitous, embedded, proactive, and wearable. In interaction design, there are also many “interactions”: tangible (Wensveen, 2004), interactive space (interactivespaces.net), aesthetic (Graves Petersen et al., 2004), rich (Frens, 2006a), intuitive (Lucero, 2009), kinesthetic, embodied (Dourish, 2002), emergent (Matthews et al., 2008), and resonant (Overbeeke et al., 2006).

MIT Media Lab
Maybe the best known sandbox has been the Media Lab at MIT in Cambridge, Massachusetts. It was created in 1985 with a mission to explore and develop media technologies. It had precursors in New York University, where Tisch School of the Arts had run an Interactive Telecommunications Program (ITP) under Red Burns since the early 1970s.
However, while ITP focused more on media content, and gradually grew into technology, MIT focused on technology from the beginning. Its mission was to explore and develop new media technologies and to conceive and illustrate new concepts by prototyping them. This is where Media Lab started, and this is where it still stands. Its moment of glory was probably during the second half of the 1990s when the IT industry exploded with the Web and soon after with mobile technology (see Figure 2.7).
B978012385502200002X/f02-07-9780123855022.jpg is missing
Figure 2.7
For a while, the Media Lab was one of the most closely followed research institutions in the world, as judged by the digital industries. Several other institutions were modeled after its example in Asia and Europe; the most famous of these was probably the short lived Interaction Design Institute Ivrea in Italy.
When one walks into the building in Massachusetts, there are no classrooms and corridors, only workspaces in which people sit in the middle of wires, sensors, circuits, computers, lights, and “old materials” of many sorts, most of them organized in open spaces where it is possible to walk around and try out the “old materials.”
Several famous concepts have been discovered in the Media Lab. Some of the most influential in the research world have been Hiroshi Ishii’s interactive ping pong table and his bottle interface for a music player.
From a constructive design research standpoint, the Media Lab well illustrates three points. First, doing is important for designers: one can create new worlds by doing. Second, design research needs design; design happens at the Media Lab, but it is not a priority. Duct tape creations are enough, because prototypes are used to illustrate technological, not design possibilities. Third, a focus on technology means that technological research comes before wrting. The Media Lab is famous for the prototypes it creaters.
The co-founder of the Media Lab, Nicholas Negroponte, is said to have replaced the old academic adage “publish or perish” with “demo or die.” (See Figure 2.8.)
B978012385502200002X/f02-08-9780123855022.jpg is missing
Figure 2.8
The main legacy of this culture is several research communities exploring new possibilities in information technology. For example, by now, there are conferences specializing in ubiquitous and pervasive computing and tangible interaction. For those constructive researchers who specialize in interactive technologies, these communities provide many types of publication possibilities. Also, by now, there are many design frameworks ranging from resonant interaction to rich and intuitive interaction. Chapter 4 presents some of these frameworks in detail.

2.2.3. Enter User Experience

In the 1990s, design researchers created many types of concepts that paved the way to constructive research. Important trailblazing work was done at IDEO and SonicRim, where Uday Dandavate, Liz Sanders, Leon Segal, Jane Fulton Suri, and Alison Black emphasized the role of emotions in experience and started to build the groundwork for empathic methodologies. 45 In Europe, the leader was probably Patrick Jordan at Philips, who claimed that design should build on pleasure rather than usability. 46 Influential studies like Maypole followed his lead, usually building on concepts like need. 47
46.For a push toward hedonic psychology — psychology of pleasure — in the 1990s, see Patrick Jordan’s (2000) work. For Maypole, see Mäkelä et al. (2000).
47.Maypole was a project funded by the European Union. Its aim was to study communication patterns in families to suggest new technologies. Participants were the Helsinki University of Technology (and through it, University of Art and Design Helsinki), IDEO Europe, Meru Research b.v., Netherlands Design Institute (which coordinated the project), Nokia Research Center, and the Center for Usability Research and Engineering.Maypole did field studies of communication behavior. Based on these studies, it developed scenarios and concepts, tested methods and tools, and built prototypes that were then studied in countries like Austria and Finland. For example, one study connected a digital camera to a laptop in a back bag, which immediately allowed it to capture and send images immediately. See Mäkelä et al. (2000). For Maypole, see cordis.europa.eu/esprit/src/25425.htm, retrieved September, 12, 2010; maypole.org; and meru.nl.
This hedonic and emotional movement was a useful correction to cognitive psychology, which had crept into design research through usability and design studies focusing on what designers know and how they think. 48 It remained individualistic. The key constructs of this movement were difficult to understand. It focused on measurable emotions at the expense of more finely tuned emotions like aesthetic feelings, which are crucial to design.
For reasons like this, the main conceptual innovation came to be user experience, which was open enough and avoided many of these problems. 49 It did not have unwanted connotations like the word “pleasure,” and was not contested like “aesthetics,” which has a history in aesthetics, art history, and philosophy. This concept has been so successful that leading universities, corporations, and design firms have built units to study user experience. Even the International Organization for Standardization (ISO) is trying to create a standard for user experience in industrial practice. Finally, pragmatist philosophy gave this concept credibility, depth, and openness. 50
49.For “user experience” in industry and universities, see Shedroff (2001), Forlizzi and Ford (2000) and Battarbee (2004). Theoretically, this notion is alternatively grounded in Dewey’s pragmatism (1980; see McCarthy and Wright, 2004), symbolic interactionism (Battarbee, 2004), ecological psychology (Djajadiningrat, 1998, pp. 29–61), or emotional psychology (Desmet, 2002).
50.Usually the main reference is John Dewey (1980) and especially his Art as Experience. Over the past few years, there has been more interest in William James, but references to Dewey still dominate research.

2.2.4. Design Tradition as Inspiration

Yet another push beyond user-centered design came from design. The key place was the Computer-Aided Design program in the Royal College of Art (RCA) in London. Its researchers explored new media in city space and alternative ways to design electronics. They explicitly built on art and design and had an agnostic tone when it came to science. 51
51.See Chapter 6.
For example, the main influence of the Presence Project, published in 2000, was an artistic movement called “situationism.”52 What came to be known as “critical design,” on the other hand, built on designers like Daniel Weil who had questioned the design conventions of electronics. 53 Critical design was also influenced by Italian controdesigners, and from the Dutch design concept Droog Design, which was also inspired partly by Italian design. 54
52.Presence Project (2001).
54.For Droog, see Ramakers (2002) and Ramakers and Bakker (1998); for radical designers, see Celant in Ambasz (1972). Bosoni (2001) provides a long-term perspective on discourse in Italy.
Today, many design researchers seek inspiration from art and design, 55 and many are also active debaters and curators. 56 The art and the design worlds are also converging commercially, with one-offs, limited editions, and prototypes becoming objects sold by major auction houses. 57 There are hundreds of books about designers’ sketches in bookstores, effectively representing designers as artists. Media celebrates designers much as it has celebrated artists. Also, there have been company research centers that have had artist-in-residence programs. 58
55.There is no shortage on literature that maps the relationship of art and design. For example, for a particularly knowledgeable analysis of the relationship between pop art and design, see Bocchietto (2008). A good recent example is Stefano Giovannoni’s work for Alessi (see Morozzi, 2008). A less consistent account on surrealism in design is Wood (2007).
56.For example, Alessandro Mendini works as an all-around cultural personality whose work is available in numerous designs, but is sometimes also exhibited as art (see Fiz, 2010), and Andrea Branzi continues to curate high-profile events in places such as Milan. For example, see Branzi’s Neues Europäisches Design, which he curated with François Burkhardt in Berlin in 1991, and more recent exhibitions of What Is Italian Design? The Seven Obsessions in Milan’s Triennale (Branzi, 2008).
57.For one-offs and prototypes, see Lovell (2009). See also Konstantin Grcic’s Design Real (2010), which commented on this tendency by showing ordinary industrial products in a gallery. Ordinary products may lack the mystique of one-offs and prototypes, but not functionality and elegance. For how craft can be treated as art, see Ramakers and Bakker (1998) and Holt and Skov (2008).
58.For PAIR, an artist-in-residence program at Xerox’s Palo Alto Research Center, see Harris (1999).

2.3. Between Engineering, Science, Design, and Art

This history has left a legacy to constructive design research, which lies on several foundations. A good deal of early design research was built on rationalistic models that in the beginning faced many kinds of political and scientific difficulties. Constructive design research has turned away from this foundation. Researchers seek inspiration from engineering as well as from the social sciences and design traditions. What it is doing is important: it is bringing design back to the heart of research.
By now, constructive design research has gained a degree of maturity and autonomy. There have been several milestones in this maturation. Methods like scenarios, personas, Make Tools, and the cultural probes played an important role in lowering designers’ entry into research. These methods have proved that many things in design practice can be turned into research methods fairly easily. After the end of the 1990s, conferences like Design+Emotion, Designing Pleasurable Products and Interfaces, and Nordes59 gave designers an opportunity to explore design-related topics with little gatekeeping from other disciplines. A few influential books have served as precedents; noteworthy are Anthony Dunne and Fiona Raby’s Design Noir and Dunne’s Hertzian Tales. Several dozen doctoral theses build directly on design rather than borrow methodologies and concepts from other disciplines (see Figure 2.9). 60
B978012385502200002X/f02-09-9780123855022.jpg is missing
Figure 2.9
The two-headed design researcher (homage to Henry Dreyfuss).
The development is uneven. The strongest institutions have taken leadership. Among universities, the most research-driven are well-resourced schools such as Politecnico di Milano, technical universities in Delft and Eindhoven, Carnegie Mellon University, and what was the University of Art and Design Helsinki (now a part of Aalto University). Among global companies, leaders included Intel, Microsoft, and Nokia, and some of the largest design firms like IDEO. 61 Among pioneers were Delft’s IO Studiolab, which combined several studios at the end of the 1990s, Smart Products Research Group in Helsinki’s UIAH, Philips’ visionary programs, and Intel’s anthropological fieldwork. 62
62.For Philips, see De Ruyter and Aarts (2010) and Aarts and Marzano (2003); for Intel, see Cefkin (2010).
Underneath this canopy, a good deal of the design world went on as before. However, the strongest schools and companies set examples for others to follow. Once they did the trailblazing, others found it easier to follow suit.
Although constructive design research is coming of age, this is only one part of the story. This research is typically multidisciplinary and takes place in institutions over which designers have little control. Constructive design researchers typically collaborate with sociologists, anthropologists, and computer scientists. In these research groups, design researchers are often junior partners who needed to follow the models of research from their more established colleagues. 63 Some consequences of this collaboration have left a mark on constructive design research. For example, experimental research became an almost unquestioned choice for constructive design researchers, especially in technical universities and the technology industry.
63.This dilemma, and a drift to the applied science model, was already discussed by Herbert Simon in The Sciences of the Artificial. Apparently because of prestige bestowed upon the sciences in years following World War II, leading engineering schools of that time were clearly opting for the science-based model, see Simon (1996, p. 111).
Design also juggles with the worlds of art and culture. Even designers who work with industry typically have one foot in art and culture, as in the famous case of Olivetti. Designers working for Olivetti in Ivrea, about 100 kilometers west of Milan, also continued living and working in Milan with company approval. This organization made it possible for designers like Ettore Sottsass and Michele de Lucchi to alternate between industrial work and Milan’s artistic and intellectual milieux. 64 Today, it is easy to see a similar balancing act in some researchers’ work coming from the RCA at Sheffield-Hallam University and several Scandinavian universities. These researchers sometimes work at the university and sometimes as independent designers and artists. They also mix these worlds in their work in various ways, especially in the ways in which they communicate their work through exhibitions rather than books. 65
64.Ambasz (1972), Branzi (1984). The Olivetti case is from Kicherer (1990, pp. 17, 25).
65.For example, see Freak Show. Strategies for (Dis)engagement in Design, an exhibition in the HelmRinderknecht Gallery in Berlin.
Constructive design research has managed to gain a degree of autonomy and recognition on its own, but it has to find its way through an environment that sets many standards for research. In research today, coalitions are a norm, not an exception. These coalitions tend to be strategic and temporary, usually lasting for only one project, and then disappear as parties move to other projects. 66 To flourish in this environment, constructive design researchers need methodological and theoretical flexibility.
66.Nowotny et al. (2008). We come back to this point at length in Chapter 3.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset