CHAPTER 5

Technologies of Input and Output

We don’t—can’t—interact with our information directly but only through the tools that we use. The tools influence not just the ways we interact with information but also our ability to apprehend it. Though we may not usually think of it so, paper is a tool (and the ability to produce paper is a technology). So too are leak-proof pens, paperclips, staplers, and filing cabinets with hanging folders. They’re all tools for managing paper-based information.

Even if some of us can contemplate a gradual obsolescence of the paper-based book in favor of a digital delivery of information via tablets, pads, and palmtops, we should all acknowledge the profound impact that the bound book, as a tool, has had on our ability to work with information.16 Contrast the use of the book with an earlier use of scrolls and it’s easy to believe that quantitative changes—e.g., it’s faster to leaf through the pages of a bound book than to roll and unroll a scroll—might add up to qualitative differences in our ability to work with and understand our information.

We interact with our digital information via a computer in one device or another. Computer-based tools—devices, gadgets, desktop and web-based applications, etc.—are having a profound impact on our interactions with digital information. The technologies behind these tools can be grouped in two general areas according to whether these support the output of information from the computer or the input of information to the computer.

5.1   TECHNOLOGIES OF OUTPUT

How do we get our information from a computer? Some of us can recall mainframe computing days when the results of a “run” were communicated via paper printout to a designated bin in the anteroom of the computer area. The teletype was a big improvement. Now we have display screens of ever better resolution for laptop and palmtop use. We can elect to get our textual information spoken to us (e.g., as we walk or drive) via voice synthesis.17

We may soon be looking at our information through glasses we wear either as a tiny separate display embedded within eyeglasses or as “augmenting” overlay to the sights and sounds of the physical world.18

If we’d rather not wear glasses, we might accessorize in other ways such as through a “watch-watch” as discussed in Chapter 3 (Part 1). The “watch-watch” was originally discussed in 200719 as a way to connect a palmtop in pocket or purse with an item many of us habitually wear in any case—a wristwatch. The watch then provides a display surface for much that is timely beyond the time of day. Who is calling us right now? What meetings do we have today? What’s the traffic like on the 520 bridge? How are our stocks doing?20

Given these new devices, some have enthused that “the end of the smartphone era is coming.”21 Others are more skeptical. Google glasses, for example, have been likened to Apple’s ill-fated Newton; that is, they may be years ahead of the supporting technology needed to make them mainstream.22

More likely is a future of several devices including a palmtop device with a long battery life kept in pocket or purse as a kind of data hub connecting the other accessories we wear to the Internet.

Also on the output side of our interactions with computing devices are animations and an ability to “zoom” into and out of our information. Zooming and other animations provide a powerful way to apprehend a large volume of information.23

If “today is the world” (courtesy of Google Earth, Bing Maps, etc.) then perhaps tomorrow is our own personal space of information (PSI). Suppose, for example, that a Steinberg-like map of our personal world were used to organize and provide access to all our information. We might zoom into a particular area—a project we are working on now, a special party we are planning—only to zoom out and back in again to another area—plans for ski trip over the weekend or for a summer vacation.

image

Figure 5.1: What if we could access our informational worlds via a customized, “zoomable,” 3-dimenensional version of Steinberg’s New Yorker-centric map of the world?24 From Steinberg: illustration for the cover of The New Yorker, 1976. Copyright © 2013 Condé Nast. Used with permission.

5.2   TECHNOLOGIES OF INPUT

In the mainframe days of computing, we may have given our information to the computer via keypunch cards25 that may themselves have been punched via keypunch.26 The keyboard, whether attached to a keypunch, teletype, terminal, or personal computer was our primary means of input for decades. Then came the mouse—mostly as complement to, not a replacement of, the keyboard. The promise of “keyboard-free” voice recognition seemed for decades to be a mirage on the horizon. But voice recognition continues to improve and may be good enough for use even in noisy environments—good enough at least for simple messages (e.g., “on my way home”).

Voice recognition (and handwriting recognition) are not the only means of input. The gaming industry has implemented gesture recognition (and a form of face recognition) “in the large” for big gestures.27 Gesture recognition is already being used as an aid in stroke rehabilitation,28 where it enables self-paced training by the patient as a complement to the sessions with a trained professional.

Now we are beginning to see technologies to support “in the small” gesturing and other touch-free ways of interacting with our digital information.29 Through eye-tracking technology30 we might communicate our changing focus with a glance. We might select with a wink.31

I met a salesperson recently who spends up to two days out of three on the road. His office is his car for much of his day. Seconds matter. If he can’t respond quickly to a request for a quote, the business may go elsewhere. His palmtop is his lifeline. Even as he drives from one meeting to the next, he “hears” text messages and email via voice synthesis. His response is converted from speech to a text or email message.

This is task switching to be sure.32 Even if hands and eyes remain on the road, attention is divided between tasks: 1. Driving (often in unfamiliar circumstances). 2. Responding to a client with a quote. 3. Possibly also ancillary tasks to look up information in order to give the quote. Potentially dangerous? Certainly. He might agree. But this is his reality. He considers, rightly or wrongly, that the risk is worth the savings in time.

Technologies of input and output are incorporated into efforts to support a more natural user interface (NUI).33 NUI promises—some might say threatens—to further separate us (some would say “liberate,” others “alienate”) from the immediate physical world in order to interact, typically via Internet connection, with digital information and people at long distances.

We have already grown accustomed to people talking into thin air as they walk along. Sometimes they really are crazy. But more often, they are talking to someone via headset. In a near future we may encounter people not only talking into thin air but also gesticulating and apparently seeing things that aren’t there.

When we’re the ones looking, talking, and gesticulating, we may realize whole new levels of freedom and power in our interactions with information. For much of my day, I still carry a laptop around with me. I once felt liberated by this arrangement—how nice to be able take my work with me from place to place rather than having to return to a desktop computer in my office! But more recently, I feel the burden—my backpack with my laptop inside is my constant companion and the thing I’m always worried I’ll forget. How liberating instead to carry my work with me via wearable accessories!

Still, even as these NUI technologies become mainstream, they don’t eliminate the need for PIM. On the contrary, technologies of input/output force with even greater urgency the need to manage. How, for example, to protect privacy in an age when we are openly, continuously interacting with our information, leaving a record that others might inspect?

If technologies of NUI don’t eliminate PIM, they will certainly transform our ways of doing it. Technologies of NUI and ubiquitous computing will complete a revolution already underway, thanks to our palmtop devices and a greatly improved connectivity of these devices to the Web. To the good, we’ll be able to work on our information anywhere via exchanges that are faster, easier, and more “natural” than those we have currently (and with no need to carry backpack or briefcase). To the not-so-good, as we realize these new abilities we are even more separated from our actual physical environment with consequences that range from the comical and crazy (in appearance) to the dangerous and deadly.

Technologies of NUI and their impacts on our practices of PIM are explored further in the context of the remaining chapters in Part 2.

16 See a short write-up on the history of the book: http://en.wikipedia.org/wiki/Book. For a humorous take on initial “user interface” challenges posed by the book when first introduced, see http://www.youtube.com/watch?v=pQHX-SjgQvQ.

17 http://en.wikipedia.org/wiki/Speech_synthesis.

18 http://en.wikipedia.org/wiki/Google_Glass; http://www.google.com/glass/start/. http://www.slate.com/blogs/future_tense/2012/11/28/
microsoft_augmented_reality_glasses_patent_rival_to_apple_google_glass.html
; see also Krevelen & Poelman, 2010 and http://www.businessinsider.com/this-is-what-apples-curved-glass-iwatch-might-look-like-2013-2.

19 Jones, 2007.

20 There was speculation not long ago that Apple had plans to realize a watch with such a configuration. Via Bluetooth (4.0), the iPod Nano as a wristwatch would serve as display for information pulled in from a nearby iPhone. See http://www.cultofmac.com/189414/will-apple-save-the-wristwatch/, and http://www.digitaltrends.com/apple/baseless-speculation-the-new-ipod-nano-will-be-a-wristwatch-for-the-iphone-5/. However, no more recent word on when or whether this will happen. Meanwhile, at least one other company is working on a wristwatch as display for an iPhone or Android (http://getpebble.com/).

21 Carlson, 2012, see also http://www.businessinsider.com/the-end-of-the-smartphone-era-is-coming-2012-11#ixzz2LNdIR2AP.

22 Chen, 2013 (http://qz.com/61145/google-glass-will-be-the-next-apple-newton/). See also, http://www.tgdaily.com/opinion-features/69806-google-glass-vs-apples-iwatch-when-will-screenphones-become-obsolete#bZzBDWW2Tr5XCcPE.99; and http://techpinions.com/apple-iwatch-vs-google-glasses-and-the-next-ui-battle/14497.

23 See The Economist, 2012b and also http://en.wikipedia.org/wiki/Zooming_user_interface.

24 Taken from http://theruralsite.blogspot.com/2011/12/winter-colds-new-yorker-state-of-mind.html.

25 http://en.wikipedia.org/wiki/Punched_card.

26 A special kind of typewriter (see for example, http://en.wikipedia.org/wiki/Keypunch#IBM_029_Card_Punch).

27 http://en.wikipedia.org/wiki/Gesture_recognition, http://en.wikipedia.org/wiki/PlayStation_Move, http://en.wikipedia.org/wiki/Kinec. And of course, touchscreens support gestures on a small scale (select this, delete that, make this view bigger/smaller, etc.).

28 http://www.onwindows.com/Articles/Kinect-aids-patient-stroke-rehabilitation/7506/Default.aspx.

29 Leap Freehand 3D Computer Interaction Without Gloves, http://research.microsoft.com/apps/video/default.aspx?id=173838&l=i; (Knight, 2012), What Comes After the Touch Screen? See also http://www.economist.com/node/21548486.

30 M. Bell et al., 2009; The Economist, 2012a. For eye-tracking, http://www.tobii.com/.

31 There are also “cyborg” technologies already being realized for a greater synthesis of the digital and the biological. and already here in research prototypes for people with special needs (people unable to see or hear; people with artificial limbs) are computing interfaces in support of control of an artificial limb or synthetic vision. http://en.wikipedia.org/wiki/Cyborg.

32 See Chapter 3 (Part 1), William Jones, 2012.

33 Bowman, McMahan, & Ragan, 2012, see also http://en.wikipedia.org/wiki/Natural_user_interface#Examples_of_
interfaces_commonly_referred_to_as_NUI
, http://whatis.techtarget.com/definition/natural-user-interface-NUI.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset