Chapter 12
Adapting Economic Principles for Information

Although it may be characterized as the “dismal science,” the field of economics offers another lens for understanding and applying information assets, and for optimizing information-related investments. Business, information, and technology leaders now must develop and optimize business models, applications, information products, and digital solutions with the economics of information in mind. Unfortunately, classic macro- and microeconomic principles were developed to better understand and improve the consumption and value of traditional goods and services, not information assets.

As we have explored throughout this book, information behaves differently than other assets. Moreover, the type of user increasingly consuming information is not human—rather it is an application, machine, or some kind of device. Technology doesn’t feel degrees of satisfaction—a key economic determinant. Nor does technology tire as people do. And information itself doesn’t get used up, like metal or wood. The unique qualities of information assets and the way they are produced and consumed compel us to re-examine economic principles and models, and adjust them as necessary for this new age asset.

For decades, and even centuries, classical economic principles have been a critical determinant for making ongoing investments and decisions about expected ROI of products and services. Decisions concerning the production, sizing, packaging, pricing, marketing, and purchasing of goods and services all hinge on an understood canon of economic conventions. However, in the current information-driven society and increasingly digitalized world, sentiments are shifting from the economics of tangible assets to the economics of information—“infonomics”—and other intangible assets.

I have relegated the examination of information economics toward the end of this book, not just because it is the “-nomics” in the “infonomics” portmanteau, but because it is opening a portal to an unexplored universe of ideas. Thought leaders like Barb Wixom and Erik Brynjolfsson at MIT have researched and taught on monetizing information and the information economy, and UCSD’s Jim Short is researching data valuation. Others like Gartner’s Andrew White, Alan Duncan, Alan Dayley, and Brian Lowans, along with practitioners including John Ladley, James Price, Tony Fisher, Thomas Redman, Kelle O’Neal, Danette McGilvray, Theresa Kushner, Maria Villar, and Rob Hillard, have been pushing the envelope on how to manage information more like an asset. But even as Urs Birchler, Monka Bütler, James Surowiecki, and Hal Varian have offered advanced thinking on information’s role in economic phenomena, to my knowledge there’s a void in exploring and practicably applying the array of standard economic principles and models to information itself.

University of Illinois professor Nolan Miller cautions about the term “economics of information” or “information economics”, noting they are monikers for an established field concerning information’s role in human behavior, especially decision making.1 “Infonomics” on the other hand is about how information behaves as an economic asset itself—the topic of this book, and I believe, a greenfield subdiscipline for economists and a future core competency for CDOs, other information professionals, and enterprise architects.

Applying Economic Concepts to Information

While economists are concerned about the economics of information, CDOs, CIOs, and enterprise architects in particular should be concerned with infonomics. It’s not part of their profession and, formally, never really has been. Certainly, architects make design decisions based on information availability, volume, velocity, and variety, and may unwittingly weigh various economic factors about the supply and demand, cost, or marginal utility of information. CDOs and CIOs make strategic decisions based on similar factors, but each are largely untrained in principles like diminishing marginal utility, productive efficiency, externalities, elasticity, and excess burden, or behavioral implications of asset availability—certainly not as they apply to information assets. Would knowing these concepts and how to apply them to information assets be beneficial to their employers? Absolutely.

Indeed, many if not most microeconomic and macroeconomic principles can be applied to information assets. Some are obvious and need little treatment, such as the diminishing marginal cost of information (a feature of any storage device or service). Yet others like Ricardian Rent, Nash Equilibrium, and Bekenstein Bounds are out of bounds for a book of this nature. Therefore, we’ll explore only those which have the greatest relevance in the context of information assets, and that require some reformulation to guide information producers and consumers such as CDOs, CIOs, and enterprise-, application-, and data architects. They include:

  • How the principle of supply and demand operates differently with information than with other assets.
  • How to understand and apply the forces of information pricing and elasticity.
  • How understanding the marginal utility of information for both human and technology-based consumers of information should drive business and architecture decisions.
  • How the opportunity costs of certain information assets must be factored into selecting and publishing them.
  • How the information production possibility frontier affects information-related behavior and investments.
  • How to use Gartner’s information yield curve to conceptually integrate the concepts of information monetization, management, and measurement for improved information-related and business strategies.

The Supply and Demand of Information

Information is an unruly asset. As pointed out earlier, it does not deplete when consumed, it can be used simultaneously, it is representative of some other entity or activity, it costs comparatively little to store or transmit, and it can instantly transform or disappear. At once, its supply can seem both infinite and its demand insatiable. Yet most organizations strive to generate or collect ever more information, even as their ability to consume it struggles to keep up. Regardless, demand for it throughout your organization goes largely unfulfilled. This is an indication of an information supply chain inefficiency, not so much an information ecosystem imbalance.

Supply and demand is the most fundamental of economic principles. Demand refers to the quantity of something someone is willing to obtain for a given price, while supply represents how much of that asset is available.

But wait: If the same unit of information can be sold (licensed) and delivered (transmitted) to multiple buyers (Remember the non-rivalry principle?), isn’t the supply of available information infinite?2 How can its price achieve some market equilibrium? Accordingly, why isn’t all information freely available? Clearly classic supply and demand breaks down for information assets. Remember the Cost Value of Information (CVI) and Market Value of Information (MVI) models from chapter 12? They can be used not just for valuing information, but also to guide the behavior of suppliers and consumers of information assets. The CVI sets a minimum price for information based on the costs involved in producing/collecting, managing, and delivering it,3 while the MVI expresses how market saturation of the same information devalues it, thereby reducing its demand.

Executives, business leaders, and architects need to understand how this price equilibrium operates differently with information than with traditional goods and services. It is not based on balancing supply and demand, but rather on a more sophisticated function of information costs, viable uses, and market saturation. Consumers of information should not expect to buy it merely at some nominal cost premium. And suppliers of information must consider how attenuating market saturation will affect the number of buyers at any given price point (and vice versa), and how this in turn affects its attributable revenue stream. Achieving an information price equilibrium, it would seem, involves a bit of game theory well beyond classic supply and demand, and the scope of this book.

Information Pricing and Elasticity

Increasingly, much information is purchased by organizations—not just produced, either directly from data brokers or social media aggregators, or indirectly in the form of trading goods or services. Accordingly, this puts a microscope on the issue of information pricing. As we saw with the market value of information (MVI) model in chapter 11, and the topic of supply and demand in the previous section, pricing information can be more convoluted than pricing traditional goods and services. There are few “open arms-length marketplaces” (as accountants would say) with which to establish viable market valuations for most kinds of information. And most information bartering is off-book because information is not a formal capital asset.

But perhaps as important as—or more important than—fixing a price for a unit of information is how its price changes in response to other factors (i.e., its elasticity)—primarily information’s supply and demand.

The price elasticity of information supply measures how much a sup plier (publisher, producer) of an information asset chooses to supply in response to a change in price. But since information is inherently repli-cable, why would a supplier like a data broker or book publisher limit the supply even with downward price pressure? Simple: to prevent the price from approaching zero, resulting in negligible or negative profits. In most cases, however, it makes more sense for information suppliers to increase supply up to the point in which market saturation creates signifi-cant downward price pressure. This is the theory and mechanics behind the MVI valuation method.

On the other hand, the demand elasticity of information suggests that at some price point, an information consumer will seek an alternative (substitute) source. When fewer alternative sources (e.g., retail pricing data, or IT research) are available the information asset is critical, and/or switching costs are prohibitive, then the price premiums can be maintained, and vice versa. Therefore, enterprise efforts to identify and curate information sources can be important not just for improving information’s utility, but also for negotiating the price of current licensed or bartered information sources. Moreover, the growing variety and availability of information sources (especially open data published by government organizations and others) compels information suppliers to differentiate either with expanded information sets, improved quality, and/or additional services such as analytics or integration.

The Marginal Utility of Information

The supply, demand, and pricing of information are in part determined by its utility. Economists define utility as a general measure of happiness or pleasure. This is not to be confused with value, in which benefits and costs are juxtaposed.

Marginal utility is the additional satisfaction gained from consuming one additional unit of a good or service. Economists and marketers use it to determine how much, or how much more, of an item a consumer will buy or consume. In most situations, each subsequent unit of something consumed has less utility than the prior one; this is the law of diminishing marginal utility.

Positive marginal utility occurs when the consumption of an additional unit of a good or service increases the consumer’s total utility (for example, when a child receives an additional outfit for her doll). Conversely, negative marginal utility is when the consumption of an additional unit decreases the total utility (such as when that extra scoop of ice cream makes you feel unwell).

Below are some highlights of how to consider and incorporate the marginal utility of information into business models and system designs.4

Marginal Utility of Information for People

Publishing or directly exposing people to repeated identical information leads to rapidly diminishing marginal utility and very little increase in total utility for each unit of information published. However, marginal utility can increase for the first few units of information assets, and negative marginal utility can be experienced much more quickly than when consuming tangible products or services.

Creators of information (such as journal publishers, advertisers, and data brokers) will find a high level of initial marginal utility among consumers for “original content” or new units of information, but not nearly as much for messages repeated beyond the first few. However, since information is not depleted when consumed and is non-rivalrous, this high level of initial marginal utility can be capitalized on simultaneously across multiple consumers. This is a key principle of information monetization. Information architects therefore should seek to design systems that increase information availability for multiple human and/or machine consumers.

Nonetheless, publishing the same information to the same individual repeatedly results in fast-diminishing marginal utility for humans, quickly leading to negative marginal utility and diminishing total utility. Information producers and publishers must therefore strive to achieve a balance between pushing information to individuals in a frequency and periodicity that matches the preferred consumption characteristics of those individuals. Understanding information consumption patterns across increasingly granular segments of individuals can optimize the total utility of the information published. For example, publishing an enterprise performance dashboard of strategic indicators to executives on a daily basis can result in rapidly diminishing marginal information utility, and ultimately disuse—whereas publishing daily changes or a monthly dashboard may retain high levels of marginal information utility.

Marginal Utility of Information for Technologies

For technologies such as systems, applications, and devices, which increasingly are becoming the primary consumers of information assets, the marginal utility principle must be adjusted to accommodate the following two scenarios:

  1. Repeated identical units of information received by a component of technology offer zero marginal utility and constant total utility.
  2. If we consider the total cost to process or ignore repeated units of identical information, total utility could be perceived to decline with each successive piece of identical information received.

Therefore, information and solution architects should filter out redundant information and instead only collect or stream changes (or deltas). In remote IoT environments, where bandwidth and edge processing is limited, this becomes an important consideration for the transmission of remote sensors’ current status (rather than only a state change). For example, thermostats are designed to take periodic readings and only activate HVAC units upon temperature deltas of a prescribed magnitude (such as 2 degrees or more). If, instead, thermostats were designed to activate and deactivate HVAC units based on continuous and often identical temperature readings, these devices would soon suffer damage.

Architecting for Optimized Information Utility

Three basic solutions to architecting systems can avoid the undesirable conditions caused by streams of identical information:

  1. Transmit only distinct data if and when it is produced and filtered by the publisher. IoT devices, for example, might only send updates when their state changes.
  2. Transmit only differential data, which includes the delta between subsequent data points. Examples include the aptly named differential backups, and also accelerometer sensors and water leak detectors.
  3. Produce and transmit only derivative data. Examples include publishing a revision to a previously published book or article, or applying different algorithms to a piece of information in order to craft uniquely differentiated messages for particular customers.

Opportunity Cost of Information Choices

While marginal utility deals with the diminishing benefits of consuming additional units of the same thing, opportunity cost refers to foregone benefits from choosing to consume one thing rather than another. In the sphere of infonomics, this constitutes selecting one information source over another, or performing minimal analysis in the interest of expense or expediency. As asserted in chapter 4, enterprise inertia often keeps organizations publishing pretty pie charts when deeper insights from advanced analytics like machine learning are readily available. As mentioned in chapter 3, most organizations are unaware of the variety of alternate information sources available, while they continue to use less accurate, complete, or timely sources “because that’s the way we’ve always done it.” The economic principle behind this is called the endowment effect, in which people ascribe a greater value to something merely because they possess it. The endowment effect quietly motivates business people to resist information and analytic progress.

It’s the job of the CDO to ensure that opportunity costs for information and analytics are understood and employed to drive related strategies. As new information sources and analytic capabilities are continually presenting themselves, information and analytic opportunity cost assessment must be a periodic activity. Moreover, CDOs, business leaders, and enterprise architects must be aware of potential unintended consequences (or externalities in economic parlance) when planning to shift to new information sources or analytic methods. In addition, Externalities are a major consideration for any organization publishing information, either publicly or privately. Information can be used in an endless array of contexts, so CDOs especially must anticipate and prepare for how others might use (or misuse) information the organization shares.

The Production Possibilities of Information

Sometimes the inhibitor to improving the production or quality of information, or to implementing more advanced analytic capabilities, is due to its internal information production possibility frontier. That is, the organization would have to shift available budget and resources away from one activity, say developing ERP application functionality, to support such information-related improvements. This of course assumes the organization is operating at near optimal information productive efficiency.

Production possibility frontiers are a common reason particularly for stagnant data governance efforts. People are busy. They don’t have the extra time to deal with developing or adhering to additional information principles and policies, without sacrificing other tasks (i.e., producing other things). However, there are several ways to overcome information-related production possibility frontiers:

  • Embed information-related activities into other activities to soften their impact on resources.
  • Automate information-related activities, or other activities, to free resources.
  • Use ROI modeling to justify additional resources, thereby expanding the production possibility frontier outward.
  • Or, as Michael Smith, principal contributing analyst at Gartner, and I recently posited: apply the future alternate economic benefits of an information asset to fund additional resources today.5 (Remember, information is non-rivalrous, meaning the same information can be used for multiple purposes. So, plan for new ways to use information, and funnel the expected revenue or savings into additional information-related resources.)

Improving Information Yield

Ultimately, each of these economic factors influence—or should influence—how and when you make information-related investments. As discussed in chapter 5, these investments come in the form of vision and strategy development, devising and incorporating information-specific and information initiative metrics, data governance, skills and processes, and of course technology. But how is an organization, or more specifically its CDO, to gauge the appropriate level of investment? When do—or should—investments in information asset management (IAM) pay off? What happens to IAM maturity when information assets—along with related procedures and technologies—are merely maintained, not improved? Conversely, what if you accelerate information-related investments? And, how mature should you be or need to be, especially related to your competitors?

This leads us to a examination of information yield (a concept loosely inspired by the traditional yield curve6) for expressing the rate of improved value per unit of information-related investment. The information yield curve brings together, conceptually, the concepts of information monetization, management, and measurement. As we all know, information asset management (IAM) involves numerous moving parts. And as enumerated in chapter 5, many factors exert upward or downward pressure on information maturity. But what does this maturation curve look like? And where are you on it?

The information yield curve which my Gartner colleagues Andrew White, Joe Bugajski, Frank Buytendijk, and I devised a few years ago is intended to answer these questions. Not computationally, but more along the lines of how the popular Gartner Hype Cycle7 sets maturity expectations for technology users and suppliers, the information yield curve sets expectations for how information-related investments affect IAM maturity—and thereby your information’s rate of return.

In short, low-maturity organizations will see accelerating improvements in the rate of return on their information assets from information-related investments, while high-maturity organizations will see decelerating rates of return as they approach an optimization ceiling. Similarly, low-maturity organizations that accelerate their information-related investments will experience a maturity bump that begins to level off. For example, moving to a more sophisticated analytics platform will have quicker returns initially until your competitors catch up, market dynamics change, or the platform becomes overwhelmed by the increased volume, velocity, and/or variety of information assets—such as we have observed in recent years with traditional data warehousing approaches.

Figure 12.1 Information Yield Curve

Figure 12.1 Information Yield Curve

The forces exerting upward pressure to improve and accelerate maturity should be embraced and cultivated. Many of these can be found in the information management maturity challenges mentioned in chapter 5, such as information quality, accessibility, compliance, analytics, governance, and so forth. The downward forces that limit and tend to decelerate IAM maturity include the volume, velocity, and variety of information, the increasing ubiquity of information assets, the speed of business, regulatory mandates, organizational resistance/inertia, information hoarding and underutilization, and lack of information trust (metadata).

Your organization will find itself, or choose to be, on different relative positions along the curve. These positions can be classified as:

  • Industry Average (in the middle of the curve): You’re tracking along with the industry pack, making similar investments, dealing with downward pressures on par with others, and therefore receiving an average return on your information assets.
  • High Performer (above on the curve): You didn’t get a jump on others in your industry, but your investments are paying off better and/or you’re doing a better job of mitigating downward value pressures. Therefore, your information yield is greater than most others.
  • Low Performer (below on the curve): You are not effectively deploying IAM solutions you’ve invested in and/or are succumbing to downward pressures more so than other organizations. Therefore, your return on information assets is lower.
  • Leader (ahead on the curve): You got a head start on your competition and have continued to reap the rewards of greater information maturity, for now.
  • Laggard (behind on the curve): You’re off to a slow start in making information-related investments, and therefore are not generating as much value yet from the information available to you, yet.

Indeed, these positions on the curve are subjective. Nonetheless, a combination of IAM maturity scoring8 and information asset valuation, along with certain relative industry performance indicators, can give you a sense of where you are on the curve—and therefore an indication of how well your information-related investments are increasing (or will increase) your information’s yield. In turn, these indicators, along with the infonomics considerations throughout this chapter, can and should influence business and information vision and strategy.

Vision and strategy are all about the future—defining it, getting there without stumbling or spending too much… and hopefully before your competitors. However, vision and strategy are not only influenced by economic principles, but also by trends. So in the final chapter we’ll examine various information, business, and technology trends affecting how you will be able to and should monetize, manage, and measure information assets over the next several years.

Notes

1 Nolan Miller, interview with author, 09 September 2016.

2 I’m not talking about generating new information here.

3 Supplier costs may include those to market it as well.

4 In our research report, “Applied Infonomics: Designing for Optimal Marginal Utility in a Digital World,” Dale Kutnick, Saul Brand, and I detail how the concept of marginal utility and diminishing marginal utility must be adapted for both human and machine-based consumption of information assets. (ref: “Applied Infonomics: Designing for Optimal Marginal Utility in a Digital World,” Douglas Laney, Dale Kutnick and Saul Brand, Gartner, 16 December 2016, www.gartner.com/document/3546617).

5 Douglas Laney, and Michael Smith, “How CIOs and CDOs Can Use Infonomics to Identify, Justify and Fund Initiatives,” Gartner, 29 March 2016, www.gartner.com/document/3267517.

6 The standard yield curve plots the yields or interest rates for bonds or other debt instruments of differing contract or maturity lengths.

7 “Gartner Hype Cycle: Interpreting the Hype,” Gartner, accessed 09 February 2017, www.gartner.com/technology/research/methodologies/hype-cycle.jsp.

8 Douglas Laney, and Michael Patrick Moran, “Toolkit: Enterprise Information Management Maturity Self-Assessment,” Gartner, 13 June 2016, www.gartner.com/document/3344417.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset