Chapter 5. The Mythical Metaman

 

“There is no silver bullet.”

 
 --Fred P. Brooks, Jr.
<feature><title>Chapter Contents</title> <objective>

When You Eliminate the Impossible 91

</objective>
<objective>

The Long Tail of Business Services 98

</objective>
<objective>

Business Attractors for Attractive Businesses 104

</objective>
<objective>

The Death of Brownfield 105

</objective>
<objective>

Endnotes 105

</objective>
</feature>

The last four chapters described the Brownfield approach and showed how it can be used to reengineer complex environments. But this is just the initial application of Brownfield. By switching viewpoints from Greenfield to Brownfield and adopting the VITA architecture, further possibilities emerge. This chapter looks at the future of Brownfield and what it might ultimately mean for the IT industry.

By now, it should be easy to understand the applicability of Brownfield in terms of delivering large IT projects. So far, the focus has been on eating the IT elephant, per the title of the book. This perspective is deliberate. The best way to explain what Brownfield is about is to describe the problems it was designed to overcome. As a result, the book has so far primarily looked at how the Elephant Eater and the Brownfield approach can overcome environmental complexity and the communication problems of large projects.

The book has also examined how it is possible to create powerful new Inventories of information to store the environmental complexity, project requirements, and definition of a solution. In the more technical Part II, “The Elephant Eater,” we explain the new semantic technologies underpinning these capabilities.

Semantic technologies are relatively new. They aim to provide new mechanisms for computers and humans to understand the context and meaning of the information they are dealing with. This makes communication less ambiguous, in much the same way that we clarified the word “check” in Chapter 3, “Big-Mouthed Superhero Required.” This chapter looks at the further possibilities that the Brownfield technique offers when combined with these semantic technologies.

In The Mythical Man Month, Fred Brooks describes a “silver bullet” as a single measure that would raise the productivity of the IT industry tenfold. Many IT industry movements have claimed to be silver bullets over the years. None has realized such potential. As Grady Booch reiterates, “Software engineering is fundamentally hard.” Brownfield does not change that. Brownfield is not a silver bullet.

It does, however, come closer to solving some of the intractable problems the IT industry has created for itself and businesses over the last 20 years. It is not a new way of writing programs, or a new architectural approach for functional decomposition, or a mechanism for ensuring code reuse. It is not a productivity enabler at heart, although significant project productivity gains of 40% or more have been measured. In contrast, it tries to answer a different question. The industry should not ask itself, “What can we do to be more efficient?” but fundamentally, “How do we turn today’s inflexible systems into tomorrow’s dynamic ones?” Brownfield is not a technology or an architecture—it’s a fundamental change in looking at IT. It is how to cost-effectively realize the vision of business flexibility and interoperability that SOAs and Web 2.0 promise. If it is such a good enabler of known end goals, what other futures might it enable?

When You Eliminate the Impossible

Brownfield is such a compelling idea that, once it’s in your head, you begin to wonder why it has never been used. Chapter 7, “Evolution of the Elephant Eater,” shows that Brownfield is the aggregation of a number of existing IT techniques and some new ones—and explains why its time has come.

Many of these new techniques and technologies have been enabled for mainstream use by the popularity of XML and the investment currently being made in semantic technologies by Sir Tim Berners-Lee[1] and many others. These technologies likely will become mainstream.

In this chapter, therefore, we do a very dangerous thing: We look into the future of the IT industry and extrapolate what the combination of powerful semantic technologies and Brownfield might create. In doing so, remember that all predictions about the future of the IT industry are invariably wrong—possibly even this one. To maintain a level of credibility, therefore, we exclusively extrapolate from known capabilities.

Software Archaeologist Digs Up Babel Fish

Much of this book has looked at the capability to discover information about the problem being solved, reengineer that information to generate the solution, and test the solution. The initial envisioning for the Elephant Eater was that the Views being eaten would be at the level of formal design documentation. (It seemed sensible that the Views should be at a similar level as the UML or XML data sources that are increasingly used to specify both high-level and low-level designs.) This would mean that everyone involved in the design could continue to work as normal but would actually be creating Views suitable for feeding into the Elephant Eater.

By using these design-level sources of information, the business analysts, architects, and specialists could easily perform mappings between design-level concepts and could maintain and generate solutions to their problems. For example, the data entities that the business analysts recorded in the use cases could be formally linked (or mapped) onto the Logical Data Model created by the architect. Although the Views would be maintained separately in different tools, if they ever moved out of synchronization, the Elephant Eater would force them back into line.

This solution works well for much of what must be done, even in elephantine projects. Sometimes, however, projects get even harder. The environment might be so large or complex that even simple mapping between concepts becomes laborious and time consuming. Could Brownfield be used to accelerate this process of mapping concepts?

 

On one huge and complex project, we needed to precisely mimic part of the behavior of a very old legacy system to ensure that the system worked end to end. We knew where the information was, but the legacy system was undocumented and essentially unknown. The team that maintained and understood the system was too busy to talk to us because it was working flat out on a critical project to keep it going until we could replace it. It was a Catch-22 situation: We couldn’t free up their time until we replaced their system, and we couldn’t replace their system because they had no time to talk to us!

To make matters worse, the constraints that were hidden inside the same legacy system had caused all kinds of operational problems for the business when an earlier system had been sent live alongside the same legacy system. This was despite the fact that the new system had successfully completed its lengthy formal and expensive testing with flying colors. Unless we were going to suffer a similar fate, we needed a way to get at and understand the constraints and behavior of the legacy system so that our new system could work without disrupting the business.

Giving the task to the same team that had successfully parsed a host of other information for us, we expected a similar “one-off” answer to be proposed. What they came back with was far more exciting and powerful. Instead of simply analyzing the target system’s code looking for certain keywords in certain contexts (simple static analysis), they suggested creating a representation of the whole program within the Inventory. When the whole program was in the Inventory, finding the information we were looking for would be much simpler.

 
 --R.H. and K.J.

The idea of absorbing whole programs into the Inventory resulted in new uses for Brownfield tooling. The technique is called software archaeology. As described in more detail in Chapter 9, “Inside the Elephant Eater,” software archaeology works by translating the legacy computer programs and configurations into a form that can be fed into the Inventory as a number of Views. As the legacy code is analyzed, it becomes possible to see its hidden structures, infer the original architecture, and even identify areas where that architecture has not been followed.

Ultimately, this technique could become the Babel fish[2] of computing architectures, enabling the shape of a legacy application to be determined by the archaeology tools and then regenerated using modern technologies. Although enhancements are still required to cope with the more dynamic aspects of some programs, these can be (and are being) addressed. The complexity of any program can be peeled away layer by layer until only the business logic remains. This isolated legacy business logic code can be analyzed and fed into a modern rules engine. The resulting business logic can then be advertised as a set of services, effectively turning a legacy application into an SOA.

This capability to automatically convert existing legacy spaghetti code efficiently into a flexible, modernized application is in its early days, but even at this stage, it looks hugely promising.

Radical Options for Business

As discussed in Chapter 3, Brownfield expands the scope of iterative development from being primarily focused on the user experience to encompassing the harder, less amorphous areas of integration and enterprise data. Brownfield effectively expands the IT industry’s capability to use iterative techniques so that solutions can be defined incrementally within complex environments without creating a solution that is incompatible with its surroundings. Because this happens in a controlled way, business analysts have more power and techies have less. This is a good thing because business moves back into the driver’s seat of IT projects.

Taking the approach further, perhaps ultimately this would result in businesses themselves becoming much more directly involved in specifying their own systems—and not through the artifice of being involved in designing and approving the user interface. The real business owners could look at real-time visualizations of their IT and business estate (like the ones in the last chapter), and use such visualizations to make decisions for both business and IT.

Ultimately, it might be possible for business leaders to walk around an editable visualization of their entire business (including their IT). Instead of just using the visualization to understand the business process and the IT environment that the business depends upon, they might be able to make changes to the representation. These changes would then create precise requirements for a business and IT reengineering project. In VITA terms, the visualization is generated as an Artifact, changed by the business leader, and fed back into the Elephant Eater as a View.

The implications of changing processes or IT systems could be immediately grasped from such multilayered displays. Feedback on the capability to enact the change could be calculated after analyzing the information in the Inventory. There’s no reason why this analysis couldn’t be done almost instantly. Suddenly, the capability to create a well-defined and implementable set of requirements would increase astronomically because all the constraints of the enterprise would be applied in real time. Coming up with good requirements for incrementally reengineering Brownfield sites is one of the hardest problems the IT industry has yet to solve. Perhaps this is the beginning of such a solution.

Taking this idea one step further, it might be possible to automatically ripple the impacts of high-level changes down through the Inventory—not only to assess the impact and disruption, but to actually regenerate the solution that runs the business. Furthermore, this might not require an architect or specialist, and it might not even directly change a single line of code. In such situations, the changes made in the business visualization could result in an automated change to the processes and software that run the business.

If so, the business would be in a very powerful position indeed. But how would it know which decisions to take?

Brownfield might expand to model not just the business and IT environment of the enterprise itself (as seen in Chapter 4, “The Trunk Road to the Brain”), but also its competitive environment. Within IBM’s business analysis methods is an idea of modeling the business context and determining the external events that impact the business. If this black box boundary that defined the edges of a business was drawn a little bigger, could it provide an industry context instead?

Such all-encompassing black boxes are essentially ecosystems. Currently, they are hard to model and comprehend, but the extensibility of the Inventory suggests that such ecosystem modeling might be possible by drawing information from a huge variety of sources and using them as Views. Ultimately, these sources of information might be real-time feeds, making the projections dynamic forecasts.

The ultimate vision of Brownfield, therefore, is that it should be capable of iteratively changing and tuning business services within the context of the overall industry ecosystem. The business should be able to literally redraw an element of the business or introduce a new customer service, and immediately understand both how to underpin it with new or existing IT services and how it will impact the whole organism.

This expansion of agile capabilities should be seen as a continuation of the progress already made with Brownfield, as shown in Figure 5.1.

The expanding sphere of Brownfield.

Figure 5.1. The expanding sphere of Brownfield.

Such reactive organisms have been called many things. Followers of IBM’s major messages might recognize this kind of business “sense and respond” capability as On Demand. Sam Palmisano introduced this vision of On Demand when he took over as IBM CEO in 2000. Over the years, this vision became misinterpreted and was limited to a utility model for autonomic computing (buying computing power as needed it, as with electricity). The original vision, however, was one of business flexibility and integration.

As Palmisano predicted, such businesses would have a very powerful competitive advantage. Using the Brownfield approach, the business would potentially be able to rewire its Inventory using an intuitive visualization. By doing so, the business could regenerate systems and business processes. In such circumstances, the business leaders would be the true drivers of change, not the recipients of it.

Surely, then, this is the endpoint for Brownfield? Evolving the frozen elephant of yesteryear into the dancing elephants of tomorrow? Figure 5.2 graphically illustrates the point. In some ways, perhaps it is; this is certainly in line with the vision set out at the beginning of this book. Looking even farther down the Brownfield road, are there other sights to see?

Over time, the frozen elephant could evolve into the dancing elephant.

Figure 5.2. Over time, the frozen elephant could evolve into the dancing elephant.

At Your Service

Chapter 3 described some of what can be done with an Inventory and Transforms. The beauty and power of the Inventory lie in its simplicity. The capability to create multilevel, interrelated structures on top of it is a much closer corollary to the synaptic structures of the brain than conventional IT tooling. The kinds of structures that emerge within complex Inventories do not look that dissimilar to the structures we see in organic brains.

Because the language and structure of the Inventory have no artificial constraint (other than the ones imposed by choice), the VITA architecture is inherently extensible. Pretty much any formal documentation can be massaged into such a form held within its structure. Basically, triples are simple sentences and pretty much anything can be expressed in those terms, given enough thought (and enough sentences).

The likely outcome of this capability is twofold. First, hardware, operating systems, middleware, and applications will start being geared to provide information about themselves in formats ready for eating by the Elephant Eater. Hardware, operating systems, middleware, and applications will essentially be augmented to create standardized Views that describe their configuration. This will reduce the cost of doing an overarching full site survey, which currently would probably need to focus on the most difficult aspects of the problem to be cost-effective.

Second, that information will not need to be manually extracted from these sources, but will be available via open interfaces. The task of performing and updating the site survey of the existing environment will become automated and inexpensive rather than manual and prohibitive for simple areas.

Key aspects of this can already be seen in parts of the industry concerned with managing existing complexity—primarily the outsourcing and operations areas. IBM’s Tivoli® Application Dependency Discovery Manager (TADDM) is a good example. TADDM works from the bottom up, automatically creating and maintaining application infrastructure maps. The TADDM application maps are comprehensive and include complete run-time dependencies, configuration values, and accurate change histories. For industries in which an application infrastructure audit is part of the legislative requirement for risk management (such as Basel II[3] in the finance industry), such tools are exceptionally useful.

TADDM currently stops at the application container (which runs the applications), but there’s no particular reason why it needs to. The applications within the containers could be analyzed for further static relationships, even down to the code level.

Over time, therefore, we predict that the IT industry will move from the selected scything of information from Brownfield environments, to a combine harvesting approach in which the existing environment can be automatically eaten by the Elephant Eater to obtain a comprehensive description of its complexity.

The remodeling of both application and underlying middleware and hardware environments will become an iterative process of refinement.

The Long Tail of Business Services

The use of such standard harvesting tools will have a strong byproduct: The format of the Inventory itself and the classification of the information within it will begin to be standardized. Such formal classifications are called ontologies.

Ontologies are formal languages that can be used to unambiguously classify things, just like Venn diagrams (see Figure 5.3).

A five-set Venn diagram obsessed—like much of this book—with elephants.

Figure 5.3. A five-set Venn diagram obsessed—like much of this book—with elephants.

In Figure 5.3, everything is a Thing—except, of course, for the empty class Nothing. (This part of the classification is identical to the Web Ontology Language, OWL, which is one of the technical standards used for defining things in the Inventory.) Five further classifications (in bold) in the figure overlap in a complex pattern. These overlaps enable anything to be placed unambiguously on the diagram, defining its relationship with those five classifications. A quick analysis of Figure 5.3 enables us to confirm, for example, that the animal Tyrannosaurus Rex was large but did not have big ears and was a poor dancer.

As the coverage and depth of formal ontologies increase, computer systems will be able to change the way in which they communicate. Currently, computer systems communicate via interfaces that are defined by the format of the data they carry. In the future, as the use of Inventories and semantic technologies increases, the interfaces will be defined not in terms of the data formats in which they communicate, but in terms of the meaning. This means that the exact formats need not be known when the system is built. Ultimately, this enables connections between systems to be established that were not necessarily thought of when the communicating systems were built. If the systems can agree on what they want to talk about, they can create their own interfaces. Getting systems to talk together would no longer be a laborious process.

As the discovery tools used for Brownfield purposes increase in coverage and depth, they will help standardize many of the concepts and relationships within the Inventory. In other words, the vocabulary of the short sentences, the triples, will start to become common. What will the implication of such standardization be?

Enabling the Semantic Web

Sir Tim Berners-Lee’s vision for the future of the Internet is the semantic Web, “a single Web of meaning, about everything and for everyone” whereas the Internet is billions of pages of information understood only by humans, the semantic Web would enable computers to understand the information on each page. You no longer would need to surf the Web to find the information you want; the semantically enabled search engines would be able to find it for you. In addition, writing programs to understand information on the Web and respond to it would become much easier.

The hard part, of course, is not working out how to publish and use semantic data. The hard part is getting people to agree with the same classifications and definitions, and getting them to link their data. This is difficult because the benefits of doing so are not immediately clear. As Berners-Lee himself points out, the benefit of the semantic Web is realized only when large quantities of data are available in these semantically strong ontology forms. Only then does everything click and the semantic Web provides a better way of interacting with customers or other organizations. Until that point, any investment in creating semantically strong Web resources is a burden to an organization. Currently, in commercial environments, this only amounts to a public show of faith in one particular future of the Web with little or no business benefit.

Using semantic technologies to create a site survey of existing Brownfield IT environments for potential change and regeneration means that the newly generated function created from the Inventory could automatically include additional semantic information. Adding this information would not be a burden; it would be a free byproduct of creating an Inventory that contains semantic information in the first place. Suddenly, the kind of additional information that the semantic Web needs to work would be available at little additional cost. IBM has measured significant productivity benefits from the use of its Elephant Eater, so we could make a strong business case for such reengineering.

Thus, publishing semantically based Web services that describe not only their content, but also the relationship of their content to other content becomes highly cost-effective. (The information is already there in the Inventory for the taking.) As a result, the kind of Web services companies offer would change forever.

Dynamic Services

In today’s state-of-the-art implementations, IT services can be selected from a list of services in a directory. So if a company was looking for a trading partner that could offer a certain service, its inquiry to the directory service would be highly specific. The service would return only companies that had anticipated the inquiry. In the future, using semantic technologies would change such an interaction: The company would issue a request for a trading partner that offered a particular service, but instead of identifying matching services, the potential trading partners would see if they could assemble such a service—they would not need to have anticipated that particular request.

Geert-Willem Haasjes, a senior IT architect colleague from IBM in the Netherlands, has incorporated this model of flexible integration into a high-level architecture. Assuming that the requestor was trusted to access such a described service, the request for a trading partner would be returned via a service that had been constructed in real time to answer the specific request. Integration would become a dynamically established activity rather than a statically linked and formally tested service interface advertised via a directory of such static services.

Figure 5.4 shows the basic operation of such a system. In this figure, Enterprise A has followed the Brownfield approach. The organization’s capabilities and constraints have been discovered and harvested in an Inventory. As a result, the services the business offers via the semantic Web exactly describe its capabilities (unlike those of its competitors).

Enterprise B wants something unusual; Enterprise A can provide it.

Figure 5.4. Enterprise B wants something unusual; Enterprise A can provide it.

Enterprise B is looking for a service but does not know how or where it can be serviced. Today, someone in Enterprise B would probably use Google to find a company specializing in that service and then use other forms of contact to establish whether the service required is exactly what the business wants—and how much it might cost. In this new world, a semantic-based request is sent. Instead of describing the precise form of the IT service that is required, the request describes the actual characteristics of the service itself. The request is sent to a search engine, which identifies that Enterprise A has published a service that is compatible with the request.

Enterprise B might have asked for something very unusual that Enterprise A has never received such a request for. But Enterprise A knows that it can meet the request because it knows about all its internal constraints. As a result, Enterprise A gets the business.

This kind of flexibility has a much bigger effect: Today’s approach of delivering a set of specific products or services to a customer would completely break down. Once upon a time, your choice of what to buy was largely constrained by the store’s capability to stock it. If you had a peculiar taste in music or literature, chances are, your local store wouldn’t meet it. You might be able to order what you wanted, but if you wanted it urgently, you had to travel to a bigger store, or possibly travel even farther to find a specialist shop.

That pretty much all changed with the e-business revolution. These days, if it’s made (and even if it’s not), it likely can be ordered immediately via the Internet. Shops are no longer constrained by their physical footprint, so they can offer a greater variety to a much larger potential customer set via the Internet.

Chris Anderson pointed out this effect, known as the long tail, in 2004 and then popularized it in his book, The Long Tail.[4]

The long tail says that the amount of business available via low-volume sales might outweigh the amount of business available via high-volume sales (see Figure 5.5). Many Internet retailers make the bulk of their profit from the long tail, where they can attract higher levels of gross profit.

Chris Anderson’s long tail

Figure 5.5. Chris Anderson’s long tail

So why not have a long tail of business services for customers? Using the previous model, the customers could essentially customize the offering that they want, including the supporting business processes within the supplier organization.

If the retail industry model holds strong, these highly personalized and customized—and, therefore, desirable—products could offer significantly greater profit.

Consider an example with the new BMW Mini: The number of options available almost ensures that each car is different from anyone else’s. The car thus reflects the customer’s individuality; the customer values this and sees it as a key differentiator in the market, enabling the company to charge a premium price for a small car.

Everything We Do Is Driven by You

Ultimately, therefore, the Brownfield vision is for businesses that are literally driven by their customers—not just at the macro level of determining what products and services their customers might like, but actually at the micro level of a customized offering that the customer has specifically asked for.

In such an environment, IT (as a separate department receiving requests from the business and acting on them) is dead. The translation layers and communication gaps between IT and the business have been removed. The IT is driven dynamically by the business and adapts automatically as required.

As a result, all businesses will need and use Inventories; they will be unable to compete without them. But when everyone has an Inventory, will this create a level playing field? Probably not. Whenever something unique is created, it is then copied and commoditized or standardized—and then something else comes along to improve on it. So in a world of Inventories, who would be king?

Business Attractors for Attractive Businesses

After the business, its processes, and its supporting IT are fed into the Inventory, the Inventory would contain thousands of Views. As predicted by Fred Brooks, Grady Booch, and a host of other IT thought leaders, the complexity of the Inventory would ensure that software engineering remained a complex pursuit.

Ultimately, the patterns, hidden knowledge, and capabilities in the Inventories will differentiate organizations. An organization’s capability to mine, analyze, improve, and reengineer its own Inventory will become vital. In a fast-moving world, businesses that can identify and establish structures that can withstand the chaos around them will survive.

Such islands of strength will be akin to the strange attractors of chaos theory. Strange attractors define the long-term, complex, yet stable patterns that chaotic dynamic systems ultimately evolve into. Business environments are essentially dynamic chaotic systems: Even if you know an awful lot about the industry and its key players, it is still almost impossible to predict its future. As with the weather, predicting the future behavior of business environments is difficult. Even in such complex and unpredictable systems, however, there are islands of order. These strange attractors are ultimately the places where long-lived successful businesses will ultimately be drawn and thrive.

Unfortunately, only one kind of brain can cope with that amount of precise, complex, and interrelated information and model the surrounding chaos. It would be appropriate if this were an elephant’s brain because they don’t forget—but the truth is that computer programs will have to be written (or generated) to analyze and adapt the Inventories themselves.

In companies in which the business data, IT, and business processes are one—each coupled to the others in well-defined symbiotic ways—the entire organization begins to resemble an organism instead of a conventional business. This is the embodiment of an enterprise within a complex program that is aware of its businesses structure and its place within a wider business ecosystem. Such a being would be capable of dealing with chaos and an uncertain future. Computer programs called Inventory optimizers could be used to constantly adapt the business to its ecosystem.

Just as chaos theory has taught us that hidden within apparent disorder, such as weather patterns, are complex patterns of stability, these Inventory optimizers would look for stable business attractors within their chaotic and dynamically changing ecosystem. Businesses that conformed to such attractors would be uniquely capable of surfing on the very edge of constantly changing waves of change without losing their fundamental structure or order.

Gregory Stock invented a similar concept in his book Metaman: The Merging of Humans and Machines into a Global Superorganism.[5] The metaman is a market-driven hugely robust superorganism that includes both human and technological components that ultimately dominates mankind’s future. It’s a fundamentally optimistic book, so perhaps it’s not a bad note on which to end this part of the book. Brownfield takes over the world!

The Death of Brownfield

Of course, by the time the world gets to the metaman, such an ecosystem would have long ago reengineered every other system in the enterprise and optimized every process. As the world around it changes, the metaman will analyze the new landscape in real time and adapt itself accordingly. In such environments, the IT would be reengineered at its will—every day there could be a new environment.

Perhaps that’s the ultimate fate of Brownfield. If Brownfield succeeds, there will be no IT Brownfields anymore—no environmental constraints, no 500-person-year projects effecting change at a snail’s pace. We can hope... and, in the meantime, we shall keep trying.

Endnotes

1.

Sir Tim Berners-Lee founded the Web, not the Internet, which was founded much earlier.

2.

The Babel fish is an invention of the late and much lamented Douglas Adams. It was a creature that, if you stuck it in your ear, it would automatically translate everything you heard in any language.

3.

Basel II offers recommendations for banking law and regulation, to ensure that banks remain financially stable in the face of the risks they take.

4.

Anderson, Chris. The Long Tail. NY/NY/USA: Hyperion, 2006.

5.

Stock, Gregory. Metaman: The Merging of Humans and Machines into a Global Superorganism. NY/NY/USA: Simon & Schuster, 1993.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset