Foreword

I met Daniel Linstedt during a speech at Lockheed Martin in the early 1990’s for the first time. By the time, he was an employee of the company, working for government projects. He approached me because he wanted my opinion about a concept that he had invented at the Department of Defense, in order to store large amounts of data. Back then, the term Big Data was not invented yet. But from what Daniel explained to me, the concept to deal with such huge amounts of data, was born.

Because back then, the end user had cried for “give me my data!”. But over time the end user became more sophisticated. The end user learned that it was not enough to get one’s data. What a person needed was the RIGHT data. And then the sophisticated end user cried for “give me my accurate and correct data!”

The data warehouse represented the architectural solution to the issue of needing a single version of the truth. The primary reason for the existence of the data warehouse was the corporate need for integrity and believability of data. As such the data warehouse became the major architectural evolutionary leap beyond the early application systems.

But the data warehouse was not the end of architecture. Indeed, the data warehouse was only one stepping stone – architecturally speaking – in the progression of the evolution of architecture. It was Daniel’s idea that followed the data warehouse. In many ways the data warehouse set the stage for him.

Daniel used the term common foundational modeling architecture to describe a model based on three simple entities, focusing on business keys, their relationships and descriptive information for both. By doing so, the model closely followed the way business was using the data in the source systems. It allowed to source all kinds of data, regardless its structure, in a fully auditable manner. This was a core requirement of government agencies at the time. And due to Enron and a host of other corporate failures, Basel, and SOX compliance auditability was pushed to the forefront of the industry.

Not only that, the model was able to evolve on changing data structures. It was also easy to extend by adding more and more source systems. Daniel later called it the “Data Vault Model” and it was groundbreaking.

The Data Vault became the next architectural extension of the data warehouse. But the Data Vault concept – like all evolutions – continued to evolve. He asked me what to do about it and, as a professional author, I gave him the advice to “publish the heck out of it.” But Daniel decided to take it to the long run. Over multiple years, he improved the Data Vault and evolved it into Data Vault 2.0. Today, this System of Business Intelligence includes not only a more sophisticated model, but an agile methodology, a reference architecture for enterprise data warehouse systems, and best practices for implementation.

The Data Vault 2.0 System of Business Intelligence is ground-breaking, again. It incorporates concepts from massively parallel architectures, Big Data, real-time and unstructured data. And after all the time, I’m glad that he followed my advice and has started to publish more on the topic.

This book represents that latest, most current step in the larger evolution of the Data Vault that has been occurring. This book had been carefully and thoughtfully prepared by leaders in the thought and implementation of the Data Vault.

Bill Inmon

June 29, 2015

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset