”
So, I'm playing a little with words here. I'm certainly not advocating locking anybody or anything in a Data Vault. I want to share how you can lock in success as you design and deliver your new Data Vault. I assume you have your business people fully on board as discussed in this recent blog. If not, I advise you to go back and do that first. This blogpost is aimed to specifically assist your development team.
Most of us are challenged by change. And developers are little different. They are typically very comfortable with a set of design approaches and tools learned in the past and it routinely frames their perspective on how to tackle the future. Combining the comfort of old ways with the tight timeframes and pressures of today’s business requests seldom leads to taking time to explore new options. As a result, it is easy for teams to be weighed down by outdated, limiting approaches to data infrastructure.
What we’ve learned with the evolution of the Data Vault methodology and data warehouse automation (DWA) over the past decade is that some areas within the data warehouse development process are broken. Dan Linstedt and the other contributors to the Data Vault model in the early 2000’s recognized early on that the traditional data models were not able to meet the quality and agility goals of a data warehouse serving a modern data-focused business. I have provided some of this background in this recent white paper.
The Data Vault is constructed from some very carefully defined primitives, such as hubs, links and satellite tables, that must be defined and populated in specific ways to work as intended. If developers use old approaches or, worse still, make up new ones themselves, disaster will follow.
In Data Vault 2.0, Linstedt has provided a methodology to drive best practice in the design of the data model and in the development of the function that populates it. Methodologies are great: I rely on a wonderful methodology for manually raising my computer screen to the ideal height as I write this post. But, within development teams, such behavior will lead to inconsistent approaches to development; result in delays in future maintenance as other developers struggle to understand different coding styles; and ultimately will lead to a skills loss for your organization when your cleverest developer dies in a freak coding accident.
WhereScape® Data Vault Express addresses these issues by encoding the templates of the Data Vault components, and employing best practices in population processes and development methods within an automated, metadata-driven design and development environment. Starting in initial design collaboration between IT and business people, design choices are encoded in metadata to auto-generate the code and scripts responsible for defining Data Vault tables and populating them with the correct data, ensuring design consistency and completeness, and coding conformity to a single set of standards. Traceability is enforced and maintenance eased. Additionally, as your developers work, all is documented automatically—a task few enjoy or have the time to complete.
Locking in the Data Vault is all about maintaining consistency, ensuring complete documentation, and auto-generating best-practice model and code assets across design and development. As I discuss in this white paper Meeting the Six Data Vault Challenges and within this recent recorded webcast, data warehouse automation is the logical foundation. And while change is hard, development teams will benefit greatly from an openness to doing it differently.
Coming soon, some thoughts on Living in a Data Vault.
This blog is part of a series and has been published on WhereScape.com.
14 en 15 mei 2025 Organisaties hebben behoefte aan data science, selfservice BI, embedded BI, edge analytics en klantgedreven BI. Vaak is het dan ook tijd voor een nieuwe, toekomstbestendige data-architectuur. Dit tweedaagse seminar geeft antwoord op...
19 t/m 21 mei 2025Praktische driedaagse workshop met internationaal gerenommeerde trainer Lawrence Corr over het modelleren Datawarehouse / BI systemen op basis van dimensioneel modelleren. De workshop wordt ondersteund met vele oefeningen en praktij...
20 en 21 mei 2025 Deze 2-daagse cursus is ontworpen om dataprofessionals te voorzien van de kennis en praktische vaardigheden die nodig zijn om Knowledge Graphs en Large Language Models (LLM's) te integreren in hun workflows voor datamodelleri...
22 mei 2025 Workshop met BPM-specialist Christian Gijsels over AI-Gedreven Business Analyse met ChatGPT. Kunstmatige Intelligentie, ongetwijfeld een van de meest baanbrekende technologieën tot nu toe, opent nieuwe deuren voor analisten met innovatie...
17 t/m 19 november 2025 De DAMA DMBoK2 beschrijft 11 disciplines van Data Management, waarbij Data Governance centraal staat. De Certified Data Management Professional (CDMP) certificatie biedt een traject voor het inleidende niveau (Associate) tot...
Alleen als In-house beschikbaar Het Logical Data Warehouse, een door Gartner geïntroduceerde architectuur, is gebaseerd op een ontkoppeling van rapportage en analyse enerzijds en gegevensbronnen anderzijds. Een flexibelere architectuur waarbij snell...
Deel dit bericht