WortmarkeTEDAMOH

Log in

Data Modeling Zone 17

After all, I am very happy to be a speaker at this year's Data Modeling Zone in Düsseldorf. Again, like at the Global Data Summit, I'm talking about one of my favorite topics: Temporal data in the data warehouse, especially in connection with data vault and dimensional modeling.

  • Geschrieben von Dirk Lerner
  • Zugriffe: 3943

Global Data Summit

I am very pleased to be speaking at the Global Data Summit in Golden, Colorodo this year. I am talking about one of my favorite topics: Temporal data in the data warehouse, especially in connection with data vault and dimensional modeling. The title is:

Bitemporal modeling for the Agile Data Warehouse

The talk is a 5x5 presentation, that is 5 slides in 5 minutes. Afterwards, the participants have the opportunity to discuss the topic intensively with me in a 90-minute whiteboard session.

  • Geschrieben von Dirk Lerner
  • Zugriffe: 3282

Fact-Oriented Modeling (FOM) - Family, History and Differences

Months ago I talked to Stephan Volkmann, the student I mentor, about possibilities to write a seminar paper. One suggestion was to write about Information Modeling, namely FCO-IM, ORM2 and NIAM, siblings of the Fact-Orietented Modeling (FOM) family. In my opinion, FOM is the most powerful technique for building conceptual information models, as I wrote in a previous blogpost.

  • Geschrieben von Dirk Lerner
  • Zugriffe: 6472

Data Model Scorecard

Objective review and data quality goals of data models

Did you ever ask yourself which score your data model would achieve? Could you imagine  90%, 95% or even 100% across 10 categories of objective criteria?

No?
Yes?

Either way, if you answered with “no” or “yes”, recommend using something to test the quality of your data model(s). For years there have been methods to test and ensure quality in software development, like ISTQB, IEEE, RUP, ITIL, COBIT and many more. In data warehouse projects I observed test methods testing everything: loading processes (ETL), data quality, organizational processes, security, …
But data models? Never! But why?

  • Geschrieben von Dirk Lerner
  • Zugriffe: 6333

The Data Doctrine

Message: Thank you for signing The Data Doctrine!

What a fantastic moment. I’ve just signed The Data Doctrine. What is the data doctrine? In a similar philosophy to the Agile Manifesto it offers us data geeks a data-centric culture:

Value Data Programmes1 Preceding Software Projects
Value Stable Data Structures Preceding Stable Code
Value Shared Data Preceding Completed Software
Value Reusable Data Preceding Reusable Code

While reading the data doctrine I saw myself looking around seeing all the lost options and possibilities in data warehouse projects because of companies, project teams, or even individuals ignoring the value of data by incurring the consequences. I saw it in data warehouse projects, struggling with the lack of stable data structures in source systems as well as in the data warehouse. In a new fancy system, where no one cares about which, what and how data was generated. And for a data warehouse project even worse, is the practice of keeping data locked with access limited to a few principalities of departments castles.
All this is not the way to get value out of corporate data, and to leverage it for value creation.

As I advocate flexible, lean and easily extendable data warehouse principles and practices, I’ll support the idea of The Data Doctrine to evolve the understanding for the need of data architecture as well as of data-centric principles.

So long,
Dirk

1 To emphasize the point, we (the authors of The Data Doctrine) use the British spelling of “programme” to reinforce the difference between a data programme, which is a set of structured activities and a software program, which is a set of instructions that tell a computer what to do (Wikipedia, 2016).

  • Geschrieben von Dirk Lerner
  • Zugriffe: 3947
  • Conferences
  • Data Architecture
  • Data Modeling
  • Bitemporal Data

    If everything would happen at the same time, there would be no need to store historic data. We, the consumers of data, would know each and everything at the same instant. Beside all the other philosophical impacts, if time wouldn’t exists, is data still necessary?

    (Un)fortunately time exists and data architects, data modelers and developers have to deal with it in the world of information technology.

    In this category about temporal data I will collect all my blogposts about this fancy topic.

  • Secret Spice