Unterfeldstrasse 18, 8050 Zurich
+41 32 511 27 89

FAQs

Leider ist der Eintrag nur auf Amerikanisches Englisch verfügbar. Der Inhalt wird unten in einer verfügbaren Sprache angezeigt. Klicken Sie auf den Link, um die aktuelle Sprache zu ändern.

Where can the Datavault Builder be run at / installed?

The tool is made up of multiple docker containers. It can be deployed locally on premise, but as well in the cloud.

Can I manually edit objects on the database?

Yes, in general you are free to access the processing database and modify objects manually. The Datavault Builder will read the existing objects and its property in real time, so manual changes (for instances deletion) will directly be reflected in the model. Please be aware of some limitations, such as staying within the naming patterns when manually creating objects.

Does the Datavault Builder have its one metadata repository?

No, the metadata is stored directly on the objects in the database.

How does the deployment between multiple environments (Dev/Test/Prod) work?

Since all objects are directly created on the databse and the metadata is stored on them, the rollout can be done with a schema compare by simply moving objects between the environments.

Can be worked with third party tools for ETL processing?

Yes. One possibility is in staging: You can disable the staging loads in the Datavault Builder and manually transfer the data into the landing zone, and then kick on the Data Vault loads. In this scenario, only the loading process into the Datavault itself is then orchestrated by the included procedures of Datavault Builder, which will automatically take care of comparing data for changes, include the hashing / load times / load source and as well update the tracking satellites.

A second possibility can be to replace our Business Rules layer after the business object with an external ETL flow to include and maintain the business logic. Therefore, in our jobs module, you can include a trigger, which will then initiate the deployed SSIS / DTS workflows after a certain job is finished loading.

Can I model a fact-dimension output?

Yes, the Business Object generator is helping you to denormalize the Datavault Structure and preparing it for the output. You can select related Datavault Objects and add the attributes to the output. The Datavault Builder will prepare the necessary joins in the background.

What are the generated jobs? SSIS Packages?

The generated jobs consist of several staging and datavault loads of one source system. On execution, a procedure in the core module will read out all contained loads, automatically calculate a running order and generate dynamic SQL statements, which will perform the loading on the database.

What kinds of sources do you support?

We can load data from any source supplying a JDBC-Driver. Also, we have included support for Flat Files and REST-Apis.