A second possibility can be to replace our Business Rules layer after the business object with an external ETL flow to include and maintain the business logic. Therefore, in our jobs module, you can include a trigger, which will then initiate the deployed SSIS / DTS workflows after a certain job is finished loading.
The tool is made up of multiple docker containers. It can be deployed locally on premise, but as well in the cloud.
Yes, in general you are free to access the processing database and modify objects manually. The Datavault Builder will read the existing objects and its property in real time, so manual changes (for instances deletion) will directly be reflected in the model. Please be aware of some limitations, such as staying within the naming patterns when manually creating objects.
No, the metadata is stored directly on the objects in the database.
Since all objects are directly created on the databse and the metadata is stored on them, the rollout can be done with a schema compare by simply moving objects between the environments.
Yes. One possibility is in staging: You can disable the staging loads in the Datavault Builder and manually transfer the data into the landing zone, and then kick on the Data Vault loads. In this scenario, only the loading process into the Datavault itself is then orchestrated by the included procedures of Datavault Builder, which will automatically take care of comparing data for changes, include the hashing / load times / load source and as well update the tracking satellites.
Yes, the Business Object generator is helping you to denormalize the Datavault Structure and preparing it for the output. You can select related Datavault Objects and add the attributes to the output. The Datavault Builder will prepare the necessary joins in the background.
The generated jobs consist of several staging and datavault loads of one source system. On execution, a procedure in the core module will read out all contained loads, automatically calculate a running order and generate dynamic SQL statements, which will perform the loading on the database.
We can load data from any source supplying a JDBC-Driver. Also, we have included support for Flat Files and REST-Apis.