The deployment module is all about the state of the developed model. With the module, the state can be exported, versioned or also imported onto another environment again. Also, comparisons between the logical model of one environment with another enviornment is possible. Lastly, to bring detected changes from one environment to another, either direct deployment as well as change script generation is supported.


  1. Environment

    Allows you to show deployment state for an environment (default: local). Allows you to connect to remote environments to show remote deployment state and perform deployment against target environment.

  2. History

    Lists the state of ongoing or past deployments.

  3. Deployment Actions
    • Export local environment: Generate a state definition file containing the logical model state of the environment.

    • Compare: Start a new comparison of two environment states.

Export An Environment


When exporting an environment, a state-archive is generated. This zip-file contains the information about the current data model and data flows on a logical level. As the logical state is exported as json files, the state is human readable.


There are several reasons to export an environment. For one, based on the export, the model can be checked into an enterprise versioning software (e.g. GIT). Furthermore, as the files are human readable, different versions can as well be externally merged and reimported into another environment.


There are two options to export an environments state.

  1. Using the frontend

    Open up the deployment module and click onto export local environment. Follow the dialogue to download the offered zip file.


    There is two options offered - unless needed, export the state in format version 2.0.

  2. Using the API

    The model can also be programmatically retrieved using API deploymentExportModel.

    The API will return a base64 encoded zip file containing one file per logical object defining its current state.

    Check API docs on our knowledge base for more details on the payload.


Export file is case sensitive. In case it is unzipped on Mac/Windows and has objects which are different just in casing (e.g. a staging table id), it needs to be unzipped case sensitively.


Ideally, when checking in the files with git, it is recommended to use UNIX based line endings.

Connect Remote Environment


A remote environment is another instance of Datavault Builder running.


Allow comparison between local state and remote state. As well, allow checking current ongoing deployment activities on target environment.

  1. Initiate the dialogue to connect to a remote environment - click onto remote tab in the environment navigation - or click onto compare action and then connect remote environment

  2. Fill out the fields and click onto connect (or if AD Auth is configured, click onto the MS Icon).

  1. URL

    Where the remote environment is running.

  2. Credentials

    For the user on the remote environment.

Initiate Comparison


An environment can either be a second instance of the Datavault Builder (e.g. Dev/Test/Prod) or an exported logical state of the implementation.


The goal is to get an overview of changes between two different versions of the model.

  1. Open up the deployment module and click onto compare.

  2. From the environment tabs, select the environment you would like to compare to.

  3. Depending on the environment, specify the connection details:
    • Remote environment: Use a connected remote environment or connect a new one

    • File: Upload previously exported state-Zip-Files or JSON files.

    • Folder: Drag and Drop the top root folder of the previously exported, unzipped state archiv.

    • Empty model: Use this tab to generate a full delete against the local environment.

    • GIT: Directly compare to a state stored in a connected GIT repository (Requires install wide GIT configuration as described here)

  1. Arrow

    Toggle the deployment direction by clicking the arrow.

  2. Environment Tab

    Select the sort of environment to connect to.

  3. Environment Details

    Specify the login properties or upload a folder to connect to the second environment state.

Using The Comparison View


Once connected to a second environment, the comparison view lets you see differences and define a selection of changes to rollout.

  1. Open up the deployment module and connect a second environment.

  2. Go through the list and select the objects you would like to roll out.

  3. Initiate the deployment through the environment actions.

  1. Source Environment

    The state which is compared from.

  2. Target Environment

    The state which is compared against - if this is an actual environment, then this is also the environment the selected changes can be deployed against.

  3. Expand all

    Open up all categories and show contained elements.

  4. Actions
    From left to right:
    • Disconnect target environment

    • Switch deployment direction

    • Recompare both environments

    • Initiate deployment

    • Select/Unselect all (filtered)

    • Commit to GIT (only available when targeting GIT)

  5. Filter List

    Apply a filter to only compare certain elements of the implementation

  6. Deployment package options

    The operation column shows you, what action is needed to get the target environments element into the same state as the source environment. With the checkbox you can select which elements should be included in the rollout. When clicking the checkbox, a dependency check is performed which supports you with rolling out dependent objects.

  7. Diff Viewer

    When clicking onto an object in the list, the diff viewer will show you the states of both sides and in case of differences also the changes in the middle part.

Check into GIT


Check in changes into GIT.

  1. Open up the deployment module, connect as target to a specific GIT Branch and start comparison.

  2. Select the changes you would like to commit.

  3. Click onto the “Commit to GIT” action

  4. Enter a commit message and hit commit.


  • Commit will be immediately pushed to central repository.

  • Commiting user will be automatically taken from logged in User (Full Name <Email>).

  • If a newer commit is available on the central repository after starting the comparison, the commit will fail and you need to start recompare against current branch head.

  • Only one DVB model may be present within the pre-configured directory.

  • Make sure to not add any independent files within the pre-configured directory as these will be removed on next commit and could cause errors on model import.

  • New state commited will always be written in format 2.0.

Deploying objects

  1. Deployment way

    Specify which way to roll out the previously build deployment package.

  2. Deployment direction

    Reminder of the direction of deployment.

  3. Actions

    Start direct deployment / export into change script or cancel the process.

Direct Deployment


Instantly start the rollout of a previously defined deployment package to propagate changes onto an environment.

  1. Open up the deployment module connect a second environment.

  2. Initiate the deployment and select the option for direct deployment.

Script based deployment


Script based deployment is usually the way to deploy in enterprise environment where specific deployment packages are created and tested before being rolled out onto a productive environment. Therefore, for the rollout a package of API-Calls is generated, which can then be executed against a target environment.

  1. Open up the deployment module and connect a second environment.

  2. Compose a deployment package and initiate the deployment.

  3. Select the option to export a deployment script package.

    This exports a zip file. In that package are three files. One of it contains environment parameters, the other is a shell script with the necessary API-calls. Additionally, there is a json file which holds as well the changes in form of a package which can be put onto the target.

  4. There are two ways to deploy the packages content against a target

  1. Sequential API calls:
    • in the environment file (rollout.env), specify at least against which target to rollout and login user details. If you are rolling out source systems, you can also update their parameters according to the target environments needs.

    • install cUrl ( and jq (

    • Invoke the rollout by calling the script as described in the file (./ rollout.env)

  2. Parallelized deployment run
    • put the deployment package onto the target which allows running the steps parallelized.

    • Take the content of the json file and make a call to API deploymentPutDeploymentPackage of the target environment.


Currently no error handling is in place. Therefore, you should test the rollout against a second.

Alternative: Database structure rollout

As the Datavault Builder has no separate meta-data-repository, all objects are directly created on the processing database. For the rollout, the database objects and structures can alternatively be moved between the environments by using schema-comparison-tools.

  1. Deploy the database structures

  2. Deploy the content of the tables in the dvb_config-schema as well (as the environment configuration is stored in there)
    • config (adjust if necessary)

    • system_data & system_colors (adjust if necessary)

    • auth_users (adjust if necessary)

    • job_data, job_loads, job_schedules, job_sql_queries & job_triggers. At least the job _dvb_j_default has to be present on each environment.

    • source_types, source_type_parameters, source_type_parameter_groups, source_type_parameter_group_names, source_type_parameter_defaults > copy&paste as is.

  3. After the deployment, depending on your used database type (e.g. MSSQL, EXASOL, ORACLE), trigger the refreshing of the metadata by executing the following query in the command line in the GUI:

    SELECT dvb_cache.f_refresh_caches();
  4. If you have skipped to deploy parts of the dvb_config-schema, it may be necessary to:
    • create new users

    • update system configs (as most of the time in DEV-TEST-PROD have different configs) - Can also be done in the GUI

    • adjust jobs

Deployment Limitations

  • When an object is dropped in the development environment and recreated with a different definition (e.g. other data types for columns), these changes can not directly be deployed.

  • When not commiting all local changes to GIT, an invalid or inconsistent state may result.

  • History: History of deployments only shows latest 20 deployments and will not be visible in the UI anymore after performing a recreation of the core service (docker-compose down & up).

  • Export: Staging table column order may be different on different environments and lead to changed order in the export.

  • Deployment: Direct comparison is only possible between environments on Version 7+.

  • New deployment state and packages can not be deployed against Versions below 7.

  • Unaltered default BR is dependent on BO: If the corresponding BO gets deployed, it will also update the Unaltered default BR (also if it is not selected for deployment)

  • Make sure files generated over the api are checked into enterprise versioning system using linux line breaks.

  • GIT comparison is only offered against head of a branch. Create new Branch based on a commit in case you need to compare to specific commit instead.

  • When model size of both the source state and the target state exceed 15000 objects, the definition for “equal” objects is not shown in the UI doing a comparison anymore.