The deployment module is all about the state of the developed model. With the module, the state can be exported, versioned or also imported onto another environment again. Also, comparisons between the logical model of one environment with another enviornment is possible. Lastly, to bring detected changes from one environment to another, either direct deployment as well as change script generation is supported.

Getting started

  1. Connect Environment

    Lets you connect to a remote environment or import an existing, locally stored state of a logical state.

  2. Export Local Environment

    Retrieves the logical state from the connected environment and packages it into human readable jsons within a zip file.

Export an environment


When exporting an environment, a state-archive is generated. This zip-file contains the information about the current data integration flows on a logical level. As the logical state is exported as json files, the state is human readable.


There are several reasons to export an environment. For one, based on the export, the model can be checked into an enterprise versioning software. Furthermore, as the files are human readable, different versions can as well be externally merged and reimported into another environment.

You can either export the local environment or also a remote environment after connecting to it.
a) Open up the deployment module and click onto export local environment. A zip file will automatically be generated and offered for download.
b) Open up the deployment module and click onto connect environment. Once connected, press the export-icon besides the connected environment to generate the state-archive.


Export file is case sensitive. In case it is unzipped on Mac/Windows and has objects which are different just in casing (e.g. a staging table id), it needs to be unzipped case sensitively.

Connect environment


An environment can either be a second instance of the Datavault Builder (e.g. Dev/Test/Prod) or an exported logical state of the implementation.


The goal is to either connect to a remote Datavault Builder Instance or to upload a previously exported state for comparison.

  1. Open up the deployment module and click onto connect environment.

  2. From the environment tabs, select the environment you would like to compare to.

  3. Depending on the environment, specifcy the connection details:

    1. Remote environment: Specify an URL and login details.


      The URL must begin with http:// or https:// and should be the URL of another datavaultbuilder webgui environment.

    2. File: Upload previously exported state-Zip-Files or JSON files.

    3. Folder: Drag and Drop the top root folder of the previously exported, unzipped state archiv.

    4. Empty model: Use this tab to either export or delete the local environment, depending on the arrow direction.

  1. Arrow

    Toggle the deployment direction by clicking the arrow.

  2. Environment Tab

    Select the sort of environment to connect to.

  3. Environment Details

    Specify the login properties or upload a folder to connect to the second environment state.

Comparing environment


Once connected to a second environment, the compare view lets you see differences and either build and execute custom packages of import or export actions to rollout changes to another environment.

  1. Open up the deployment module and connect a second environment.

  2. Go through the list and select the objects you would like to roll out.

  3. Initiate the deployment through the environment actions.

  1. Export Environment

    Download a state-archive from either of the two compared environments.

  2. Expand all

    Open up all categories and show contained elements.

  3. Environment actions

    From left to right: - Disconnect target environment - Switch deployment direction - Recompare both environments - Initiate deployment

  4. Filter List

    Apply a filter to only compare certain elements of the implementation

  5. Deployment package options

    The operation column shows you, what action is needed to get the target environments element into the same state as the source environment. With the checkbox you can select which elements should be included in the rollout. When clicking the checkbox, a dependency check is performed which supports you with rolling out dependent objects.

  6. Diff Viewer

    When clicking onto an object in the list, the diff viewer will show you the states of both sides and in case of differences also the changes in the middle part.

Deploying objects

  1. Deployment way

    Specify which way to roll out the previously build deployment package.

  2. Deployment direction

    Reminder of the direction of deployment.

  3. Actions

    Start direct deployment / export into change script or cancel the process.

Direct Deployment


Instantly start the rollout of a previously defined deployment package to propagate changes onto an environment.

  1. Open up the deployment module connect a second environment.

  2. Initiate the deployment and select the option for direct deployment.

Script based deployment


Script based deployment is usually the way to deploy in enterprise environment where specific deployment packages are created and tested before being rolled out onto a productive environment. Therefore, for the rollout a package of API-Calls is generated, which can then be executed against a target environment.

  1. Open up the deployment module and connect a second environment.

  2. Compose a deployment package and initiate the deployment.

  3. Select the option to export a deployment script package.

    This exports a file with name pattern [date]_[time] (e.g. 20180803_135742_rollout_package).
    In that package are two files. One of it contains environment parameters, the other the necessary API-calls.
  4. Configure files and setup required library to be able to initiate the script based rollout

    1. in the environment file (rollout.env), specify at least against which target to rollout and login user details. If you are rolling out source systems, you can also update their parameters according to the target environments needs.

    2. install cUrl ( and jq (

  5. Invoke the rollout by calling the script as described in the file (./ rollout.env)


Currently no error handling is in place. Therefore, you should test the rollout against a second.

Database structure rollout

As the Datavault Builder has no separate meta-data-repository, all objects are directly created on the processing database. For the rollout, the database objects and structures can be moved between the environments by using schema-comparison-tools.

By exporting the database structures, the current states of the implementation can as well be versioned by using a standard code-versioning-software, such as SVN or GIT.

  1. Deploy the database structures

  2. Deploy the content of the tables in the dvb_config-schema as well (as the environment configuration is stored in there)

    • config (adjust if necessary)

    • system_data & system_colors (adjust if necessary)

    • auth_users (adjust if necessary)

    • job_data, job_loads, job_schedules, job_sql_queries & job_triggers. At least the job _dvb_j_default has to be present on each environment.

    • source_types, source_type_parameters, source_type_parameter_groups, source_type_parameter_group_names, source_type_parameter_defaults > copy&paste as is.

  3. After the deployment, depending on your used database type (e.g. MSSQL, EXASOL, ORACLE), trigger the refreshing of the metadata by executing the following query in the command line in the GUI:

    SELECT dvb_core.f_refresh_all_base_views();

  4. If you have skipped to deploy parts of the dvb_config-schema, it may be necessary to:

    • create new users

    • update system configs (as most of the time in DEV-TEST-PROD have different configs) - Can also be done in the GUI

    • adjust jobs


  • When an object is dropped in the development environment and recreated with a different definition (e.g. other data types for columns), these changes can not directly be deployed.