Preparing target database
To start the Setup of the Datavault Builder, some Objects (Mainly Tables for Config and Logging and Metadata Views) need to be installed on the target Database.
When using a separate, non-containered Version of the Database, then this can be achieved by starting up the containerized version of the clientdabase once, connecting into the container and executing the install scripts.
Postgres, MS SQL, Oracle, Exasol
Run a helper container and connect to the shell using bash or windows powershell
docker run -it --rm --entrypoint=/bin/bash --mount type=bind,ro=true,src=$(pwd)/secrets/authenticator_password.txt,dst=/run/secrets/authenticator_password docker.datavault-builder.com/datavaultbuilder/clientdb_<target_db>:7.4.3.0
Hint
where
<target_db>
is mssql, oracle, postgres or exasolthe secret file for the authenticator user under
secrets/authenticator_password.txt
must exists
change into folder with source code
cd /dvb_sql/
execute shell script which deploys source code, depending on the database, the script execution may vary
- Postgres
./create_db.sh <host> <port> <database-name> <admin-user> <admin-password>
- MS SQL
Hint
If the named database does not exists, it will be created
Make sure to either specify an instance_name or a port, but not both!
./create_db.sh <host>[/<instance_name>|,<port>] <sys-user> <password> <database-name>
- Sample
./create_db.sh internal.domain.com,1433 sysusername password123 dvb_warehouse
- Exasol
./create_db.sh <host>:<port> <sys-user> <password>
- Oracle
Hint
the sys user must have have the sysdba role for this script to work.
./create_db.sh '<sys-user>/"<password>"@"<host>:<port>/<service>.<domain>" as sysdba'
Snowflake
Install the Snowflake python module:
python -m pip install --upgrade pip pip install --upgrade snowflake-connector-python
Execute python script to install DVB structures:
python .\snowflake_by_schema_install.py -d <db_name> -w <warehouse_name> -a <snowflake_account> -u <username> -p "<password>" [ --dvb-connection-password "<authenticator_password>" ]
Hint
The python script will be available in the install package zip file, which you will receive from us. Make sure to use the correct casing for database, warehouse and username.
Database Specific Limitations
- Oracle:
Does not know transactions for ddl-statements. Therefore, it can happen that structures may remain in an inconsistent state when an error occurs.
Loading of CLOBs and BLOBs into the Data Vault is currently not supported.
Data Viewer / Data Preview does not (fully) support tables/columns with CLOBs and BLOBs.
- Snowflake:
Does not know transactions for ddl-statements. Therefore, it can happen that structures may remain in an inconsistent state when an error occurs.
Missing dependencies metadata restricts data lineage calculation.
- Azure Synapse:
No Field length longer than 8000 bytes as large datatypes can not participate in column store index.
No transactions: Everything is running in auto-commit mode.
TimeTZ not loadable.
No FKs on synapse - no validation for referential integrity.
- Snowflake Source Connector:
Limitations with processing timestamps due to a handling issue in the JDBC Driver