For certain data sources, the JDBC drivers needed are no longer included in the installation package of Composer. You need to provide your own JDBC driver for the following data sources:
The following Composer connectors are distributed with a JDBC driver, but you can download and install newer versions using the information in this topic: Snowflake.
This approach allows you the flexibility to add a specific JDBC driver that meets your licensing, support policies or operational needs. As a result, in order to connect to and visualize data from Composer, you first need to download and install a JDBC driver.
If the JDBC driver for the Composer connector is not configured, the connector server will not start and the connector cannot be enabled within Composer. See Manage Connectors and Connector Servers.
To use any of the connectors listed above, perform the following steps to install the required JDBC driver after successful installation of Composer microservices:
Download the required driver from the vendor’s site to the corresponding Composer instance. Place the required driver in the following folder:
<install_path>\lib\edc-<connector_name>/. For example, place MySQL libraries for the default Composer path install in this folder:
c:\logi-composer\lib\edc-mysql\if Composer is installed in
The truststore path passed as part of the JDBC url must only contain forward (/) slashes in Windows environments.
If the folder does not exist, you need to create it in the location mentioned above. See the following table for resources for vendor’s JDBC drivers. Make sure that the Composer administrator has read-level access rights to the JDBC driver (JAR) file.
Connector Link to JDBC Driver License Type Supported Version Amazon Redshift http://docs.aws.amazon.com/redshift/latest/mgmt/configure-jdbc-connection.html#download-jdbc-driver Commercial 220.127.116.117 Dremio https://www.dremio.com/drivers/ LGPL 4.1 MemSQL https://dev.mysql.com/downloads/connector/j/ LGPL 8.0.13 MySQL https://dev.mysql.com/downloads/connector/j/ LGPL 8.0.13 Oracle https://www.oracle.com/database/technologies/appdev/jdbc-ucp-183-downloads.html Commercial 18.104.22.168 SAP Hana https://developers.sap.com/trials-downloads.html Commercial 2.0 SAP IQ https://www.sap.com/index.html Commercial 16 Snowflake https://repo1.maven.org/maven2/net/snowflake/snowflake-jdbc/ LGPL Teradata https://downloads.teradata.com/download/connectivity/jdbc-driver Commercial 15.00.00.30 Vertica https://www.vertica.com/client-drivers/ Commercial All
Use the following command to access and open the property file:
vi /etc/zoomdata/edc-<connector_name>.properties. If you are not logged in as a root user, enter
sudo vi /etc/zoomdata/edc-<connector_name>.propertiesto create the desired file. If the properties file does not exist, this command creates it.
Windows platforms, using a text editor that can edit Windows property files:
<connector_name>with the name of the connector you are configuring:
Connector Connector Property File Name Amazon Redshift edc-redshift.properties MemSQL edc-memsql.properties Microsoft SQL Server edc-mssql.properties MySQL edc-mysql.properties Oracle edc-oracle.properties SAP Hana edc-saphana.properties SAP IQ edc-sapiq.properties Snowflake edc-snowflake.properties Teradata edc-teradata.properties Vertica edc-vertica.properties
In the edc-<connector_name>.properties file, add the following property:
If you need to add multiple paths, use a comma-separated list:
Save your changes to the properties file.
Restart the corresponding connector by running the appropriate command:
- For CentOS 7 and Ubuntu 18 or 20:
systemctl restart zoomdata-edc-<connector_name>
- For Windows:
PS C:\> Restart-Service -Name zoomdata-edc-<connector_name>
- For CentOS 7 and Ubuntu 18 or 20:
Log in as the supervisor, access the Connectors page, and verify that the connector is enabled so it appears in the data source list. After the JDBC driver has been configured and the connector has been enabled, users with the correct access privileges can use the connector to connect to the data store in a data source configuration.