Articles in this section
Category / Section

Data Warehouse for Bold Desk Using Bold ETL

Published:
Updated:

This article outlines the necessary steps to successfully import Bold Desk data into a data warehouse or database for analytical purposes using Bold ETL. Given that the data is stored in the PostgreSQL Server within the Bold Desk Data system and is currently in a normalized format, it is imperative to denormalize the data to facilitate analytical processes and derive meaningful insights.

Step 1: Click the Bold ETL icon to open the Bold ETL site in a new tab.

select-bold-etl.png

etl-page.png

Step 2: Click Add Project in the left-side panel. Enter the project name and click the tick icon.

etl-add-project.png

project-test.png

Step 3: Click on the Project name. This will open the YAML editor where you can configure the source and destination connector settings.

click-test-project.png

Step 4: Click PostgreSQL in the left side panel and add the template in the right side panel to add the sample configuration in the yaml editor.
Create the project’s source yaml configuration as follows.

version: 1
encrypt_credentials: false
plugins:
 extractors:
   - name: PostgreSQL
     connectorname: PostgreSQL
     config:
       host: serverip
       port: 5432
       username: username
       database: databasename
       password: password
       drivername: postgresql+pg8000
     select:
       - tablename1
       - tablename2
     metadata:
       tablename1:
         replication_method: FULL_TABLE
         replication_key: lastmodifiedon
         replication_value : 2024-01-01 00:00:00
         interval_type : month
         interval_value : 1
       tablename2:
         replication_method: FULL_TABLE
         replication_key: lastmodifiedon
         replication_value : 2024-01-01 00:00:00
         interval_type : month
         interval_value : 1

where interval type - days/hours/minutes/year/month

Step 5 : Save the project, and click Schedules in the left side panel.
Step 6: Go to Schedules, and select the Run Now option in the context menu of the data source grid.

schedule-run-now.png

Step 7: Logs will be available in the Logs tab. Click the project name in the left side panel and switch to the Logs tab.
table-created-warehouse.png

Step 8: Go to the Transformation tab and click Add Table and enter the table name to move it into the destination database.
transformation-tab.png

support-ticket-detail-table.png

Step 9: Tables will be moved into the destination schema named Project Name. For example, if the project name was test, it would be moved to the test schema.
Transformation Preview

Step 10: Enter the SQL scripts to apply transformations to existing tables and create the SQL scripts into a new table in the destination database.

sql-query-editor.png

Step 11: Click Save. Then, go to the Schedules page and select the Schedule option in the context menu of the data source grid. Configure the schedule and click Run.
schedule-copy-table.png

Step 12: According to the designated schedule, the process involves transferring data from tables and source tables to the destination database and configuring the schedule interval. For instance, the schedule interval is set to occur every 5 minutes.

schedule-interval.png

Step 13: In the first schedule, fetch the records from Jan 1, 2024, to Jan 31, 2024. In the next schedule, fetch the records from Feb 1, 2024, to Feb 29, 2024, and so on.

Step 14: Once the full data migration is complete, disable the schedule. After that, you can change the data refresh mode from FULL_TABLE to INCREMENTAL. Refer to INCREMENTAL.

version: 1
encrypt_credentials: false
plugins:
extractors:
  - name: PostgreSQL
    connectorname: PostgreSQL
    config:
      host: serverip
      port: 5432
      username: username
      database: databasename
      password: password
      drivername: postgresql+pg8000
    select:
      - tablename1
      - tablename2
    metadata:
      tablename1:
        replication_method: INCREMENTAL
        replication_key: last_modified_on
        replication_value : 2024-03-13 00:00:00
      tablename2:
        replication_method: INCREMENTAL
        replication_key: last_modified_on
        replication_value : 2024-03-13 00:00:00

Step 15: The Bold Extract, Transform, Load (ETL) tool will generate a data source entry on the Bold Reports Data Source Page. Subsequently, users can create a report using the data source from Bold ETL.

support-ticket-datasource.png

Attached are the Sample Yaml file and Transformation scripts.

Attachments

Was this article useful?
Like
Dislike
Help us improve this page
Please provide feedback or comments
Comments (0)
Please  to leave a comment
Access denied
Access denied