Warehouse Importer

Push any type of table-like metadata, following our API requirements

Reach out to us to retrieve your API secret

In the end, warehouses look like one another. They all have tables, columns, users and queries. So we've defined a format, and as long as you follow, we'll load your metadata into Catalog.

You just need to fill in the 7 files below and push them to our endpoint using the Catalog Uploader

CSV formatting

Here's an example of a very simple CSV file:

Some fields such as tags are typed as list[string]

In that case, several formats are accepted:

- list "['a', 'b']"
- tuples "('a', 'b')"
- sets "{'a', 'b', 'c'}"

Empty list allowed: []
Singleton allowed: 'a'
Multiple types allowed: "['foo', 100, 19.8]"

Please keep in mind that fields containing comas must be quoted (see details below)

Forbidden characters

  • Column separator is the coma ,

  • Row separator is the carriage return

Quoting

Most of string fields (table names, column names, etc.) should not contain commas or carriage returns. Generally the problem comes with large text fields, such as SQL queries or descriptions. If you have any doubts, you can quote all your text fields:

Files

πŸ”‘ Primary Key (must be unique)

πŸ” Foreign Key (must reference an existing entry)

❓Optional (empty string in the CSV)

1. Database

Fields

id string πŸ”‘

database_name string

2. Schema

Fields

id string πŸ”‘

database_id string β†’ database.id πŸ”

schema_name string

description string ❓

tags list[string] ❓

3. Table

Fields

id string πŸ”‘

schema_id string β†’ schema.id πŸ”

table_name string

description string ❓

tags list[string] ❓

type enum {TABLE | VIEW | EXTERNAL |Β TOPIC}

owner_external_id string β†’ user.id ❓

4. Column

Fields

id string πŸ”‘

table_id string β†’ table.id πŸ”

column_name string

description string ❓

data_type enum: { BOOLEAN | INTEGER | FLOAT | STRING | ... | CUSTOM }

ordinal_position positive integer ❓

5. Query

If you do not want to fill up the Query file, you can simply upload it with no data (the file itself is required)

Fields

query_id string β†’ query.id

database_id string β†’ database.id πŸ”

database_name string β†’ database.name

schema_name string β†’ schema.name

query_text string

user_id string β†’ user.id πŸ”

user_name string β†’ user name

start_time timestamp

end_time timestamp ❓

6. View DDL

If you do not want to fill up the View DDL file, you can simply upload it with no data (the file itself is required)

Fields

database_name string

schema_name string

view_name string

view_definition string

7. User

If you do not want to fill up the User file, you can simply upload it with no data (the file itself is required)

Fields

id string πŸ”‘

email string ❓

first_name string ❓

last_name string ❓

Lineage

We compute lineage for your integration by analysing and parsing the Queries and View DDL when possible.

Alternatively, you can complete the following lineage mapping for Tables and/or Columns and we will ingest them during each update.

1. Table Lineage

Fields

parent_path string πŸ”‘: path of the parent table

child_path string πŸ”‘: path of the child table

2. Column Lineage

Fields

parent_path string πŸ”‘: path of the parent column

child_path string πŸ”‘: path of the child column

Last updated

Was this helpful?