Create your CSV

CSV Uploads only work for custom integrations that can map their resources to Tables and Columns.

Generate a CSV of your data that contains a list of all of your columns in the database in the following format.

col_name

col_description

col_type

sort_order

database

schema

name

description

is_view

last_updated_time

id

ID of the user

int

1

secoda

public

users

the table of all users

false

None

name

name of the user

char

2

secoda

public

users

the table of all users

false

None

  • col_name -> The name of the column (ex. event_name)

    • ❗Required

    • Column names can be made up of numbers, letters, and any special characters except periods (ex. event.name 👎).

    • ℹī¸ Column names will be made lower case when brought into Secoda

  • col_description -> The description of the column

    • Column descriptions can be made up of numbers, letters, and any special characters.

    • ℹī¸ There is no additional validation or transformation on this field. The way you input it into the CSV is the way it will show in the UI.

  • col_type -> The type of data in the column (ex. character varying(25))

    • Column types can be made up of numbers, letters, and any special characters.

    • ℹī¸ There is no additional validation or transformation on this field. The way you input it into the CSV is the way it will show in the UI.

  • col_sort_order -> The default order that the columns should be sorted in

    • Sort order must be an integer.

    • ℹī¸ If sort order is not important, you can set the field to 0 and it will default to alphabetical order.

  • name -> The name of the table the column belongs to

    • ❗Required

    • Table names can be made up of numbers, letters, and any special characters except periods.

  • description -> The description for the table the column belongs to.

    • Table descriptions can be made up of numbers, letters, and any special characters.

    • ℹī¸ There is no additional validation or transformation on this field. The way you input it into the CSV is the way it will show in the UI.

  • database -> The name of the database the column and table belong to.

    • ❗Required

    • Column names can be made up of numbers, letters, and any special characters except periods.

  • schema -> The name of the schema the column and table belong to.

    • ❗Required

    • Column names can be made up of numbers, letters, and any special characters except periods.

  • is_view -> Indicates whether the table is a view (generated from a query).

    • Is View is a boolean value expecting either true or false.

  • last_updated_time -> The timestamp of when the table was last updated.

    • Last updated time is a timestamp that expects an integer value of the Unix Epoch time.

For rows which contain columns belonging to the same table, the is_view, last_updated_time, name, and description fields should all be identical.

Ex. If there are 5 columns all belonging to the table TestTable, each row for the columns in the CSV would have a name of TestTable and an identical description.

For example with Snowflake, this CSV can be generated with a SQL statement like this.

SELECT DISTINCT
    lower(c.column_name)                                    AS col_name,
    c.comment                                               AS col_description,
    lower(c.data_type)                                      AS col_type,
    lower(c.ordinal_position)                               AS col_sort_order,
    lower(c.table_catalog)                                  AS database,
    lower(c.table_schema)                                   AS schema,
    lower(c.table_name)                                     AS name,
    t.comment                                               AS description,
    decode(lower(t.table_type), 'view', 'true', 'false')    AS is_view,
    DATE_PART(EPOCH, t.last_altered)                        AS last_updated_time
FROM {database.title}.INFORMATION_SCHEMA.COLUMNS AS c
LEFT JOIN {database.title}.INFORMATION_SCHEMA.TABLES t ON c.TABLE_NAME = t.TABLE_NAME
 AND c.TABLE_SCHEMA = t.TABLE_SCHEMA
WHERE c.TABLE_SCHEMA not in ({','.join(SnowflakeConnectionConstants.DEFAULT_IGNORED_SCHEMAS)}) \
        AND c.TABLE_SCHEMA not like 'STAGE_%' \
        AND c.TABLE_SCHEMA not like 'HIST_%' \
        AND c.TABLE_SCHEMA not like 'SNAP_%' \
        AND lower(c.TABLE_SCHEMA) not like 'dbt_cloud_pr%' \
        AND lower(c.TABLE_SCHEMA) not like 'ci_impact_resilience_%' \
        AND lower(c.COLUMN_NAME) not like 'dw_%' \

Feel free to reach out to us for more guidance on how to generate CSVs.

At this point, you cannot add custom properties or tags using the CSV upload. However, you can add custom properties and tags after the initial extraction has been done using the Import/Export feature.

Last updated