Snowflake Command

Snowflake Command activity uses the PUT command to upload the data files from a local file system to the internal stage and the COPY INTO <table> command to load the data from staged files to an existing table.

Settings

The Settings tab has the following fields.

Field Required Description
Snowflake Connection Yes The name of the Snowflake connection.
Command Yes

You can manage the staging of files using the following commands:

PUT: You can upload the data files from a local storage to the snowflake's internal stage, or to a table's internal stage, or to your internal stage.
COPY INTO: You can load the data from the staged files to an existing table.

Input

If you select the PUT command on the Settings tab, the Input tab displays the following fields.

Field Required Description
fileName Yes

Specify the URI of the file in the string format.

Example: /home/<user>/Snowflake/mycsv.csv

internalStage Yes

Specify the location where the files must be uploaded.

Examples:

User Stage: "@~/my_stage"
Table Stage: "@%mytable"
Named Stage: "@my_stage"
parallel No Specify the number of threads that must be used to upload the files.
auto_compress No Specify whether snowflake uses the GZIP format to compress the files to be uploaded.
source_compress No Specify the compression method to be used on the staged files.
overwrite No Specify whether snowflake must overwrite the uploaded files.

For more information, See the PUT section of File Staging of Snowflake documentation.

If you select the COPY INTO command on the Settings tab, the Input tab has the following fields.

Field Required Description
tableName Yes

Specify the name of the table into which the data must be loaded.

stageName Yes

Specify the name of the internal stage from where the data must be loaded.

Examples:

User Stage: "@~/my_stage"
Table Stage: "@%mytable"
Named Stage: "@my_stage"
files No Specify the list of all the files that must be loaded.
pattern No Specify the file name as a regular expression pattern.
format_Name No Specify the file format that must be used to load the data into a table.
type No Specify the type of the file that must be loaded into a table.
formatTypeOptions No

Specify the format type options depending on the type specified for the file. Mention the required parameters in JSON format string.

Example: "{\"field_delimiter\": \",\", \"compression\": \"GZIP\"}"

copyOptions No

Specify one or more copy options from the list:

on_error: Specify the operation to be performed for error handling.
size_limit: Specify the maximum amount of data that can be loaded.
purge: Specify if the data files must be automatically deleted from the internal stage after the data is loaded successfully.
return_failed_only: Specify if only the files that have failed to load in the result must be returned.
match_by_column_name: Specify if the semi-structured data from the columns must be loaded to the target table.
enforce_length: Specify if the strings that exceed the target column length must be truncated.
truncatecolumns: Specify if the strings that exceed the target column length must be truncated. This field is an alternative option to enforce_length field.
force: Specify to load all the files irrespective of whether they were loaded earlier.
load_uncertain_files: Specify to load the files whose status is unknown.
validation_mode No Validate the files instead of loading the files to the table.

For more information, See the COPY INTO section of Loading and Unloading of Snowflake documentation.

Output

The Output tab displays the output schema of the activity as a tree structure. The output is read-only. The properties that are displayed in the schema correspond to the output of this activity and can be used as input by subsequent activities in the flow.

If you select the PUT Command on the Settings tab, the Output tab has the following fields.

Field Description
message Specifies the additional information or messages related to the operation.
source Specifies the source file path from where the data was uploaded.
source_compression Indicates the compression type of the source file.
source_size Specifies the size of the source file in bytes.
status Indicates the status of the upload operation. For example, "UPLOADED" means that the upload operation was successful.
target Specifies the name of the target file created after the upload operation.
target_compression Indicates the compression type of the target file.
target_size Specifies the size of the target file in bytes.

If you select the COPY INTO Command on the Settings tab, the Output tab has the following fields.

Field Description
file Specifies the files from where the data was loaded.
status Indicates the status of the data loading process. For example, "LOADED" means that the data was successfully loaded.
rows_parsed Indicates the number of rows that are parsed from the input file.
rows_loaded Indicates the number of rows that are successfully loaded into the target table.
error_limit Specifies the maximum number of errors allowed during the data loading process.
errors_seen Indicates the number of errors that are encountered during the data loading process.
first_error Provides the details of the first error encountered during the data loading process.
first_error_line Specifies the line number in the input file where the first error occurred.
first_error_character Specifies the character position in the input file where the first error occurred.
first_error_column_name Specifies the name of the column where the first error occurred.

Loop

If you want this activity to iterate multiple times within the flow, enter an expression that evaluates to the iteration details. Select a type of iteration from the Type menu. The default type is None, which means the activity does not iterate. For more information, see the "Using the Loop Feature in an Activity" topic in the TIBCO Cloud ™ Integration documentation.