A Data Engineer is working on a continuous data pipeline which receives data from Amazon Kinesis Firehose and loads the data into a staging table which will later be used in the data transformation process. The average file size is 300-500 MB.
The Engineer needs to ensure that Snowpipe is performant while minimizing costs.
How can this be achieved?
DIPARJ
3 months agoSnow_P
7 months, 2 weeks agostopthisnow
7 months, 3 weeks ago