Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Snowflake Exam ARA-R01 Topic 2 Question 20 Discussion

Actual exam question for Snowflake's ARA-R01 exam
Question #: 20
Topic #: 2
[All ARA-R01 Questions]

A company's Architect needs to find an efficient way to get data from an external partner, who is also a Snowflake user. The current solution is based on daily JSON extracts that are placed on an FTP server and uploaded to Snowflake manually. The files are changed several times each month, and the ingestion process needs to be adapted to accommodate these changes.

What would be the MOST efficient solution?

Show Suggested Answer Hide Answer
Suggested Answer: D

Using Snowpipe for continuous, automated data ingestion minimizes the need for manual intervention and ensures that data is available in Snowflake promptly after it is generated. Leveraging Snowflake's data sharing capabilities allows for efficient and secure access to the vendor's data without the need for complex API integrations. Materialized views provide pre-aggregated data for fast access, which is ideal for dashboards that require high performance1234.

Reference =

* Snowflake Documentation on Snowpipe4

* Snowflake Documentation on Secure Data Sharing2

* Best Practices for Data Ingestion with Snowflake1


Contribute your Thoughts:

Nieves
12 days ago
I'm not sure about option B. I think option A could also work well if the partner is willing to create a share for us.
upvoted 0 times
...
Lottie
15 days ago
I agree with Lashawna. Using the data lake export feature would definitely improve efficiency and reduce manual work.
upvoted 0 times
...
Lashawna
16 days ago
I think option B would be the most efficient solution. It would streamline the process and make it easier for Snowflake to ingest the data.
upvoted 0 times
...

Save Cancel