Pass the Snowflake SnowPro Advanced Certification DEA-C01 Questions and answers with CertsForce

Viewing page 1 out of 2 pages
Viewing questions 1-10 out of questions
Questions # 1:

A large table with 200 columns contains two years of historical data. When queried. the table is filtered on a single day Below is the Query Profile:

Question # 1

Using a size 2XL virtual warehouse, this query look over an hour to complete

What will improve the query performance the MOST?

Options:

A.

increase the size of the virtual warehouse.


B.

Increase the number of clusters in the virtual warehouse


C.

Implement the search optimization service on the table


D.

Add a date column as a cluster key on the table


Expert Solution
Questions # 2:

A company has an extensive script in Scala that transforms data by leveraging DataFrames. A Data engineer needs to move these transformations to Snowpark.

…characteristics of data transformations in Snowpark should be considered to meet this requirement? (Select TWO)

Options:

A.

It is possible to join multiple tables using DataFrames.


B.

Snowpark operations are executed lazily on the server.


C.

User-Defined Functions (UDFs) are not pushed down to Snowflake


D.

Snowpark requires a separate cluster outside of Snowflake for computations


E.

Columns in different DataFrames with the same name should be referred to with squared brackets


Expert Solution
Questions # 3:

Which Snowflake objects does the Snowflake Kafka connector use? (Select THREE).

Options:

A.

Pipe


B.

Serverless task


C.

Internal user stage


D.

Internal table stage


E.

Internal named stage


F.

Storage integration


Expert Solution
Questions # 4:

What kind of Snowflake integration is required when defining an external function in Snowflake?

Options:

A.

API integration


B.

HTTP integration


C.

Notification integration


D.

Security integration


Expert Solution
Questions # 5:

Company A and Company B both have Snowflake accounts. Company A's account is hosted on a different cloud provider and region than Company B's account Companies A and B are not in the same Snowflake organization.

How can Company A share data with Company B? (Select TWO).

Options:

A.

Create a share within Company A's account and add Company B's account as a recipient of that share


B.

Create a share within Company A's account, and create a reader account that is a recipient of the share Grant Company B access to the reader account


C.

Use database replication to replicate Company A's data into Company B's account Create a share within Company B's account and grant users within Company B's

account access to the share


D.

Create a new account within Company A's organization in the same cloud provider and region as Company B's account Use database replication to replicate Company

A's data to the new account Create a share within the new account and add Company B's account as a recipient of that share


E.

Create a separate database within Company A's account to contain only those data sets they wish to share with Company B Create a share within Company A's account

and add all the objects within this separate database to the share Add Company B's account as a recipient of the share


Expert Solution
Questions # 6:

A Data Engineer ran a stored procedure containing various transactions During the execution, the session abruptly disconnected preventing one transactionfrom committing or rolling hark.The transaction was left in a detached state and created a lock on resources

...must the Engineer take to immediately run a new transaction?

Options:

A.

Call the system function SYSTEM$ABORT_TRANSACTION.


B.

Call the system function SYSTEM$CANCEL_TRANSACTION.


C.

Set the LOCK_TIMEOUTto FALSE in the stored procedure


D.

Set the transaction abort on error to true in the stored procedure.


Expert Solution
Questions # 7:

A Data Engineer has developed a dashboard that will issue the same SQL select clause to Snowflake every 12 hours.

---will Snowflake use the persisted query results from the result cache provided that the underlying data has not changed^

Options:

A.

12 hours


B.

24 hours


C.

14 days


D.

31 days


Expert Solution
Questions # 8:

A CSV file around 1 TB in size is generated daily on an on-premise server A corresponding table. Internal stage, and file format have already been created in Snowflake to facilitate the data loading process

How can the process of bringing the CSV file into Snowflake be automated using the LEAST amount of operational overhead?

Options:

A.

Create a task in Snowflake that executes once a day and runs a copy into statement that references the internal stage The internal stage will read the files directly

from the on-premise server and copy the newest file into the table from the on-premise server to the Snowflake table


B.

On the on-premise server schedule a SQL file to run using SnowSQL that executes a PUT to push a specific file to the internal stage Create a task that executes once a

day m Snowflake and runs a OOPY WTO statement that references the internal stage Schedule the task to start after the file lands in the internal stage


C.

On the on-premise server schedule a SQL file to run using SnowSQL that executes a PUT to push a specific file to the internal stage. Create a pipe that runs a copy

into statement that references the internal stage Snowpipe auto-ingest will automatically load the file from the internal stage when the new file lands in the internal

stage.


D.

On the on premise server schedule a Python file that uses the Snowpark Python library. The Python script will read the CSV data into a DataFrame and generate an

insert into statement that will directly load into the table The script will bypass the need to move a file into an internal stage


Expert Solution
Questions # 9:

Which methods can be used to create a DataFrame object in Snowpark? (Select THREE)

Options:

A.

session.jdbc_connection()


B.

session.read.json{)


C.

session,table()


D.

DataFraas.writeO


E.

session.builder()


F.

session.sql()


Expert Solution
Questions # 10:

Which functions will compute a 'fingerprint' over an entire table, query result, or window to quickly detect changes to table contents or query results? (Select TWO).

Options:

A.

HASH (*)


B.

HASH_AGG(*)


C.

HASH_AGG(, )


D.

HASH_AGG_COMPARE (*)


E.

HASH COMPARE(*)


Expert Solution
Viewing page 1 out of 2 pages
Viewing questions 1-10 out of questions