Pre-Winter Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: pass65

Pass the Snowflake SnowPro Advanced: Architect ARA-C01 Questions and answers with CertsForce

Viewing page 4 out of 6 pages
Viewing questions 31-40 out of questions
Questions # 31:

An Architect has a table called leader_follower that contains a single column named JSON. The table has one row with the following structure:

{

"activities": [

{ "activityNumber": 1, "winner": 5 },

{ "activityNumber": 2, "winner": 4 }

],

"follower": {

"name": { "default": "Matt" },

"number": 4

},

"leader": {

"name": { "default": "Adam" },

"number": 5

}

}

Which query will produce the following results?

ACTIVITY_NUMBER

WINNER_NAME

1

Adam

2

Matt

Options:

A.

SELECT lf.json:activities.activityNumber AS activity_number,

IFF(

lf.json:activities.activityNumber = lf.json:leader.number,

lf.json:leader.name.default,

lf.json:follower.name.default

)::VARCHAR

FROM leader_follower lf;


B.

SELECT


C.

value:activityNumber AS activity_number,

IFF(


D.

value:winner = lf.json:leader.number,

lf.json:leader.name.default,

lf.json:follower.name.default

)::VARCHAR AS winner_name

FROM leader_follower lf,

LATERAL FLATTEN(input => json:activities) p;


E.

SELECT


F.

value:activityNumber AS activity_number,

IFF(


G.

value:winner = lf.json:leader.number,

lf.json:leader,

lf.json:follower

)::VARCHAR AS winner_name

FROM leader_follower lf,

LATERAL FLATTEN(input => json:activities) p;


Expert Solution
Questions # 32:

An Architect runs the following SQL query:

Question # 32

How can this query be interpreted?

Options:

A.

FILEROWS is a stage. FILE_ROW_NUMBER is line number in file.


B.

FILEROWS is the table. FILE_ROW_NUMBER is the line number in the table.


C.

FILEROWS is a file. FILE_ROW_NUMBER is the file format location.


D.

FILERONS is the file format location. FILE_ROW_NUMBER is a stage.


Expert Solution
Questions # 33:

A Snowflake Architect is designing a multi-tenant application strategy for an organization in the Snowflake Data Cloud and is considering using an Account Per Tenant strategy.

Which requirements will be addressed with this approach? (Choose two.)

Options:

A.

There needs to be fewer objects per tenant.


B.

Security and Role-Based Access Control (RBAC) policies must be simple to configure.


C.

Compute costs must be optimized.


D.

Tenant data shape may be unique per tenant.


E.

Storage costs must be optimized.


Expert Solution
Questions # 34:

Which security, governance, and data protection features require, at a MINIMUM, the Business Critical edition of Snowflake? (Choose two.)

Options:

A.

Extended Time Travel (up to 90 days)


B.

Customer-managed encryption keys through Tri-Secret Secure


C.

Periodic rekeying of encrypted data


D.

AWS, Azure, or Google Cloud private connectivity to Snowflake


E.

Federated authentication and SSO


Expert Solution
Questions # 35:

You are a snowflake architect in an organization. The business team came to to deploy an use case which requires you to load some data which they can visualize through tableau. Everyday new data comes in and the old data is no longer required.

What type of table you will use in this case to optimize cost

Options:

A.

TRANSIENT


B.

TEMPORARY


C.

PERMANENT


Expert Solution
Questions # 36:

An Architect has a design where files arrive every 10 minutes and are loaded into a primary database table using Snowpipe. A secondary database is refreshed every hour with the latest data from the primary database.

Based on this scenario, what Time Travel query options are available on the secondary database?

Options:

A.

A query using Time Travel in the secondary database is available for every hourly table version within the retention window.


B.

A query using Time Travel in the secondary database is available for every hourly table version within and outside the retention window.


C.

Using Time Travel, secondary database users can query every iterative version within each hour (the individual Snowpipe loads) in the retention window.


D.

Using Time Travel, secondary database users can query every iterative version within each hour (the individual Snowpipe loads) and outside the retention window.


Expert Solution
Questions # 37:

Assuming all Snowflake accounts are using an Enterprise edition or higher, in which development and testing scenarios would be copying of data be required, and zero-copy cloning not be suitable? (Select TWO).

Options:

A.

Developers create their own datasets to work against transformed versions of the live data.


B.

Production and development run in different databases in the same account, and Developers need to see production-like data but with specific columns masked.


C.

Data is in a production Snowflake account that needs to be provided to Developers in a separate development/testing Snowflake account in the same cloud region.


D.

Developers create their own copies of a standard test database previously created for them in the development account, for their initial development and unit testing.


E.

The release process requires pre-production testing of changes with data of production scale and complexity. For security reasons, pre-production also runs in the production account.


Expert Solution
Questions # 38:

An Architect needs to design a data unloading strategy for Snowflake, that will be used with the COPY INTO command.

Which configuration is valid?

Options:

A.

Location of files: Snowflake internal location. File formats: CSV, XML. File encoding: UTF-8. Encryption: 128-bit


B.

Location of files: Amazon S3. File formats: CSV, JSON. File encoding: Latin-1 (ISO-8859). Encryption: 128-bit


C.

Location of files: Google Cloud Storage. File formats: Parquet. File encoding: UTF-8· Compression: gzip


D.

Location of files: Azure ADLS. File formats: JSON, XML, Avro, Parquet, ORC. Compression: bzip2. Encryption: User-supplied key


Expert Solution
Questions # 39:

When loading data into a table that captures the load time in a column with a default value of either CURRENT_TIME () or CURRENT_TIMESTAMP() what will occur?

Options:

A.

All rows loaded using a specific COPY statement will have varying timestamps based on when the rows were inserted.


B.

Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were read from the source.


C.

Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were created in the source.


D.

All rows loaded using a specific COPY statement will have the same timestamp value.


Expert Solution
Questions # 40:

The following DDL command was used to create a task based on a stream:

Question # 40

Assuming MY_WH is set to auto_suspend – 60 and used exclusively for this task, which statement is true?

Options:

A.

The warehouse MY_WH will be made active every five minutes to check the stream.


B.

The warehouse MY_WH will only be active when there are results in the stream.


C.

The warehouse MY_WH will never suspend.


D.

The warehouse MY_WH will automatically resize to accommodate the size of the stream.


Expert Solution
Viewing page 4 out of 6 pages
Viewing questions 31-40 out of questions