Pass the Snowflake SnowPro Advanced: Architect ARA-R01 Questions and answers with CertsForce

Viewing page 1 out of 5 pages
Viewing questions 1-10 out of questions
Questions # 1:

A user has the appropriate privilege to see unmasked data in a column.

If the user loads this column data into another column that does not have a masking policy, what will occur?

Options:

A.

Unmasked data will be loaded in the new column.


B.

Masked data will be loaded into the new column.


C.

Unmasked data will be loaded into the new column but only users with the appropriate privileges will be able to see the unmasked data.


D.

Unmasked data will be loaded into the new column and no users will be able to see the unmasked data.


Expert Solution
Questions # 2:

Data is being imported and stored as JSON in a VARIANT column. Query performance was fine, but most recently, poor query performance has been reported.

What could be causing this?

Options:

A.

There were JSON nulls in the recent data imports.


B.

The order of the keys in the JSON was changed.


C.

The recent data imports contained fewer fields than usual.


D.

There were variations in string lengths for the JSON values in the recent data imports.


Expert Solution
Questions # 3:

An Architect is designing a file ingestion recovery solution. The project will use an internal named stage for file storage. Currently, in the case of an ingestion failure, the Operations team must manually download the failed file and check for errors.

Which downloading method should the Architect recommend that requires the LEAST amount of operational overhead?

Options:

A.

Use the Snowflake Connector for Python, connect to remote storage and download the file.


B.

Use the get command in SnowSQL to retrieve the file.


C.

Use the get command in Snowsight to retrieve the file.


D.

Use the Snowflake API endpoint and download the file.


Expert Solution
Questions # 4:

A company is trying to Ingest 10 TB of CSV data into a Snowflake table using Snowpipe as part of Its migration from a legacy database platform. The records need to be ingested in the MOST performant and cost-effective way.

How can these requirements be met?

Options:

A.

Use ON_ERROR = continue in the copy into command.


B.

Use purge = TRUE in the copy into command.


C.

Use FURGE = FALSE in the copy into command.


D.

Use on error = SKIP_FILE in the copy into command.


Expert Solution
Questions # 5:

A healthcare company wants to share data with a medical institute. The institute is running a Standard edition of Snowflake; the healthcare company is running a Business Critical edition.

How can this data be shared?

Options:

A.

The healthcare company will need to change the institute’s Snowflake edition in the accounts panel.


B.

By default, sharing is supported from a Business Critical Snowflake edition to a Standard edition.


C.

Contact Snowflake and they will execute the share request for the healthcare company.


D.

Set the share_restriction parameter on the shared object to false.


Expert Solution
Questions # 6:

An Architect needs to improve the performance of reports that pull data from multiple Snowflake tables, join, and then aggregate the data. Users access the reports using several dashboards. There are performance issues on Monday mornings between 9:00am-11:00am when many users check the sales reports.

The size of the group has increased from 4 to 8 users. Waiting times to refresh the dashboards has increased significantly. Currently this workload is being served by a virtual warehouse with the following parameters:

AUTO-RESUME = TRUE AUTO_SUSPEND = 60 SIZE = Medium

What is the MOST cost-effective way to increase the availability of the reports?

Options:

A.

Use materialized views and pre-calculate the data.


B.

Increase the warehouse to size Large and set auto_suspend = 600.


C.

Use a multi-cluster warehouse in maximized mode with 2 size Medium clusters.


D.

Use a multi-cluster warehouse in auto-scale mode with 1 size Medium cluster, and set min_cluster_count = 1 and max_cluster_count = 4.


Expert Solution
Questions # 7:

A retailer's enterprise data organization is exploring the use of Data Vault 2.0 to model its data lake solution. A Snowflake Architect has been asked to provide recommendations for using Data Vault 2.0 on Snowflake.

What should the Architect tell the data organization? (Select TWO).

Options:

A.

Change data capture can be performed using the Data Vault 2.0 HASH_DIFF concept.


B.

Change data capture can be performed using the Data Vault 2.0 HASH_DELTA concept.


C.

Using the multi-table insert feature in Snowflake, multiple Point-in-Time (PIT) tables can be loaded in parallel from a single join query from the data vault.


D.

Using the multi-table insert feature, multiple Point-in-Time (PIT) tables can be loaded sequentially from a single join query from the data vault.


E.

There are performance challenges when using Snowflake to load multiple Point-in-Time (PIT) tables in parallel from a single join query from the data vault.


Expert Solution
Questions # 8:

Based on the Snowflake object hierarchy, what securable objects belong directly to a Snowflake account? (Select THREE).

Options:

A.

Database


B.

Schema


C.

Table


D.

Stage


E.

Role


F.

Warehouse


Expert Solution
Questions # 9:

A company has a Snowflake account named ACCOUNTA in AWS us-east-1 region. The company stores its marketing data in a Snowflake database named MARKET_DB. One of the company’s business partners has an account named PARTNERB in Azure East US 2 region. For marketing purposes the company has agreed to share the database MARKET_DB with the partner account.

Which of the following steps MUST be performed for the account PARTNERB to consume data from the MARKET_DB database?

Options:

A.

Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA create a share of database MARKET_DB, create a new database out of this share locally in AWS us-east-1 region, and replicate this new database to AZABC123 account. Then set up data sharing to the PARTNERB account.


B.

From account ACCOUNTA create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then make this database the provider and share it with the PARTNERB account.


C.

Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA replicate the database MARKET_DB to AZABC123 and from this account set up the data sharing to the PARTNERB account.


D.

Create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then replicate this database to the partner’s account PARTNERB.


Expert Solution
Questions # 10:

An Architect needs to design a Snowflake account and database strategy to store and analyze large amounts of structured and semi-structured data. There are many business units and departments within the company. The requirements are scalability, security, and cost efficiency.

What design should be used?

Options:

A.

Create a single Snowflake account and database for all data storage and analysis needs, regardless of data volume or complexity.


B.

Set up separate Snowflake accounts and databases for each department or business unit, to ensure data isolation and security.


C.

Use Snowflake's data lake functionality to store and analyze all data in a central location, without the need for structured schemas or indexes


D.

Use a centralized Snowflake database for core business data, and use separate databases for departmental or project-specific data.


Expert Solution
Viewing page 1 out of 5 pages
Viewing questions 1-10 out of questions