Pre-Winter Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: pass65

Pass the Snowflake SnowPro Advanced: Architect ARA-C01 Questions and answers with CertsForce

Viewing page 5 out of 6 pages
Viewing questions 41-50 out of questions
Questions # 41:

A global company needs to securely share its sales and Inventory data with a vendor using a Snowflake account.

The company has its Snowflake account In the AWS eu-west 2 Europe (London) region. The vendor's Snowflake account Is on the Azure platform in the West Europe region. How should the company's Architect configure the data share?

Options:

A.

1. Create a share.2. Add objects to the share.3. Add a consumer account to the share for the vendor to access.


B.

1. Create a share.2. Create a reader account for the vendor to use.3. Add the reader account to the share.


C.

1. Create a new role called db_share.2. Grant the db_share role privileges to read data from the company database and schema.3. Create a user for the vendor.4. Grant the ds_share role to the vendor's users.


D.

1. Promote an existing database in the company's local account to primary.2. Replicate the database to Snowflake on Azure in the West-Europe region.3. Create a share and add objects to the share.4. Add a consumer account to the share for the vendor to access.


Expert Solution
Questions # 42:

An Architect is designing partitioned external tables for a Snowflake data lake. The data lake size may grow over time, and partition definitions may need to change in the future.

How can these requirements be met?

Options:

A.

Use the PARTITION BY () clause when creating the external table.


B.

Use partition_type = USER_SPECIFIED when creating the external table.


C.

Set METADATA$EXTERNAL_TABLE_PARTITION = MANUAL.


D.

Alter the table using ADD_PARTITION_COLUMN before defining a new partition column.


Expert Solution
Questions # 43:

Question # 43

Based on the architecture in the image, how can the data from DB1 be copied into TBL2? (Select TWO).

A)

Question # 43

B)

Question # 43

C)

Question # 43

D)

Question # 43

E)

Question # 43

Options:

A.

Option A


B.

Option B


C.

Option C


D.

Option D


E.

Option E


Expert Solution
Questions # 44:

A company wants to Integrate its main enterprise identity provider with federated authentication with Snowflake.

The authentication integration has been configured and roles have been created in Snowflake. However, the users are not automatically appearing in Snowflake when created and their group membership is not reflected in their assigned rotes.

How can the missing functionality be enabled with the LEAST amount of operational overhead?

Options:

A.

OAuth must be configured between the identity provider and Snowflake. Then the authorization server must be configured with the right mapping of users and roles.


B.

OAuth must be configured between the identity provider and Snowflake. Then the authorization server must be configured with the right mapping of users, and the resource server must be configured with the right mapping of role assignment.


C.

SCIM must be enabled between the identity provider and Snowflake. Once both are synchronized through SCIM, their groups will get created as group accounts in Snowflake and the proper roles can be granted.


D.

SCIM must be enabled between the identity provider and Snowflake. Once both are synchronized through SCIM. users will automatically get created and their group membership will be reflected as roles In Snowflake.


Expert Solution
Questions # 45:

What is a characteristic of event notifications in Snowpipe?

Options:

A.

The load history is stored In the metadata of the target table.


B.

Notifications identify the cloud storage event and the actual data in the files.


C.

Snowflake can process all older notifications when a paused pipe Is resumed.


D.

When a pipe Is paused, event messages received for the pipe enter a limited retention period.


Expert Solution
Questions # 46:

A company is following the Data Mesh principles, including domain separation, and chose one Snowflake account for its data platform.

An Architect created two data domains to produce two data products. The Architect needs a third data domain that will use both of the data products to create an aggregate data product. The read access to the data products will be granted through a separate role.

Based on the Data Mesh principles, how should the third domain be configured to create the aggregate product if it has been granted the two read roles?

Options:

A.

Use secondary roles for all users.


B.

Create a hierarchy between the two read roles.


C.

Request a technical ETL user with the sysadmin role.


D.

Request that the two data domains share data using the Data Exchange.


Expert Solution
Questions # 47:

A company uses the COPY INTO <title> command with the following sequence:

• A file is staged on March 1

• The table is loaded on March 2

• On June 30, the company attempts to reload the same file into the same table, but the file is skipped

Which options can load the file? (Select TWO).

Options:

A.

Set the PURGE option to TRUE.


B.

Set the FORCE option to TRUE.


C.

Set the VALIDATION_MODE option to FALSE.


D.

Set the LOAD_UNCERTAIN_FILES option to TRUE.


E.

Set the ALLOW_DUPLICATE option to TRUE.


Expert Solution
Questions # 48:

A company is designing a process for importing a large amount of loT JSON data from cloud storage into Snowflake. New sets of loT data get generated and uploaded approximately every 5 minutes.

Once the loT data is in Snowflake, the company needs up-to-date information from an external vendor to join to the data. This data is then presented to users through a dashboard that shows different levels of aggregation. The external vendor is a Snowflake customer.

What solution will MINIMIZE complexity and MAXIMIZE performance?

Options:

A.

1. Create an external table over the JSON data in cloud storage.2. Create a task that runs every 5 minutes to run a transformation procedure on new data, based on a saved timestamp.3. Ask the vendor to expose an API so an external function can be used to generate a call to join the data back to the loT data in the transformation procedure.4. Give the transformed table access to the dashboard tool.5. Perform the aggregations on the dashboard


B.

1. Create an external table over the JSON data in cloud storage.2. Create a task that runs every 5 minutes to run a transformation procedure on new data based on a saved timestamp.3. Ask the vendor to create a data share with the required data that can be imported into the company's Snowflake account.4. Join the vendor's data back to the loT data using a transformation procedure.5. Create views over the larger dataset to perform the aggrega


C.

1. Create a Snowpipe to bring the JSON data into Snowflake.2. Use streams and tasks to trigger a transformation procedure when new JSON data arrives.3. Ask the vendor to expose an API so an external function call can be made to join the vendor's data back to the loT data in a transformation procedure.4. Create materialized views over the larger dataset to perform the aggregations required by the dashboard.5. Give the materialized views acce


D.

1. Create a Snowpipe to bring the JSON data into Snowflake.2. Use streams and tasks to trigger a transformation procedure when new JSON data arrives.3. Ask the vendor to create a data share with the required data that is then imported into the Snowflake account.4. Join the vendor's data back to the loT data in a transformation procedure5. Create materialized views over the larger dataset to perform the aggregations required by the dashboard


Expert Solution
Questions # 49:

How is the change of local time due to daylight savings time handled in Snowflake tasks? (Choose two.)

Options:

A.

A task scheduled in a UTC-based schedule will have no issues with the time changes.


B.

Task schedules can be designed to follow specified or local time zones to accommodate the time changes.


C.

A task will move to a suspended state during the daylight savings time change.


D.

A frequent task execution schedule like minutes may not cause a problem, but will affect the task history.


E.

A task schedule will follow only the specified time and will fail to handle lost or duplicated hours.


Expert Solution
Questions # 50:

A Snowflake Architect is setting up database replication to support a disaster recovery plan. The primary database has external tables.

How should the database be replicated?

Options:

A.

Create a clone of the primary database then replicate the database.


B.

Move the external tables to a database that is not replicated, then replicate the primary database.


C.

Replicate the database ensuring the replicated database is in the same region as the external tables.


D.

Share the primary database with an account in the same region that the database will be replicated to.


Expert Solution
Viewing page 5 out of 6 pages
Viewing questions 41-50 out of questions