Winter Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: pass65

Pass the Confluent Confluent Certified Developer CCDAK Questions and answers with CertsForce

Viewing page 3 out of 3 pages
Viewing questions 21-30 out of questions
Questions # 21:

Match the testing tool with the type of test it is typically used to perform.


Expert Solution
Questions # 22:

(You deploy a Kafka Streams application with five application instances.

Kafka Streams stores application metadata using internal topics.

Auto-topic creation is disabled in the Kafka cluster.

Which statement about this scenario is true?)

Options:

A.

The application will continue to work and internal topics will be created, even if auto-topic creation is disabled.


B.

The application will terminate with a non-retriable exception.


C.

The application will work, but application metadata will not be stored.


D.

The application will be on hold until internal topics are created manually.


Expert Solution
Questions # 23:

Match each configuration parameter with the correct deployment step in installing a Kafka connector.

Question # 23


Expert Solution
Questions # 24:

You are experiencing low throughput from a Java producer.

Metrics show low I/O thread ratio and low I/O thread wait ratio.

What is the most likely cause of the slow producer performance?

Options:

A.

Compression is enabled.


B.

The producer is sending large batches of messages.


C.

There is a bad data link layer (layer 2) connection from the producer to the cluster.


D.

The producer code has an expensive callback function.


Expert Solution
Questions # 25:

(Your application consumes from a topic configured with a deserializer.

You want the application to be resilient to badly formatted records (poison pills).

You surround the poll() call with a try/catch block for RecordDeserializationException.

You need to log the bad record, skip it, and continue processing other records.

Which action should you take in the catch block?)

Options:

A.

Log the bad record and seek the consumer to the offset of the next record.


B.

Log the bad record and call consumer.skip() method.


C.

Throw a runtime exception to trigger a restart of the application.


D.

Log the bad record; no other action is needed.


Expert Solution
Questions # 26:

You use Kafka Connect with the JDBC source connector to extract data from a large database and push it into Kafka.

The database contains tens of tables, and the current connector is unable to process the data fast enough.

You add more Kafka Connect workers, but throughput doesn't improve.

What should you do next?

Options:

A.

Increase the number of Kafka partitions for the topics.


B.

Increase the value of the connector's property tasks.max.


C.

Add more Kafka brokers to the cluster.


D.

Modify the database schemas to enable horizontal sharding.


Expert Solution
Questions # 27:

This schema excerpt is an example of which schema format?

package com.mycorp.mynamespace;

message SampleRecord {

int32 Stock = 1;

double Price = 2;

string Product_Name = 3;

}

Options:

A.

Avro


B.

Protobuf


C.

JSON Schema


D.

YAML


Expert Solution
Viewing page 3 out of 3 pages
Viewing questions 21-30 out of questions