Pass the Confluent Confluent Certified Developer CCDAK Questions and answers with CertsForce

Viewing page 2 out of 2 pages
Viewing questions 11-20 out of questions
Questions # 11:

Which two statements are correct about transactions in Kafka?

(Select two.)

Options:

A.

All messages from a failed transaction will be deleted from a Kafka topic.


B.

Transactions are only possible when writing messages to a topic with single partition.


C.

Consumers can consume both committed and uncommitted transactions.


D.

Information about producers and their transactions is stored in the _transaction_state topic.


E.

Transactions guarantee at least once delivery of messages.


Expert Solution
Questions # 12:

Which two producer exceptions are examples of the class RetriableException? (Select two.)

Options:

A.

LeaderNotAvailableException


B.

RecordTooLargeException


C.

AuthorizationException


D.

NotEnoughReplicasException


Expert Solution
Questions # 13:

Match the testing tool with the type of test it is typically used to perform.

Question # 13


Expert Solution
Questions # 14:

You are developing a Java application using a Kafka consumer.

You need to integrate Kafka’s client logs with your own application’s logs using log4j2.

Which Java library dependency must you include in your project?

Options:

A.

SLF4J implementation for Log4j 1.2 (org.slf4j:slf4j-log4j12)


B.

SLF4J implementation for Log4j2 (org.apache.logging.log4j:log4j-slf4j-impl)


C.

None, the right dependency will be added by the Kafka client dependency by transitivity.


D.

Just the log4j2 dependency of the application


Expert Solution
Questions # 15:

Which statement describes the storage location for a sink connector’s offsets?

Options:

A.

The __consumer_offsets topic, like any other consumer


B.

The topic specified in the offsets.storage.topic configuration parameter


C.

In a file specified by the offset.storage.file.filename configuration parameter


D.

In memory which is then periodically flushed to a RocksDB instance


Expert Solution
Questions # 16:

You need to explain the best reason to implement the consumer callback interface ConsumerRebalanceListener prior to a Consumer Group Rebalance.

Which statement is correct?

Options:

A.

Partitions assigned to a consumer may change.


B.

Previous log files are deleted.


C.

Offsets are compacted.


D.

Partition leaders may change.


Expert Solution
Questions # 17:

You have a consumer group with default configuration settings reading messages from your Kafka cluster.

You need to optimize throughput so the consumer group processes more messages in the same amount of time.

Which change should you make?

Options:

A.

Remove some consumers from the consumer group.


B.

Increase the number of bytes the consumers read with each fetch request.


C.

Disable auto commit and have the consumers manually commit offsets.


D.

Decrease the session timeout of each consumer.


Expert Solution
Questions # 18:

Your application is consuming from a topic with one consumer group.

The number of running consumers is equal to the number of partitions.

Application logs show that some consumers are leaving the consumer group during peak time, triggering a rebalance. You also notice that your application is processing many duplicates.

You need to stop consumers from leaving the consumer group.

What should you do?

Options:

A.

Reduce max.poll.records property.


B.

Increase session.timeout.ms property.


C.

Add more consumer instances.


D.

Split consumers in different consumer groups.


Expert Solution
Viewing page 2 out of 2 pages
Viewing questions 11-20 out of questions