Winter Sale Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: pass65

Pass the Confluent Confluent Certified Developer CCDAK Questions and answers with CertsForce

Viewing page 2 out of 3 pages
Viewing questions 11-20 out of questions
Questions # 11:

What is accomplished by producing data to a topic with a message key?

Options:

A.

Messages with the same key are routed to a deterministically selected partition, enabling order guarantees within that partition.


B.

Kafka brokers allow you to add more partitions to a given topic, without impacting the data flow for existing keys.


C.

It provides a mechanism for encrypting messages at the partition level to ensure secure data transmission.


D.

Consumers can filter messages in real time based on the message key without processing unrelated messages.


Expert Solution
Questions # 12:

You want to connect with username and password to a secured Kafka cluster that has SSL encryption.

Which properties must your client include?

Options:

A.

security.protocol=SASL_SSLsasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username='myUser' password='myPassword';


B.

security.protocol=SSLsasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username='myUser' password='myPassword';


C.

security.protocol=SASL_PLAINTEXTsasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required username='myUser' password='myPassword';


D.

security.protocol=PLAINTEXTsasl.jaas.config=org.apache.kafka.common.security.ssl.TlsLoginModule required username='myUser' password='myPassword';


Expert Solution
Questions # 13:

(You are implementing a Kafka Streams application to process financial transactions.

Each transaction must be processed exactly once to ensure accuracy.

The application reads from an input topic, performs computations, and writes results to an output topic.

During testing, you notice duplicate entries in the output topic, which violates the exactly-once processing requirement.

You need to ensure exactly-once semantics (EOS) for this Kafka Streams application.

Which step should you take?)

Options:

A.

Enable compaction on the output topic to handle duplicates.


B.

Set enable.idempotence=true in the internal producer configuration of the Kafka Streams application.


C.

Set enable.exactly_once=true in the Kafka Streams configuration.


D.

Set processing.guarantee=exactly_once_v2 in the Kafka Streams configuration.


Expert Solution
Questions # 14:

(You want to enrich the content of a topic by joining it with key records from a second topic.

The two topics have a different number of partitions.

Which two solutions can you use?

Select two.)

Options:

A.

Use a GlobalKTable for one of the topics where data does not change frequently and use a KStream–GlobalKTable join.


B.

Repartition one topic to a new topic with the same number of partitions as the other topic (co-partitioning constraint) and use a KStream–KTable join.


C.

Create as many Kafka Streams application instances as the maximum number of partitions of the two topics and use a KStream–KTable join.


D.

Use a KStream–KTable join; Kafka Streams will automatically repartition the topics to satisfy the co-partitioning constraint.


Expert Solution
Questions # 15:

(You are developing a Java application that includes a Kafka consumer.

You need to integrate Kafka client logs with your own application logs.

Your application is using the Log4j2 logging framework.

Which Java library dependency must you include in your project?)

Options:

A.

SLF4J implementation for Log4j2 (org.apache.logging.log4j:log4j-slf4j-impl)


B.

SLF4J implementation for Log4j 1.2 (org.slf4j:slf4j-log4j12)


C.

Just the Log4j2 dependency of the application


D.

None, the correct dependency will be added transitively by the Kafka client


Expert Solution
Questions # 16:

You are creating a Kafka Streams application to process retail data.

Match the input data streams with the appropriate Kafka Streams object.


Expert Solution
Questions # 17:

(You are writing to a Kafka topic with producer configuration acks=all.

The producer receives acknowledgements from the broker but still creates duplicate messages due to network timeouts and retries.

You need to ensure that duplicate messages are not created.

Which producer configuration should you set?)

Options:

A.

enable.auto.commit=true


B.

retries=2147483647max.in.flight.requests.per.connection=5enable.idempotence=false


C.

retries=2147483647max.in.flight.requests.per.connection=1enable.idempotence=true


D.

retries=0max.in.flight.requests.per.connection=5enable.idempotence=true


Expert Solution
Questions # 18:

Match each configuration parameter with the correct option.

To answer choose a match for each option from the drop-down. Partial

credit is given for each correct answer.

Question # 18


Expert Solution
Questions # 19:

A stream processing application is consuming from a topic with five partitions. You run three instances of the application. Each instance has num.stream.threads=5.

You need to identify the number of stream tasks that will be created and how many will actively consume messages from the input topic.

Options:

A.

5 created, 1 actively consuming


B.

5 created, 5 actively consuming


C.

15 created, 5 actively consuming


D.

15 created, 15 actively consuming


Expert Solution
Questions # 20:

You have a Kafka client application that has real-time processing requirements.

Which Kafka metric should you monitor?

Options:

A.

Consumer lag between brokers and consumers


B.

Total time to serve requests to replica followers


C.

Consumer heartbeat rate to group coordinator


D.

Aggregate incoming byte rate


Expert Solution
Viewing page 2 out of 3 pages
Viewing questions 11-20 out of questions