To continuously export data from Kafka into a target database, I should use
Answer : C
Kafka Connect Sink is used to export data from Kafka to external databases and Kafka Connect Source is used to import from external databases into Kafka.
Which of the following is not an Avro primitive type?
Answer : D
date is a logical type
An ecommerce wesbite sells some custom made goods. What's the natural way of modeling this data in Kafka streams?
Answer : B
Mostly-static data is modeled as a table whereas business transactions should be modeled as a stream.
How do you create a topic named test with 3 partitions and 3 replicas using the Kafka CLI?
Answer : C
As of Kafka 2.3, the kafka-topics.sh command can take --bootstrap-server localhost:9092 as an argument. You could also use the (now deprecated) option of --zookeeper localhost:2181.
Your streams application is reading from an input topic that has 5 partitions. You run 5 instances of your application, each with num.streams.threads set to 5. How many stream tasks will be created and how many will be active?
Answer : D
One partition is assigned a thread, so only 5 will be active, and 25 threads (i.e. tasks) will be created
What happens if you write the following code in your producer? producer.send(producerRecord).get()
Answer : B
Using Future.get() to wait for a reply from Kafka will limit throughput.
Suppose you have 6 brokers and you decide to create a topic with 10 partitions and a replication factor of 3. The brokers 0 and 1 are on rack A, the brokers 2 and 3 are on rack B, and the brokers 4 and 5 are on rack C. If the leader for partition 0 is on broker 4, and the first replica is on broker 2, which broker can host the last replica? (select two)
Answer : B, E
When you create a new topic, partitions replicas are spreads across racks to maintain availability. Hence, the Rack A, which currently does not hold the topic partition, will be selected for the last replica