Confluent β’ CCAAK
Validates expertise in managing Apache Kafka clusters in production, covering Kafka fundamentals, cluster configuration, security, deployment architecture, Kafka Connect administration, observability, and troubleshooting.
Questions
624
Duration
90 minutes
Passing Score
70%
Difficulty
AssociateLast Updated
Feb 2026
The Confluent Certified Administrator for Apache Kafka (CCAAK) is a professional certification that validates the skills required to deploy, configure, monitor, and maintain Apache Kafka clusters in production environments. It covers a broad spectrum of administrative competencies, including broker and topic configuration, ZooKeeper management, security implementation (SSL/TLS, SASL, ACLs), Kafka Connect administration, Schema Registry, observability practices, and production troubleshooting. The certification is offered by Confluent, the company founded by the original creators of Apache Kafka, and is recognized across the industry as a credible benchmark for Kafka operations expertise.
The exam tests both conceptual understanding and scenario-based knowledge, requiring candidates to demonstrate proficiency with real-world challenges such as managing consumer group rebalances, diagnosing replication health, configuring listener protocols correctly, and resolving consumer lag. Candidates must understand the roles of brokers, leaders, group coordinators, and how components like Kafka Streams and ksqlDB interact within the broader ecosystem. The certification expires after two years, requiring recertification to remain current.
The CCAAK is designed for professionals who are responsible for the day-to-day administration and operation of Apache Kafka clusters. This includes platform engineers, site reliability engineers (SREs), DevOps engineers, and infrastructure administrators who manage Kafka in self-managed, Kubernetes-based, or cloud-hosted environments. Candidates typically have hands-on experience with Kafka CLI tooling and configuration files, and are comfortable diagnosing issues such as replication lag, consumer timeouts, and partition imbalances.
The certification is well-suited for professionals who want to formalize their Kafka administration skills and distinguish themselves in the job market. It is not intended for developers building Kafka-based applications (who would be better served by the CCDAK developer certification), but rather for those responsible for the health, security, and operational performance of Kafka infrastructure.
Confluent does not enforce formal prerequisites to register for the CCAAK exam. However, candidates are strongly recommended to have practical, hands-on experience running Kafka in a production or production-like environment before attempting the exam. This includes comfort with broker configuration files, CLI tools (kafka-topics, kafka-consumer-groups, kafka-configs, etc.), and experience troubleshooting common operational issues such as under-replicated partitions, consumer lag, and connectivity errors.
Familiarity with Kafka's core components β brokers, producers, consumers, consumer groups, ZooKeeper (or KRaft mode), Kafka Connect, and Schema Registry β is essential. Confluent recommends reviewing the official online study guide and, optionally, completing their formal training courses (available in live and self-paced formats) prior to sitting the exam. Most candidates report studying between 30 and 120 hours depending on their existing Kafka experience.
The CCAAK is a 90-minute, proctored, multiple-choice exam delivered online or at authorized testing centers worldwide. The exam consists of multiple-choice and multi-select questions, with the total question count reported at approximately 40β60 questions depending on the exam version. Remote delivery requires a webcam for proctor monitoring throughout the session. The exam is administered in English only, and results are displayed immediately upon completion.
The passing score is 70%, and the cost per attempt is $150 USD. The certification is valid for two years, after which recertification is required. Upon passing, candidates receive a Confluent digital badge and certificate and are authorized to use the certification title and logo in professional materials.
Earning the CCAAK demonstrates verified expertise in Apache Kafka administration, a skill set in high demand as organizations across financial services, e-commerce, telecommunications, and technology sectors scale their event streaming infrastructure. Kafka administrators and platform engineers with this credential are well-positioned for roles such as Senior Kafka Administrator, Platform Engineer, Data Infrastructure Engineer, and Site Reliability Engineer. The certification serves as a credible differentiator in both salary negotiations with current employers and job applications with new ones.
Apache Kafka skills consistently command above-average compensation in the data engineering and platform engineering space, with experienced Kafka administrators in North America typically earning $130,000β$180,000+ USD annually. The CCAAK pairs well with cloud certifications (AWS, GCP, Azure) and complements the Confluent Certified Developer for Apache Kafka (CCDAK) for professionals seeking full-stack Kafka expertise. As organizations increasingly adopt event-driven architectures, demand for credentialed Kafka operators continues to grow.
1. A data engineering team needs to configure a Kafka Connect sink connector to handle occasional data transformation errors without stopping the entire connector. They want failed records sent to a dead letter queue topic. Which configuration should they use? (Select one!)
2. An operations team monitors a Kafka cluster and observes that flush.ms=10000 is configured for several high-throughput topics. Disk I/O metrics show frequent fsync operations causing performance bottlenecks. Which three approaches will improve performance while maintaining durability? (Select three!)
Select all that apply3. A Kafka Connect distributed cluster runs 5 workers with a connector configured with tasks.max=8. The administrator applies a Single Message Transform chain with the following configuration: transforms=addSource,maskPII,routeTopic. The transforms.addSource.type uses InsertField to add a static source_system field, transforms.maskPII.type uses MaskField to mask the ssn field, and transforms.routeTopic.type uses RegexRouter to modify topic names. A record flows through the connector. In what order are the transformations applied? (Select one!)
4. An administrator performs a rolling restart on a 6-broker cluster to update JVM settings. The restart procedure includes stopping each broker with SIGTERM, waiting 30 seconds, starting the broker, and immediately proceeding to the next broker. After restarting the fourth broker, multiple topics show under-replicated partitions. What is the most likely cause? (Select one!)
5. A streaming platform uses Kafka Connect with AvroConverter and Schema Registry. The value.converter.schema.registry.url is configured as http://localhost:8081. When the team tries to create a new source connector, they receive authorization errors from Schema Registry. The connector requires the ability to register new schemas for evolving source data. Which configuration must be added to the connector? (Select one!)
All exams included β’ Cancel anytime