Cloud & Engineering

Tabish Ghani

Automate Kafka Testing

Posted by Tabish Ghani on 12 April 2019

Testing, unit-testing, kafka, Fast-Data


Apache Kafka is being leveraged very commonly and forms some of large scale and important systems in the world processing trillions of messages per day. It is serving as pipeline backbone for many companies in financial and tech industry.

Before I continue, I want to set some expectations. The point of this article is not to explain the intricacies or use cases of Kafka and its architecture but rather to clearly illustrate one of the libraries that can be used to perform Kafka Testing and our approach and experience. And how Zerocode allowed us to perform Integration, Unit and End to End (E2E) testing. The article is intended for those already acquainted with Kafka, its applications or very least have a strong level of theoretical knowledge about it. To follow our setup, you would like to require to clone and have some monitoring tool (Confluent Control Centre) installed.

Use case

Because of the project confidentiality and our commitment to client, I would dress up our use case with a generic scenario that you might face while testing Kafka. Please feel free to reach out to discuss your specific scenario via comments or LinkedIn. The project had 30+ microservices producing and consuming messages to Kafka and performing certain transformations and validations on the messages in the process.

Our Approach

We decided to use Zerocode because of it its step-chaining, simple and horizontally scalable format which permitted us to write tests in simple JSON formats with payload and response assertions (leveraging using JSON Path

Confluent Centre (local) was our platform of choice to obtain visibility and monitoring of our test cases.

To simulate one of our test scenario (i.e. where the message might be produced from microservice A then consumed and required to be consumed and validated from microservice B) we produced message to topic_A and leveraged KSQL to write those messages to topic_B and consume from topic_B and perform the assertions then execute another KSQL query to pass that payload on topic_C and repeat. Zerocode’s JSON declarative style allowed us to do this efficiently.



As a common knowledge, we are aware that messages in Kafka are not ordered! 


In our discussion with Zerocode community, this can be resolved with the option to define:

  1. in: /zerocode/kafka-testing/src/test/resources/kafka_servers/
  2. in: /zerocode/kafka-testing/src/test/resources/kafka_servers/


In Kafka; allows you to easily correlate requests on the broker with the client instance which made it. Check out more examples and details:

And, property defines a unique identity for the set of consumers within the same consumer group. You can learn more here:


Zerocode allows you to configure e.g. = zerocode-producer_${RANDOM.NUMBER} and with various other placeholders. Keeping it unique assist in tracing and testing purposes.

With the defined as example above, each test executed is assigned a unique ID.

Sample results below:

  1. 1st run - = test_producer_1553209530873
  2. 2nd run - = test_producer_1553209530889
  3. 3rd run - = test_producer_1553209530893

This suffixed numeric ID is unique, because it is the numeric equivalent of the current timestamp.


Another suggested approach by the Zerocode community is to define as timestamp as it makes ideal for testing and tracing = test_producer_${LOCAL.DATE.TODAY:yyyy-MM-dd}


  1. 1st day - = test_producer_2018-03-18
  2. 2nd day - = test_producer_2018-03-19
  3. 3rd day - = test_producer_2018-03-20

Please see the following link for additional placeholders to define explained in the README file suited to your project requirement.

The is defined in the as per Kafka’s requirement.

e.g. etc.

Defining it unique can help you in achieve your end to end testing right. Also, this might enable in rerunning your entire test Suite/Pack i.e. making your CI build pipeline repeatable.

This uniqueness will allow the consumers to fetch Old + New messages (if it helps).

More examples and details:

Our Experience & Learning

Zerocode allowed us to achieve this with Java runner with a JSON config file with a Java runner (Junit) and configurable Kafka server, producers and consumers properties.

We used KSQL to move data from a topic to another to simulate multi microservices involvement as discussed above.

Some of the testing screenshots shared are below:




See the snapshot of mock KSQL queries we used to move data between different topics. 


 Please double-check following dependency is added in your repository:






Property configuration examples include:

  1. Producer Properties:
  2. Consumer Properties:

Sample Test cases to run:


Below is a sample JSON configuration which we used in one of our test scenarios:


 Test Results

Test Result 1

Test result 2


In terms of security Zerocode offers below:

  1. For Oauth2 please see a very short and precise blog in the DZone Security Zone: Please reach out to community or Zerocode in case you need further information.
  2. For Corporate Proxy configuration, you can follow the README section here: 
  3. This is for any Http API Invocation, for instance REST, SOAP etc
  4. SAML/JWT with working examples are: repo    
  5. If tokens are dynamic, it's still easy to inject them into header in runtime:  has explained in his blog
  6. If you use OpenAM or RedHat SSO or Simple Basic Auth. You can refer the examples in readme-file You can manually use per test-case wise or embed it to the HttpClient which is one off (and less maintenance overhead

7. Custom HTTP client: Zerocode's Http Client supports Http and Https connections anyways. But you can override and add/remove security features to match your project requirement.

See example:


Then it's very simple and straight forward to use like below-

Just annotate your test class or suite class.


In similar fashion, you can inject any custom headers you need.



Feature & Future

In discussion with the Zerocode broad contributing community, about on the feature-comparison front Zerocode are in the process of collecting the feedback/data from our customers to capture benefits and preference to Zerocode e.g.

  1. From Postman(collections) to Zerocode
  2. From other Step-Definition based BDD tools to Zerocode etc.

We and Zerocode will love to hear your feedback on this.

Wrapping Up

Distributed Testing is tricky and doesn’t have a silver bullet and is renowned for their unique corner cases. To counter them requires a lot of design thinking and set of practices of SDLC (Software Development Lifecycle) from process design to production. The goal of this blog is to share some insights on how to handle and scale testing as needed with an existing well thought out and document option Zerocode aimed at preserving quality.

It is important to clearly outline that Zerocode is exceptional for:

  • Application Integration Testing
  • End to End testing
  • System Integration testing
  • Load/Stress testing
  • API Mock making (using wiremock JSON DSLs)

In a declarative way reducing the hassles to zero for Developers/Testers.

Our conclusion on choosing a suitable testing library or framework would be to:

  • Ease of use
  • Less syntax overhead
  • Ease of dealing and asserting payload
  • Easy to extend the test runners
  • Ease of to add custom security featured
  • Easy for manual testers to understand the test flow

We sincerely hope this helps out community to some extent and helps in bridging and filling the gap with our kindred spirits from Zerocode.

Honourable References

For further detailed information on how to test Kafka or REST APIs producing and consuming from or to Kafka please see the following link or get in touch via comments below:


If you like what you read, join our team as we seek to solve wicked problems within Complex Programs, Process Engineering, Integration, Cloud Platforms, DevOps & more!


Have a look at our opening positions in Deloitte. You can search and see which ones we have in Cloud & Engineering.


Have more enquiries? Reach out to our Talent Team directly and they will be able to support you best.

Leave a comment on this blog: