Skip to content

Releases: mostafa/xk6-kafka

v0.13.1

04 Aug 17:53
91cf77c
Compare
Choose a tag to compare

I released this patched version to fix two bugs reported by the awesome users of this extension:

Full Changelog: v0.13.0...v0.13.1

v0.13.0

14 Jul 12:57
47cdfe5
Compare
Choose a tag to compare

This release is a bugfix release and includes mostly cosmetic changes to the Go codebase that were reported by various linters via golangci-lint. I fixed the CI to use the golangci-lint-action and created a configuration file to enable all linters, including gosec, except a few. All constructors and functions are now unexported and you can access them via either of these classes: Writer, Reader or Connection.

I addressed these issues:

Also, @thmshmm was kind enough to fix a bug in #129 that I introduced in the byte array serializer. 🙏

What's Changed

Full Changelog: v0.12.0...v0.13.0

v0.12.0

02 Jul 21:02
d205d0b
Compare
Choose a tag to compare

This release is an effort toward better stability of the JavaScript API. As you will see in the following changelog, most of the changes are refactorings, as I envisioned in #89. I completely revamped the API, removed the old ones, and introduced better ones with a more native JavaScript syntax. These are the changes:

  1. Better JS API docs with all the constants, classes, and data structures, all documented here.
  2. Better logging with the ability to connect the global extension logger to the Reader and/or Writer objects (of the kafka-go library) to print the internal errors by using the connectLogger parameter.
  3. Better error handling by throwing exceptions (native to JS) rather than returning them (native to Go), so you can enclose them in try/catch blocks.
  4. All constants are now available for import at the module level and there are lots of them for different purposes.
  5. The writer and reader functions are removed, and replaced by classes that can be instantiated like this:
    import { Writer, Reader } from "k6/x/kafka";
    
    const writer = new Writer({
      brokers: ["localhost:9092"],
      topic: "my-topic",
      autoCreateTopic: true,
    });
    
    const reader = new Reader({
      brokers: ["localhost:9092"],
      topic: "my-topic",
    });
  6. The produce and consume functions are now methods of their respective Writer and Reader classes:
    writer.produce({
      messages: [
        {
          key: "key",
          value: "value",
        }
      ]
    });
    
    const messages = reader.consume({limit: 10});
  7. Constructors' and methods' parameters are now JSON objects, mostly.
  8. All the kafka-go's WriterConfig and ReaderConfig are now combined and consolidated, and they are available as parameters to the Writer and Reader objects, so everything is customizable now. Only some of them are tested, though. So, feel free to open an issue if you find any.
  9. Topic-related functions are now exposed in a Connection class. You need to instantiate the class in the init context and use its methods for listing, creating, and deleting topics.
    import { Connection } from "k6/x/kafka";
    
    const connection = new Connection({
      address: "localhost:9092"
    });
    
    connection.listTopics();
    connection.createTopic({ topic: "my-topic" });
    connection.deleteTopic("my-topic");
  10. All the scripts are now updated with the new syntax.
  11. A few bugs are fixed and dependencies are updated.

I hope you like the new changes and please report issues if you find any.

What's Changed

Full Changelog: v0.11.0...v0.12.0

v0.11.0

20 Jun 10:51
Compare
Choose a tag to compare

This release includes several changes and fixes to how the extension handles SASL and TLS configurations. It also includes a feature by @enamrik 🙏 that lets users choose their subject name strategies for naming schemas in Schema Registry. Details of what happened are available on the respective PRs.

Previously the JS API docs were in the README, but after merging #87, the docs are available separately. The docs always refer to the latest changes on the main branch, as explained here.

The release process and branching are also changed.

What's Changed

New Contributors

Full Changelog: v0.10.0...v0.11.0

v0.10.0

04 Jun 16:32
Compare
Choose a tag to compare

In this release, I refactored almost everything. Significant changes include heavy refactoring, adding tests, and fixing bugs reported by users and contributors, which I am grateful for 🙏. I also added docstrings and comments wherever I could.

You might also notice that old tags are stale because I mistakenly committed a binary file and then wanted to remove it from the history, which resulted in overwriting the history and losing some bits and pieces 🤦. Hopefully, the commit logs are intact. The lesson learned here is NEVER to USE bfg-repo-cleaner ever again. ⚠️

Also, I've mentioned in the docs that I won't guarantee backward compatibility in APIs, so please update your scripts to reflect the latest changes or things will start to fail. You can also stick with the old v0.8.0 and Docker image if you don't want the latest changes, as mentioned in #56.

What's Changed

Full Changelog: 832c83b...main

v0.9.0

05 May 11:54
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v0.8.0...v0.9.0

v0.8.0

22 Apr 16:20
80918cb
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v0.7.0...v0.8.0

Allow producing Kafka message without a key

09 Dec 17:26
c7bd315
Compare
Choose a tag to compare

In this release, a few things have changed:

  1. The project is re-licensed to Apache 2.0, so it can be run on the cloud.
  2. Thanks to @eduardowitter, PR #30 is merged that allow producing Kafka messages with no keys.
  3. README is updated with some useful troubleshooting instructions.
  4. A blog article is published on k6 blog to demonstrate the capabilities of the extension.

Kafka message compression support

06 Sep 08:17
Compare
Choose a tag to compare

This release adds support for message compression in Kafka. The supported compression of messages written to and read from Kafka using these compression codec:

  • Gzip
  • Snappy
  • Lz4
  • Zstd

There's a script that shows how to use the compression feature. The createTopic function is also updated to create topics and set their compression config to the desired value.

Support for Confluent's KafkaAvroSerializer/KafkaAvroDeSerializer

29 Jun 12:52
Compare
Choose a tag to compare

In this release, basic support for Confluent's KafkaAvroSerializer/KafkaAvroDeSerializer is added by @fmck3516 (#9). This helps support the Confluent wire-format for Kafka.

Also, now that there are 4 different test scripts, they're moved to script/ directory.