Kafkit¶
Kafkit helps you write Kafka producers and consumers in Python with asyncio:
- Kafkit integrates aiokafka consumers and producers with the Confluent Schema Registry.
The
Deserializer
class can deserialize messages with any schema that’s registered in a Confluent Schema Registry. TheSerializer
class can serialize Python objects against a single Avro schema, while thePolySerializer
class is flexible enough to handle multiple schemas. - Kafkit provides Python APIs for working with the Confluent Schema Registry’s HTTP API.
The
RegistryApi
client includes high-level methods that manage subjects and their schemas in a registry. These methods are cached so that theRegistryApi
client can be an integral part of your application’s schema management. Additionally,RegistryApi
includes low-level HTTP methods (GET, POST, PUT, PATCH, DELETE) so you can work directly with the Confluent Schema Registry API if you want. kafkit.registry.aiohttp.RegistryApi
is implemented with aiohttp, but that’s not the only implementation. Kafkit subscribes to the sans IO architecture (gidgethub is a popular example) meaning that you can subclasskafkit.registry.sansio.RegistryApi
to integrate with your favorite HTTP client library. Thekafkit.registry.sansio.MockRegistryApi
is a mock client that you can use in your app’s unit tests.
User guide¶
Project information¶
Kafkit is developed on GitHub at https://github.com/lsst-sqre/kafkit.