SkillAgentSearch skills...

Karapace

Karapace - Your Apache Kafka® essentials in one tool

Install / Use

/learn @Aiven-Open/Karapace
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

Karapace

karapace. Your Apache Kafka® essentials in one tool.

An open-source <https://github.com/Aiven-Open/karapace/blob/master/LICENSE>_ implementation of Kafka REST <https://docs.confluent.io/platform/current/kafka-rest/index.html#features>_ and Schema Registry <https://docs.confluent.io/platform/current/schema-registry/index.html>_.

|Tests| |Contributor Covenant|

.. |Tests| image:: https://github.com/Aiven-Open/karapace/actions/workflows/tests.yml/badge.svg?branch=main :target: https://github.com/Aiven-Open/karapace/actions/workflows/tests.yml?query=branch%3Amain

.. |Contributor Covenant| image:: https://img.shields.io/badge/Contributor%20Covenant-2.1-4baaaa.svg :target: CODE_OF_CONDUCT.md

Overview

Karapace supports the storing of schemas in a central repository, which clients can access to serialize and deserialize messages. The schemas also maintain their own version histories and can be checked for compatibility between their different respective versions.

Karapace rest provides a RESTful interface to your Apache Kafka cluster, allowing you to perform tasks such as producing and consuming messages and perform administrative cluster work, all the while using the language of the WEB.

Features

  • Drop-in replacement both on pre-existing Schema Registry / Kafka Rest Proxy client and server-sides
  • Moderate memory consumption
  • Asynchronous architecture based on aiohttp
  • Supports Avro, JSON Schema, and Protobuf
  • Leader/Replica architecture for HA and load balancing
  • Observability with Metrics and OpenTelemetry
  • Based on aiokafka (rdkafka) kafka library mostly
  • Schema registry based on FastApi

Compatibility details

Karapace is compatible with Schema Registry 6.1.1 on API level. When a new version of SR is released, the goal is to support it in a reasonable time. Karapace supports all operations in the API.

There are some caveats regarding the schema normalization, and the error messages being the same as in Schema Registry, which cannot be always fully guaranteed.

Setup

Using Docker

To get you up and running with the latest build of Karapace, a docker image is available::

Fetch the latest build from main branch

docker pull ghcr.io/aiven-open/karapace:develop

Fetch the latest release

docker pull ghcr.io/aiven-open/karapace:latest

Versions 3.7.1 and earlier are available from the ghcr.io/aiven registry::

docker pull ghcr.io/aiven/karapace:3.7.1

An example setup including configuration and Kafka connection is available as compose example::

docker compose -f ./container/compose.yml up -d

Then you should be able to reach two sets of endpoints:

  • Karapace schema registry on http://localhost:8081
  • Karapace REST on http://localhost:8082

For local development with Docker, use the convenient make commands (see Development_ section below).

Configuration ^^^^^^^^^^^^^

Each configuration key can be overridden with an environment variable prefixed with KARAPACE_, exception being configuration keys that actually start with the karapace string. For example, to override the bootstrap_uri config value, one would use the environment variable KARAPACE_BOOTSTRAP_URI. Here_ you can find an example configuration file to give you an idea what you need to change.

.. _Here: https://github.com/Aiven-Open/karapace/blob/master/karapace.config.json

Using Sources

Install ^^^^^^^

You can do a source install using::

pip install .

Troubleshooting notes :

  • An updated version of wheel (https://pypi.org/project/wheel/) is required.
  • An updated version of 'go' and 'rust' is required
  • Create and activate virtual environment (venv) to manage dependencies

Run ^^^

  • Make sure kafka is running.

Start Karapace. This shout start karapace on http://localhost:8081 ::

$ karapace karapace.config.json

Verify in browser http://localhost:8081/subjects should return an array of subjects if exist or an empty array. or with curl ::

$ curl -X GET http://localhost:8081/subjects

To enable oidc authentication on the karapace, configure oidc jwks url config details ::

sasl_oauthbearer_jwks_endpoint_url = "", sasl_oauthbearer_expected_issuer = "", sasl_oauthbearer_expected_audience = "", sasl_oauthbearer_sub_claim_name = "sub",

There is a detailed section about OAuth2 authentication for karapace below.

To enable oidc authorization on karapace, configure the below params together with the above ::

sasl_oauthbearer_authorization_enabled: bool = False sasl_oauthbearer_client_id: str | None = None sasl_oauthbearer_roles_claim_path: str | None = None sasl_oauthbearer_method_roles: dict[str, list[str]] = {"GET": [], "POST": [], "PUT": [], "DELETE": []}

There is a detailed section about OAuth2 authorization for karapace below.

Start Karapace rest proxy. This shout start karapace on http://localhost:8082 ::

karapace rest-proxy-karapace.config.json

To enable authorization & authentication on the rest proxy, configure 'sasl_mechanism' in the config with values like PLAIN/OAUTHBEARER ::

sasl_mechanism = "OAUTHBEARER", sasl_oauth_token_provider = token_provider, security_protocol="SASL_SSL", ssl_cafile="ca.pem",

If 'sasl_mechanism' is configured to PLAIN::

sasl_mechanism = "PLAIN",
security_protocol = "SASL_PLAIN",
sasl_plain_username = "your_username",
sasl_plain_password = "your_password"

There is a detailed section about OAuth2 authentication for rest proxy below.

Verify with list topics::

$ curl "http://localhost:8082/topics"

Schema Registry Api reference

To register the first version of a schema under the subject "test" using Avro schema::

$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json"
--data '{"schema": "{"type": "record", "name": "Obj", "fields":[{"name": "age", "type": "int"}]}"}'
http://localhost:8081/subjects/test-key/versions {"id":1}

To register a version of a schema using JSON Schema, one needs to use schemaType property::

$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json"
--data '{"schemaType": "JSON", "schema": "{"type": "object","properties":{"age":{"type": "number"}},"additionalProperties":true}"}'
http://localhost:8081/subjects/test-key-json-schema/versions {"id":2}

To list all subjects (including the one created just above)::

$ curl -X GET http://localhost:8081/subjects ["test-key"]

To list all the versions of a given schema (including the one just created above)::

$ curl -X GET http://localhost:8081/subjects/test-key/versions [1]

To fetch back the schema whose global id is 1 (i.e. the one registered above)::

$ curl -X GET http://localhost:8081/schemas/ids/1 {"schema":"{"fields":[{"name":"age","type":"int"}],"name":"Obj","type":"record"}"}

To get the specific version 1 of the schema just registered run::

$ curl -X GET http://localhost:8081/subjects/test-key/versions/1 {"subject":"test-key","version":1,"id":1,"schema":"{"fields":[{"name":"age","type":"int"}],"name":"Obj","type":"record"}"}

To get the latest version of the schema under subject test-key run::

$ curl -X GET http://localhost:8081/subjects/test-key/versions/latest {"subject":"test-key","version":1,"id":1,"schema":"{"fields":[{"name":"age","type":"int"}],"name":"Obj","type":"record"}"}

In order to delete version 10 of the schema registered under subject "test-key" (if it exists)::

$ curl -X DELETE http://localhost:8081/subjects/test-key/versions/10 10

To Delete all versions of the schema registered under subject "test-key"::

$ curl -X DELETE http://localhost:8081/subjects/test-key [1]

Test the compatibility of a schema with the latest schema under subject "test-key"::

$ curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json"
--data '{"schema": "{"type": "int"}"}'
http://localhost:8081/compatibility/subjects/test-key/versions/latest {"is_compatible":true}

NOTE: if the subject's compatibility mode is transitive (BACKWARD_TRANSITIVE, FORWARD_TRANSITIVE or FULL_TRANSITIVE) then the compatibility is checked not only against the latest schema, but also against all previous schemas, as it would be done when trying to register the new schema through the subjects/<subject-key>/versions endpoint.

Get current global backwards compatibility setting value::

$ curl -X GET http://localhost:8081/config {"compatibilityLevel":"BACKWARD"}

Change compatibility requirements for all subjects where it's not specifically defined otherwise::

$ curl -X PUT -H "Content-Type: application/vnd.schemaregistry.v1+json"
--data '{"compatibility": "NONE"}' http://localhost:8081/config {"compatibility":"NONE"}

Change compatibility requirement to FULL for the test-key subject::

$ curl -X PUT -H "Content-Type: application/vnd.schemaregistry.v1+json"
--data '{"compatibility": "FULL"}' http://localhost:8081/config/test-key {"compatibility":"FULL"}

Schema Registry Rest proxy Api reference

List topics::

$ curl "http://localhost:8082/topics"

Get info for one particular topic::

$ curl "http://localhost:8082/topics/my_topic"

Produce a message backed up by schema registry::

$ curl -H "Content-Type: application/vnd.kafka.avro.v2+json" -X POST -d
'{"value_schema": "{"namespace": "example.avro", "type": "record", "name": "simple", "fields":
[{"name": "name", "type": "string"}]}", "records": [{"value": {"name": "name0"}}]}' http://localhost:8082/topics/my_topic

Create a consumer with consumer group 'avro_consumers' and consumer instance 'my_consumer' ::

$ curl -X POST -H "Content-Type: application/vnd.kafka.v2+json" -H "Accept: application/vnd.kafka.v2+json"
--data '{"name": "my_

Related Skills

View on GitHub
GitHub Stars604
CategoryDevelopment
Updated2d ago
Forks93

Languages

HTML

Security Score

100/100

Audited on Mar 27, 2026

No findings