SkillAgentSearch skills...

Trifecta

Trifecta is a web-based and CLI tool that simplifies inspecting Kafka messages and Zookeeper data. Additionally, the CLI tool provides the capability to import/export data to/from ElasticSearch and MongoDB.

Install / Use

/learn @ldaniels528/Trifecta
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

Trifecta

Trifecta is a web-based and Command Line Interface (CLI) tool that enables users to quickly and easily inspect, verify and even query Kafka messages. In addition, Trifecta offers data import/export functions for transferring data between Kafka topics and many other Big Data Systems (including Cassandra, ElasticSearch, MongoDB and others).

Table of Contents

  • <a href="#motivations">Motivations</a>
  • <a href="#features">Features</a>
  • <a href="#development">Development</a>
    • <a href="#build-requirements">Build Requirements</a>
    • <a href="#external-dependencies">External Dependencies</a>
    • <a href="#building-the-code">Building the applications</a>
    • <a href="#testing-the-code">Running the tests</a>
    • <a href="#configuring-the-app">Configuring the application</a>
    • <a href="#running-the-app">Running the application</a>
  • <a href="#downloads">Downloads</a>
  • <a href="#whats-new">What's New</a>
  • <a href="#trifecta-ui">Trifecta UI</a>
    • <a href="#trifecta-ui-start">Starting Trifecta UI</a>
    • <a href="#trifecta-ui-configure">Configuring Trifecta UI</a>
    • <a href="#trifecta-ui-decoders">Default Decoders</a>
    • <a href="#trifecta-ui-inspect">Inspecting Kafka Messages</a>
    • <a href="#trifecta-ui-replicas">Replicas</a>
    • <a href="#trifecta-ui-query">Queries</a>
  • <a href="#trifecta-cli">Trifecta CLI</a>
    • <a href="#core-module">Core Module</a>
    • <a href="#kafka-module">Kafka Module</a>
      • <a href="#kafka-brokers">Kafka Brokers</a>
      • <a href="#kafka-topics">Kafka Topics</a>
      • <a href="#kafka-message-cursor">Navigable Cursor</a>
      • <a href="#kafka-consumer-group">Consumer Groups</a>
      • <a href="kafka-inbound-traffic">Inbound Traffic</a>
      • <a href="#kafka-avro-module">Avro Integration</a>
      • <a href="kafka-default-avro-decoder">Default Avro Decoders</a>
      • <a href="#kafka-search-by-key">Searching By Key</a>
      • <a href="#kafka-advanced-search">Advanced Search</a>
      • <a href="#kafka-search-by-query">Searching By Query</a>
    • <a href="#zookeeper-module">Zookeeper Module</a>
      • <a href="#zookeeper-list">Navigating directories and keys</a>
      • <a href="#zookeeper-get-put">Getting and setting key-value pairs</a>
    • <a href="#cassandra">Cassandra Module</a>
    • <a href="#elastic-search">Elastic Search Module</a>
      • <a href="#es-avro-to-json">Avro-to-Document support</a>
    • <a href="#mongodb-module">MongoDB Module</a>

<a name="motivations"></a>

Motivations

The motivations behind creating Trifecta are simple; testing, verifying and managing Kafka topics and Zookeeper key-value pairs is an arduous task. The goal of this project is to ease the pain of developing applications that make use of Kafka and ZooKeeper via a console-based tool using simple Unix-like (or SQL-like) commands.

<a name="features"></a>

Features

  • Avro integration
    • <a href="#kafka-avro-module">Kafka — Avro</a> support
    • <a href="#es-avro-to-json">Elastic Search — Avro</a> support
      • Copy Avro-encoded messages from Kafka to Elastic Search as JSON
    • Zookeeper — Avro support (coming soon)
  • <a href="#elastic-search">Elastic Search</a> integration (experimental)
  • <a href="#kafka-module">Kafka</a> integration
  • <a href="#zookeeper-module">Zookeeper</a> integration

<a name="development"></a>

Development

<a name="build-requirements"></a>

Build Requirements

<a name="external-dependencies"></a>

External Dependencies

In order to build from the source, you'll need to download the above dependencies and issue the following command for each of them:

$ sbt publish-local

<a name="building-the-code"></a>

Building the applications

Trifecta's build process produces two distinct applications, the command-line interface (trifecta_cli) and the Web-based user interface (trifecta_ui)

Building Trifecta CLI (Command-line interface)

$ sbt "project trifecta_cli" assembly

Building Trifecta UI (Typesafe Play application)

$ sbt "project trifecta_ui" dist
    

<a name="testing-the-code"></a>

Running the tests

$ sbt clean test    

<a name="configuring-the-app"></a>

Configuring the application

On startup, Trifecta reads $HOME/.trifecta/config.properties (or creates the file if it doesn't exist). This file contains the configuration properties and connection strings for all supported systems.

# common properties
trifecta.common.autoSwitching = true
trifecta.common.columns = 25
trifecta.common.debugOn = false
trifecta.common.encoding = UTF-8

# Kafka/Zookeeper properties
trifecta.zookeeper.host = localhost:2181

# supports the setting of a path prefix for multi-tenant Zookeeper setups
#trifecta.zookeeper.kafka.root.path = /kafka

# indicates whether Storm Partition Manager-style consumers should be read from Zookeeper
trifecta.kafka.consumers.storm = false

# Cassandra properties
trifecta.cassandra.hosts = localhost

# ElasticSearch properties
trifecta.elasticsearch.hosts = localhost:9200

# MongoDB properties
trifecta.mongodb.hosts = localhost

<a name="configuring-kafka-consumers"></a>

Configuring Kafka Consumers

Trifecta currently supports 3 types of consumers:

  • Zookeeper Consumer Groups (Kafka 0.8.x)
  • Kafka-native Consumer Groups (Kafka 0.9.x)
  • Storm Partition Manager Consumers (Apache Storm-specific)

The most common type in use today are the Kafka-native consumers.

Kafka-native Consumer Groups

Kafka-native consumers require the consumer IDs that you want to monitor to be register via the trifecta.kafka.consumers.native property. Only registered consumer IDs (and their respective offsets will be visible).

    trifecta.kafka.consumers.native = dev,test,qa

Zookeeper Consumer Groups

Zookeeper-based consumers are enabled by default; however, they can be disabled (which will improve performance) by setting the trifecta.kafka.consumers.zookeeper property to false.

    trifecta.kafka.consumers.zookeeper = false

Apache Storm Partition Manager Consumer Groups

Storm Partition Manager consumers are disabled by default; however, they can be enabled (which will impact performance) by setting the trifecta.kafka.consumers.storm property be set to true.

    trifecta.kafka.consumers.storm = true

<a name="running-the-app"></a>

Run the application

To start the Trifecta REPL:

$ java -jar trifecta_cli_0.20.0.bin.jar

Optionally, you can execute Trifecta instructions (commands) right from the command line:

$ java -jar trifecta_cli_0.20.0.bin.jar kls -l

<a name="downloads"></a>

Downloads

Trifecta binaries are available for immediate download in the "<a href='https://github.com/ldaniels528/trifecta/releases'>releases</a>" section.

<a name="whats-new"></a>

What's New

v0.22.0

  • Reimplemented Publish and Query views (ported from v0.20.0)
  • Now offering simultaneous support for Kafka 0.8.x, 0.9.x and 0.10.x

v0.21.2

  • Added support for Kafka 0.10.0.0
  • Added support for AVDL
  • Updated to use MEANS.js 0.2.3.0 (Scalajs-Nodejs)

v0.20.0

  • Trifecta UI (CLI version)
    • Miscellaneous bug fixes
  • Trifecta UI (TypeSafe Play version)
    • Miscellaneous bug fixes

v0.19.2

  • Trifecta UI (TypeSafe version)
    • Fixed issue with out of memory errors while streaming messages

v0.19.1

  • Trifecta UI (CLI version)
    • Fixed issue with missing web resources

v0.19.0

  • Trifecta UI
    • Now a TypeSafe Play Application
    • Updated the user interface
    • Bug fixes

v0.18.1 to v0.18.20

  • Trifecta Core

    • Fixed issue with the application failing if the configuration file is not found
    • Upgraded to Kafka 0.8.2-beta
    • Kafka Query language (KQL) (formerly Big-Data Query Language/BDQL) has grammar simplification
      • The "<a href="#trifecta-ui-query">with default</a>" clause is no longer necessary
    • Upgraded to Kafka 0.8.2.0
    • Added configuration key to support multi-tenant Zookeeper setups
    • Added support for Kafka ~~v0.8.2.0~~ v9.0.0 consumers
  • Trifecta UI

    • Added capability to navigate directly from a message (in the Inspect tab) to its decoder (in the Decoders tab)
    • Decoder tab user interface improvements
    • Observe tab user interface improvements
      • The Consumers section has been enhanced to display topic and consumer offset deltas
      • Redesigned the Replicas view to report under-replicated partitions.
      • The Topics section has been enhanced to display topic offset deltas
    • Query tab user interface improvements
      • Multiple queries can be executed concurrently
    • The embedded web server now supports asynchronous request/response flows
    • Added real-time message streaming capability to the Inspect tab
    • Swapped the Inspect and Observe modules
    • Added a new Brokers view to the Observe module
    • Reworked the Brokers view (Inspect module)
    • Fixed sort ordering of partitions in the Replicas view (Inspect module)
    • Fixed potential bug related to retrieving the list of available brokers
    • Now a TypeSafe Play Application w/updated the user interface

<a name="trifecta-ui"></a>

Trifecta UI

Trifecta offers a single-page web application (via Angular.js) with a REST service layer and web-socket support, which offers a comprehensive and powerful set of features for inspecting Kafka topic partitions and messages.

<a name="trifecta-ui-start"></a>

Starting Tri

View on GitHub
GitHub Stars216
CategoryProduct
Updated4mo ago
Forks47

Languages

Scala

Security Score

92/100

Audited on Nov 29, 2025

No findings