SkillAgentSearch skills...

Daggy

Daggy - Data Aggregation Utility and C/C++ developer library for data streams catching

Install / Use

/learn @synacker/Daggy

README


description: Common information about Daggy and Getting Started

Free code signing on Windows povided by SignPath.io, certificate by SignPath Foundation

About Daggy

Develop build GitBook Conan Center

Daggy

Daggy - Data Aggregation Utility and C/C++ developer library for data streams catching

Daggy main goals are server-less, cross-platform, simplicity and ease-of-use.

Daggy can be helpful for developers, QA, DevOps and engineers for debug, analyze and control any data streams, including requests and responses, in distributed network systems, for example, based on micro-service architecture.

{% hint style="info" %} In short terms, daggy run local or remote processes at the same time, simultaneously read output from processes, stream and aggregate them under once session {% endhint %}

{% embed url="https://youtu.be/oeNSwv9oYDc" %} Daggy Screencast {% endembed %}

Table of contents generated with markdown-toc

Introduction and goal concepts

The Daggy Project consist of:

  1. Core - library for streams aggregation and catching
  2. Daggy - console application for aggregation streams into files

Daggy High Level Design


Daggy High Level Design

Basic terms

The main goal of Daggy Software System is obtaining the data from envorinments that located in sources to streams into aggregators and via providers.

Environment contains data for streams. Out of box, Core supports local and remote environments, but can be extended by user defined environments. Local Environment is located on the same host, that Daggy Core instance. Remote Environment is located on the different from Daggy Core instance host. User defined environment can be located anywhere, likes databases, network disks, etc.

Sources are declarations, how to obtain the data from environments. It descirbes, which kind of data need to be conveted to streams and which provider will need.

There is example of sources that contains once local environment and once remote environment:

aliases:  
    - &my_commands
        pingYa:
            exec: ping ya.ru
            extension: log
        pingGoo:
            exec: ping goo.gl
            extension: log
        
    - &ssh_auth
        user: {{env_USER}}
        passphrase: {{env_PASSWORD}}
            
sources:
    local_environment:
        type: local
        commands: *my_commands
    remote_environment:
        host: 192.168.1.9
        type: ssh2
        parameters: *ssh_auth
        commands: *my_commands

The streams from local environment are generates via local provider (looks at type: local).

The streams from remote environment are generates via ssh2 provider (looks at type: ssh2).

Out of box Core provides local and ssh2 providers. Both providers obtains the data for streams from processes - the local provider runs local process and generates streams from process channels (stdin and stdout). Ssh2 provider runs remote processes via ssh2 protocol and also generates streams from process channels. The Daggy Core can be extended by user defined provider that will generate streams, for example, from http environment.

Providers generate streams by parts via commands. The each part has unique seq_num value, uninterruptedly and consistently. It means, that full data from stream can be obtain by adding parts of stream in seq_num ascending order. Each stream can be generated by command.

The Core translates streams from any count of providers in once Core Streams Session. The streams from Core Streams Session can be aggregated by aggregators or viewed by user.

Out of box, the Core provides several types of aggregators:

  1. File - aggregates streams into files at runtime, as data arrives. This aggregator is used by Daggy Console Application.
  2. Console - aggreagates streams into console output. This aggregator is used by Daggy Console Application.
  3. Callback - aggregates streams into ANSI C11 callbacks. This aggregator is used by Core ANSI C11 Interface.

The Core library can be extended by user defined aggregators.

Getting Started

Getting Daggy

Fedora

sudo dnf install daggy daggy-devel

Windows

Download installer or portable version from releases page.

Linux

Download rpm/deb or portable version from releases page.

MacOS

Download portable version from releases page or install via homebrew:

brew install --build-from-source synacker/daggy/daggy

Install from source with conan

{% hint style="info" %} Build requirenments: Conan, cmake, git and C++17/20 compiler. {% endhint %}

git clone https://github.com/synacker/daggy.git
mkdir build
cd build
conan install ../daggy --build=missing -o package_deps=True
conan build ../daggy

Install from source with cmake (choose for maintainers)

{% hint style="info" %} System dependencies: qt6 (Core and Network), libssh2, libyaml-cpp, kainjow-mustache {% endhint %}

The tweak number must set to zero. It means, if you get version 2.2.1 you need to set -DVERSION=2.2.1.0.

git clone https://github.com/synacker/daggy.git
mkdir build
cd build
cmake -DVERSION=2.2.0.0 ../daggy/src -DBUILD_SHARED_LIBS=ON
cmake --build .

Add as conan package dependency

Get daggy from conan-center.

{% code title="conanfile.py" %}

def requirements(self):
    self.requires("daggy/2.2.0")

{% endcode %}

Check installation of Daggy Core C++17/20 interface

{% code title="test.cpp" %}

#include <DaggyCore/Core.hpp>
#include <DaggyCore/Sources.hpp>
#include <DaggyCore/aggregators/CFile.hpp>
#include <DaggyCore/aggregators/CConsole.hpp>

#include <QCoreApplication>
#include <QTimer>

namespace {
constexpr const char* json_data = R"JSON(
{
    "sources": {
        "localhost" : {
            "type": "local",
            "commands": {
                "ping1": {
                    "exec": "ping 127.0.0.1",
                    "extension": "log"
                },
                "ping2": {
                    "exec": "ping 127.0.0.1",
                    "extension": "log",
                    "restart": true
                }
            }
        }
    }
}
)JSON";
}

int main(int argc, char** argv) 
{
    QCoreApplication app(argc, argv);
    daggy::Core core(*daggy::sources::convertors::json(json_data));

    daggy::aggregators::CFile file_aggregator("test");
    daggy::aggregators::CConsole console_aggregator("test");

    core.connectAggregator(&file_aggregator);
    core.connectAggregator(&console_aggregator);

    QObject::connect(&core, &daggy::Core::stateChanged, &core,
    [&](DaggyStates state){
        if(state == DaggyFinished)
            app.quit();      
    });

    QTimer::singleShot(3000, &core, [&]()
    {
        core.stop();
    });

    QTimer::singleShot(5000, &core, [&]()
    {
        app.exit(-1);
    });

    core.prepare();
    core.start();

    return app.exec();
}

{% endcode %}

Check installation of Daggy Core C11 interface

{% code title="test.c" %}

#include <stdio.h>
#ifdef _WIN32
#include <Windows.h>
#else
#include <unistd.h>
#endif

#include <DaggyCore/Core.h>

const char* json_data =
"{\
    \"sources\": {\
        \"localhost\" : {\
            \"type\": \"local\",\
            \"commands\": {\
                \"ping1\": {\
                    \"exec\": \"ping 127.0.0.1\",\
                    \"extension\": \"log\"\
                },\
                \"ping2\": {\
                    \"exec\": \"ping 127.0.0.1\",\
                    \"extension\": \"log\"\
                    }\
            }\
        }\
    }\
}"
;

void sle
View on GitHub
GitHub Stars160
CategoryDevelopment
Updated1mo ago
Forks16

Languages

C++

Security Score

100/100

Audited on Feb 18, 2026

No findings