SkillAgentSearch skills...

Datalens

A modern, scalable analytics system

Install / Use

/learn @datalens-tech/Datalens
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

DataLens · Release last commit

datalens-ui datalens-us datalens-backend datalens-auth datalens-meta-manager

DataLens is a modern business intelligence and data visualization system. It was developed and extensively used as a primary BI tool in Yandex and is also available as a part of Yandex Cloud platform. See also our roadmap, releases notes and community in telegram.

Getting started

Installing Docker

DataLens requires Docker to be installed. Follow these instructions depending on the platform you use:

Note about Docker Compose:

  • The new Docker Compose plugin is available as the docker-compose-v2 package on Ubuntu 20.04/22.04/24.04 from the base APT repository.

  • The minimal supported version of the legacy docker-compose utility (as a separate package) is 1.29.0. It is included in the base APT repository as the docker-compose package only on Ubuntu 22.04.

Running containers

Clone repository:

git clone https://github.com/datalens-tech/datalens && cd datalens

For the quick start use the following command to start DataLens containers:

HC=1 docker compose up

This command will launch all containers required to run DataLens, and the UI will be available on http://localhost:8080 (default user and password is admin, admin).

<details> <summary>Using different port for UI</summary> If you want to use a different port (e.g. `8081`), you can set it using the `UI_PORT` env variable:
UI_PORT=8081 docker compose up
</details>

However, for production usage we recommend generating a compose file with random secrets:

# generate random secrets with openssl, store it to .env file and prepare production compose template
./init.sh --hc

# and then run production compose
docker compose -f ./docker-compose.production.yaml up -d

# you can also generate and run production compose file with one command
./init.sh --hc --up

Randomly generated admin password will be stored in the .env file and printed to terminal.

Note: You can find all script arguments by running the ./init.sh --help command

<details> <summary>Notice on Highcharts usage</summary>
  Highcharts is a proprietary commercial product. If you enable Highcharts in your DataLens instance (with `HC=1` variable), you should comply with Highcharts license (https://github.com/highcharts/highcharts/blob/master/license.txt).

  When Highcharts is disabled in DataLens, we use D3.js instead. However, currently only a few visualization types are compatible with D3.js. We are actively working on adding D3 support to additional visualizations and are going to completely replace Highcharts with D3 in DataLens.
</details> <details> <summary>How to enable Yandex Maps</summary>

Available since release v1.11.0

Use the following container parameters for launch:

| Parameter | Description | Values | | -------------------- | --------------------------------------------------------------- | ------------- | | YANDEX_MAP_ENABLED | Enable usage of Yandex Maps visualization | 1 or true | | YANDEX_MAP_TOKEN | Yandex Maps API key | <string> |

YANDEX_MAP_ENABLED=1 YANDEX_MAP_TOKEN=XXXXXXXXX docker compose up

# or if you use init.sh script
./init.sh --yandex-map --yandex-map-token XXXXXXXXX --up
</details>

How to update

To update DataLens to the latest version, simply pull git repository and restart the containers:

git pull

# if you use base compose file
docker compose up

# if you use init.sh script
./init.sh --up

All your user settings, connections, and created objects will be preserved as they are stored in the db-postgres docker volume. The update process does not affect your data.

Deploy with Helm chart in k8s cluster

For deployment in a Kubernetes cluster, you can use Helm chart from an OCI-compatible package registry

First install Helm release:

# generating rsa keys for auth service and temporal
AUTH_TOKEN_PRIVATE_KEY=$(openssl genpkey -algorithm RSA -pkeyopt "rsa_keygen_bits:4096" 2>/dev/null)
AUTH_TOKEN_PUBLIC_KEY=$(echo "${AUTH_TOKEN_PRIVATE_KEY}" | openssl rsa -pubout 2>/dev/null)
TEMPORAL_AUTH_PRIVATE_KEY=$(openssl genpkey -algorithm RSA -pkeyopt "rsa_keygen_bits:4096" 2>/dev/null)
TEMPORAL_AUTH_PUBLIC_KEY=$(echo "${TEMPORAL_AUTH_PRIVATE_KEY}" | openssl rsa -pubout 2>/dev/null)
BI_DYNAMIC_US_AUTH_PRIVATE_KEY=$(openssl genpkey -algorithm RSA -pkeyopt "rsa_keygen_bits:4096" 2>/dev/null)
BI_DYNAMIC_US_AUTH_PUBLIC_KEY=$(echo "${BI_DYNAMIC_US_AUTH_PRIVATE_KEY}" | openssl rsa -pubout 2>/dev/null)
UI_DYNAMIC_US_AUTH_PRIVATE_KEY=$(openssl genpkey -algorithm RSA -pkeyopt "rsa_keygen_bits:4096" 2>/dev/null)
UI_DYNAMIC_US_AUTH_PUBLIC_KEY=$(echo "${UI_DYNAMIC_US_AUTH_PRIVATE_KEY}" | openssl rsa -pubout 2>/dev/null)

helm upgrade --install datalens oci://ghcr.io/datalens-tech/helm/datalens \
--namespace datalens --create-namespace \
--set "secrets.AUTH_TOKEN_PRIVATE_KEY=${AUTH_TOKEN_PRIVATE_KEY}" \
--set "secrets.AUTH_TOKEN_PUBLIC_KEY=${AUTH_TOKEN_PUBLIC_KEY}" \
--set "secrets.TEMPORAL_AUTH_PRIVATE_KEY=${TEMPORAL_AUTH_PRIVATE_KEY}" \
--set "secrets.TEMPORAL_AUTH_PUBLIC_KEY=${TEMPORAL_AUTH_PUBLIC_KEY}" \
--set "secrets.BI_DYNAMIC_US_AUTH_PRIVATE_KEY=${BI_DYNAMIC_US_AUTH_PRIVATE_KEY}" \
--set "secrets.BI_DYNAMIC_US_AUTH_PUBLIC_KEY=${BI_DYNAMIC_US_AUTH_PUBLIC_KEY}" \
--set "secrets.UI_DYNAMIC_US_AUTH_PRIVATE_KEY=${UI_DYNAMIC_US_AUTH_PRIVATE_KEY}" \
--set "secrets.UI_DYNAMIC_US_AUTH_PUBLIC_KEY=${UI_DYNAMIC_US_AUTH_PUBLIC_KEY}"

Note: Helm template engine does not provide built-in functions for creating private and public RSA keys.

Update Helm release:

helm upgrade datalens oci://ghcr.io/datalens-tech/helm/datalens --namespace datalens

Admin login and password will be stored in datalens-secrets Kubernetes secret resource

Parts of the project

DataLens consists of three main parts:

  • UI is a SPA application with corresponding Node.js part. It provides user interface, proxies requests from users to backend services, and also applies some light data postprocessing for charts.
  • Backend is a set of Python applications and libraries. It is responsible for connecting to data sources, generating queries for them, and post-processing the data (including formula calculations). The result of this work is an abstract dataset that can be used in UI for charts data request.
  • UnitedStorage (US) is a Node.js service that uses PostgreSQL to store metadata and configuration of all DataLens objects.
  • Auth is a Node.js service that provides authentication/authorization layer for DataLens.
  • MetaManager is a Node.js service that provides workflow workers for export/import workbooks.

What's already available

We are releasing DataLens with a minimal set of available connectors (ClickHouse, ClickHouse over YTsaurus, and PostgreSQL) as well as other core functionality such as data processing engine, user interface, and minimal auth layer. We are planning to add missing features based on our understanding of community priorities and your feedback.

Cloud Providers

Below is a list of cloud providers offering DataLens as a service:

  1. Yandex Cloud platform

Authentication

DataLens supports native authentication which is enabled by default.

Use the following command to start DataLens with authentication and auto-generated secrets for production:

./init.sh --up

Notice: the updated .env file after initialization: it contains auth access keys and admin password. Keep that file safe and do not share its contents.

After that you can login to DataLens

View on GitHub
GitHub Stars1.7k
CategoryData
Updated3d ago
Forks108

Languages

PLpgSQL

Security Score

100/100

Audited on Mar 31, 2026

No findings