SkillAgentSearch skills...

Dockerfiles

Multi docker container images for main Big Data Tools. (Hadoop, Spark, Kafka, HBase, Cassandra, Zookeeper, Zeppelin, Drill, Flink, Hive, Hue, Mesos, ... )

Install / Use

/learn @jorgeacf/Dockerfiles

README

Jorge Figueiredo - Dockerfiles

Build Status DockerHub License: GPL v3 GitHub stars GitHub forks

This repository contains my dockerfiles to build containers for the most used Big Data Tools.

| Image Type | Docker Container | Image Size | Latest Version | :--------: | :----------------: | :--------------: | :---------------: | | Big Data | Cassandra | | | Big Data | Drill | | | Big Data | Flink | | | Big Data | Hadoop | | | Big Data | HBase | | | Big Data | Hive | | | Big Data | Hue | | | Big Data | Kafka | | | Big Data | Mesos | | | Big Data | Spark | | | Big Data | Zeppelin | | | Big Data | Zookeeper | | | Build | Maven | | | Build | SBT | | | CI | Jenkins | | | CI | Nexus | | | Languages | Node.js | | | Languages | Scala | |

How to build these Docker images

Each docker image can be build using make build from the root directory of that image.

To build all images run make build from the root directory of this repository.

How to use these images

Run make in the root directory the image to run to see the options. e.g.

jorgeacf at localhost in ~/dev/dockerfiles/bigdata/hadoop on develop [!]
$ make

This is the make help for Apache Hadoop (2.7.2) docker image

  Run 'make build' to build the Hadoop docker image.
  Run 'make run-multi slaves=n' to start Hadoop with n slaves.
  Run 'make clean' to clean all Hadoop containers.

Related Skills

View on GitHub
GitHub Stars36
CategoryDevelopment
Updated10mo ago
Forks12

Languages

Shell

Security Score

87/100

Audited on May 22, 2025

No findings