Dockerfiles
Multi docker container images for main Big Data Tools. (Hadoop, Spark, Kafka, HBase, Cassandra, Zookeeper, Zeppelin, Drill, Flink, Hive, Hue, Mesos, ... )
Install / Use
/learn @jorgeacf/DockerfilesREADME
Jorge Figueiredo - Dockerfiles
This repository contains my dockerfiles to build containers for the most used Big Data Tools.
| Image Type | Docker Container | Image Size | Latest Version
| :--------: | :----------------: | :--------------: | :---------------: |
| Big Data | Cassandra | |
| Big Data | Drill |
|
| Big Data | Flink |
|
| Big Data | Hadoop |
|
| Big Data | HBase |
|
| Big Data | Hive |
|
| Big Data | Hue |
|
| Big Data | Kafka |
|
| Big Data | Mesos |
|
| Big Data | Spark |
|
| Big Data | Zeppelin |
|
| Big Data | Zookeeper |
|
| Build | Maven |
|
| Build | SBT |
|
| CI | Jenkins |
|
| CI | Nexus |
|
| Languages | Node.js |
|
| Languages | Scala |
|
How to build these Docker images
Each docker image can be build using make build from the root directory of that image.
To build all images run make build from the root directory of this repository.
How to use these images
Run make in the root directory the image to run to see the options. e.g.
jorgeacf at localhost in ~/dev/dockerfiles/bigdata/hadoop on develop [!]
$ make
This is the make help for Apache Hadoop (2.7.2) docker image
Run 'make build' to build the Hadoop docker image.
Run 'make run-multi slaves=n' to start Hadoop with n slaves.
Run 'make clean' to clean all Hadoop containers.
Related Skills
node-connect
348.0kDiagnose OpenClaw node connection and pairing failures for Android, iOS, and macOS companion apps
frontend-design
108.8kCreate distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, or applications. Generates creative, polished code that avoids generic AI aesthetics.
openai-whisper-api
348.0kTranscribe audio via OpenAI Audio Transcriptions API (Whisper).
qqbot-media
348.0kQQBot 富媒体收发能力。使用 <qqmedia> 标签,系统根据文件扩展名自动识别类型(图片/语音/视频/文件)。
