StreamPack
Multiprotocol (SRT, RTMP and others) live streaming broadcaster libraries for Android
Install / Use
/learn @ThibaultBee/StreamPackREADME
StreamPack: RTMP and SRT live streaming SDK for Android
StreamPack is a flexible live streaming library for Android made for both demanding video broadcasters and new video enthusiasts.
Hop On Board! 🚀
⭐ If you like this project, don’t forget to star it!
💖 Want to support its development? Consider becoming a sponsor.
🛠️ Contributions are welcome—feel free to open issues or submit pull requests!
Setup
Get StreamPack core latest artifacts on Maven Central:
dependencies {
implementation 'io.github.thibaultbee.streampack:streampack-core:3.1.2'
// For UI (incl. PreviewView)
implementation 'io.github.thibaultbee.streampack:streampack-ui:3.1.2'
// For services (incl. screen capture/media projection service)
implementation 'io.github.thibaultbee.streampack:streampack-services:3.1.2'
// For RTMP
implementation 'io.github.thibaultbee.streampack:streampack-rtmp:3.1.2'
// For SRT
implementation 'io.github.thibaultbee.streampack:streampack-srt:3.1.2'
}
Features
- Video:
- Source: Cameras, Screen recorder or custom video source
- Orientation: portrait or landscape
- Codec: HEVC/H.265, AVC/H.264, VP9 or AV1
- HDR (experimental, see https://github.com/ThibaultBee/StreamPack/discussions/91)
- Configurable bitrate, resolution, frame rate (tested up to 60), encoder level, encoder profile
- Video only mode
- Device video capabilities
- Switch between video sources
- Camera settings: auto-focus, exposure, white balance, zoom, flash,...
- Audio:
- Source: Microphone, device audio or custom audio source
- Codec: AAC (LC, HE, HEv2,...) or Opus
- Configurable bitrate, sample rate, stereo/mono, data format
- Processing: Noise suppressor or echo cancellation
- Audio only mode
- Device audio capabilities
- Switch between audio sources
- File: TS, FLV, MP4, WebM, Fragmented MP4 or custom output.
- Write to a single file or multiple chunk files
- Streaming: RTMP/RTMPS or SRT
- Record to a file and stream at the same time
- Support for enhanced RTMP
- Ultra low-latency based on SRT
- Network adaptive bitrate mechanism for SRT
Quick start
If you want to create a new application, you should use the template StreamPack boilerplate. In 5 minutes, you will be able to stream live video to your server.
Getting started
Getting started for a camera stream
-
Request the required permissions in your Activity/Fragment. See the Permissions section for more information.
-
Creates a
Viewto display the preview in your layoutAs a camera preview, you can also use a
SurfaceView, aTextureViewor anyViewwhere that can provide aSurface.To simplify integration, StreamPack provides an
PreviewViewin thestreampack-uipackage.<layout> <io.github.thibaultbee.streampack.views.PreviewView android:id="@+id/preview" android:layout_width="match_parent" android:layout_height="match_parent" app:enableZoomOnPinch="true" /> </layout>app:enableZoomOnPinchis a boolean to enable zoom on pinch gesture. -
Instantiates the streamer (main live streaming class)
A
Streameris a class that represents a whole streaming pipeline from capture to endpoint ( incl. encoding, muxing, sending). Multiple streamers are available depending on the number of independent outputs you want to have:SingleStreamer: for a single output (such as live or record)DualStreamer: for 2 independent outputs (such as independent live and record)- for multiple outputs, you can use the
StreamerPipelineclass that allows to create more complex pipeline with multiple independent outputs (such as audio in one file, video in another file)
The
SingleStreamerand theDualStreamercomes with factory forCameraandMediaProjection(for screen capture). Otherwise, you can set the audio and the video source manually./** * Most StreamPack components are coroutine based. * Suspend and flow have to be called from a coroutine scope. * Android comes with coroutine scopes like `lifecycleScope` or `viewModelScope`. * Call suspend functions from a coroutine scope: * viewModelScope.launch { * } */ val streamer = cameraSingleStreamer(context = requireContext()) /** * To have multiple independent outputs (like for live and record), use a `cameraDualStreamer` or even the `StreamerPipeline`. * * You can also create the `SingleStreamer`or the `DualStreamer` and add later the audio and video source with `setAudioSource` * and `setVideoSource`. * val streamer = SingleStreamer(context = requireContext()) * streamer.setVideoSource(CameraSourceFactory()) // Same as streamer.setCameraId(context.defaultCameraId) * streamer.setAudioSource(MicrophoneSourceFactory()) */For more information, check the Streamers documentation.
-
Configures audio and video settings
val streamer = cameraSingleStreamer(context = requireContext()) // Already instantiated streamer // Creates a new audio and video config val audioConfig = AudioConfig( startBitrate = 128000, sampleRate = 44100, channelConfig = AudioFormat.CHANNEL_IN_STEREO ) val videoConfig = VideoConfig( startBitrate = 2000000, // 2 Mb/s resolution = Size(1280, 720), fps = 30 ) // Sets the audio and video config viewModelScope.launch { streamer.setAudioConfig(audioConfig) streamer.setVideoConfig(videoConfig) } -
Inflates the preview with the streamer
val streamer = cameraSingleStreamer(context = requireContext()) // Already instantiated streamer val preview = findViewById<PreviewView>(R.id.preview) // Already inflated preview /** * If the preview is a `PreviewView` */ preview.setVideoSourceProvider(streamer) // The preview automatically starts /** * Otherwise if the preview is in a [SurfaceView], a [TextureView], a [Surface],... you can use: */ streamer.startPreview(preview) -
Sets the device orientation
// Already instantiated streamer val streamer = cameraSingleStreamer(context = requireContext()) // Sets the device orientation streamer.setTargetRotation(Surface.ROTATION_90) // Or Surface.ROTATION_0, Surface.ROTATION_180, Surface.ROTATION_270StreamPack comes with 2
RotationProviderthat fetches and listens the device rotation:- the
SensorRotationProvider. TheSensorRotationProvideris backed by theOrientationEventListenerand it follows the device orientation. - the
DisplayRotationProvider. TheDisplayRotationProvideris backed by theDisplayManagerand if orientation is locked, it will return the last known orientation.
val streamer = cameraSingleStreamer(context = requireContext()) // Already instantiated streamer val rotationProvider = SensorRotationProvider(context = requireContext()) // Sets the device orientation rotationProvider.addListener(object : IRotationProvider.Listener { override fun onOrientationChanged(rotation: Int) { streamer.setTargetRotation(rotation) } }) // Don't forget to remove the listener when you don't need it anymore rotationProvider.removeListener(listener)You can transform the
RotationProviderinto aFlowprovider through theasFlowProvider.val streamer = cameraSingleStreamer(context = requireContext()) // Already instantiated streamer val rotationProvider = SensorRotationProvider(context = requireContext()) // For coroutine based val rotationFlowProvider = rotationProvider.asFlowProvider() // Then in a coroutine suspend function rotationFlowProvider.rotationFlow.collect { rotation -> streamer.setTargetRotation(rotation) }You can also create your own
targetRotationprovider. - the
-
Starts the live streaming
// Already instantiated streamer val streamer = cameraSingleStreamer(context = requireContext()) val descriptor = UriMediaDescriptor("rtmps://serverip:1935/s/streamKey") // For RTMP/RTMPS. Uri also supports SRT url, file path, content path,... /** * Alternatively, you can use object syntax: * - RtmpMediaDescriptor("rtmps", "serverip", 1935, "s", "streamKey") // For RTMP/RTMPS * - SrtMediaDescriptor("serverip", 1234) // For SRT */ streamer.startStream(descriptor) // You can also use: // streamer.startStream("rtmp://serverip:1935/s/streamKey") // For RTMP/RTMPS -
Stops and releases the streamer
// Already instantiated streamer val streamer = cameraSingleStreamer(context = requireContext()) streamer.stopStream() streamer.close() // Disconnect from server or close the file streamer.release()
For more detailed explanation, check out the documentation.
For a complete example, check out the demos/camera directory.
Getting started for a screen recorder stream
-
Add the
streampack-servicesdependency in yourbuild.gradlefile:dependencies { implementation 'io.github.thibaultbee.streampack:streampack-services:3.1.2' } -
Requests the required permissions in your Activity/Fragment. See the [Permis
Related Skills
qqbot-channel
349.0kQQ 频道管理技能。查询频道列表、子频道、成员、发帖、公告、日程等操作。使用 qqbot_channel_api 工具代理 QQ 开放平台 HTTP 接口,自动处理 Token 鉴权。当用户需要查看频道、管理子频道、查询成员、发布帖子/公告/日程时使用。
docs-writer
100.3k`docs-writer` skill instructions As an expert technical writer and editor for the Gemini CLI project, you produce accurate, clear, and consistent documentation. When asked to write, edit, or revie
model-usage
349.0kUse CodexBar CLI local cost usage to summarize per-model usage for Codex or Claude, including the current (most recent) model or a full model breakdown. Trigger when asked for model-level usage/cost data from codexbar, or when you need a scriptable per-model summary from codexbar cost JSON.
Design
Campus Second-Hand Trading Platform \- General Design Document (v5.0 \- React Architecture \- Complete Final Version)1\. System Overall Design 1.1. Project Overview This project aims t
