SkillAgentSearch skills...

StreamPack

Multiprotocol (SRT, RTMP and others) live streaming broadcaster libraries for Android

Install / Use

/learn @ThibaultBee/StreamPack
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

StreamPack: RTMP and SRT live streaming SDK for Android

StreamPack is a flexible live streaming library for Android made for both demanding video broadcasters and new video enthusiasts.

Hop On Board! 🚀

⭐ If you like this project, don’t forget to star it!

💖 Want to support its development? Consider becoming a sponsor.

🛠️ Contributions are welcome—feel free to open issues or submit pull requests!

Setup

Get StreamPack core latest artifacts on Maven Central:

dependencies {
    implementation 'io.github.thibaultbee.streampack:streampack-core:3.1.2'
    // For UI (incl. PreviewView)
    implementation 'io.github.thibaultbee.streampack:streampack-ui:3.1.2'
    // For services (incl. screen capture/media projection service)
    implementation 'io.github.thibaultbee.streampack:streampack-services:3.1.2'
    // For RTMP
    implementation 'io.github.thibaultbee.streampack:streampack-rtmp:3.1.2'
    // For SRT
    implementation 'io.github.thibaultbee.streampack:streampack-srt:3.1.2'
}

Features

  • Video:
    • Source: Cameras, Screen recorder or custom video source
    • Orientation: portrait or landscape
    • Codec: HEVC/H.265, AVC/H.264, VP9 or AV1
    • HDR (experimental, see https://github.com/ThibaultBee/StreamPack/discussions/91)
    • Configurable bitrate, resolution, frame rate (tested up to 60), encoder level, encoder profile
    • Video only mode
    • Device video capabilities
    • Switch between video sources
    • Camera settings: auto-focus, exposure, white balance, zoom, flash,...
  • Audio:
    • Source: Microphone, device audio or custom audio source
    • Codec: AAC (LC, HE, HEv2,...) or Opus
    • Configurable bitrate, sample rate, stereo/mono, data format
    • Processing: Noise suppressor or echo cancellation
    • Audio only mode
    • Device audio capabilities
    • Switch between audio sources
  • File: TS, FLV, MP4, WebM, Fragmented MP4 or custom output.
    • Write to a single file or multiple chunk files
  • Streaming: RTMP/RTMPS or SRT

Quick start

If you want to create a new application, you should use the template StreamPack boilerplate. In 5 minutes, you will be able to stream live video to your server.

Getting started

Getting started for a camera stream

  1. Request the required permissions in your Activity/Fragment. See the Permissions section for more information.

  2. Creates a View to display the preview in your layout

    As a camera preview, you can also use a SurfaceView, a TextureView or any View where that can provide a Surface.

    To simplify integration, StreamPack provides an PreviewView in the streampack-ui package.

    
    <layout>
        <io.github.thibaultbee.streampack.views.PreviewView android:id="@+id/preview"
            android:layout_width="match_parent" android:layout_height="match_parent"
            app:enableZoomOnPinch="true" />
    </layout>
    

    app:enableZoomOnPinch is a boolean to enable zoom on pinch gesture.

  3. Instantiates the streamer (main live streaming class)

    A Streamer is a class that represents a whole streaming pipeline from capture to endpoint ( incl. encoding, muxing, sending). Multiple streamers are available depending on the number of independent outputs you want to have:

    • SingleStreamer: for a single output (such as live or record)
    • DualStreamer: for 2 independent outputs (such as independent live and record)
    • for multiple outputs, you can use the StreamerPipeline class that allows to create more complex pipeline with multiple independent outputs (such as audio in one file, video in another file)

    The SingleStreamer and the DualStreamer comes with factory for Camera and MediaProjection (for screen capture). Otherwise, you can set the audio and the video source manually.

    /**
     * Most StreamPack components are coroutine based.
     * Suspend and flow have to be called from a coroutine scope.
     * Android comes with coroutine scopes like `lifecycleScope` or `viewModelScope`.
     * Call suspend functions from a coroutine scope:
     *  viewModelScope.launch {
     *  }
     */
    
    val streamer = cameraSingleStreamer(context = requireContext())
    
    /**
     * To have multiple independent outputs (like for live and record), use a `cameraDualStreamer` or even the `StreamerPipeline`.
     *
     * You can also create the `SingleStreamer`or the `DualStreamer` and add later the audio and video source with `setAudioSource` 
     * and `setVideoSource`.
     * val streamer = SingleStreamer(context = requireContext())
     * streamer.setVideoSource(CameraSourceFactory()) // Same as streamer.setCameraId(context.defaultCameraId)
     * streamer.setAudioSource(MicrophoneSourceFactory())
    */
    
    

    For more information, check the Streamers documentation.

  4. Configures audio and video settings

    val streamer = cameraSingleStreamer(context = requireContext()) // Already instantiated streamer
    
    // Creates a new audio and video config
    val audioConfig = AudioConfig(
        startBitrate = 128000,
        sampleRate = 44100,
        channelConfig = AudioFormat.CHANNEL_IN_STEREO
    )
    
    val videoConfig = VideoConfig(
        startBitrate = 2000000, // 2 Mb/s
        resolution = Size(1280, 720),
        fps = 30
    )
    
    // Sets the audio and video config
    viewModelScope.launch {
        streamer.setAudioConfig(audioConfig)
        streamer.setVideoConfig(videoConfig)
    }
    
  5. Inflates the preview with the streamer

    val streamer = cameraSingleStreamer(context = requireContext()) // Already instantiated streamer
    val preview = findViewById<PreviewView>(R.id.preview) // Already inflated preview
    /**
     * If the preview is a `PreviewView`
     */
    preview.setVideoSourceProvider(streamer)
    // The preview automatically starts
    
    
    /**
     * Otherwise if the preview is in a [SurfaceView], a [TextureView], a [Surface],... you can use:
     */
    streamer.startPreview(preview)
    
  6. Sets the device orientation

    // Already instantiated streamer
    val streamer = cameraSingleStreamer(context = requireContext())
    
    // Sets the device orientation
    streamer.setTargetRotation(Surface.ROTATION_90) // Or Surface.ROTATION_0, Surface.ROTATION_180, Surface.ROTATION_270
    

    StreamPack comes with 2 RotationProvider that fetches and listens the device rotation:

    • the SensorRotationProvider. The SensorRotationProvider is backed by the OrientationEventListener and it follows the device orientation.
    • the DisplayRotationProvider. The DisplayRotationProvider is backed by the DisplayManager and if orientation is locked, it will return the last known orientation.
    val streamer = cameraSingleStreamer(context = requireContext()) // Already instantiated streamer
    val rotationProvider = SensorRotationProvider(context = requireContext())
    
    // Sets the device orientation
    rotationProvider.addListener(object : IRotationProvider.Listener {
        override fun onOrientationChanged(rotation: Int) {
            streamer.setTargetRotation(rotation)
        }
    })
    
    // Don't forget to remove the listener when you don't need it anymore
    rotationProvider.removeListener(listener)
    

    You can transform the RotationProvider into a Flow provider through the asFlowProvider.

     val streamer = cameraSingleStreamer(context = requireContext()) // Already instantiated streamer
     val rotationProvider = SensorRotationProvider(context = requireContext())
    
     // For coroutine based
     val rotationFlowProvider = rotationProvider.asFlowProvider()
     // Then in a coroutine suspend function
     rotationFlowProvider.rotationFlow.collect { rotation ->
     streamer.setTargetRotation(rotation)
     }
    

    You can also create your own targetRotation provider.

  7. Starts the live streaming

    // Already instantiated streamer
    val streamer = cameraSingleStreamer(context = requireContext())
    
    val descriptor =
        UriMediaDescriptor("rtmps://serverip:1935/s/streamKey") // For RTMP/RTMPS. Uri also supports SRT url, file path, content path,...
    /**
     * Alternatively, you can use object syntax:
     * - RtmpMediaDescriptor("rtmps", "serverip", 1935, "s", "streamKey") // For RTMP/RTMPS
     * - SrtMediaDescriptor("serverip", 1234) // For SRT
     */
    
    streamer.startStream(descriptor) 
    // You can also use:
    // streamer.startStream("rtmp://serverip:1935/s/streamKey") // For RTMP/RTMPS
    
  8. Stops and releases the streamer

    // Already instantiated streamer
    val streamer = cameraSingleStreamer(context = requireContext())
    
    streamer.stopStream()
    streamer.close() // Disconnect from server or close the file
    streamer.release()
    

For more detailed explanation, check out the documentation.

For a complete example, check out the demos/camera directory.

Getting started for a screen recorder stream

  1. Add the streampack-services dependency in your build.gradle file:

    dependencies {
        implementation 'io.github.thibaultbee.streampack:streampack-services:3.1.2'
    }
    
  2. Requests the required permissions in your Activity/Fragment. See the [Permis

Related Skills

View on GitHub
GitHub Stars334
CategoryContent
Updated1d ago
Forks95

Languages

Kotlin

Security Score

100/100

Audited on Apr 4, 2026

No findings