Sponsored with ๐ by
Enterprise Grade APIs for Feeds & Chat. Try the iOS Chat tutorial ๐ฌ
LBLogger.with(HaishinKitIdentifier).level = .trace
.Project name | Notes | License |
---|---|---|
HaishinKit for Android. | Camera and Microphone streaming library via RTMP for Android. | BSD 3-Clause "New" or "Revised" License |
HaishinKit for Flutter. | Camera and Microphone streaming library via RTMP for Flutter. | BSD 3-Clause "New" or "Revised" License |
Supports two camera video sources. A picture-in-picture display that shows the image of the secondary camera of the primary camera. Supports camera split display that displays horizontally and vertically.
Picture-In-Picture | Split |
---|---|
![]() |
![]() |
// If you're using multi-camera functionality, please make sure to call the attachMultiCamera method first. This is required for iOS 14 and 15, among others.
if #available(iOS 13.0, *) {
let front = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front)
stream.attachMultiCamera(front)
}
let back = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)
stream.attachCamera(back)
rtmpStream.attachAudio(AVCaptureDevice.default(for: .audio))
Features | PiPHKView | MTHKView |
---|---|---|
Engine | AVSampleBufferDisplayLayer | Metal |
Publish | โ | โ |
Playback | โ | โ |
VisualEffect | โ | โ |
MultiCamera | โ | โ |
PictureInPicture | โ |
Examples project are available for iOS with UIKit, iOS with SwiftUI, macOS and tvOS. Example macOS requires Apple Silicon mac.
git clone https://github.com/shogo4405/HaishinKit.swift.git
cd HaishinKit.swift
carthage bootstrap --platform iOS,macOS,tvOS --use-xcframeworks
open HaishinKit.xcodeproj
Version | Xcode | Swift |
---|---|---|
1.7.0+ | 15.0+ | 5.9+ |
1.6.0+ | 15.0+ | 5.8+ |
1.5.0+ | 14.0+ | 5.7+ |
- | iOS | tvOS | macOS | visionOS | watchOS |
---|---|---|---|---|---|
HaishinKit | 12.0+ | 12.0+ | 10.13+ | 1.0+ | - |
SRTHaishinKit | 12.0+ | - | 13.0+ | - | - |
Please contains Info.plist.
iOS 10.0+
macOS 10.14+
tvOS 17.0+
HaishinKit has a multi-module configuration. If you want to use the SRT protocol, please use SRTHaishinKit. SRTHaishinKit supports SPM only.
HaishinKit | SRTHaishinKit | |
---|---|---|
SPM | https://github.com/shogo4405/HaishinKit.swift | https://github.com/shogo4405/HaishinKit.swift |
CocoaPods | source 'https://github.com/CocoaPods/Specs.git' use_frameworks! def import_pods pod 'HaishinKit', '~> 1.6.0 end target 'Your Target' do platform :ios, '12.0' import_pods end |
Not supported. |
Carthage | github "shogo4405/HaishinKit.swift" ~> 1.6.0 | Not supported. |
Make sure you setup and activate your AVAudioSession iOS.
import AVFoundation
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth])
try session.setActive(true)
} catch {
print(error)
}
let connection = RTMPConnection()
let stream = RTMPStream(connection: rtmpConnection)
stream.attachAudio(AVCaptureDevice.default(for: .audio)) { error in
// print(error)
}
stream.attachCamera(AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)) { error in
// print(error)
}
let hkView = MTHKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(stream)
// add ViewController#view
view.addSubview(hkView)
connection.connect("rtmp://localhost/appName/instanceName")
stream.publish("streamName")
let connection = RTMPConnection()
let stream = RTMPStream(connection: rtmpConnection)
let hkView = MTHKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(stream)
// add ViewController#view
view.addSubview(hkView)
connection.connect("rtmp://localhost/appName/instanceName")
stream.play("streamName")
var connection = RTMPConnection()
connection.connect("rtmp://username:password@localhost/appName/instanceName")
let connection = SRTConnection()
let stream = SRTStream(connection: connection)
stream.attachAudio(AVCaptureDevice.default(for: .audio)) { error in
// print(error)
}
stream.attachCamera(AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)) { error in
// print(error)
}
let hkView = HKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(rtmpStream)
// add ViewController#view
view.addSubview(hkView)
connection.connect("srt://host:port?option=foo")
stream.publish()
let connection = SRTConnection()
let stream = SRTStream(connection: connection)
let hkView = MTHKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(rtmpStream)
// add ViewController#view
view.addSubview(hkView)
connection.connect("srt://host:port?option=foo")
stream.play()
stream.frameRate = 30
stream.sessionPreset = AVCaptureSession.Preset.medium
/// Specifies the video capture settings.
stream.videoCapture(for: 0).isVideoMirrored = false
stream.videoCapture(for: 0).preferredVideoStabilizationMode = .auto
// stream.videoCapture(for: 1).isVideoMirrored = false
// Specifies the audio codec settings.
stream.audioSettings = AudioCodecSettings(
bitRate: 64 * 1000
)
// Specifies the video codec settings.
stream.videoSettings = VideoCodecSettings(
videoSize: .init(width: 854, height: 480),
profileLevel: kVTProfileLevel_H264_Baseline_3_1 as String,
bitRate: 640 * 1000,
maxKeyFrameIntervalDuration: 2,
scalingMode: .trim,
bitRateMode: .average,
allowFrameReordering: nil,
isHardwareEncoderEnabled: true
)
// Specifies the recording settings. 0" means the same of input.
stream.startRecording([
AVMediaType.audio: [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 0,
AVNumberOfChannelsKey: 0,
// AVEncoderBitRateKey: 128000,
],
AVMediaType.video: [
AVVideoCodecKey: AVVideoCodecH264,
AVVideoHeightKey: 0,
AVVideoWidthKey: 0,
/*
AVVideoCompressionPropertiesKey: [
AVVideoMaxKeyFrameIntervalDurationKey: 2,
AVVideoProfileLevelKey: AVVideoProfileLevelH264Baseline30,
AVVideoAverageBitRateKey: 512000
]
*/
]
])
stream.attachAudio(AVCaptureDevice.default(for: .audio))
// picrure in picrure settings.
stream.multiCamCaptureSettings = MultiCamCaptureSetting(
mode: .pip,
cornerRadius: 16.0,
regionOfInterest: .init(
origin: CGPoint(x: 16, y: 16),
size: .init(width: 160, height: 160)
)
)
// split settings.
stream.multiCamCaptureSettings = MultiCamCaptureSetting(
mode: .split(direction: .east),
cornerRadius: 0.0,
regionOfInterest: .init(
origin: .zero,
size: .zero
)
)
// iOS
let screen = IOUIScreenCaptureUnit(shared: UIApplication.shared)
screen.delegate = stream
screen.startRunning()
// macOS
stream.attachScreen(AVCaptureScreenInput(displayID: CGMainDisplayID()))
Looking for sponsors. Sponsoring I will enable us to:
If you use any of our libraries for work, see if your employers would be interested in sponsorship. I have some special offers.ใI would greatly appreciate. Thank you.
ในใใณใตใผใๅ้ใใฆใใพใใๅฉ็จ็จ้ใจใใฆใฏใ
ใใฎใฉใคใใฉใชใผใไปไบใง็ถ็ถ็ใซๅฉ็จใใฆใใๅ ดๅใฏใใใฒใ้็จไธปใซใในใใณใตใผใซ่ๅณใใชใใ็ขบ่ชใใใ ใใใจๅนธใใงใใใใใคใ็นๅ ธใ็จๆใใฆใใพใใ
BSD-3-Clause
link |
Stars: 0 |
Last commit: 6 weeks ago |
Swiftpack is being maintained by Petr Pavlik | @ptrpavlik | @swiftpackco | API | Analytics