Sponsored with ๐ by
Enterprise Grade APIs for Feeds & Chat. Try the iOS Chat tutorial ๐ฌ
LBLogger.with(HaishinKitIdentifier).level = .trace
.Project name | Notes | License |
---|---|---|
HaishinKit for Android. | Camera and Microphone streaming library via RTMP for Android. | BSD 3-Clause "New" or "Revised" License |
HaishinKit for Flutter. | Camera and Microphone streaming library via RTMP for Flutter. | BSD 3-Clause "New" or "Revised" License |
Supports two camera video sources. A picture-in-picture display that shows the image of the secondary camera of the primary camera. Supports camera split display that displays horizontally and vertically.
Picture-In-Picture | Split |
---|---|
![]() |
![]() |
// If you're using multi-camera functionality, please make sure to call the attachMultiCamera method first. This is required for iOS 14 and 15, among others.
if #available(iOS 13.0, *) {
let front = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front)
stream.attachMultiCamera(front)
}
let back = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)
stream.attachCamera(back)
rtmpStream.attachAudio(AVCaptureDevice.default(for: .audio))
Features | HKView | PiPHKView | MTHKView |
---|---|---|---|
Engine | AVCaptureVideoPreviewLayer | AVSampleBufferDisplayLayer | Metal |
Publish | โ | โ | โ |
Playback | โ | โ | |
VisualEffect | โ | โ | |
PictureInPicture | โ | ||
MultiCamera | โ | โ |
Examples project are available for iOS with UIKit, iOS with SwiftUI, macOS and tvOS.
git clone https://github.com/shogo4405/HaishinKit.swift.git
cd HaishinKit.swift
carthage bootstrap -platform iOS,macOS,tvOS --use-xcframeworks
open HaishinKit.xcodeproj
Version | Xcode | Swift |
---|---|---|
1.6.0+ | 15.0+ | 5.8+ |
1.5.0+ | 14.0+ | 5.7+ |
- | iOS | tvOS | macOS | visionOS | watchOS |
---|---|---|---|---|---|
HaishinKit | 12.0+ | 12.0+ | 10.13+ | - | - |
SRTHaishinKit | 12.0+ | - | - | - | - |
Please contains Info.plist.
iOS 10.0+
macOS 10.14+
HaishinKit has a multi-module configuration. If you want to use the SRT protocol, please use SRTHaishinKit. SRTHaishinKit supports SPM only.
HaishinKit | SRTHaishinKit | |
---|---|---|
SPM | https://github.com/shogo4405/HaishinKit.swift | https://github.com/shogo4405/HaishinKit.swift |
CocoaPods | source 'https://github.com/CocoaPods/Specs.git' use_frameworks! def import_pods pod 'HaishinKit', '~> 1.6.0 end target 'Your Target' do platform :ios, '12.0' import_pods end |
Not supported. |
Carthage | github "shogo4405/HaishinKit.swift" ~> 1.6.0 | Not supported. |
Make sure you setup and activate your AVAudioSession iOS.
import AVFoundation
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth])
try session.setActive(true)
} catch {
print(error)
}
Real Time Messaging Protocol (RTMP).
let connection = RTMPConnection()
let stream = RTMPStream(connection: rtmpConnection)
stream.attachAudio(AVCaptureDevice.default(for: .audio)) { error in
// print(error)
}
stream.attachCamera(AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)) { error in
// print(error)
}
let hkView = MTHKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(stream)
// add ViewController#view
view.addSubview(hkView)
connection.connect("rtmp://localhost/appName/instanceName")
stream.publish("streamName")
rtmpConneciton.connect("rtmp://server-ip-address[:port]/application/[appInstance]")
rtmpStream.publish("[prefix:[path1[/path2/]]]streamName")
rtmpConneciton.connect("rtmp://localhost/live")
rtmpStream.publish("streamName")
var stream = RTMPStream(connection: rtmpConnection)
stream.frameRate = 30
stream.sessionPreset = AVCaptureSession.Preset.medium
/// Specifies the video capture settings.
stream.videoCapture(for: 0).isVideoMirrored = false
stream.videoCapture(for: 0).preferredVideoStabilizationMode = .auto
// rtmpStream.videoCapture(for: 1).isVideoMirrored = false
// Specifies the audio codec settings.
stream.audioSettings = AudioCodecSettings(
bitRate: 64 * 1000
)
// Specifies the video codec settings.
stream.videoSettings = VideoCodecSettings(
videoSize: .init(width: 854, height: 480),
profileLevel: kVTProfileLevel_H264_Baseline_3_1 as String,
bitRate: 640 * 1000,
maxKeyFrameIntervalDuration: 2,
scalingMode: .trim,
bitRateMode: .average,
allowFrameReordering: nil,
isHardwareEncoderEnabled: true
)
// Specifies the recording settings. 0" means the same of input.
stream.startRecording([
AVMediaType.audio: [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 0,
AVNumberOfChannelsKey: 0,
// AVEncoderBitRateKey: 128000,
],
AVMediaType.video: [
AVVideoCodecKey: AVVideoCodecH264,
AVVideoHeightKey: 0,
AVVideoWidthKey: 0,
/*
AVVideoCompressionPropertiesKey: [
AVVideoMaxKeyFrameIntervalDurationKey: 2,
AVVideoProfileLevelKey: AVVideoProfileLevelH264Baseline30,
AVVideoAverageBitRateKey: 512000
]
*/
]
])
// 2nd arguemnt set false
stream.attachAudio(AVCaptureDevice.default(for: .audio), automaticallyConfiguresApplicationAudioSession: false)
// picrure in picrure settings.
stream.multiCamCaptureSettings = MultiCamCaptureSetting(
mode: .pip,
cornerRadius: 16.0,
regionOfInterest: .init(
origin: CGPoint(x: 16, y: 16),
size: .init(width: 160, height: 160)
)
)
// split settings.
stream.multiCamCaptureSettings = MultiCamCaptureSetting(
mode: .split(direction: .east),
cornerRadius: 0.0,
regionOfInterest: .init(
origin: .zero,
size: .zero
)
)
var connection = RTMPConnection()
connection.connect("rtmp://username:password@localhost/appName/instanceName")
// iOS
let screen = IOUIScreenCaptureUnit(shared: UIApplication.shared)
screen.delegate = stream
screen.startRunning()
// macOS
stream.attachScreen(AVCaptureScreenInput(displayID: CGMainDisplayID()))
HTTP Live Streaming (HLS). Your iPhone/Mac become a IP Camera. Basic snipet. You can see http://ip.address:8080/hello/playlist.m3u8
var stream = HTTPStream()
stream.attachCamera(AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back))
stream.attachAudio(AVCaptureDevice.default(for: .audio))
stream.publish("hello")
var hkView = MTHKView(frame: view.bounds)
hkView.attachStream(httpStream)
var httpService = HLSService(domain: "", type: "_http._tcp", name: "HaishinKit", port: 8080)
httpService.addHTTPStream(stream)
httpService.startRunning()
// add ViewController#view
view.addSubview(hkView)
Looking for sponsors. Sponsoring I will enable us to:
If you use any of our libraries for work, see if your employers would be interested in sponsorship. I have some special offers.ใI would greatly appreciate. Thank you.
ในใใณใตใผใๅ้ใใฆใใพใใๅฉ็จ็จ้ใจใใฆใฏใ
ใใฎใฉใคใใฉใชใผใไปไบใง็ถ็ถ็ใซๅฉ็จใใฆใใๅ ดๅใฏใใใฒใ้็จไธปใซใในใใณใตใผใซ่ๅณใใชใใ็ขบ่ชใใใ ใใใจๅนธใใงใใใใใคใ็นๅ ธใ็จๆใใฆใใพใใ
BSD-3-Clause
link |
Stars: 2574 |
Last commit: 23 hours ago |
This is a release for Xcode 15. The minimum supported versions for iOS and tvOS will be 12.0. There are some changes to certain methods, but the core functionality remains the same as in version 1.5.8.
Full Changelog: https://github.com/shogo4405/HaishinKit.swift/compare/1.5.8...1.6.0
Swiftpack is being maintained by Petr Pavlik | @ptrpavlik | @swiftpackco | API | Analytics