Latest 0.9.3
Homepage https://github.com/shogo4405/HaishinKit.swift
License New BSD
Platforms ios 8.0, osx 10.11, tvos 10.2
Dependencies Logboard
Authors

HaishinKit (formerly lf)

Platform
Language
CocoaPods
GitHub license

  • Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.
  • Issuesの言語は、英語か、日本語でお願いします!

Features

RTMP

  • [x] Authentication
  • [x] Publish and Recording (H264/AAC)
  • [x] Playback (Technical Preview)
  • [x] Adaptive bitrate streaming
    • [x] Handling (see also #126)
    • [x] Automatic drop frames
  • [ ] Action Message Format
    • [x] AMF0
    • [ ] AMF3
  • [x] SharedObject
  • [x] RTMPS
    • [x] Native (RTMP over SSL/TSL)
    • [x] Tunneled (RTMPT over SSL/TSL) (Technical Preview)
  • [x] RTMPT (Technical Preview)
  • [x] ReplayKit Live as a Broadcast Upload Extension (Technical Preview)

HLS

  • [x] HTTPService
  • [x] HLS Publish

Rendering

HKView GLHKView MTHKView
Engine AVCaptureVideoPreviewLayer OpenGL ES Metal
Publish
Playback ×
VIsualEffect ×
Condition Stable Stable Beta

Others

  • [x] Support tvOS 10.2+ (Technical Preview)
    • tvOS can’t publish Camera and Microphone. Available playback feature.
  • [x] Hardware acceleration for H264 video encoding, AAC audio encoding
  • [x] Support "Allow app extension API only" option
  • [x] Support GPUImage framework (~> 0.5.12)
  • [ ] Objectiv-C Bridging

Requirements

iOS OSX tvOS XCode Swift CocoaPods Carthage
0.9.0 8.0+ 10.11+ 10.2+ 9.3+ 4.1 1.5.0+ 0.29.0+
0.8.0 8.0+ 10.11+ 10.2+ 9.0+ 4.0+ 1.2.0+ 0.20.0+
0.7.0 8.0+ 10.11+ 10.2+ 8.3+ 3.1 1.2.0+ 0.20.0+

Cocoa Keys

iOS10.0+

  • NSMicrophoneUsageDescription
  • NSCameraUsageDescription

Installation

Please set up your project Swift 4.1.

CocoaPods

source 'https://github.com/CocoaPods/Specs.git'
use_frameworks!

def import_pods
    pod 'HaishinKit', '~> 0.9.2'
end

target 'Your Target'  do
    platform :ios, '8.0'
    import_pods
end

Carthage

github "shogo4405/HaishinKit.swift" ~> 0.9.2

License

BSD-3-Clause

Donation

Bitcoin

17N3qWCKjwJrWrDuyeHaqWkZYnJqX7igXN

Prerequisites

Make sure you setup and activate your AVAudioSession.

import AVFoundation
let session: AVAudioSession = AVAudioSession.sharedInstance()
do {
    try session.setPreferredSampleRate(44_100)
    try session.setCategory(AVAudioSessionCategoryPlayAndRecord, with: .allowBluetooth)
    try session.setMode(AVAudioSessionModeDefault)
    try session.setActive(true)
} catch {
}

RTMP Usage

Real Time Messaging Protocol (RTMP).

let rtmpConnection:RTMPConnection = RTMPConnection()
let rtmpStream: RTMPStream = RTMPStream(connection: rtmpConnection)
rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio)) { error in
    // print(error)
}
rtmpStream.attachCamera(DeviceUtil.device(withPosition: .back)) { error in
    // print(error)
}

let hkView = HKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(rtmpStream)

// add ViewController#view
view.addSubview(hkView)

rtmpConnection.connect("rtmp://localhost/appName/instanceName")
rtmpStream.publish("streamName")
// if you want to record a stream.
// rtmpStream.publish("streamName", type: .localRecord)

Settings

let sampleRate:Double = 44_100

// see: #58
#if(iOS)
let session: AVAudioSession = AVAudioSession.sharedInstance()
do {
    try session.setPreferredSampleRate(44_100)
    try session.setCategory(AVAudioSessionCategoryPlayAndRecord, with: .allowBluetooth)
    try session.setMode(AVAudioSessionModeDefault)
    try session.setActive(true)
} catch {
}
#endif

var rtmpStream = RTMPStream(connection: rtmpConnection)

rtmpStream.captureSettings = [
    "fps": 30, // FPS
    "sessionPreset": AVCaptureSession.Preset.medium.rawValue, // input video width/height
    "continuousAutofocus": false, // use camera autofocus mode
    "continuousExposure": false, //  use camera exposure mode
]
rtmpStream.audioSettings = [
    "muted": false, // mute audio
    "bitrate": 32 * 1024,
    "sampleRate": sampleRate, 
]
rtmpStream.videoSettings = [
    "width": 640, // video output width
    "height": 360, // video output height
    "bitrate": 160 * 1024, // video output bitrate
    // "dataRateLimits": [160 * 1024 / 8, 1], optional kVTCompressionPropertyKey_DataRateLimits property
    "profileLevel": kVTProfileLevel_H264_Baseline_3_1, // H264 Profile require "import VideoToolbox"
    "maxKeyFrameIntervalDuration": 2, // key frame / sec
]
// "0" means the same of input
rtmpStream.recorderSettings = [
    AVMediaType.audio: [
        AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
        AVSampleRateKey: 0,
        AVNumberOfChannelsKey: 0,
        // AVEncoderBitRateKey: 128000,
    ],
    AVMediaType.video: [
        AVVideoCodecKey: AVVideoCodecH264,
        AVVideoHeightKey: 0,
        AVVideoWidthKey: 0,
        /*
        AVVideoCompressionPropertiesKey: [
            AVVideoMaxKeyFrameIntervalDurationKey: 2,
            AVVideoProfileLevelKey: AVVideoProfileLevelH264Baseline30,
            AVVideoAverageBitRateKey: 512000
        ]
        */
    ],
]

// 2nd arguemnt set false
rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio), automaticallyConfiguresApplicationAudioSession: false)

Authentication

var rtmpConnection:RTMPConnection = RTMPConnection()
rtmpConnection.connect("rtmp://username:[email protected]/appName/instanceName")

Screen Capture

// iOS
rtmpStream.attachScreen(ScreenCaptureSession(shared: UIApplication.shared))
// macOS
rtmpStream.attachScreen(AVCaptureScreenInput(displayID: CGMainDisplayID()))

HTTP Usage

HTTP Live Streaming (HLS). Your iPhone/Mac become a IP Camera. Basic snipet. You can see http://ip.address:8080/hello/playlist.m3u8

var httpStream:HTTPStream = HTTPStream()
httpStream.attachCamera(DeviceUtil.device(withPosition: .back))
httpStream.attachAudio(AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio))
httpStream.publish("hello")

var lfView:LFView = LFView(frame: view.bounds)
lfView.attachStream(httpStream)

var httpService:HLSService = HLSService(domain: "", type: "_http._tcp", name: "lf", port: 8080)
httpService.startRunning()
httpService.addHTTPStream(httpStream)

// add ViewController#view
view.addSubview(lfView)

FAQ

How can I run example project?

Please hit carthage update command. HaishinKit needs Logboard module via Carthage.

carthage update

Do you support me via Email?

Yes. Consulting fee is $50/1 incident. I don’t recommend.
Please consider to use Issues.

Reference

Latest podspec

{
    "name": "HaishinKit",
    "version": "0.9.3",
    "summary": "Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.",
    "description": "HaishinKit (formerly lf). Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.",
    "homepage": "https://github.com/shogo4405/HaishinKit.swift",
    "license": "New BSD",
    "authors": {
        "shogo4405": "[email protected]"
    },
    "source": {
        "git": "https://github.com/shogo4405/HaishinKit.swift.git",
        "tag": "0.9.3"
    },
    "social_media_url": "http://twitter.com/shogo4405",
    "platforms": {
        "ios": "8.0",
        "osx": "10.11",
        "tvos": "10.2"
    },
    "ios": {
        "source_files": "Platforms/iOS/*.{h,swift}"
    },
    "osx": {
        "source_files": "Platforms/macOS/*.{h,swift}"
    },
    "tvos": {
        "source_files": "Platforms/tvOS/*.{h,swift}"
    },
    "source_files": "Sources/**/*.swift",
    "dependencies": {
        "Logboard": [
            "~> 1.1.6"
        ]
    }
}

Pin It on Pinterest

Share This