Latest 0.4.0
License MIT
Platforms ios 9.0
Dependencies AFNetworking, SynqHttpLib

Build Status

This is the SYNQ mobile SDK for iOS. It lets you easily integrate your mobile app with the SYNQ platform and the SYNQ Video API.


To run the example project, clone the repo, and run pod install from the Example directory first. The example project features an app that utilizes the features of the SDK, and this shows how parts of the SDK are to be used.


SynqObjC is available through CocoaPods. If you do not have CocoaPods installed, you can install it with the following command:

$ gem install cocoapods

To integrate SynqObjC into your Xcode project, specify it in your Podfile:

pod "SynqObjC"

Then run the following command to install:

$ pod install

If you get an error saying [!] Unable to find a specification for <name of pod>, try running pod repo update, and then run pod install again. This should fix it.

The SDK is comprised of two parts: SynqUploader – for uploading videos to SYNQ, and SynqStreamer – for streaming live video.


This part consists of classes for fetching videos from the Photos library, exporting and uploading them to SYNQ. The SDK uses AFNetworking 3 for communicating with the server. It utilizes a background configured NSURLSession to manage video uploads. This makes the upload continue regardless of whether the app is in the foreground or background.

Import the SynqUploader header

#import <SynqUploader/SynqUploader.h>

Set SQVideoUploadDelegate to be able to handle upload results

[[SynqUploader sharedInstance] setDelegate:self];

Create a SQVideoUpload object for each PHAsset to be uploaded

SQVideoUpload *video = [[SQVideoUpload alloc] initWithPHAsset:asset];

Set upload parameters for the SQVideoUpload object

To do this, you must do two things:

  1. Create a video object in the SYNQ API, and
  2. Fetch upload parameters from the API for the created video object.

In the example project, the SynqHttpLib pod and the example server (SYNQ-Nodejs-example-server) performs these two functions in one step via the function "createVideoAndGetParamsWithSuccess:"


Add each SQVideoUpload object to a NSArray and call the upload function

[[SynqUploader sharedInstance] uploadVideoArray:videoArray
                            exportProgressBlock:^(double exportProgress) 
    // Report progress to UI
    [self.progressView setProgress:exportProgress];
uploadProgressBlock:^(double uploadProgress) 
    // uploadProgress is between 0.0 and 100.0
    // Report progress to UI
    [self.progressView setProgress:uploadProgress];


The outcome of each upload is reported through the SQVideoUploadDelegate methods. These are the methods that are available, and how they should be used:

- (void) videoUploadCompleteForVideo:(SQVideoUpload *)video;

A video is successfully uploaded.

- (void) videoUploadFailedForVideo:(SQVideoUpload *)video;

There was an error uploading a video.

- (void) allVideosUploadedSuccessfully;

All videos were successfully uploaded.

Play uploaded videos

The example project included in this repo contains functionality for playing back your uploaded videos. This consists of a table view that lists all uploaded videos. Selecting one of the videos will open a new view controller (an instance of AVPlayerViewController) with a video player configured to play the selected video. This example uses the HLS output format as source for the video playback. The various sources for video playback can be found under the "outputs" field of the video object:

"outputs": {
  "hls": {
    "url": "",
    "state": "complete"
  "mp4_360": {
    "url": "",
    "state": "complete"
  "mp4_720": {
    "url": "",
    "state": "complete"
  "mp4_1080": {
    "url": "",
    "state": "complete"
  "webm_720": {
    "url": "",
    "state": "complete"

Please note: the "url" field is only present when the state is "complete", i.e. when the transcoding of the video file is finished. The state might also read "submitted" or "progressing", meaning that the transcoding is not complete yet, hence no output url.

Having obtained the URL for the needed output format, presenting the video player view is accomplished by configuring an instance of the AVPlayerViewController:

// Convert url string to URL
NSString *urlString;    // the url string fetched from the video object JSON
NSURL *videoUrl = [NSURL URLWithString:urlString];

// Configure AVPlayerViewController with an AVPlayer
AVPlayerViewController *avPlayerViewController = [[AVPlayerViewController alloc] init];
avPlayerViewController.player = [[AVPlayer alloc] initWithURL:videoUrl];

// Present the player view controller
[self presentViewController:avPlayerViewController animated:YES completion:^{
    [avPlayerViewController.player play];



This part of the SDK features a framework with the core streamer functionality and a resource bundle that contains a compiled storyboard with a fully configured view controller for video streaming. Functions for configuring the video stream and displaying the streamer view is exposed through the SynqStreamer.h header file.

Create an instance of SynqStreamer:

SynqStreamer *streamer = [[SynqStreamer alloc] init];

These are the steps needed to setup the video streamer:

  1. Create a video object configured for live streaming in the SYNQ API and get the returned parameters
  2. Set the stream URL in the SynqStreamer function - (void) setStreamUrl:(NSString *)streamUrl
  3. Get the streamer view by calling the SynqStreamer function - (AppNavigationController *) getStreamerViewWithNavigationController
  4. Present the streamer view (the view controller is embedded in a navigation controller). You can use the function presentViewController: animated: completion:
  5. The stream (rec) button is disabled by default. When you get the stream URL returned from the function call in step 1, streaming is ready to start and you can enable the stream button by calling SynqStreamer’s - (void) setStreamButtonEnabled:(BOOL)enabled

The example app included in this repo shows how you can create the video object and get the stream URL using the SynqHttpLib in connection with our NodeJS example server. We simply call this function in SynqHttpLib:

[client createVideoAndGetStreamParamsWithSuccess:^(NSDictionary *jsonResponse) 
    // Get stream URL from parameters
    NSString *streamUrl = [jsonResponse objectForKey:@"stream_url"];
httpFailureBlock:^(NSURLSessionDataTask *task, NSError *error) 
    // An error occurred, handle error

Set parameters to perform step 2 and 5:

[streamer setStreamUrl:streamUrl];
[streamer setStreamButtonEnabled:YES];

Configure and present the streamer view, step 3 and 4:

AppNavigationController *navController = [streamer getStreamerViewWithNavigationController];
[self presentViewController:navController animated:YES completion:nil];

Now you can start and stop the live video stream as you wish in the streamer view. There is also a settings view (press the cog icon) where you can configure video parameters like resolution, sample rate, audio channel count, etc.

Important note

This SDK is dependant on access to the SYNQ API to be able to create video objects and to fetch parameters needed when uploading or playing videos. To use our API, you need to have an API key. Each API key corresponds to a project, and a project is a collection of videos, and settings regarding those videos, such as webhooks. Register for a free API key. Your app should never communicate directly with our API, you should have your own server and use your API key as a secret key from there. Your server should then authenticate requests from your app before sending http calls to SYNQ. If you do not have any server configured to access the SYNQ API, you can use our NodeJS example server as a way to get started. Please note that this server should never be used in production!


This SDK requires iOS 9 or above


Kjartan Vestvik, [email protected]


SynqObjC is available under the MIT license. See the LICENSE file for more info.

Latest podspec

    "name": "SynqObjC",
    "version": "0.4.0",
    "summary": "SynqObjC is an Objective-C SDK that lets you easily add SYNQ video functionality to your mobile app",
    "description": "This SDK contains what you need to make your app interact with the SYNQ video API. The SDK contains functionality for accessing the videos on the device, and for uploading videos into the SYNQ infrastructure. The SDK also lets you add live video streaming functionality to your app, containing a pre-configured view controller with UI elements for controlling a live video stream.nPlease note: this pod is an add-on to the SYNQ video API and is of no use unless you already have created a service for accessing the API, either directly or by using one of our SDKs. (",
    "homepage": "",
    "social_media_url": "",
    "license": {
        "type": "MIT",
        "file": "LICENSE"
    "authors": {
        "Kjartan Vestvik": "[email protected]"
    "source": {
        "git": "",
        "tag": "0.4.0"
    "screenshots": "",
    "platforms": {
        "ios": "9.0"
    "vendored_frameworks": "SynqObjC/SynqStreamer.framework",
    "resources": "SynqObjC/Assets/SynqStreamerResources.bundle",
    "source_files": "SynqObjC/Classes/**/*",
    "public_header_files": "SynqObjC/Classes/*.h",
    "dependencies": {
        "AFNetworking": [
            "~> 3.0"
        "SynqHttpLib": [
            "~> 0.3"

Pin It on Pinterest

Share This