Latest 0.0.1
Homepage https://github.com/gdfairclough/EmotionalSwift
License MIT
Platforms ios 10.0, requires ARC
Authors

Swift Wrapper for Microsoft Congitve Services Emotion API

Usage

This wrapper allows you to easily pull emotional data from the Micorosoft Cogntivie Services Emotion API.

In order to use the API, you will need to sign up for a Micorsoft account and use your account to retrieve an API key.

This API key will be used to create an EmotionalDataRequester object:

let requester = EmotionalDataRequester(apiKey: key)

Once the requester object is created, it will be used to make a request to Microsoft’s servers. You must supply image data in
the call, along with a closure that will allow you to handle the response.

The requester call looks like this :

requester.requestEmotionalData(for : data) { (result) in
            switch result{
            case .success(let faces):
               //work with faces returned by the API 
            case .failure(let error):
               //network request failed with an error
               break
            }
       }

The array of Face structs returned in the success case are sorted by largest face to smallest face, in terms of height and width of the face as calucalted by the API.

The closure will be called with either a failure result, or a success result. The failure result includes information about
the reason for the failure. The success results includes an array of Face objects that represent the faces in the image
and the emotional values for each face that were determined by the API.

Accessing information that was calculated by the API is done through the Face struct.

face.scores
face.faceRect

scores will provide the emotional values for the face. The API returns a value between 0.0 and 1.0 for 8 different emotions.

  • anger
  • contempt
  • disgust
  • fear
  • happiness
  • neutral
  • sadness
  • surprise
  • Each face in the picture provided to the API will get a score for each of the 8 emotions, and these scores are accessed on the Face struct scores property.

    faceRect will provide the positional information for the current Face struct, represented by a FaceRectangle struct. You will receive 4 pieces of positional information for each face.

  • width
  • height
  • top
  • left
  • You can use these items to determine where each face is located on the picture provided.

    For more infomration on Microsoft Cognitive Services’ Emotion API, and for signing up for authentication keys, visit
    Microsoft’s website

    Latest podspec

    {
        "name": "EmotionalSwift",
        "version": "0.0.1",
        "summary": "Swift Wrapper for Microsoft Congitve Services Emotion API",
        "homepage": "https://github.com/gdfairclough/EmotionalSwift",
        "license": {
            "type": "MIT",
            "file": "LICENSE"
        },
        "authors": {
            "Dale Fairclough": "[email protected]"
        },
        "social_media_url": "http://twitter.com/faircoder",
        "platforms": {
            "ios": "10.0"
        },
        "requires_arc": true,
        "source": {
            "git": "https://github.com/gdfairclough/EmotionalSwift.git",
            "tag": "v0.0.1",
            "submodules": true
        },
        "source_files": "EmotionalSwift/**/*.{h,swift}",
        "pushed_with_swift_version": "3.0"
    }

    Pin It on Pinterest

    Share This