Latest 0.0.1
License MIT
Platforms ios 11.0
Dependencies TesseractOCRiOS
Frameworks UIKit, Foundation

OCRSlicer iOS

  • No OS X support.
  • Strict requirement on language files existing in a referenced "tessdata" folder.


Using cocoa pods just add the following line to your podfile

pod 'OCRSlicer'

and run:

pod install


The library has two methods:

One to just detect text and return images (slices) of the text found

public func slice(image: UIImage, completion: @escaping ((_: [UIImage]) -> Void))

The other runs the OCR returning the pair text and slice

public func sliceaAndOCR(image: UIImage, charWhitelist: String, charBlackList: String = "", completion: @escaping ((_: String, _: UIImage) -> Void))

Screenshot of the example project, showing the original input image on the top, the detected text and the slices of the original image showing where it was found.



OCRSlicer iOS is distributed under the MIT

Tesseract, maintained by Google (, is
distributed under the Apache 2.0 license (see

Latest podspec

    "name": "OCRSlicer",
    "version": "0.0.1",
    "summary": "A quick solution to detect text in images providing slices of the image containing it and the String detected.",
    "description": "Detects the presence of texts on your UIImage, slices the words in different exportable images together with the string detected (using TesseractOCRiOS)",
    "homepage": "",
    "license": "MIT",
    "authors": {
        "Roberto Ferraz": ""
    "platforms": {
        "ios": "11.0"
    "source": {
        "git": "",
        "tag": "1.0.0"
    "source_files": [
    "exclude_files": "OCRSlicer/Example/**/*.*",
    "frameworks": [
    "libraries": "z",
    "dependencies": {
        "TesseractOCRiOS": [
            "~> 4.0.0"

Pin It on Pinterest

Share This