Swiftpack.co - mapbox/mapbox-speech-swift as Swift Package

Swiftpack.co is a collection of thousands of indexed Swift packages. Search packages.
mapbox/mapbox-speech-swift
Natural-sounding text-to-speech in Swift or Objective-C on iOS, macOS, tvOS, and watchOS
.package(url: "https://github.com/mapbox/mapbox-speech-swift.git", from: "v2.0.0-rc.1")

Mapbox Speech

CircleCI codecov Carthage compatible CocoaPods SPM compatible

Mapbox Speech connects your iOS, macOS, tvOS, or watchOS application to the Mapbox Voice API. Take turn instructions from the Mapbox Directions API and read them aloud naturally in multiple languages. This library is specifically designed to work with mapbox-directions-swift as part of the Mapbox Navigation SDK for iOS.

This library is compatible with applications written in Swift. Version 2.0 was the last version of this library to support applications written in Objective-C or AppleScript.

Getting started

Specify the following dependency in your Carthage Cartfile:

github "mapbox/mapbox-speech-swift" ~> 1.0

Or in your CocoaPods Podfile:

pod 'MapboxSpeech', '~> 1.0'

Or in your Swift Package Manager Package.swift:

.package(url: "https://github.com/mapbox/mapbox-speech-swift.git", from: "1.0.0")

Then import MapboxSpeech or @import MapboxSpeech;.

Usage

You’ll need a Mapbox access token in order to use the API. If you’re already using the Mapbox Maps SDK for iOS or macOS SDK, Mapbox Speech automatically recognizes your access token, as long as you’ve placed it in the MBXAccessToken key of your application’s Info.plist file.

Basics

The main speech synthesis class is SpeechSynthesizer. Create a speech synthesizer object using your access token:

import MapboxSpeech

let speechSynthesizer = SpeechSynthesizer(accessToken: "<#your access token#>")

Alternatively, you can place your access token in the MBXAccessToken key of your application’s Info.plist file, then use the shared speech synthesizer object:

// main.swift
let speechSynthesizer = SpeechSynthesizer.shared

With the directions object in hand, construct a SpeechOptions or MBSpeechOptions object and pass it into the SpeechSynthesizer.audioData(with:completionHandler:) method.

// main.swift

let options = SpeechOptions(text: "hello, my name is Bobby")
speechSynthesizer.audioData(with: options) { (data: Data?, error: NSError?) in
    guard error == nil else {
        print("Error calculating directions: \(error!)")
        return
    }
    
    // Do something with the audio!
}

GitHub

link
Stars: 11
Last commit: 2 weeks ago

Ad: Job Offers

iOS Software Engineer @ Perry Street Software
Perry Street Software is Jack’d and SCRUFF. We are two of the world’s largest gay, bi, trans and queer social dating apps on iOS and Android. Our brands reach more than 20 million members worldwide so members can connect, meet and express themselves on a platform that prioritizes privacy and security. We invest heavily into SwiftUI and using Swift Packages to modularize the codebase.

Submit a free job ad (while I'm testing this). The analytics numbers for this website are here.

Release Notes

v2.0.0-rc.1
2 weeks ago

Changes since v2.0.0-alpha.1:

  • Added support for serviceAccessToken (#45).
  • Disabled code coverage report

Swiftpack is being maintained by Petr Pavlik | @ptrpavlik | @swiftpackco | API | Analytics