Swiftpack.co - mapbox/mapbox-speech-swift as Swift Package

Swiftpack.co is a collection of thousands of indexed Swift packages. Search packages.
See all packages published by mapbox.
mapbox/mapbox-speech-swift v2.0.0
Natural-sounding text-to-speech in Swift or Objective-C on iOS, macOS, tvOS, and watchOS
⭐️ 15
🕓 30 weeks ago
.package(url: "https://github.com/mapbox/mapbox-speech-swift.git", from: "v2.0.0")

Mapbox Speech

CircleCI codecov Carthage compatible CocoaPods SPM compatible

Mapbox Speech connects your iOS, macOS, tvOS, or watchOS application to the Mapbox Voice API. Take turn instructions from the Mapbox Directions API and read them aloud naturally in multiple languages. This library is specifically designed to work with mapbox-directions-swift as part of the Mapbox Navigation SDK for iOS.

This library is compatible with applications written in Swift. Version 2.0 was the last version of this library to support applications written in Objective-C or AppleScript.

Getting started

Specify the following dependency in your Carthage Cartfile:

github "mapbox/mapbox-speech-swift" ~> 2.0

Or in your CocoaPods Podfile:

pod 'MapboxSpeech', '~> 2.0'

Or in your Swift Package Manager Package.swift:

.package(url: "https://github.com/mapbox/mapbox-speech-swift.git", from: "2.0.0")

Then import MapboxSpeech or @import MapboxSpeech;.

Usage

You’ll need a Mapbox access token in order to use the API. If you’re already using the Mapbox Maps SDK for iOS or macOS SDK, Mapbox Speech automatically recognizes your access token, as long as you’ve placed it in the MBXAccessToken key of your application’s Info.plist file.

Basics

The main speech synthesis class is SpeechSynthesizer. Create a speech synthesizer object using your access token:

import MapboxSpeech

let speechSynthesizer = SpeechSynthesizer(accessToken: "<#your access token#>")

Alternatively, you can place your access token in the MBXAccessToken key of your application’s Info.plist file, then use the shared speech synthesizer object:

// main.swift
let speechSynthesizer = SpeechSynthesizer.shared

With the directions object in hand, construct a SpeechOptions or MBSpeechOptions object and pass it into the SpeechSynthesizer.audioData(with:completionHandler:) method.

// main.swift

let options = SpeechOptions(text: "hello, my name is Bobby")
speechSynthesizer.audioData(with: options) { (data: Data?, error: NSError?) in
    guard error == nil else {
        print("Error calculating directions: \(error!)")
        return
    }
    
    // Do something with the audio!
}

GitHub

link
Stars: 15
Last commit: 3 weeks ago
jonrohan Something's broken? Yell at me @ptrpavlik. Praise and feedback (and money) is also welcome.

Release Notes

v2.0.0
30 weeks ago

Changes since v1.0.0:

  • MBXAccessToken is now used as default access token, in case if it's not found in Info.plist MGLMapboxAccessToken will be used. (#42)
  • Added support for serviceAccessToken (#45).
  • Disabled code coverage report.

Swiftpack is being maintained by Petr Pavlik | @ptrpavlik | @swiftpackco | API | Analytics