Swiftpack.co - facemoji/mocap4face as Swift Package

Swiftpack.co is a collection of thousands of indexed Swift packages. Search packages.
See all packages published by facemoji.
facemoji/mocap4face 0.3.0
Cross-platform SDK for facial motion capture producing blendshapes and rigid head poses in 3D space in realtime from photos or videos.
⭐️ 456
🕓 18 weeks ago
iOS macOS
.package(url: "https://github.com/facemoji/mocap4face.git", from: "0.3.0")

mocap4face by alter
mocap4face by alter

mocap4face by alter is a free, multiplatform SDK for real time facial motion capture based on Facial Action Coding System or (FACS). It provides real-time FACS-derived blendshape coefficients, and rigid head pose in 3D space from any mobile camera, webcam, photo, or video enabling live animation of 3D avatars, digital characters, and more.

After fetching the input from one of the mentioned sources, mocap4face SDK produces data in ARKit-compatible blendshapes, i.e., morph targets weight values as a per-frame expression shown in the video below. Useful for, e.g., animating a 2D or 3D avatar in a way that mimics the user's facial expressions in real-time à la Apple Memoji but without the need of a hardware-based TrueDepth Camera.

With mocap4face, you can drive live avatars or NFT PFPs, build Snapchat-like lenses, AR experiences, face filters that trigger actions, VTubing apps, and more with as little energy impact and CPU/GPU use as possible. As an example, check out how the popular avatar live-streaming app REALITY is using our SDK.

Please star us ⭐⭐⭐ on GitHub—it motivates us a lot!

📋 Table of Content

🤓 Tech Specs

✨ Key Features

  • 42 tracked facial expressions via blendshapes
  • Eye tracking including eye gaze vector
  • Tongue tracking
  • Light & fast, just 3MB ML model size
  • ≤ ±50° pitch, ≤ ±40° yaw and ≤ ±30° roll tracking coverage

🤳 Input

  • All RGB camera
  • Photo
  • Video

📦 Output

  • ARKit-compatible blendshapes
  • Head position and scale in 2D and 3D
  • Head rotation in world coordinates

⚡ Performance

  • 50 FPS on Pixel 4
  • 60 FPS on iPhone SE (1st gen)
  • 90 FPS on iPhone X or newer

💿 Installation


  1. Create a dev account at studio.facemoji.co
  2. Generate a unique API key for your app
  3. Paste the API key to your source code


  1. Open the sample XCode project and run the demo on your iOS device
  2. To use the SDK in your project, either use the bundled XCFramework directly or use the Swift Package manager (this repository also serves as a Swift PM repository)


  1. Open the sample project in Android Studio and run the demo on your Android device
  2. Add this repository to the list of your Maven repositories in your root build.gradle, for example:
allprojects {
    repositories {
        // Any other repositories here...

        maven {
            name = "Facemoji"
            url = uri("https://facemoji.jfrog.io/artifactory/default-maven-local/")
  1. To use the SDK in your project, add implementation 'co.facemoji:mocap4face:0.3.0' to your Gradle dependencies


  1. Open the sample project under js-example in an editor of your choice
  2. Run npm install && npm run dev to start a local server with the demo
  3. Run npm install && npm run dev_https to start a local server with self-signed HTTPS support
  4. Run npm install @facemoji/mocap4face in your own project to add mocap4face as a dependency

If the webcamera button is not working, you might need to use HTTPS for the local dev server. Run npm run dev_https and allow the self-signed certificate in the browser to start the demo in HTTPS mode.

You can also run npm run build to create a production bundle of the demo app.

🚀 Use Cases

  • AR for NFTs profile pics
  • Live avatar experiences
  • Snapchat-like lense
  • AR experiences
  • VTubing apps
  • Live streaming apps
  • Face filters
  • AR games with facial triggers
  • Beauty AR
  • Virtual try-on
  • Play to earn games

❤️ Links

📄 License

This library is provided under the Facemoji SDK License Agreement—see LICENSE. Also make sure to check out our FAQ for more details.

The sample code in this repository is provided under the Facemoji Samples License.

🙏 Notices

OSS used in mocap4face SDK:

Original video by LaBeouf, Rönkkö & Turner.

This library transitively uses open source software, see the full list of our OSS dependencies and license notices.


Stars: 457
Last commit: 1 week ago
jonrohan Something's broken? Yell at me @ptrpavlik. Praise and feedback (and money) is also welcome.

Release Notes

18 weeks ago

0.3.0 - 2022-02-18


  • Android dependency coordinates changed from co.facemoji:mocap4face to alter:mocap4face
  • NPM dependency coordinates changed from facemoji/mocap4face to 0xalter/mocap4face


  • Android now supports non OES-external OpenGL textures as input
  • Improved performance on Javascript


  • CameraWrapper for iOS and Javascript for an easier camera access
  • iOS M1 Simulator support
  • Protocol-buffers-based face tracker result serialization
    • Replaces deprecated functions FaceTrackerResult.serialize() and deserializeResult()

Swiftpack is being maintained by Petr Pavlik | @ptrpavlik | @swiftpackco | API | Analytics