Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

@0xalter/alter-core

Package Overview
Dependencies
Maintainers
1
Versions
143
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@0xalter/alter-core

Core by Alter

  • 0.0.0-20220331.101102
  • unpublished
  • Source
  • npm
  • Socket score

Version published
Weekly downloads
0
Maintainers
1
Weekly downloads
 
Created
Source


Core by Alter
Core by Alter

Core by Alter is a cross-platform SDK consisting of a real-time 3D avatar system and facial motion capture built from scratch for web3 interoperability and the open metaverse. Easily pipe avatars into your game, app or website. It just works. Check out the included code samples to learn how to get started. Try the live demo.

Please star us ⭐⭐⭐ on GitHub—it motivates us a lot!

📋 Table of Content

🤓 Tech Specs

🚉 Supported Platforms

  • iOS 13+
  • Android 8+
  • WebGL 2
  • macOS (WIP)
  • Windows (WIP)
  • Unity (Soon)
  • Unreal (Soon)

✨ Avatar Formats

  • Head only
  • A bust with clothing
  • Accessories only (for e.g. AR filters) (Soon)
  • Full body (Soon)

🌈 Variability

  • Human and non-human
  • From toddler to skeleton
  • Genders and non-binary
  • Full range of diversity

🤪 Motion Capture

✨ Features

  • 42 tracked facial expressions via blendshapes
  • Eye tracking including eye gaze vector
  • Tongue tracking
  • Light & fast, just 3MB ML model size
  • ≤ ±50° pitch, ≤ ±40° yaw and ≤ ±30° roll tracking coverage
  • 3D reprojection to input photo/video
  • Platform-suited API and packaging with internal optimizations
  • Simultaneous back and front camera support
  • Light & fast, just 3MB ML model size

🤳 Input

  • Any webcam
  • Photo
  • Video
  • Audio

📦 Output

  • ARKit-compatible blendshapes
  • Head position and scale in 2D and 3D
  • Head rotation in world coordinates
  • Eye tracking including eye gaze vector
  • 3D reprojection to the input photo/video
  • Tongue tracking

⚡ Performance

  • 50 FPS on Pixel 4
  • 60 FPS on iPhone SE (1st gen)
  • 90 FPS on iPhone X or newer

💡 More information

If you only need the facial tracking technology, check out our mocap4face repository!

💿 Installation

Browser/Javascript

To run the example, go to the js-example project and use npm install and npm run dev commands.

Do not forget to get your API key at studio.alter.xyz and paste it into the code. Look for "YOUR-API-KEY-HERE".

NPM Installation

Install the dependency via npm or yarn command.

npm install @0xalter/alter-core@unspecified

If you are using a bundler (such as Webpack), make sure to copy the assets from @0xalter/alter-core to your serving directory. See our Webpack config for an example of what needs to be copied.

Keywords

FAQs

Package last updated on 31 Mar 2022

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc