![38% of CISOs Fear They’re Not Moving Fast Enough on AI](https://cdn.sanity.io/images/cgdhsj6q/production/faa0bc28df98f791e11263f8239b34207f84b86f-1024x1024.webp?w=400&fit=max&auto=format)
Security News
38% of CISOs Fear They’re Not Moving Fast Enough on AI
CISOs are racing to adopt AI for cybersecurity, but hurdles in budgets and governance may leave some falling behind in the fight against cyber threats.
github.com/alievk/avatarify-python
:arrow_forward: Demo
:arrow_forward: AI-generated Elon Musk
Photorealistic avatars for video-conferencing apps. Democratized.
Based on First Order Motion Model.
Created by: GitHub community.
Q
and now you drive a person that never existed. Every time you push the button – new avatar is sampled.To run Avatarify smoothly you need a CUDA-enabled (NVIDIA) video card. Otherwise it will fallback to the central processor and run very slowly. These are performance metrics for some hardware:
Of course, you also need a webcam!
Download model's weights from Dropbox, Yandex.Disk or Google Drive [228 MB, md5sum 8a45a24037871c045fbb8a6a8aa95ebc
]
Linux uses v4l2loopback
to create virtual camera.
bash Miniconda3-latest-Linux-x86_64.sh
avatarify
and install its dependencies (sudo privelege is required):git clone https://github.com/alievk/avatarify.git
cd avatarify
bash scripts/install.sh
vox-adv-cpk.pth.tar
file in the avatarify
directory (don't unpack it).(!) Note: we found out that in versions after v4.6.8 (March 23, 2020) Zoom disabled support for virtual cameras on Mac. To use Avatarify in Zoom you can choose from 2 options:
codesign --remove-signature /Applications/zoom.us.app
(!) Note: To run Avatarify on Mac a remote GPU connection is required.
We will use CamTwist to create virtual camera for Mac.
brew cask install miniconda
.git
:git clone https://github.com/alievk/avatarify.git
cd avatarify
bash scripts/install_mac.sh
:arrow_forward: Video tutorial
This guide is tested for Windows 10.
git clone https://github.com/alievk/avatarify.git
cd avatarify
scripts\install_windows.bat
vox-adv-cpk.pth.tar
file in the avatarify
directory (don't unpack it).run_windows.bat
. If installation was successful, two windows "cam" and "avatarify" will appear. Leave these windows open for the next installation steps. If there are multiple cameras (including virtual ones) in the system, you may need to select the correct one. Open scripts/settings_windows.bat
and edit CAMID
variable. CAMID
is an index number of camera like 0, 1, 2, ...Install and register only 1 virtual camera
.OBS-Camera
camera should be available in Zoom (or other videoconferencing software).The steps 10-11 are required only once during setup.
You can offload the heavy work to a server with a GPU and use your laptop just to retreive renderings from it. See the wiki page with installation instructions.
Avatarify comes with a standard set of avatars of famous people, but you can extend this set simply copying your avatars into avatars
folder.
Follow these advices for better visual quality:
Your web cam must be plugged-in.
Note: run your video-conferencing app only after Avatarify is started.
It is supposed that there is only one web cam connected to the computer at /dev/video0
. The run script will create virtual camera /dev/video9
. You can change these settings in scripts/settings.sh
.
You can use command v4l2-ctl --list-devices
to list all devices in your system. For example, if the web camera is /dev/video1
then the device id is 1.
Run:
bash run.sh
cam
and avatarify
windows will pop-up. The cam
window is for controlling your face position and avatarify
is for the avatar animation preview. Please follow these recommendations to drive your avatars.
Note: To run Avatarify on Mac a remote GPU connection is required.
Please find where you downloaded avatarify
and substitute path /path/to/avatarify
below.
cd /path/to/avatarify
bash run_mac.sh --worker-host gpu_server_address
Desktop+
and press Select
.Settings
section choose Confine to Application Window
and select python (avatarify)
from the drop-down menu.cam
and avatarify
windows will pop-up. The cam
window is for controlling your face position and avatarify
is for the avatar animation preview. Please follow these recommendations to drive your avatars.
If there are multiple cameras (including virtual ones) in your system, you may need to select the correct one in scripts/settings_windows.bat
. Open this file and edit CAMID
variable. CAMID
is an index number of camera like 0, 1, 2, ...
cd C:\path\to\avatarify
run_windows.bat
OBS-Camera
.cam
and avatarify
windows will pop-up. The cam
window is for controlling your face position and avatarify
is for the avatar animation preview. Please follow these recommendations to drive your avatars.
Note: To reduce video latency, in OBS Studio right click on the preview window and uncheck Enable Preview.
Keys | Controls |
---|---|
1-9 | These will immediately switch between the first 9 avatars. |
Q | Turns on StyleGAN-generated avatar. Every time you push the button – new avatar is sampled. |
0 | Toggles avatar display on and off. |
A/D | Previous/next avatar in folder. |
W/S | Zoom camera in/out. |
U/H/J/K | Translate camera. H - left, K - right, U - up, J - Down by 5 pixels. Add Shift to adjust by 1 pixel. |
Shift-Z | Reset camera zoom and translation |
Z/C | Adjust avatar target overlay opacity. |
X | Reset reference frame. |
F | Toggle reference frame search mode. |
R | Mirror reference window. |
T | Mirror output window. |
L | Reload avatars. |
I | Show FPS |
ESC | Quit |
These are the main principles for driving your avatar:
Alternatively, you can hit 'F' for the software to attempt to find a better reference frame itself. This will slow down the framerate, but while this is happening, you can keep moving your head around: the preview window will flash green when it finds your facial pose is a closer match to the avatar than the one it is currently using. You will see two numbers displayed as well: the first number is how closely you are currently aligned to the avatar, and the second number is how closely the reference frame is aligned.
You want to get the first number as small as possible - around 10 is usually a good alignment. When you are done, press 'F' again to exit reference frame search mode.
You don't need to be exact, and some other configurations can yield better results still, but it's usually a good starting point.
Avatarify supports any video-conferencing app where video input source can be changed (Zoom, Skype, Hangouts, Slack, ...). Here are a few examples how to configure particular app to use Avatarify.
Go to Settings -> Audio & Video, choose avatarify
(Linux), CamTwist
(Mac) or OBS-Camera
(Windows) camera.
<img src=docs/skype.jpg width=600>
Go to Settings -> Video and choose avatarify
(Linux), CamTwist
(Mac) or OBS-Camera
(Windows) from Camera drop-down menu.
<img src=docs/zoom.jpg width=600>
Go to your profile picture -> Settings -> Devices and choose avatarify
(Linux), CamTwist
(Mac) or OBS-Camera
(Windows) from Camera drop-down menu.
<img src=docs/teams.jpg width=600>
Make a call, allow browser using cameras, click on Settings icon, choose avatarify
(Linux), CamTwist
(Mac) or OBS-Camera
(Windows) in Video settings drop-down menu.
<img src=docs/slack.jpg width=600>
To remove Avatarify and its related programs follow the instructions in the Wiki.
Our goal is to democratize photorealistic avatars for video-conferencing. To make the technology even more accessible, we have to tackle the following problems:
Please make pull requests if you have any improvements or bug-fixes.
Q: Do I need any knowledge of programming to run Avatarify?
A: Not really, but you need some beginner-level knowledge of the command line. For Windows we recorded a video tutorial, so it’ll be easy to install.
Q: Why does it work so slow on my Macbook?
A: The model used in Avatarify requires a CUDA-enabled NVIDIA GPU to perform heavy computations. Macbooks don’t have such GPUs, and for processing use CPU, which has much less computing power to run Avatarify smoothly.
Q: I don’t have a NVIDIA GPU, can I run it?
A: You still can run it without a NVIDIA GPU, but with drastically reduced performance (<1fps).
Q: I have an ATI GPU (e.g. Radeon). Why does it work so slow?
A: To run the neural network Avatarify uses PyTorch library, which is optimized for CUDA. If PyTorch can’t find a CUDA-enabled GPU in your system it will fallback to CPU. The performance on the CPU will be much worse.
Q: How to add a new avatar?
A: It’s easy. All you need is to find a picture of your avatar and put it in the avatars
folder. More.
Q: My avatar looks distorted.
A: You need to calibrate your face position. Please follow the tips or watch the video tutorial.
Q: Can I use a cloud GPU?
A: This is work in progress. See the relevant discussion.
Q: Avatarify crashed, what to do?
A: First, try to find your error in the troubleshooting section. If it is not there, try to find it in the issues. If you couldn’t find your issue there, please open a new one using the issue template.
Q: Can I use Avatarify for commercial purposes?
A: No. Avatarify and First Order Motion Model are licensed under Creative Commons Non-Commercial license, which prohibits commercial use.
Q: What video conferencing apps does Avatarify support?
A: Avatarify creates a virtual camera which can be plugged into any app where video input source can be changed (Zoom, Skype, Hangouts, Slack, ...).
Q: Where can I discuss Avatarify-related topics with the community?
A: We have Slack. Please join:
Please follow the Wiki page.
FAQs
Unknown package
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
CISOs are racing to adopt AI for cybersecurity, but hurdles in budgets and governance may leave some falling behind in the fight against cyber threats.
Research
Security News
Socket researchers uncovered a backdoored typosquat of BoltDB in the Go ecosystem, exploiting Go Module Proxy caching to persist undetected for years.
Security News
Company News
Socket is joining TC54 to help develop standards for software supply chain security, contributing to the evolution of SBOMs, CycloneDX, and Package URL specifications.