Security News
Research
Data Theft Repackaged: A Case Study in Malicious Wrapper Packages on npm
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
api-inference-community
Advanced tools
A package with helper tools to build an API Inference docker app for Hugging Face API inference using huggingface_hub
This repositories enable third-party libraries integrated with huggingface_hub to create
their own docker so that the widgets on the hub can work as the transformers
one do.
The hardware to run the API will be provided by Hugging Face for now.
The docker_images/common
folder is intended to be a starter point for all new libs that
want to be integrated.
Copy the docker_images/common
folder into your library's name docker_images/example
.
Edit:
docker_images/example/requirements.txt
docker_images/example/app/main.py
docker_images/example/app/pipelines/{task_name}.py
to implement the desired functionality. All required code is marked with IMPLEMENT_THIS
markup.
Remove:
docker_images/example/app/pipelines/
that are not used.docker_images/example/tests
.docker_images/example/app/pipelines/__init__.py
Feel free to customize anything required by your lib everywhere you want. The only real requirements, are to honor the HTTP endpoints, in the same fashion as the common
folder for all your supported tasks.
Edit example/tests/test_api.py
to add TESTABLE_MODELS.
Pass the test suite pytest -sv --rootdir docker_images/example/ docker_images/example/
Submit your PR and enjoy !
Doing the first 7 steps is good enough to get started, however in the process you can anticipate some problems corrections early on. Maintainers will help you along the way if you don't feel confident to follow those steps yourself
./manage.py docker MY_MODEL
should work and responds on port 8000. curl -X POST -d "test" http://localhost:8000
for instance if
the pipeline deals with simple text.
If it doesn't work out of the box and/or docker is slow for some reason you can test locally (using your local python environment) with :
./manage.py start MY_MODEL
When doing subsequent docker launch with the same model_id, the docker should start up very fast and not redownload the whole model file. If you see the model/repo being downloaded over and over, it means the cache is not being used correctly.
You can edit the docker_images/{framework}/Dockerfile
and add an environment variable (by default it assumes HUGGINGFACE_HUB_CACHE
), or your code directly to put
the model files in the /data
folder.
Edit the tests/test_dockers.py
file to add a new test with your new framework
in it (def test_{framework}(self):
for instance). As a basic you should have 1 line per task in this test function with a real working model on the hub. Those tests are relatively slow but will check automatically that correct errors are replied by your API and that the cache works properly. To run those tests your can simply do:
RUN_DOCKER_TESTS=1 pytest -sv tests/test_dockers.py::DockerImageTests::test_{framework}
api-inference-community/{routes,validation,..}.py
.If you ever come across a bug within api-inference-community/
package or want to update it
the development process is slightly more involved.
api-inference-community
without depending on it
that's also a great option. Make sure to add the proper tests to your PR.manage.py
command:api-inference-community
first.pip install -e .
./manage.py start --framework example --task audio-source-separation --model-id MY_MODEL
api-inference-community
part.
The second one will be for your package specific modifications and will only land once the api-inference-community
tag has landed.Another similar command ./manage.py docker --framework example --task audio-source-separation --model-id MY_MODEL
Will launch the server, but this time in a protected, controlled docker environment making sure the behavior
will be exactly the one in the API.
question
within the context
.FAQs
A package with helper tools to build an API Inference docker app for Hugging Face API inference using huggingface_hub
We found that api-inference-community demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Research
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Research
Security News
Attackers used a malicious npm package typosquatting a popular ESLint plugin to steal sensitive data, execute commands, and exploit developer systems.
Security News
The Ultralytics' PyPI Package was compromised four times in one weekend through GitHub Actions cache poisoning and failure to rotate previously compromised API tokens.