
Security News
The Nightmare Before Deployment
Season’s greetings from Socket, and here’s to a calm end of year: clean dependencies, boring pipelines, no surprises.
fast-task-api
Advanced tools
We at SocAIty developed FastTaskAPI to create and deploy our AI services as easy and standardized as possible.
Built on-top of FastAPI and runpod you can built high quality endpoints with proven stability.
Run services anywhere, be it local, hosted or serverless.
Package machine learning models standardized and production ready. Deploy them to your own infrastructure, runpod or to socaity.ai.
FastTaskAPI creates threaded jobs and job queues on the fly.
Call the api and return a job id. Get the result with the job id later.
Introduction
Get started:
Creating services for long-running tasks is hard.
This package solves these problems, by providing a simple well-known interface to deploy AI services anywhere.
The syntax is oriented by the simplicity of fastapi. Other hazards are taken care of by our router.
The code is fast, lightweight, flexible and pure python.
You can install the package with PIP (PyPi), or clone the repository.
pip install fast-task-api
Use the decorator syntax @app.task_endpoint to add an endpoint. This syntax is similar to fastapi's @app.get syntax.
from fast_task_api import FastTaskAPI, ImageFile
# define the app including your provider (fastapi, runpod..)
app = FastTaskAPI()
# add endpoints to your service
@app.task_endpoint("/predict")
def predict(my_param1: str, my_param2: int = 0):
return f"my_awesome_prediction {my_param1} {my_param2}"
@app.task_endpoint("/img2img", queue_size=10)
def my_image_manipulator(upload_img: ImageFile):
img_as_numpy = upload_img.to_np_array()
# Do some hard work here...
# img_as_numpy = img2img(img_as_numpy)
return ImageFile().from_np_array(img_as_numpy)
# start and run the server
app.start()
If you execute this code you should see the following page under http://localhost:8000/docs.

If you have a long running task, you can use the job queue functionality.
@app.task_endpoint(path="/make_fries", queue_size=100)
def make_fries(job_progress: JobProgress, fries_name: str, amount: int = 1):
job_progress.set_status(0.1, f"started new fries creation {fries_name}")
time.sleep(1)
job_progress.set_status(0.5, f"I am working on it. Lots of work to do {amount}")
time.sleep(2)
job_progress.set_status(0.8, "Still working on it. Almost done")
time.sleep(2)
return f"Your fries {fries_name} are ready"
What will happen now is:
Note: in case of "runpod", "serverless" this is not necessary, as the job mechanism is handled by runpod deployment.
You can provide status updates by changing the values of the job_progress object. If you add a parameter named job_progress to the function we will pass that object to the function. If then a client asks for the status of the task, he will get the messages and the process bar. This is for example in fastSDK used to provide a progress bar.
You can call the endpoints with a simple http request. You can try them out in the browser, with curl or Postman. For more convenience with the socaity package, you can use the endpoints like functions.
With fastSDK, you can use the endpoints like a function. FastSDK will deal with the job id and the status requests in the background. This makes it insanely useful for complex scenarios where you use multiple models and endpoints. Checkout the fastSDK documentation for more information.
If you don't want to use the job queue functionality, you can use the @app.endpoint syntax
@app.endpoint("/my_normal_endpoint", methods=["GET", "POST"]):
def my_normal_endpoint(image: str, my_param2: int = 0):
return f"my_awesome_prediction {my_param1} {my_param2}"
This will return a regular endpoint -> No job_result object with job-id is returned. The method also supports file uploads.
The library supports file uploads out of the box. Use the parameter type hints in your method definition to get the file.
from fast_task_api import MediaFile, ImageFile, AudioFile, VideoFile
@app.task_endpoint("/my_upload")
def my_upload(anyfile: MediaFile):
return anyfile
FastTaskAPI supports all file-types of media-toolkit. This includes common file types like: ImageFile, AudioFile and VideoFile.
from fast_task_api import ImageFile, AudioFile, VideoFile
@app.task_endpoint("/my_file_upload")
def my_file_upload(image: ImageFile, audio: AudioFile, video: VideoFile):
image_as_np_array = np.array(image)
return [ImageFile().from_np_array(image_as_np_array) for i in range(10)]
You can call the endpoints, either with bytes or b64 encoded strings.
FastSDK also supports MediaFiles and for this reason it natively supports file up/downloads. Once you have added the service in FastSDK you can call it like a python function
mysdk = mySDK() # follow the fastSDK tutorial to set up correctly.
task = upload_image(my_imageFile, myAudioFile, myVideoFile) # uploads the data to the service. Retrieves a job back.
result = task.get_result() # constantly trigger the get_job endpoint in the background until the server finished.
import httpx
with open('my_image_file.png', 'rb') as f:
image_file_content = f.read()
my_files = {
"image": ("file_name", image_file_content, 'image/png')
...
}
response = httpx.Client().post(url, files=my_files)
Note: In case of runpod you need to convert the file to a b64 encoded string.
One property of media-toolkit is, that it support files from URLs. Thus instead of sending a file directly (as bytes) to the endpoints, you can also send a URL to the file location.
my_files = {
"image": "https:/my_cloud_storage/my_image_file.png"
...
}
response = httpx.Client().post(url, files=my_files)
You can define a default upload size limit in the settings with the environment variable:
DEFAULT_MAX_UPLOAD_FILE_SIZE_MB. If it is None, there is no limit.
Also you can define individual limit for each task_endpoint like.
@app.task_endpoint("/my_upload", max_upload_file_size_mb=10)
Finally you could also define the limit fo a specific file by using
@app.task_endpoint("/my_upload")
def my_upload(file: LimitedUploadFile = LimitedUploadFile(max_size_mb=10)):
return file.content
Note: MaxUploadSize currently is only respected for fastapi backend.
Just run the server by running your script app.start() this will spin-up uvicorn on localhost:port.
Prerequisite: You have created a python module "yourmodule.server" with the code that starts the server. Then to start the fast-task-api server in docker, add the following command at the end of your Dockerfile.
# Start the fast-task-api server which is instantiated in the module -m yourmodule.server
CMD [ "python", "-m", "yourmodule.server"]
You can change the backend (hosting provider) either by setting it in the constructor or by setting the environment variable.
# Options: "fastapi", "runpod"
ENV FTAPI_BACKEND="runpod"
# Options: "localhost", "serverless"
ENV FTAPI_DEPLOYMENT="serverless"
Depending on the environment it is also necessary to specify the host and port.
# allows any IP from the computer to connect to the host
ENV FTAPI_HOST="0.0.0.0"
# allows the docker container to use the port
ARG port=8080
ENV FTAPI_PORT=$port
EXPOSE $port
It is not required to write a handler function anymore. The fast-task-api magic handles it :D Just change the ENV variable and described above. This brings you additional benefits:
When deployed with runpod, fast-task-api will use the builtin runpod job handling instead of using the own job queue implementation.
FastSDK allows you to easily invoke your FastTaskAPI services with python in an efficient parallel manner. Therefore you can natively work with them as if they are python functions. Read the fastSDK. documentation to get started.
Cog comes with several cons:
The fastapi documentation recommends using starlette background tasks for long-running tasks like sending an e-mail. However starlette background tasks have several drawbacks.
Starlette-Background-Tasks are a good solution for simple tasks, but you'll hit a wall pretty soon.
Celery is a great tool for running jobs in the background on distributed systems. However it comes with several drawbacks:
Socaity router is lightweight and provides background task functionality abstracted from the developer. This doesn't mean that we don't recommend celery. Indeed, it is planned to integrate celery as possible backend.
FAQs
Create web-APIs for long-running tasks
We found that fast-task-api demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Season’s greetings from Socket, and here’s to a calm end of year: clean dependencies, boring pipelines, no surprises.

Research
/Security News
Impostor NuGet package Tracer.Fody.NLog typosquats Tracer.Fody and its author, using homoglyph tricks, and exfiltrates Stratis wallet JSON/passwords to a Russian IP address.

Security News
Deno 2.6 introduces deno audit with a new --socket flag that plugs directly into Socket to bring supply chain security checks into the Deno CLI.