
Security News
pnpm 10.12 Introduces Global Virtual Store and Expanded Version Catalogs
pnpm 10.12.1 introduces a global virtual store for faster installs and new options for managing dependencies with version catalogs.
Get up and running with promptable vision models locally.
Osam (/oʊˈsɑm/) is a tool to run open-source promptable vision models locally (inspired by Ollama).
Osam provides:
pip install osam
For osam serve
:
pip install osam[serve]
To run with EfficientSAM:
osam run efficientsam --image <image_file>
To run with YOLO-World:
osam run yoloworld --image <image_file>
Here are models that can be downloaded:
Model | Parameters | Size | Download |
---|---|---|---|
SAM 100M | 94M | 100MB | osam run sam:100m |
SAM 300M | 313M | 310MB | osam run sam:300m |
SAM 600M | 642M | 630MB | osam run sam |
SAM2 Tiny | 39M | 150MB | osam run sam2:tiny |
SAM2 Small | 46M | 170MB | osam run sam2:small |
SAM2 BasePlus | 82M | 300MB | osam run sam2 |
SAM2 Large | 227M | 870MB | osam run sam2:large |
EfficientSAM 10M | 10M | 40MB | osam run efficientsam:10m |
EfficientSAM 30M | 26M | 100MB | osam run efficientsam |
YOLO-World XL | 168M | 640MB | osam run yoloworld |
PS. sam
, efficientsam
is equivalent to sam:latest
, efficientsam:latest
.
# Run a model with an image
osam run efficientsam --image examples/_images/dogs.jpg > output.png
# Get a JSON output
osam run efficientsam --image examples/_images/dogs.jpg --json
# {"model": "efficientsam", "mask": "..."}
# Give a prompt
osam run efficientsam --image examples/_images/dogs.jpg \
--prompt '{"points": [[1439, 504], [1439, 1289]], "point_labels": [1, 1]}' \
> efficientsam.png
osam run yoloworld --image examples/_images/dogs.jpg --prompt '{"texts": ["dog"]}' \
> yoloworld.png
Input and output images ('dogs.jpg', 'efficientsam.png', 'yoloworld.png').
import osam.apis
import osam.types
request = osam.types.GenerateRequest(
model="efficientsam",
image=np.asarray(PIL.Image.open("examples/_images/dogs.jpg")),
prompt=osam.types.Prompt(points=[[1439, 504], [1439, 1289]], point_labels=[1, 1]),
)
response = osam.apis.generate(request=request)
PIL.Image.fromarray(response.mask).save("mask.png")
Input and output images ('dogs.jpg', 'mask.png').
# pip install osam[serve] # required for `osam serve`
# Get up the server
osam serve
# POST request
curl 127.0.0.1:11368/api/generate -X POST \
-H "Content-Type: application/json" \
-d "{\"model\": \"efficientsam\", \"image\": \"$(cat examples/_images/dogs.jpg | base64)\"}" \
| jq -r .mask | base64 --decode > mask.png
FAQs
Get up and running vision foundational models locally.
We found that osam demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
pnpm 10.12.1 introduces a global virtual store for faster installs and new options for managing dependencies with version catalogs.
Security News
Amaro 1.0 lays the groundwork for stable TypeScript support in Node.js, bringing official .ts loading closer to reality.
Research
A deceptive PyPI package posing as an Instagram growth tool collects user credentials and sends them to third-party bot services.