๐Ÿš€ Big News: Socket Acquires Coana to Bring Reachability Analysis to Every Appsec Team.Learn more โ†’
Socket
Book a DemoInstallSign in
Socket

locallab

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

locallab

LocalLab: A lightweight AI inference server for running LLMs locally or in Google Colab with a friendly API.

0.7.2
PyPI
Maintainers
1

๐Ÿš€ LocalLab: Your Personal AI Lab

LocalLab Server LocalLab Client License Python

Run ChatGPT-like AI on your own computer! LocalLab is a server that runs AI models locally and makes them accessible from anywhere.

๐Ÿค” What is LocalLab?

LocalLab is like having your own personal ChatGPT that runs on your computer. Here's how it works:

  • LocalLab Server: Runs on your computer and loads AI models
  • Python Client: A separate package that connects to the server
  • Access From Anywhere: Use your AI from any device with the ngrok feature

No complicated setup, no monthly fees, and your data stays private. Perfect for developers, students, researchers, or anyone who wants to experiment with AI.

๐Ÿง  How LocalLab Works (In Simple Terms)

Think of LocalLab as having two parts:

  • The Server (what you install with pip install locallab)

    • This is like a mini-ChatGPT that runs on your computer
    • It loads AI models and makes them available through a web server
    • You start it with a simple command: locallab start
  • The Client (what you install with pip install locallab-client)

    • This is how your Python code talks to the server
    • It's a separate package that connects to the server
    • You use it in your code with: client = SyncLocalLabClient("http://localhost:8000")
graph TD
    A[Your Python Code] -->|Uses| B[LocalLab Client Package]
    B -->|Connects to| C[LocalLab Server]
    C -->|Runs| D[AI Models]
    C -->|Optional| E[Ngrok for Remote Access]
    style C fill:#f9f,stroke:#333,stroke-width:2px
    style D fill:#bbf,stroke:#333,stroke-width:2px

The Magic Part: With the --use-ngrok option, you can access your AI from anywhere - your phone, another computer, or share with friends!

๐ŸŽฏ Key Features

๐Ÿ“ฆ Easy Setup         ๐Ÿ”’ Privacy First       ๐ŸŽฎ Free GPU Access
๐Ÿค– Multiple Models    ๐Ÿ’พ Memory Efficient    ๐Ÿ”„ Auto-Optimization
๐ŸŒ Local or Colab    โšก Fast Response       ๐Ÿ”ง Simple Server
๐ŸŒ Access Anywhere   ๐Ÿ”Œ Client Package      ๐Ÿ›ก๏ธ Secure Tunneling

Two-Part System:

  • LocalLab Server: Runs the AI models and exposes API endpoints
  • LocalLab Client: A separate Python package (pip install locallab-client) that connects to the server

Access From Anywhere: With built-in ngrok integration, you can securely access your LocalLab server from any device, anywhere in the world - perfect for teams, remote work, or accessing your models on the go.

๐ŸŒŸ Two Ways to Run

  • On Your Computer (Local Mode)

    ๐Ÿ’ป Your Computer
    โ””โ”€โ”€ ๐Ÿš€ LocalLab Server
        โ””โ”€โ”€ ๐Ÿค– AI Model
            โ””โ”€โ”€ ๐Ÿ”ง Auto-optimization
    
  • On Google Colab (Free GPU Mode)

    โ˜๏ธ Google Colab
    โ””โ”€โ”€ ๐ŸŽฎ Free GPU
        โ””โ”€โ”€ ๐Ÿš€ LocalLab Server
            โ””โ”€โ”€ ๐Ÿค– AI Model
                โ””โ”€โ”€ โšก GPU Acceleration
    

๐Ÿ“ฆ Installation & Setup

Latest Package Versions:

  • LocalLab Server: LocalLab Server
  • LocalLab Client: LocalLab Client

Windows Setup

  • Install Required Build Tools

  • Install Packages

    pip install locallab locallab-client
    
  • Verify PATH

    • If locallab command isn't found, add Python Scripts to PATH:
      # Find Python location
      where python
      # This will show something like: C:\Users\YourName\AppData\Local\Programs\Python\Python311\python.exe
      

    Adding to PATH in Windows:

    • Press Win + X and select "System"

    • Click "Advanced system settings" on the right

    • Click "Environment Variables" button

    • Under "System variables", find and select "Path", then click "Edit"

    • Click "New" and add your Python Scripts path (e.g., C:\Users\YourName\AppData\Local\Programs\Python\Python311\Scripts\)

    • Click "OK" on all dialogs

    • Restart your command prompt

    • Alternatively, use: python -m locallab start

๐Ÿ” Having issues? See our Windows Troubleshooting Guide

Linux/Mac Setup

# Install both server and client packages
pip install locallab locallab-client
# Run interactive configuration
locallab config

# This will help you set up:
# - Model selection
# - Memory optimizations
# - GPU settings
# - System resources

3. Start the Server

# Start with saved configuration
locallab start

# Or start with specific options
locallab start --model microsoft/phi-2 --quantize --quantize-type int8

๐Ÿ’ก Client Connection & Usage

After starting your LocalLab server (either locally or on Google Colab), you'll need to connect to it using the LocalLab client package. This is how your code interacts with the AI models running on the server.

Synchronous Client Usage (Easier for Beginners)

from locallab_client import SyncLocalLabClient

# Connect to server - choose ONE of these options:
# 1. For local server (default)
client = SyncLocalLabClient("http://localhost:8000")

# 2. For remote server via ngrok (when using Google Colab or --use-ngrok)
# client = SyncLocalLabClient("https://abc123.ngrok.app")  # Replace with your ngrok URL

try:
    print("Generating text...")
    # Generate text
    response = client.generate("Write a story")
    print(response)

    print("Streaming responses...")
    # Stream responses
    for token in client.stream_generate("Tell me a story"):
       print(token, end="", flush=True)

    print("Chat responses...")
    # Chat with AI
    response = client.chat([
        {"role": "system", "content": "You are helpful."},
        {"role": "user", "content": "Hello!"}
    ])
    print(response.choices[0]["message"]["content"])

finally:
    # Always close the client
    client.close()

๐Ÿ’ก Important: When connecting to a server running on Google Colab or with ngrok enabled, always use the ngrok URL (https://abc123.ngrok.app) that was displayed when you started the server.

Asynchronous Client Usage (For Advanced Users)

import asyncio
from locallab_client import LocalLabClient

async def main():
    # Connect to server - choose ONE of these options:
    # 1. For local server (default)
    client = LocalLabClient("http://localhost:8000")

    # 2. For remote server via ngrok (when using Google Colab or --use-ngrok)
    # client = LocalLabClient("https://abc123.ngrok.app")  # Replace with your ngrok URL

    try:
        print("Generating text...")
        # Generate text
        response = await client.generate("Write a story")
        print(response)

        print("Streaming responses...")
        # Stream responses
        async for token in client.stream_generate("Tell me a story"):
            print(token, end="", flush=True)

        print("\nChatting with AI...")
        # Chat with AI
        response = await client.chat([
            {"role": "system", "content": "You are helpful."},
            {"role": "user", "content": "Hello!"}
        ])
        # Extracting Content
        content = response['choices'][0]['message']['content']
        print(content)
    finally:
        # Always close the client
        await client.close()

# Run the async function
asyncio.run(main())

๐ŸŒ Google Colab Usage with Remote Access

Step 1: Set Up the Server on Google Colab

First, you'll set up the LocalLab server on Google Colab to use their free GPU:

# In your Colab notebook:

# 1. Install the server package
!pip install locallab

# 2. Configure with CLI (notice the ! prefix)
!locallab config

# 3. Start server with ngrok for remote access
!locallab start --use-ngrok

# The server will display a public URL like:
# ๐Ÿš€ Ngrok Public URL: https://abc123.ngrok.app
# COPY THIS URL - you'll need it to connect!

Step 2: Connect to Your Server

After setting up your server on Google Colab, you'll need to connect to it using the LocalLab client package. The server will display a ngrok URL that you'll use for the connection.

Using the Client Connection Examples

You can now use the client connection examples from the Client Connection & Usage section above.

Just make sure to:

  • Use your ngrok URL instead of localhost
  • Install the client package if needed

For example:

# In another cell in the same Colab notebook:

# 1. Install the client package
!pip install locallab-client

# 2. Import the client
from locallab_client import SyncLocalLabClient

# 3. Connect to your ngrok URL (replace with your actual URL from Step 1)
client = SyncLocalLabClient("https://abc123.ngrok.app")  # โ† REPLACE THIS with your URL!

# 4. Now you can use any of the client methods
response = client.generate("Write a poem about AI")
print(response)

# 5. Always close when done
client.close()

Access From Any Device

The power of using ngrok is that you can connect to your Colab server from anywhere:

# On your local computer, phone, or any device with Python:
pip install locallab-client

from locallab_client import SyncLocalLabClient
client = SyncLocalLabClient("https://abc123.ngrok.app")  # โ† REPLACE THIS with your URL!
response = client.generate("Hello from my device!")
print(response)
client.close()

๐Ÿ’ก Remote Access Tip: The ngrok URL lets you access your LocalLab server from any device - your phone, tablet, another computer, or share with teammates. See the Client Connection & Usage section above for more examples of what you can do with the client.

๐Ÿ’ป Requirements

Local Computer

  • Python 3.8+
  • 4GB RAM minimum (8GB+ recommended)
  • GPU optional but recommended
  • Internet connection for downloading models

Google Colab

  • Just a Google account!
  • Free tier works fine

๐ŸŒŸ Features

  • Easy Setup: Just pip install and run
  • Multiple Models: Use any Hugging Face model
  • Resource Efficient: Automatic optimization
  • Privacy First: All local, no data sent to cloud
  • Free GPU: Google Colab integration
  • Flexible Client API: Both async and sync clients available
  • Automatic Resource Management: Sessions close automatically
  • Remote Access: Access your models from anywhere with ngrok integration
  • Secure Tunneling: Share your models securely with teammates or access from mobile devices
  • Client Libraries: Python libraries for both synchronous and asynchronous usage

๐ŸŒ Client-Server Architecture

graph LR
    A[Your Application] -->|Uses| B[LocalLab Client]
    B -->|API Requests| C[LocalLab Server]
    C -->|Runs| D[AI Models]
    C -->|Optional| E[Ngrok Tunnel]
    E -->|Remote Access| F[Any Device, Anywhere]
    style E fill:#f9f,stroke:#333,stroke-width:2px
    style F fill:#bbf,stroke:#333,stroke-width:2px

โžก๏ธ See All Features

๐Ÿ“š Documentation

Getting Started

Advanced Topics

Deployment

๐Ÿ” Need Help?

๐Ÿ“– Additional Resources

๐ŸŒŸ Star Us!

If you find LocalLab helpful, please star our repository! It helps others discover the project.

Made with โค๏ธ by Utkarsh Tiwari GitHub โ€ข Twitter โ€ข LinkedIn

FAQs

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts