Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

create-llama

Package Overview
Dependencies
Maintainers
0
Versions
126
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

create-llama - npm Package Versions

1
13

0.1.1

Diff

Changelog

Source

0.1.1

Patch Changes

  • 7bd3ed5: Support Anthropic and Gemini as model providers
  • 7bd3ed5: Support new agents from LITS 0.3
  • cfb5257: Display events (e.g. retrieving nodes) per chat message
marcusschiesser
published 0.1.0 •

Changelog

Source

0.1.0

Minor Changes

  • f1c3e8d: Add Llama3 and Phi3 support using Ollama

Patch Changes

  • a0dec80: Use gpt-4-turbo model as default. Upgrade Python llama-index to 0.10.28
  • 753229d: Remove asking for AI models and use defaults instead (OpenAIs GPT-4 Vision Preview and Embeddings v3). Use --ask-models CLI parameter to select models.
  • 1d78202: Add observability for Python
  • 6acccd2: Use poetry run generate to generate embeddings for FastAPI
  • 9efcffe: Use Settings object for LlamaIndex configuration
  • 418bf9b: refactor: use tsx instead of ts-node
  • 1be69a5: Add Qdrant support
marcusschiesser
published 0.0.32 •

Changelog

Source

0.0.32

Patch Changes

  • 625ed4d: Support Astra VectorDB
  • 922e0ce: Remove UI question (use shadcn as default). Use html UI by calling create-llama with --ui html parameter
  • ce2f24d: Update loaders and tools config to yaml format (for Python)
  • e8db041: Let user select multiple datasources (URLs, files and folders)
  • c06d4af: Add nodes to the response (Python)
  • 29b17ee: Allow using agents without any data source
  • 665c26c: Add redirect to documentation page when accessing the base URL (FastAPI)
  • 78ded9e: Add Dockerfile templates for Typescript and Python
  • 99e758f: Merge non-streaming and streaming template to one
  • b3f2685: Add support for agent generation for Typescript
  • 2739714: Use a database (MySQL or PostgreSQL) as a data source
marcusschiesser
published 0.0.31 •

Changelog

Source

0.0.31

Patch Changes

  • 56faee0: Added windows e2e tests
  • 60ed8fe: Added missing environment variable config for URL data source
  • 60ed8fe: Fixed tool usage by freezing llama-index package versions
marcusschiesser
published 0.0.0-20240320065238 •

marcusschiesser
published 0.0.30 •

Changelog

Source

0.0.30

Patch Changes

  • 3af6328: Add support for llamaparse using Typescript
  • dd92b91: Add fetching llm and embedding models from server
  • bac1b43: Add Milvus vector database
marcusschiesser
published 0.0.29 •

Changelog

Source

0.0.29

Patch Changes

  • edd24c2: Add observability with openllmetry
  • 403fc6f: Minor bug fixes to improve DX (missing .env value and updated error messages)
  • 0f79757: Ability to download community submodules
marcusschiesser
published 0.0.28 •

Changelog

Source

0.0.28

Patch Changes

  • 89a49f4: Add more config variables to .env file
  • fdf48dd: Add "Start in VSCode" option to postInstallAction
  • fdf48dd: Add devcontainers to generated code
marcusschiesser
published 0.0.27 •

Changelog

Source

0.0.27

Patch Changes

  • 2d29350: Add LlamaParse option when selecting a pdf file or a folder (FastAPI only)
  • b354f23: Add embedding model option to create-llama (FastAPI only)
marcusschiesser
published 0.0.26 •

Changelog

Source

0.0.26

Patch Changes

  • 09d532e: feat: generate llama pack project from llama index
  • cfdd6db: feat: add pinecone support to create llama
  • ef25d69: upgrade llama-index package to version v0.10.7 for create-llama app
  • 50dfd7b: update fastapi for CVE-2024-24762
SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc