Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

ai-renamer

Package Overview
Dependencies
Maintainers
0
Versions
24
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

ai-renamer - npm Package Compare versions

Comparing version 1.0.18 to 1.0.19

src/getModelResponse.js

5

package.json
{
"version": "1.0.18",
"version": "1.0.19",
"license": "GPL-3.0",

@@ -9,3 +9,3 @@ "name": "ai-renamer",

},
"description": "A Node.js CLI tool that uses Ollama models (Llama, Gemma, Phi etc.) to intelligently rename files in a specified directory",
"description": "A Node.js CLI that uses Ollama and LM Studio models (Llava, Gemma, Llama etc.) to intelligently rename files by their contents",
"author": {

@@ -21,2 +21,3 @@ "name": "Ozgur Ozer",

"dependencies": {
"axios": "^1.7.2",
"change-case": "^5.4.4",

@@ -23,0 +24,0 @@ "ollama": "^0.5.2",

61

readme.md
# ai-renamer
A Node.js CLI tool that uses Ollama models (Llama, Gemma, Phi etc.) to intelligently rename files in a specified directory
A Node.js CLI that uses Ollama and LM Studio models (Llava, Gemma, Llama etc.) to intelligently rename files by their contents

@@ -20,3 +20,3 @@ [![npm](https://img.shields.io/npm/v/ai-renamer.svg?style=flat-square)](https://www.npmjs.com/package/ai-renamer)

You need to have [Ollama](https://ollama.com/download) and at least one LLM (Llama, Gemma etc.) installed on your system
You need to have [Ollama](https://ollama.com/download) or [LM Studio](https://lmstudio.ai/) and at least one LLM (Llava, Gemma, Llama etc.) installed on your system

@@ -39,27 +39,50 @@ Run with NPX

## Params
Ollama Usage
Ollama is the default platform so you don't have to do anything. You can just run `npx ai-renamer /images`. At the first launch it will try to auto-select the Llava model but if it couldn't do that you can specify the model.
```bash
npx ai-renamer --help
Options:
-h, --help Show help [boolean]
--version Show version number [boolean]
-c, --set-case Set the case style (e.g. camelCase,
pascalCase, snakeCase, kebabCase) [string]
-m, --set-model Set the Ollama model to use (e.g. gemma2,
llama3) [string]
-x, --set-chars Set the maximum number of characters in the
new filename (e.g. 25) [number]
-l, --set-language Set the output language (e.g. English,
Turkish) [string]
-s, --set-include-subdirectories Include files in subdirectories when
processing (e.g: true, false) [string]
npx ai-renamer /path --platform=ollama --model=llava:13b
```
To get the model name to use in `--set-model`
LM Studio Usage
You need to set the platform as `lm-studio` and it will auto-select the loaded model in LM Studio.
```bash
ollama list
npx ai-renamer /path --platform=lm-studio
```
If you're using a different port in Ollama or LM Studio you could simply specify the base URLs.
```bash
npx ai-renamer /path --platform=ollama --base-url=http://127.0.0.1:11434
npx ai-renamer /path --platform=lm-studio --base-url=http://127.0.0.1:1234
```
The values of the flags will be saved to your disk when you use them. You can find the config file at `~/ai-renamer.json`. If you're using a Mac it's `/Users/your-user-name/ai-renamer.json`. Also when you set a flag you don't have to use them again. The script gets the values from this config file.
## Params
```bash
npx ai-renamer --help
Options:
-h, --help Show help [boolean]
--version Show version number [boolean]
-p, --platform Set the platform (e.g. ollama, lm-studio)
[string]
-u, --base-url Set the API base URL (e.g.
http://127.0.0.1:11434 for ollama) [string]
-m, --model Set the model to use (e.g. gemma2, llama3)
[string]
-c, --case Set the case style (e.g. camelCase, pascalCase,
snakeCase, kebabCase) [string]
-x, --chars Set the maximum number of characters in the new
filename (e.g. 25) [number]
-l, --language Set the output language (e.g. English, Turkish)
[string]
-s, --include-subdirectories Include files in subdirectories when processing
(e.g: true, false) [string]
```
`ai-renamer` uses `change-case` library for case styling

@@ -66,0 +89,0 @@

@@ -1,14 +0,63 @@

const ollama = require('ollama').default
const fs = require('fs')
const axios = require('axios')
const listModels = async () => {
const ollamaApis = async ({ baseURL }) => {
try {
const response = await ollama.list()
return response.models
const apiResult = await axios({
data: {},
method: 'get',
url: `${baseURL}/api/tags`
})
return apiResult.data.models
} catch (err) {
return []
throw new Error(err?.response?.data?.error || err.message)
}
}
const lmStudioApis = async ({ baseURL }) => {
try {
const apiResult = await axios({
data: {},
method: 'get',
url: `${baseURL}/v1/models`
})
return apiResult.data.data
} catch (err) {
throw new Error(err?.response?.data?.error || err.message)
}
}
const listModels = async options => {
try {
const { platform } = options
if (platform === 'ollama') {
return ollamaApis(options)
} else if (platform === 'lm-studio') {
return lmStudioApis(options)
} else {
throw new Error('🔴 No supported platform found')
}
} catch (err) {
throw new Error(err.message)
}
}
const filterModelNames = arr => {
return arr.map((item) => {
if (item.id !== undefined) {
return { name: item.id }
} else if (item.name !== undefined) {
return { name: item.name }
} else {
throw new Error('Item does not contain id or name property')
}
})
}
const chooseModel = ({ models }) => {
const preferredModels = [
'llava',
'llama',

@@ -33,5 +82,6 @@ 'gemma',

module.exports = async () => {
module.exports = async options => {
try {
const models = await listModels()
const _models = await listModels(options)
const models = filterModelNames(_models)
console.log(`⚪ Available models: ${models.map(m => m.name).join(', ')}`)

@@ -38,0 +88,0 @@

@@ -31,13 +31,23 @@ const os = require('os')

})
.option('set-case', {
alias: 'c',
.option('platform', {
alias: 'p',
type: 'string',
description: 'Set the case style (e.g. camelCase, pascalCase, snakeCase, kebabCase)'
description: 'Set the platform (e.g. ollama, lm-studio)'
})
.option('set-model', {
.option('base-url', {
alias: 'u',
type: 'string',
description: 'Set the API base URL (e.g. http://127.0.0.1:11434 for ollama)'
})
.option('model', {
alias: 'm',
type: 'string',
description: 'Set the Ollama model to use (e.g. gemma2, llama3)'
description: 'Set the model to use (e.g. gemma2, llama3)'
})
.option('set-chars', {
.option('case', {
alias: 'c',
type: 'string',
description: 'Set the case style (e.g. camelCase, pascalCase, snakeCase, kebabCase)'
})
.option('chars', {
alias: 'x',

@@ -47,3 +57,3 @@ type: 'number',

})
.option('set-language', {
.option('language', {
alias: 'l',

@@ -53,3 +63,3 @@ type: 'string',

})
.option('set-include-subdirectories', {
.option('include-subdirectories', {
alias: 's',

@@ -65,28 +75,38 @@ type: 'string',

if (argv['set-case']) {
config.defaultCase = argv['set-case']
if (argv.platform) {
config.defaultPlatform = argv.platform
await saveConfig({ config })
}
if (argv['set-model']) {
config.defaultModel = argv['set-model']
if (argv['base-url']) {
config.defaultBaseURL = argv['base-url']
await saveConfig({ config })
}
if (argv['set-chars']) {
config.defaultChars = argv['set-chars']
if (argv.model) {
config.defaultModel = argv.model
await saveConfig({ config })
}
if (argv['set-language']) {
config.defaultLanguage = argv['set-language']
if (argv.case) {
config.defaultCase = argv.case
await saveConfig({ config })
}
if (argv['set-include-subdirectories']) {
config.defaultIncludeSubdirectories = argv['set-include-subdirectories']
if (argv.chars) {
config.defaultChars = argv.chars
await saveConfig({ config })
}
if (argv.language) {
config.defaultLanguage = argv.language
await saveConfig({ config })
}
if (argv['include-subdirectories']) {
config.defaultIncludeSubdirectories = argv['include-subdirectories']
await saveConfig({ config })
}
return { argv, config }
}

@@ -1,6 +0,5 @@

const ollama = require('ollama').default
const changeCase = require('./changeCase')
const getModelResponse = require('./getModelResponse')
module.exports = async ({ model, _case, chars, content, language, images, relativeFilePath }) => {
module.exports = async ({ model, _case, chars, images, content, baseURL, language, platform, relativeFilePath }) => {
try {

@@ -25,10 +24,11 @@ const promptLines = [

const res = await ollama.generate({ model, prompt, images })
const modelResult = await getModelResponse({ model, prompt, images, baseURL, platform })
const maxChars = chars + 10
const text = res.response.trim().slice(-maxChars)
const text = modelResult.trim().slice(-maxChars)
const filename = await changeCase({ text, _case })
return filename
} catch (err) {
console.log(`🔴 Ollama error: ${err.message} (${relativeFilePath})`)
console.log(`🔴 Model error: ${err.message} (${relativeFilePath})`)
}
}

@@ -9,3 +9,3 @@ const path = require('path')

module.exports = async ({ model, _case, chars, language, filePath, inputPath }) => {
module.exports = async ({ model, _case, chars, baseURL, language, platform, filePath, inputPath }) => {
try {

@@ -35,3 +35,13 @@ const fileName = path.basename(filePath)

const newName = await getNewName({ model, _case, chars, content, language, images, relativeFilePath })
const newName = await getNewName({
model,
_case,
chars,
images,
content,
baseURL,
language,
platform,
relativeFilePath
})
if (!newName) return

@@ -38,0 +48,0 @@

@@ -7,15 +7,26 @@ const fs = require('fs').promises

module.exports = async ({ inputPath, defaultCase, defaultModel, defaultChars, defaultLanguage, defaultIncludeSubdirectories }) => {
module.exports = async ({ inputPath, defaultCase, defaultModel, defaultChars, defaultBaseURL, defaultLanguage, defaultPlatform, defaultIncludeSubdirectories }) => {
try {
const model = defaultModel || await chooseModel()
console.log(`⚪ Chosen model: ${model}`)
const platform = defaultPlatform || 'ollama'
console.log(`⚪ Platform: ${platform}`)
let baseURL = defaultBaseURL
if (platform === 'ollama' && !baseURL) {
baseURL = 'http://127.0.0.1:11434'
} else if (platform === 'lm-studio' && !baseURL) {
baseURL = 'http://127.0.0.1:1234'
}
console.log(`⚪ Base URL: ${baseURL}`)
const model = defaultModel || await chooseModel({ baseURL, platform })
console.log(`⚪ Model: ${model}`)
const _case = defaultCase || 'kebabCase'
console.log(`⚪ Chosen case: ${_case}`)
console.log(`⚪ Case: ${_case}`)
const chars = defaultChars || 20
console.log(`⚪ Chosen chars: ${chars}`)
console.log(`⚪ Chars: ${chars}`)
const language = defaultLanguage || 'English'
console.log(`⚪ Chosen language: ${language}`)
console.log(`⚪ Language: ${language}`)

@@ -28,3 +39,12 @@ const includeSubdirectories = defaultIncludeSubdirectories === 'true' || false

const stats = await fs.stat(inputPath)
const options = { model, _case, chars, language, inputPath, includeSubdirectories }
const options = {
model,
_case,
chars,
baseURL,
language,
platform,
inputPath,
includeSubdirectories
}

@@ -31,0 +51,0 @@ if (stats.isDirectory()) {

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc