![Deno 2.2 Improves Dependency Management and Expands Node.js Compatibility](https://cdn.sanity.io/images/cgdhsj6q/production/97774ea8c88cc8f4bed2766c31994ebc38116948-1664x1366.png?w=400&fit=max&auto=format)
Security News
Deno 2.2 Improves Dependency Management and Expands Node.js Compatibility
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
localizer-ai
Advanced tools
A powerful CLI tool for automating content localization using OpenAI GPT-4 or MistralAI, helping to localize apps. Supports multiple file types, preserves formatting, and enables context-aware translations.
A powerful CLI tool for automating content localization using OpenAI GPT-4 or MistralAI, helping to localize apps. Supports multiple file types, preserves formatting, and enables context-aware translations.
Format-Aware Translation
Smart Context Management
Developer-Centric Features
Project Structure Preservation
Developer Experience
This tool combines the power of LLMs with specialized handling for development-focused content, making it superior to generic translation services or basic LLM implementations for technical documentation and code-related content.
⚠️ Early Stage Project: This tool is in its early stages of development.
These limitations are being actively addressed and will be improved in future versions. For best results, review translated output for critical content.
# Install globally
npm install -g localizer-ai
# Set up API key for OpenAI
npm config set -g openai-key YOUR_API_KEY
# Or set up API key for MistralAI
npm config set -g mistralai-key YOUR_API_KEY
localizer-ai create-config
localizer-ai translate
Create a localizer-ai.config.json
file:
{
"source": "docs/en",
"fileTypes": [".md", ".txt"],
"locales": ["fr", "es", "de"],
"from": "en",
"destination": "localized",
"aiServiceProvider": "openAI",
"parallelProcessing": true,
"llmConfig": {
"temperature": 0.4,
"maxTokens": 1000
}
}
Option | Description | Default |
---|---|---|
source | Source directory path | Required |
fileTypes | Array of file extensions | Required |
locales | Target language codes | Required |
from | Source language code | "en" |
destination | Output directory | Same as source |
aiServiceProvider | AI service to use | "openAI" |
parallelProcessing | Enable parallel processing | true |
llmConfig | AI model configuration | {} |
Note on Default Models:
- OpenAI: Uses
gpt-4o-mini
model by default- MistralAI: Uses
open-mistral-nemo
model by default
{
"docs/api.md": "Technical API documentation",
"docs/guide.md": "User guide content"
}
{
"docs/config.json": {
"api.endpoints": "API endpoint descriptions",
"$fileContext": "Configuration documentation"
}
}
src/cli/commandExecutor.js
)async function commandExecutor() {
displayWelcomeMessage();
const args = process.argv.slice(2);
// ... command handling logic
}
src/utils/textTranslator.js
)async function translateText({ content, from, to, localeContext, fileType }) {
// ... translation logic
}
src/utils/fileReplicator.js
)async function replicateFiles(
sourcePath,
locales,
fileTypes,
from,
destinationPath
) {
// ... file replication logic
}
const askOpenAI = async ({ question, systemPrompt }) => {
const response = await openai.chat.completions.create({
model: "gpt-4",
messages: [
{ role: "system", content: systemPrompt },
{ role: "user", content: question },
],
temperature: 0.4,
...llmConfig,
});
return response.choices[0].message.content;
};
const askMistralAI = async ({ question, systemPrompt }) => {
const chatResponse = await client.chat.complete({
model: "open-mistral-nemo",
messages: [
{ role: "system", content: systemPrompt },
{ role: "user", content: question },
],
...llmConfig,
});
return chatResponse.choices[0].message.content;
};
The tool maintains formatting for:
class AIRequestQueue {
constructor(delayMs = 1500) {
this.queue = [];
this.isProcessing = false;
this.delayMs = delayMs;
}
async processQueue() {
// ... queue processing logic with rate limiting
}
}
git checkout -b feature/AmazingFeature
npm install
npm run dev
git commit -m 'Add some AmazingFeature'
git push origin feature/AmazingFeature
npm test
/**
* Translates content from one language to another
* @param {Object} options Translation options
* @param {string} options.content Content to translate
* @param {string} options.from Source language
* @param {string} options.to Target language
* @returns {Promise<string>} Translated content
*/
async function translateText(options) {
// Implementation
}
/**
* Replicates directory structure with translations
* @param {string} sourcePath Source directory
* @param {string[]} locales Target locales
* @param {string[]} fileTypes File types to process
* @returns {Promise<void>}
*/
async function replicateFiles(sourcePath, locales, fileTypes) {
// Implementation
}
This project is licensed under the MIT License - see the LICENSE file for details.
Created by Takasi Venkata Sandeep
FAQs
A powerful CLI tool for automating content localization using OpenAI GPT-4 or MistralAI, helping to localize apps. Supports multiple file types, preserves formatting, and enables context-aware translations.
The npm package localizer-ai receives a total of 2 weekly downloads. As such, localizer-ai popularity was classified as not popular.
We found that localizer-ai demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Deno 2.2 enhances Node.js compatibility, improves dependency management, adds OpenTelemetry support, and expands linting and task automation for developers.
Security News
React's CRA deprecation announcement sparked community criticism over framework recommendations, leading to quick updates acknowledging build tools like Vite as valid alternatives.
Security News
Ransomware payment rates hit an all-time low in 2024 as law enforcement crackdowns, stronger defenses, and shifting policies make attacks riskier and less profitable.