
Research
Malicious npm Packages Impersonate Flashbots SDKs, Targeting Ethereum Wallet Credentials
Four npm packages disguised as cryptographic tools steal developer credentials and send them to attacker-controlled Telegram infrastructure.
storescraper-ai
Advanced tools
StoreScraper AI CLI is a powerful command-line tool for fetching, analyzing, and processing app reviews from Google Play Store and Apple App Store. It helps developers, product managers, and market researchers gain valuable insights from user feedback through AI-powered analysis.
The tool solves several key problems:
This tool is aimed at:
StoreScraper AI CLI is built with a modular architecture that separates concerns into distinct components:
Command Layer (src/commands/)
Scraper Layer (src/utils/store-scraper.ts)
AI Layer (src/ai/)
Settings Layer (src/utils/settings.ts)
User Input → Command Layer → Scraper Layer → Data Processing → AI Analysis → Output Generation
npm install -g nanogiants-storescraper-ai
npm install nanogiants-storescraper-ai
You can use this package as a library in your Node.js, React, or NestJS applications:
import { StoreScraper } from 'nanogiants-storescraper-ai';
// Initialize with configuration
const scraper = new StoreScraper({
openai: {
apiKey: 'your-openai-api-key', // Required for AI features
model: 'gpt-4-turbo' // Optional, defaults to gpt-4-turbo
},
defaultLanguage: 'en',
defaultCountry: 'us',
defaultReviewCount: 50
});
// Example: Fetch app info and reviews from Google Play
async function getAppInfo() {
try {
// Fetch app info and reviews
const appInfo = await scraper.fetchGooglePlayApp('com.example.app', {
reviewCount: 100,
lang: 'en',
country: 'us'
});
console.log(`App: ${appInfo.title}`);
console.log(`Rating: ${appInfo.score}`);
console.log(`Reviews: ${appInfo.reviews.length}`);
// Analyze reviews with AI
const analysis = await scraper.analyzeFeedback(
appInfo.reviews.map(review => review.text),
'english'
);
console.log('Analysis:', analysis);
// Example: Bulk analysis of multiple apps from a CSV file
const csvString = `Name;App Id;Mobile App Name
Company A;com.company.app;App Name
Company B;com.company.app2;App Name 2`;
const result = await scraper.analyzeBulkFeedback(csvString, {
outputPath: './output',
platform: 'android',
reviewCount: 50,
language: 'en',
country: 'us',
generateDetailedReports: true
});
console.log(`Summary CSV saved to: ${result.summaryFilePath}`);
console.log(`Processed ${result.summaryData.length} apps successfully.`);
// Example: Validate Excel file before parsing
const validationResult = scraper.validateTamExcel('./path/to/excel-file.xlsx');
if (!validationResult.isValid) {
console.error(`Error: ${validationResult.error}`);
console.log('Available columns:', validationResult.columnNames.join(', '));
return;
}
console.log('Excel file is valid!');
console.log(`Name column: ${validationResult.nameColumn}`);
console.log(`App column: ${validationResult.appColumn}`);
if (validationResult.appNameColumn) {
console.log(`App name column: ${validationResult.appNameColumn}`);
}
// Example: Parse Excel file to extract app IDs
const parseResult = await scraper.parseTam('./path/to/excel-file.xlsx', {
outputPath: './output',
verbose: true
});
console.log(`Extracted ${parseResult.count} app IDs`);
console.log(`Output saved to: ${parseResult.outputPath}`);
// Access the extracted app data
parseResult.data.forEach(app => {
console.log(`${app.Name} - ${app['App Id']} (${app['Mobile App Name']})`);
});
} catch (error) {
console.error('Error:', error);
}
}
getAppInfo();
import { useState, useEffect } from 'react';
import { StoreScraper } from 'nanogiants-storescraper-ai';
function AppAnalyzer() {
const [appInfo, setAppInfo] = useState(null);
const [loading, setLoading] = useState(false);
const [error, setError] = useState(null);
// Initialize scraper (preferably in a service or context)
const scraper = new StoreScraper({
openai: {
apiKey: process.env.REACT_APP_OPENAI_API_KEY
}
});
async function analyzeApp(appId) {
setLoading(true);
try {
const info = await scraper.fetchGooglePlayApp(appId);
setAppInfo(info);
} catch (err) {
setError(err.message);
} finally {
setLoading(false);
}
}
return (
<div>
{/* Your UI components */}
</div>
);
}
import { Injectable } from '@nestjs/common';
import { StoreScraper } from 'nanogiants-storescraper-ai';
import { ConfigService } from '@nestjs/config';
@Injectable()
export class AppAnalysisService {
private scraper: StoreScraper;
constructor(private configService: ConfigService) {
this.scraper = new StoreScraper({
openai: {
apiKey: this.configService.get<string>('OPENAI_API_KEY')
}
});
}
async analyzeApp(appId: string, platform: 'android' | 'ios') {
if (platform === 'android') {
return this.scraper.fetchGooglePlayApp(appId);
} else {
return this.scraper.fetchAppStoreApp(appId);
}
}
async getFeedbackAnalysis(reviews: string[]) {
return this.scraper.analyzeFeedback(reviews);
}
async validateExcelFile(filePath: string) {
return this.scraper.validateTamExcel(filePath);
}
async parseExcelFile(filePath: string) {
// First validate the Excel file
const validationResult = await this.validateExcelFile(filePath);
if (!validationResult.isValid) {
throw new Error(`Invalid Excel file: ${validationResult.error}`);
}
// Then parse the Excel file
return this.scraper.parseTam(filePath, {
outputPath: './output/tam',
verbose: false
});
}
}
### Local Installation
```bash
git clone https://github.com/nanogiants/storescraper-ai.git
cd storescraper-ai
npm install
npm link
This package requires an OpenAI API key for AI analysis features. There are several ways to provide this:
Environment Variables: Set the OPENAI_API_KEY environment variable
export OPENAI_API_KEY=your_api_key_here
Dotenv File: Create a .env file in your project root with your API key
OPENAI_API_KEY=your_api_key_here
Settings Command: Use the settings command to save your API key
storescraper settings
When using as a library, you must provide the API key in the configuration:
const scraper = new StoreScraper({
openai: {
apiKey: 'your_api_key_here'
}
});
For security best practices:
Check out the examples directory for complete working examples of how to use this package programmatically:
Create a .env file in the root directory with your OpenAI API key:
OPENAI_API_KEY=your_api_key_here
Fetch and analyze reviews for a single app:
storescraper feedback [options]
Options:
-p, --platform <platform>: Specify platform (android or ios)-c, --count <count>: Number of reviews to fetch (default: 50)-o, --output <path>: Output directory for the markdown file-l, --lang <lang>: Two letter language code (default: en)-C, --country <country>: Two letter country code (default: us)-s, --sort <sort>: Sort reviews by: newest, rating, or helpful (default: newest)-a, --analyze <boolean>: Analyze feedback with AI after fetching (true/false)Example:
storescraper feedback -p android -c 100 -l de -C de -s newest -a true -o my-app
Output:
Process multiple apps from a CSV file:
storescraper feedback-bulk [options]
The command includes an interactive file picker that allows you to:
Options:
-i, --input <path>: Path to the input CSV file with app IDs-c, --count <number>: Number of reviews to fetch (default: 50)-l, --lang <code>: Two letter language code (default: en)-C, --country <code>: Two letter country code (default: us)-s, --sort <sort>: Sort reviews by: newest, rating, or helpful (default: newest)-a, --analyze <boolean>: Analyze feedback with AI after fetching (true/false)-o, --output <path>: Output directory for the markdown files-d, --delay <ms>: Delay between requests in milliseconds (default: 1000)Example:
storescraper feedback-bulk -i app-ids.csv -c 50 -l en -C us -a true -d 2000 -o output
Input CSV Format:
Name;App Id;Mobile App Name
App Name 1;com.example.app1;App Display Name 1
App Name 2;123456789;App Display Name 2
Output:
Extract app IDs from Excel files containing app URLs:
storescraper parse-tam [options]
Options:
-i, --input <path>: Path to the input Excel file or directory-o, --output <path>: Path to save the output CSV file(s)-d, --delimiter <char>: Delimiter for the output CSV file (default: ;)-v, --validate-only: Only validate the Excel file without parsing (default: false)Example:
storescraper parse-tam -i tam-data.xlsx -o output
storescraper parse-tam -i tam-directory -o output
storescraper parse-tam -i tam-data.xlsx -v # Validate only
Expected Excel format:
The command will first validate the Excel file to ensure it has the required columns. If validation fails, it will display the available columns and suggest possible matches.
Output CSV format:
Analyze existing review text:
storescraper analyze [options]
Options:
-f, --file <path>: Path to file containing feedback to analyze-o, --output <path>: Path to save the analysis results-l, --language <language>: Language for the analysis output (e.g., english, german, spanish)Example:
storescraper analyze -f reviews.txt -o analysis.md -l german
Configure default settings:
storescraper settings [options]
Options:
-r, --reset: Reset settings to default valuesAll output is saved in the output directory by default, with the following structure:
output/
├── app-name-1/
│ ├── App-Name-1-android-timestamp.md # Reviews
│ └── analysis-timestamp.md # AI analysis
└── app-name-2/
├── App-Name-2-ios-timestamp.md # Reviews
└── analysis-timestamp.md # AI analysis
Clone the repository:
git clone https://github.com/nanogiants/storescraper-ai.git
cd storescraper-ai
Install dependencies:
npm install
Create a local environment file:
cp .env.example .env
# Edit .env and add your OpenAI API key
Build the project:
npm run build
Link the package locally:
npm link
├── dist/ # Compiled JavaScript files
├── src/ # Source TypeScript files
│ ├── ai/ # AI-related functionality
│ │ ├── core/ # Core AI utilities
│ │ └── usecases/ # AI use cases
│ ├── commands/ # CLI commands
│ └── utils/ # Utility functions
├── output/ # Generated output files (gitignored)
├── .env # Environment variables
└── tsconfig.json # TypeScript configuration
src/commands directory.src/ai/usecases directory.src/utils/store-scraper.ts.src/utils/settings.ts.src directory.npm run build.npm link and running the CLI commands.This project follows the Karma commitlint style for commit messages:
<type>(<scope>): <subject>
Types:
Example:
feat(feedback): add support for custom output formats
src/commands/ (e.g., my-command.ts).src/index.ts.Example:
// src/commands/my-command.ts
import { Command } from 'commander';
export function initMyCommand(program: Command): void {
program
.command('my-command')
.description('Description of my command')
.option('-o, --option <value>', 'Description of option')
.action(async (options) => {
// Command implementation
});
}
// src/index.ts
import { initMyCommand } from './commands/my-command';
// ...
initMyCommand(program);
src/ai/usecases/ (e.g., my-usecase.ts).UseCase class.Example:
import { z } from 'zod';
import { UseCase, UseCaseConfig } from '../core/usecase';
const MyResponseSchema = z.object({
// Define your schema
});
export type MyResponse = z.infer<typeof MyResponseSchema>;
const myUsecaseConfig: UseCaseConfig<MyResponse> = {
name: 'My Usecase',
description: 'Description of my usecase',
schema: MyResponseSchema,
temperature: 0.2,
};
export class MyUsecase extends UseCase<MyResponse> {
constructor() {
super(myUsecaseConfig);
}
async execute(input: string): Promise<MyResponse> {
return this.execute(input);
}
}
When working with OpenAI's API and Zod schemas, note that OpenAI's implementation doesn't support certain Zod validation fields:
Instead, include validation guidance in the description field.
git flow release start <version>
git flow release finish <version>
git push origin master --tags
npm publish
parse-tam command to extract app IDs from Excel filesfeedback-bulk command to process multiple apps from a CSV fileContributions are welcome and appreciated! Here's how you can contribute:
Before submitting a pull request, please ensure:
Bug reports and feature requests are welcome on GitHub at https://github.com/nanogiants/storescraper-ai. This project is intended to be a safe, welcoming space for collaboration, and contributors are expected to adhere to the Contributor Covenant code of conduct.
MIT License Copyright (c) 2025 NanoGiants GmbH
Permission is hereby granted, free
of charge, to any person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the Software without
restriction, including without limitation the rights to use, copy, modify, merge,
publish, distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to the
following conditions:
The above copyright notice and this permission notice
(including the next paragraph) shall be included in all copies or substantial
portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF
ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO
EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR
OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
THE SOFTWARE.
API Rate Limiting
-d option-cOpenAI API Errors
CSV Parsing Issues
;)Excel Parsing Issues
If you encounter issues not covered here, please open an issue on GitHub with:
FAQs
App store scraper with AI-powered analysis capabilities
The npm package storescraper-ai receives a total of 6 weekly downloads. As such, storescraper-ai popularity was classified as not popular.
We found that storescraper-ai demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Research
Four npm packages disguised as cryptographic tools steal developer credentials and send them to attacker-controlled Telegram infrastructure.

Security News
Ruby maintainers from Bundler and rbenv teams are building rv to bring Python uv's speed and unified tooling approach to Ruby development.

Security News
Following last week’s supply chain attack, Nx published findings on the GitHub Actions exploit and moved npm publishing to Trusted Publishers.