
Security News
Axios Maintainer Confirms Social Engineering Attack Behind npm Compromise
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.
@unbuilt/cli
Advanced tools
@unbuilt/cli is a command-line tool for analyzing the technologies used on websites. It can detect frameworks, libraries, and other technical details by analyzing the structure and content of web applications.
npm install -g @unbuilt/cli
For local (default) analysis mode, Playwright is required:
# Install Playwright browsers
npx playwright install chromium
This is necessary because Unbuilt uses a headless browser to visit websites and analyze their technology stack. The remote analysis mode doesn't require Playwright to be installed locally as it runs on the Unbuilt server.
Unbuilt CLI provides three main commands:
analyze: Analyze a single website. (the analyze keyword can be omitted - ex. unbuilt mysite.com)batch: Analyze multiple websites from a CSV filestatus: Check the status of an ongoing remote analysisunbuilt https://example.com
This will analyze the website locally using a headless browser and display the detected technologies.
| Option | Description |
|---|---|
-r, --remote | Run analysis remotely via unbuilt server (runs locally by default) |
-s, --save | Saves the analysis results to the database, which serves as a cache for future runs. By default, CLI analyses are executed without caching results. Available only for admins (you need to set UNBUILT_API_KEY env variable when running). |
-n, --async | Enable asynchronous execution - returns a job ID immediately instead of waiting for the result. This ID can be used to check status or retrieve results later with the unbuilt status command. Only available when using --remote. |
-r, --refresh | Force a fresh analysis by bypassing the cache. Ignores any previously saved results and rebuilds the cache with new data. |
-t, --timeout <seconds> | Max time to wait for analysis to complete (default: 120) |
-j, --json | Output results in JSON format |
--session | Use a copy of the current user's Chrome profile data for authentication and session preservation. This allows the analysis to be performed with the user's existing login sessions and cookies without modifying the original profile. The temporary profile copy is automatically cleaned up when the browser context is closed. |
# Run a local analysis
unbuilt https://example.com
# Run a local analysis with user's session data (logged-in state)
unbuilt https://example.com --session
# Run a remote analysis
unbuilt https://example.com --remote
# Run a local analysis and save the results to DB (cache).
unbuilt https://example.com --save
# Run a local analysis with longer timeout
unbuilt https://example.com --timeout 30000
# Get JSON output
unbuilt https://example.com --json
# Run a remote analysis asynchronously. Will return <analysisId>.
unbuilt https://example.com --remote --async
# Use <analysisId> to get results.
unbuilt status <analysisId>
The batch command allows you to analyze multiple websites from a CSV file.
Create a CSV file with a list of URLs to analyze. The file should have a column header named url (or the first column will be used if there's no header).
Example websites.csv:
url
https://example.com
https://example.org
https://example.net
unbuilt batch websites.csv
| Option | Description |
|---|---|
-j, --json | Output results in JSON format |
-s, --save | Save the analysis results to the database |
-o, --output <file> | Save results to a specified output file |
-t, --timeout <seconds> | Max time to wait for each analysis (default: 120) |
-c, --concurrent <number> | Maximum number of concurrent analyses (default: 1) |
# Analyze all websites in the CSV file
unbuilt batch websites.csv
# Analyze multiple websites concurrently (4 at a time)
unbuilt batch websites.csv --concurrent 4
# Save the results to a CSV file
unbuilt batch websites.csv --output results
# Save the results to a JSON file
unbuilt batch websites.csv --output results.json --json
# Analyze websites and save to database
unbuilt batch websites.csv --save
# Set a longer timeout for each analysis
unbuilt batch websites.csv --timeout 300
When using remote analysis in async mode, you can check the status of an ongoing analysis using the status command.
unbuilt status <analysisId>
| Option | Description |
|---|---|
-j, --json | Output results in JSON format |
unbuilt status abc123-def456-ghi789
When using the --session option, the CLI implements the following workflow:
This approach allows for secure and efficient analysis of authenticated websites without:
The session option is particularly useful for analyzing:
By default, Unbuilt displays the analysis results in a human-readable format, showing the detected technologies grouped by category.
When using the --json flag, the output will be in JSON format, which is useful for programmatic processing or integration with other tools.
--concurrent parameter will speed up batch analysis, but requires more system resources.When using the --save option, analysis results are stored in a database for future reference. This allows for:
Available only for admins (you need to set UNBUILT_API_KEY env variable when running. Contact aqson@me.com in order to get access).
unbuilt https://example.com
unbuilt https://app.example.com --session
unbuilt https://example.com --json > analysis.json
unbuilt batch websites.csv --concurrent 4 --output tech-analysis
UNBUILT_API_KEY="xxxxx" unbuilt batch websites.csv --concurrent 4 --save
If you encounter issues with the tool, try the following:
npx playwright install chromium)--session with the analyze command, ensure Chrome is not currently running to avoid profile lock issuesFAQs
CLI tool to analyze technologies used on websites via unbuilt.app
The npm package @unbuilt/cli receives a total of 5 weekly downloads. As such, @unbuilt/cli popularity was classified as not popular.
We found that @unbuilt/cli demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.

Security News
The Axios compromise shows how time-dependent dependency resolution makes exposure harder to detect and contain.