
Research
6 Malicious Packagist Themes Ship Trojanized jQuery and FUNNULL Redirect Payloads
Six malicious Packagist packages posing as OphimCMS themes contain trojanized jQuery that exfiltrates URLs, injects ads, and loads FUNNULL-linked redirects.
@openai/codex
Advanced tools
Lightweight coding agent that runs in your terminal
brew install codex
This is the home of the Codex CLI, which is a coding agent from OpenAI that runs locally on your computer. If you are looking for the cloud-based agent from OpenAI, Codex [Web], see https://chatgpt.com/codex.
Codex CLI is an experimental project under active development. It is not yet stable, may contain bugs, incomplete features, or undergo breaking changes. We're building it in the open with the community and welcome:
Help us improve by filing issues or submitting PRs (see the section below for how to contribute)!
Install globally:
brew install codex
Or go to the latest GitHub Release and download the appropriate binary for your platform.
Next, set your OpenAI API key as an environment variable:
export OPENAI_API_KEY="your-api-key-here"
[!NOTE] This command sets the key only for your current terminal session. You can add the
exportline to your shell's configuration file (e.g.,~/.zshrc), but we recommend setting it for the session.
If you have a paid OpenAI account, run the following to start the login process:
codex login
If you complete the process successfully, you should have a ~/.codex/auth.json file that contains the credentials that Codex will use.
If you encounter problems with the login flow, please comment on https://github.com/openai/codex/issues/1243.
--profile to use other modelsCodex also allows you to use other providers that support the OpenAI Chat Completions (or Responses) API.
To do so, you must first define custom providers in ~/.codex/config.toml. For example, the provider for a standard Ollama setup would be defined as follows:
[model_providers.ollama]
name = "Ollama"
base_url = "http://localhost:11434/v1"
The base_url will have /chat/completions appended to it to build the full URL for the request.
For providers that also require an Authorization header of the form Bearer: SECRET, an env_key can be specified, which indicates the environment variable to read to use as the value of SECRET when making a request:
[model_providers.openrouter]
name = "OpenRouter"
base_url = "https://openrouter.ai/api/v1"
env_key = "OPENROUTER_API_KEY"
Providers that speak the Responses API are also supported by adding wire_api = "responses" as part of the definition. Accessing OpenAI models via Azure is an example of such a provider, though it also requires specifying additional query_params that need to be appended to the request URL:
[model_providers.azure]
name = "Azure"
# Make sure you set the appropriate subdomain for this URL.
base_url = "https://YOUR_PROJECT_NAME.openai.azure.com/openai"
env_key = "AZURE_OPENAI_API_KEY" # Or "OPENAI_API_KEY", whichever you use.
# Newer versions appear to support the responses API, see https://github.com/openai/codex/pull/1321
query_params = { api-version = "2025-04-01-preview" }
wire_api = "responses"
Once you have defined a provider you wish to use, you can configure it as your default provider as follows:
model_provider = "azure"
[!TIP] If you find yourself experimenting with a variety of models and providers, then you likely want to invest in defining a profile for each configuration like so:
[profiles.o3]
model_provider = "azure"
model = "o3"
[profiles.mistral]
model_provider = "ollama"
model = "mistral"
This way, you can specify one command-line argument (.e.g., --profile o3, --profile mistral) to override multiple settings together.
Run interactively:
codex
Or, run with a prompt as input (and optionally in Full Auto mode):
codex "explain this codebase to me"
codex --full-auto "create the fanciest todo-list app"
That's it - Codex will scaffold a file, run it inside a sandbox, install any missing dependencies, and show you the live result. Approve the changes and they'll be committed to your working directory.
Codex CLI is built for developers who already live in the terminal and want ChatGPT-level reasoning plus the power to actually run code, manipulate files, and iterate - all under version control. In short, it's chat-driven development that understands and executes your repo.
And it's fully open-source so you can see and contribute to how it develops!
Codex lets you decide how much autonomy you want to grant the agent. The following options can be configured independently:
approval_policy determines when you should be prompted to approve whether Codex can execute a commandsandbox determines the sandbox policy that Codex uses to execute untrusted commandsBy default, Codex runs with approval_policy = "untrusted" and sandbox.mode = "read-only", which means that:
cat, ls, etc.)Though running Codex with the --full-auto option changes the configuration to approval_policy = "on-failure" and sandbox.mode = "workspace-write", which means that:
--cd).npm install a dependency because that requires network access.)Again, these two options can be configured independently. For example, if you want Codex to perform an "exploration" where you are happy for it to read anything it wants but you never want to be prompted, you could run Codex with approval_policy = "never" and sandbox.mode = "read-only".
The mechanism Codex uses to implement the sandbox policy depends on your OS:
sandbox-exec with a profile (-p) that corresponds to the sandbox.mode that was specified.sandbox configuration.Note that when running Linux in a containerized environment such as Docker, sandboxing may not work if the host/container configuration does not support the necessary Landlock/seccomp APIs. In such cases, we recommend configuring your Docker container so that it provides the sandbox guarantees you are looking for and then running codex with sandbox.mode = "danger-full-access" (or more simply, the --dangerously-bypass-approvals-and-sandbox flag) within your container.
| Requirement | Details |
|---|---|
| Operating systems | macOS 12+, Ubuntu 20.04+/Debian 10+, or Windows 11 via WSL2 |
| Git (optional, recommended) | 2.23+ for built-in PR helpers |
| RAM | 4-GB minimum (8-GB recommended) |
| Command | Purpose | Example |
|---|---|---|
codex | Interactive TUI | codex |
codex "..." | Initial prompt for interactive TUI | codex "fix lint errors" |
codex exec "..." | Non-interactive "automation mode" | codex exec "explain utils.ts" |
Key flags: --model/-m, --ask-for-approval/-a.
You can give Codex extra instructions and guidance using AGENTS.md files. Codex looks for AGENTS.md files in the following places, and merges them top-down:
~/.codex/AGENTS.md - personal global guidanceAGENTS.md at repo root - shared project notesAGENTS.md in the current working directory - sub-folder/feature specificsRun Codex head-less in pipelines. Example GitHub Action step:
- name: Update changelog via Codex
run: |
npm install -g @openai/codex@native # Note: we plan to drop the need for `@native`.
export OPENAI_API_KEY="${{ secrets.OPENAI_KEY }}"
codex exec --full-auto "update CHANGELOG for next release"
The Codex CLI can be configured to leverage MCP servers by defining an mcp_servers section in ~/.codex/config.toml. It is intended to mirror how tools such as Claude and Cursor define mcpServers in their respective JSON config files, though the Codex format is slightly different since it uses TOML rather than JSON, e.g.:
# IMPORTANT: the top-level key is `mcp_servers` rather than `mcpServers`.
[mcp_servers.server-name]
command = "npx"
args = ["-y", "mcp-server"]
env = { "API_KEY" = "value" }
[!TIP] It is somewhat experimental, but the Codex CLI can also be run as an MCP server via
codex mcp. If you launch it with an MCP client such asnpx @modelcontextprotocol/inspector codex mcpand send it atools/listrequest, you will see that there is only one tool,codex, that accepts a grab-bag of inputs, including a catch-allconfigmap for anything you might want to override. Feel free to play around with it and provide feedback via GitHub issues.
Because Codex is written in Rust, it honors the RUST_LOG environment variable to configure its logging behavior.
The TUI defaults to RUST_LOG=codex_core=info,codex_tui=info and log messages are written to ~/.codex/log/codex-tui.log, so you can leave the following running in a separate terminal to monitor log messages as they are written:
tail -F ~/.codex/log/codex-tui.log
By comparison, the non-interactive mode (codex exec) defaults to RUST_LOG=error, but messages are printed inline, so there is no need to monitor a separate file.
See the Rust documentation on RUST_LOG for more information on the configuration options.
Below are a few bite-size examples you can copy-paste. Replace the text in quotes with your own task. See the prompting guide for more tips and usage patterns.
| ✨ | What you type | What happens |
|---|---|---|
| 1 | codex "Refactor the Dashboard component to React Hooks" | Codex rewrites the class component, runs npm test, and shows the diff. |
| 2 | codex "Generate SQL migrations for adding a users table" | Infers your ORM, creates migration files, and runs them in a sandboxed DB. |
| 3 | codex "Write unit tests for utils/date.ts" | Generates tests, executes them, and iterates until they pass. |
| 4 | codex "Bulk-rename *.jpeg -> *.jpg with git mv" | Safely renames files and updates imports/usages. |
| 5 | codex "Explain what this regex does: ^(?=.*[A-Z]).{8,}$" | Outputs a step-by-step human explanation. |
| 6 | codex "Carefully review this repo, and propose 3 high impact well-scoped PRs" | Suggests impactful PRs in the current codebase. |
| 7 | codex "Look for vulnerabilities and create a security review report" | Finds and explains security bugs. |
brew install codex
Or go to the latest GitHub Release and download the appropriate binary for your platform.
Admittedly, each GitHub Release contains many executables, but in practice, you likely want one of these:
codex-aarch64-apple-darwin.tar.gzcodex-x86_64-apple-darwin.tar.gzcodex-x86_64-unknown-linux-musl.tar.gzcodex-aarch64-unknown-linux-musl.tar.gzEach archive contains a single entry with the platform baked into the name (e.g., codex-x86_64-unknown-linux-musl), so you likely want to rename it to codex after extracting it.
The GitHub Release also contains a DotSlash file for the Codex CLI named codex. Using a DotSlash file makes it possible to make a lightweight commit to source control to ensure all contributors use the same version of an executable, regardless of what platform they use for development.
# Clone the repository and navigate to the root of the Cargo workspace.
git clone https://github.com/openai/codex.git
cd codex/codex-rs
# Install the Rust toolchain, if necessary.
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh -s -- -y
source "$HOME/.cargo/env"
rustup component add rustfmt
rustup component add clippy
# Build Codex.
cargo build
# Launch the TUI with a sample prompt.
cargo run --bin codex -- "explain this codebase to me"
# After making changes, ensure the code is clean.
cargo fmt -- --config imports_granularity=Item
cargo clippy --tests
# Run the tests.
cargo test
Codex supports a rich set of configuration options documented in codex-rs/config.md.
By default, Codex loads its configuration from ~/.codex/config.toml.
Though --config can be used to set/override ad-hoc config values for individual invocations of codex.
In 2021, OpenAI released Codex, an AI system designed to generate code from natural language prompts. That original Codex model was deprecated as of March 2023 and is separate from the CLI tool.
Any model available with Responses API. The default is o4-mini, but pass --model gpt-4.1 or set model: gpt-4.1 in your config file to override.
o3 or o4-mini not work for me?It's possible that your API account needs to be verified in order to start streaming responses and seeing chain of thought summaries from the API. If you're still running into issues, please let us know!
Codex runs model-generated commands in a sandbox. If a proposed command or file change doesn't look right, you can simply type n to deny the command or give the model feedback.
Not directly. It requires Windows Subsystem for Linux (WSL2) - Codex has been tested on macOS and Linux with Node 22.
Codex CLI does support OpenAI organizations with Zero Data Retention (ZDR) enabled. If your OpenAI organization has Zero Data Retention enabled and you still encounter errors such as:
OpenAI rejected the request. Error details: Status: 400, Code: unsupported_parameter, Type: invalid_request_error, Message: 400 Previous response cannot be used for this organization due to Zero Data Retention.
Ensure you are running codex with --config disable_response_storage=true or add this line to ~/.codex/config.toml to avoid specifying the command line option each time:
disable_response_storage = true
See the configuration documentation on disable_response_storage for details.
We're excited to launch a $1 million initiative supporting open source projects that use Codex CLI and other OpenAI models.
Interested? Apply here.
This project is under active development and the code will likely change pretty significantly. We'll update this message once that's complete!
More broadly we welcome contributions - whether you are opening your very first pull request or you're a seasoned maintainer. At the same time we care about reliability and long-term maintainability, so the bar for merging code is intentionally high. The guidelines below spell out what "high-quality" means in practice and should make the whole process transparent and friendly.
main - e.g. feat/interactive-prompt.codex --help), or relevant example projects.cargo test && cargo clippy --tests && cargo fmt -- --config imports_granularity=Item). CI failures that could have been caught locally slow down the process.main and that you have resolved merge conflicts.If you run into problems setting up the project, would like feedback on an idea, or just want to say hi - please open a Discussion or jump into the relevant issue. We are happy to help.
Together we can make Codex CLI an incredible tool. Happy hacking! :rocket:
All contributors must accept the CLA. The process is lightweight:
Open your pull request.
Paste the following comment (or reply recheck if you've signed before):
I have read the CLA Document and I hereby sign the CLA
The CLA-Assistant bot records your signature in the repo and marks the status check as passed.
No special Git commands, email attachments, or commit footers required.
| Scenario | Command |
|---|---|
| Amend last commit | git commit --amend -s --no-edit && git push -f |
The DCO check blocks merges until every commit in the PR carries the footer (with squash this is just the one).
codexFor admins only.
Make sure you are on main and have no local changes. Then run:
VERSION=0.2.0 # Can also be 0.2.0-alpha.1 or any valid Rust version.
./codex-rs/scripts/create_github_release.sh "$VERSION"
This will make a local commit on top of main with version set to $VERSION in codex-rs/Cargo.toml (note that on main, we leave the version as version = "0.0.0").
This will push the commit using the tag rust-v${VERSION}, which in turn kicks off the release workflow. This will create a new GitHub Release named $VERSION.
If everything looks good in the generated GitHub Release, uncheck the pre-release box so it is the latest release.
Create a PR to update Formula/c/codex.rb on Homebrew.
Have you discovered a vulnerability or have concerns about model output? Please e-mail security@openai.com and we will respond promptly.
This repository is licensed under the Apache-2.0 License.
FAQs
<code>npm i -g @openai/codex</code>or <code>brew install --cask codex</code> Codex CLI is a coding agent from OpenAI that runs locally on your computer. <img src="https://
The npm package @openai/codex receives a total of 2,203,361 weekly downloads. As such, @openai/codex popularity was classified as popular.
We found that @openai/codex demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 12 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Research
Six malicious Packagist packages posing as OphimCMS themes contain trojanized jQuery that exfiltrates URLs, injects ads, and loads FUNNULL-linked redirects.

Security News
The GCVE initiative operated by CIRCL has officially opened its publishing ecosystem, letting organizations issue and share vulnerability identifiers without routing through a central authority.

Security News
The project is retiring its odd/even release model in favor of a simpler annual cadence where every major version becomes LTS.