What is @lerna/exec?
@lerna/exec is a part of the Lerna monorepo management toolset. It allows you to execute shell commands in the context of each package in a Lerna-managed monorepo. This can be useful for running scripts, building, testing, or performing other tasks across multiple packages in a consistent manner.
What are @lerna/exec's main functionalities?
Execute Shell Commands
This feature allows you to run a shell command in each package managed by Lerna. For example, `lerna exec -- npm run build` will run the `npm run build` command in each package.
lerna exec -- <command>
Filter Packages
You can filter the packages on which to run the command using the `--scope` flag. For example, `lerna exec --scope my-package -- npm test` will run `npm test` only in the `my-package` package.
lerna exec --scope <package-name> -- <command>
Parallel Execution
This feature allows you to run commands in parallel across all packages. For example, `lerna exec --parallel -- npm install` will run `npm install` in all packages simultaneously.
lerna exec --parallel -- <command>
Other packages similar to @lerna/exec
npm-run-all
npm-run-all is a CLI tool to run multiple npm-scripts in parallel or sequential. It is not specifically designed for monorepos but can be used to run scripts across multiple packages by chaining commands.
concurrently
concurrently is a package that allows you to run multiple commands concurrently. It is useful for running multiple npm scripts at the same time, but it does not have the monorepo-specific features that @lerna/exec provides.
nx
Nx is a smart, fast, and extensible build system with first-class monorepo support and powerful integrations. It offers more advanced features compared to @lerna/exec, such as task scheduling, caching, and more.
@lerna/exec
Execute an arbitrary command in each package
Install lerna for access to the lerna
CLI.
Usage
$ lerna exec -- <command> [..args]
$ lerna exec -- rm -rf ./node_modules
$ lerna exec -- protractor conf.js
Run an arbitrary command in each package.
A double-dash (--
) is necessary to pass dashed flags to the spawned command, but is not necessary when all the arguments are positional.
The name of the current package is available through the environment variable LERNA_PACKAGE_NAME
:
$ lerna exec -- npm view \$LERNA_PACKAGE_NAME
You may also run a script located in the root dir, in a complicated dir structure through the environment variable LERNA_ROOT_PATH
:
$ lerna exec -- node \$LERNA_ROOT_PATH/scripts/some-script.js
Options
lerna exec
accepts all filter flags.
$ lerna exec --scope my-component -- ls -la
The commands are spawned in parallel, using the concurrency given (except with --parallel
).
The output is piped through, so not deterministic.
If you want to run the command in one package after another, use it like this:
$ lerna exec --concurrency 1 -- ls -la
--stream
Stream output from child processes immediately, prefixed with the originating
package name. This allows output from different packages to be interleaved.
$ lerna exec --stream -- babel src -d lib
--parallel
Similar to --stream
, but completely disregards concurrency and topological sorting, running a given command or script immediately in all matching packages with prefixed streaming output. This is the preferred flag for long-running processes such as babel src -d lib -w
run over many packages.
$ lerna exec --parallel -- babel src -d lib -w
Note: It is advised to constrain the scope of this command when using
the --parallel
flag, as spawning dozens of subprocesses may be
harmful to your shell's equanimity (or maximum file descriptor limit,
for example). YMMV
--no-bail
$ lerna exec --no-bail <command>
By default, lerna exec
will exit with an error if any execution returns a non-zero exit code.
Pass --no-bail
to disable this behavior, executing in all packages regardless of exit code.
--no-prefix
Disable package name prefixing when output is streaming (--stream
or --parallel
).
This option can be useful when piping results to other processes, such as editor plugins.
--profile
Profiles the command executions and produces a performance profile which can be analyzed using DevTools in a
Chromium-based browser (direct url: devtools://devtools/bundled/devtools_app.html
). The profile shows a timeline of
the command executions where each execution is assigned to an open slot. The number of slots is determined by the
--concurrency
option and the number of open slots is determined by --concurrency
minus the number of ongoing
operations. The end result is a visualization of the parallel execution of your commands.
The default location of the performance profile output is at the root of your project.
$ lerna exec --profile -- <command>
Note: Lerna will only profile when topological sorting is enabled (i.e. without --parallel
and --no-sort
).
--profile-location <location>
You can provide a custom location for the performance profile output. The path provided will be resolved relative to the current working directory.
$ lerna exec --profile --profile-location=logs/profile/ -- <command>