Security News
Research
Data Theft Repackaged: A Case Study in Malicious Wrapper Packages on npm
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Platform agnostic command and shell execution tool, also allows UAC/sudo privilege elevation
command_runner's purpose is to run external commands from python, just like subprocess on which it relies, while solving various problems a developer may face among:
It is compatible with Python 2.7+, tested up to Python 3.12 (backports some newer functionality to Python 3.5) and is tested on both Linux and Windows. It is also compatible with PyPy Python implementation. ...and yes, keeping Python 2.7 compatibility has proven to be quite challenging.
command_runner is a replacement package for subprocess.popen and subprocess.check_output The main promise command_runner can do is to make sure to never have a blocking command, and always get results.
It works as wrapper for subprocess.popen and subprocess.communicate that solves:
live_output=True
argument)command_runner also promises to properly kill commands when timeouts are reached, including spawned subprocesses of such commands. This specific behavior is achieved via psutil module, which is an optional dependency.
Install with pip install command_runner
The following example will work regardless of the host OS and the Python version.
from command_runner import command_runner
exit_code, output = command_runner('ping 127.0.0.1', timeout=10)
pip install command_runner
or download the latest git release
In order to keep the promise to always provide an exit_code, spcial exit codes have been added for the case where none is given. Those exit codes are:
This allows you to use the standard exit code logic, without having to deal with various exceptions.
command_runner has an encoding
argument which defaults to utf-8
for Unixes and cp437
for Windows platforms.
Using cp437
ensures that most cmd.exe
output is encoded properly, including accents and special characters, on most locale systems.
Still you can specify your own encoding for other usages, like Powershell where unicode_escape
is preferred.
from command_runner import command_runner
command = r'C:\Windows\sysnative\WindowsPowerShell\v1.0\powershell.exe --help'
exit_code, output = command_runner(command, encoding='unicode_escape')
Earlier subprocess.popen implementations didn't have an encoding setting so command_runner will deal with encoding for those.
You can also disable command_runner's internal encoding in order to get raw process output (bytes) by passing False boolean.
Example:
from command_runner import command_runner
exit_code, raw_output = command_runner('ping 127.0.0.1', encoding=False)
Note: for live output capture and threading, see stream redirection. If you want to run your application while command_runner gives back command output, the best way to go is queues / callbacks.
command_runner can output a command output on the fly to stdout, eg show output on screen during execution. This is helpful when the command is long, and we need to know the output while execution is ongoing. It is also helpful in order to catch partial command output when timeout is reached or a CTRL+C signal is received. Example:
from command_runner import command_runner
exit_code, output = command_runner('ping 127.0.0.1', shell=True, live_output=True)
Note: using live output relies on stdout pipe polling, which has lightly higher cpu usage.
command_runner has a timeout
argument which defaults to 3600 seconds.
This default setting ensures commands will not block the main script execution.
Feel free to lower / higher that setting with timeout
argument.
Note that a command_runner kills the whole process tree that the command may have generated, even under Windows.
from command_runner import command_runner
exit_code, output = command_runner('ping 127.0.0.1', timeout=30)
Using shell=True
will spawn a shell which will spawn the desired child process.
Be aware that under MS Windows, no direct process tree is available.
We fixed this by walking processes during runtime. The drawback is that orphaned processes cannot be identified this way.
command_runner
has it's own logging system, which will log all sorts of error logs.
If you need to disable it's logging, just run with argument silent.
Be aware that logging.DEBUG log levels won't be silenced, by design.
Example:
from command_runner import command_runner
exit_code, output = command_runner('ping 127.0.0.1', silent=True)
If you also need to disable logging.DEBUG level, you can run the following code which will required logging.CRITICAL only messages which command_runner
never does:
import logging
import command_runner
logging.getLogger('command_runner').setLevel(logging.CRITICAL)
command_runner
allows two different process output capture methods:
method='monitor'
which is default:
method='poller'
:
live_output=True
)Example:
from command_runner import command_runner
exit_code, output = command_runner('ping 127.0.0.1', method='poller')
exit_code, output = command_runner('ping 127.0.0.1', method='monitor')
command_runner
allows to redirect some stream directly into the subprocess it spawns.
Example code
import sys
from command_runner import command_runner
exit_code, output = command_runner("gzip -d", stdin=sys.stdin.buffer)
print("Uncompressed data", output)
The above program, when run with echo "Hello, World!" | gzip | python myscript.py
will show the uncompressed string Hello, World!
You can use whatever file descriptor you want, basic ones being sys.stdin for text input and sys.stdin.buffer for binary input.
command_runner can redirect stdout and/or stderr streams to different outputs:
Unless an output redirector is given for stderr
argument, stderr will be redirected to stdout
stream.
Note that both queues and callback function redirectors require poller
method and will fail if method is not set.
Possible output redirection options are:
By default, stdout writes into a subprocess.PIPE which is read by command_runner and returned as output
variable.
You may also pass any other subprocess.PIPE int values to stdout
or stderr
arguments.
If stdout=False
and/or stderr=False
argument(s) are given, command output will not be saved.
stdout/stderr streams will be redirected to /dev/null
or NUL
depending on platform.
Output will always be None
. See split_streams
for more details using multiple outputs.
Giving stdout
and/or stderr
arguments a string, command_runner
will consider the string to be a file path where stream output will be written live.
Examples:
from command_runner import command_runner
exit_code, output = command_runner('dir', stdout=r"C:/tmp/command_result", stderr=r"C:/tmp/command_error", shell=True)
from command_runner import command_runner
exit_code, output = command_runner('dir', stdout='/tmp/stdout.log', stderr='/tmp/stderr.log', shell=True)
Opening a file with the wrong encoding (especially opening a CP437 encoded file on Windows with UTF-8 coded might endup with UnicodedecodeError.)
Queue(s) will be filled up by command_runner.
In order to keep your program "live", we'll use the threaded version of command_runner which is basically the same except it returns a future result instead of a tuple.
Note: With all the best will, there's no good way to achieve this under Python 2.7 without using more queues, so the threaded version is only compatible with Python 3.3+.
For Python 2.7, you must create your thread and queue reader yourself (see footnote for a Python 2.7 comaptible example).
Threaded command_runner plus queue example:
import queue
from command_runner import command_runner_threaded
output_queue = queue.Queue()
stream_output = ""
thread_result = command_runner_threaded('ping 127.0.0.1', shell=True, method='poller', stdout=output_queue)
read_queue = True
while read_queue:
try:
line = output_queue.get(timeout=0.1)
except queue.Empty:
pass
else:
if line is None:
read_queue = False
else:
stream_output += line
# ADD YOUR LIVE CODE HERE
# Now we may get exit_code and output since result has become available at this point
exit_code, output = thread_result.result()
You might also want to read both stdout and stderr queues. In that case, you can create a read loop just like in the following example. Here we're reading both queues in one loop, so we need to observe a couple of conditions before stopping the loop, in order to catch all queue output:
import queue
from time import sleep
from command_runner import command_runner_threaded
stdout_queue = queue.Queue()
stderr_queue = queue.Queue()
thread_result = command_runner_threaded('ping 127.0.0.1', method='poller', shell=True, stdout=stdout_queue, stderr=stderr_queue)
read_stdout = read_stderr = True
while read_stdout or read_stderr:
try:
stdout_line = stdout_queue.get(timeout=0.1)
except queue.Empty:
pass
else:
if stdout_line is None:
read_stdout = False
else:
print('STDOUT:', stdout_line)
try:
stderr_line = stderr_queue.get(timeout=0.1)
except queue.Empty:
pass
else:
if stderr_line is None:
read_stderr = False
else:
print('STDERR:', stderr_line)
# ADD YOUR LIVE CODE HERE
exit_code, output = thread_result.result()
assert exit_code == 0, 'We did not succeed in running the thread'
The callback function will get one argument, being a str of current stream readings. It will be executed on every line that comes from streams. Example:
from command_runner import command_runner
def callback_function(string):
# ADD YOUR CODE HERE
print('CALLBACK GOT:', string)
# Launch command_runner
exit_code, output = command_runner('ping 127.0.0.1', stdout=callback_function, method='poller')
In some situations, you want a command to be aborted on some external triggers.
That's where stop_on
argument comes in handy.
Just pass a function to stop_on
, as soon as function result becomes True, execution will halt with exit code -251.
Example:
from command_runner import command_runner
def some_function():
return True if we_must_stop_execution
exit_code, output = command_runner('ping 127.0.0.1', stop_on=some_function)
By default, command_runner checks timeouts and outputs every 0.05 seconds.
You can increase/decrease this setting via check_interval
setting which accepts floats.
Example: command_runner(cmd, check_interval=0.2)
Note that lowering check_interval
will increase CPU usage.
command_runner
can provide a subprocess.Popen instance of currently run process as external data.
In order to do so, just declare a function and give it as process_callback
argument.
Example:
from command_runner import command_runner
def show_process_info(process):
print('My process has pid: {}'.format(process.pid))
exit_code, output = command_runner('ping 127.0.0.1', process_callback=show_process_info)
By default, command_runner
returns a tuple like (exit_code, output)
in which output contains both stdout and stderr stream outputs.
You can alter that behavior by using argument split_stream=True
.
In that case, command_runner
will return a tuple like (exit_code, stdout, stderr)
.
Example:
from command_runner import command_runner
exit_code, stdout, stderr = command_runner('ping 127.0.0.1', split_streams=True)
print('exit code:', exit_code)
print('stdout', stdout)
print('stderr', stderr)
command_runner
allows to execute a callback function once it has finished it's execution.
This might help building threaded programs where a callback is needed to disable GUI elements for example.
Example:
from command_runner import command_runner
def do_something():
print("We're done running")
exit_code, output = command_runner('ping 127.0.0.1', on_exit=do_something)
command_runner
can set it's subprocess priority to 'low', 'normal' or 'high', which translate to 15, 0, -15 niceness on Linux and BELOW_NORMAL_PRIORITY_CLASS and HIGH_PRIORITY_CLASS in Windows.
On Linux, you may also directly use priority with niceness int values.
You may also set subprocess io priority to 'low', 'normal' or 'high'.
Example:
from command_runner import command_runner
exit_code, output = command_runner('some_intensive_process', priority='low', io_priority='high')
When running long commands, one might want to know that the program is still running.
The following example will log a message every hour stating that we're still running our command
from command_runner import command_runner
exit_code, output = command_runner('/some/long/command', timeout=None, heartbeat=3600)
command_runner
takes any argument that subprocess.Popen()
would take.
It also uses the following standard arguments:
Note that ALL other subprocess.Popen arguments are supported, since they are directly passed to subprocess.
command_runner package allowing privilege elevation. Becoming an admin is fairly easy with command_runner.elevate You only have to import the elevate module, and then launch your main function with the elevate function.
from command_runner.elevate import elevate
def main():
"""My main function that should be elevated"""
print("Who's the administrator, now ?")
if __name__ == '__main__':
elevate(main)
elevate function handles arguments (positional and keyword arguments).
elevate(main, arg, arg2, kw=somearg)
will call main(arg, arg2, kw=somearg)
The elevate module has a nifty is_admin() function that returns a boolean according to your current root/administrator privileges. Usage:
from command_runner.elevate import is_admin
print('Am I an admin ? %s' % is_admin())
Initially designed for Windows UAC, command_runner.elevate can also elevate privileges on Linux, using the sudo command. This is mainly designed for PyInstaller / Nuitka executables, as it's really not safe to allow automatic privilege elevation of a Python interpreter.
Example for a binary in /usr/local/bin/my_compiled_python_binary
You'll have to allow this file to be run with sudo without a password prompt.
This can be achieved in /etc/sudoers
file.
Example for Redhat / Rocky Linux, where adding the following line will allow the elevation process to succeed without password:
someuser ALL= NOPASSWD:/usr/local/bin/my_compiled_python_binary
The following example is a Python 2.7 compatible threaded implementation that reads stdout / stderr queue in a thread. This only exists for compatibility reasons.
import queue
import threading
from command_runner import command_runner
def read_queue(output_queue):
"""
Read the queue as thread
Our problem here is that the thread can live forever if we don't check a global value, which is...well ugly
"""
stream_output = ""
read_queue = True
while read_queue:
try:
line = output_queue.get(timeout=1)
except queue.Empty:
pass
else:
# The queue reading can be stopped once 'None' is received.
if line is None:
read_queue = False
else:
stream_output += line
# ADD YOUR LIVE CODE HERE
# Create a new queue that command_runner will fill up
output_queue = queue.Queue()
# Create a thread of read_queue() in order to read the queue while command_runner executes the command
read_thread = threading.Thread(
target=read_queue, args=(output_queue)
)
read_thread.daemon = True # thread dies with the program
read_thread.start()
# Launch command_runner, which will be blocking. Your live code goes directly into the threaded function
exit_code, output = command_runner('ping 127.0.0.1', stdout=output_queue, method='poller')
FAQs
Platform agnostic command and shell execution tool, also allows UAC/sudo privilege elevation
We found that command-runner demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Research
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Research
Security News
Attackers used a malicious npm package typosquatting a popular ESLint plugin to steal sensitive data, execute commands, and exploit developer systems.
Security News
The Ultralytics' PyPI Package was compromised four times in one weekend through GitHub Actions cache poisoning and failure to rotate previously compromised API tokens.