Security News
Research
Data Theft Repackaged: A Case Study in Malicious Wrapper Packages on npm
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Yet another makefile tool. This one is meant to be super fast, super easy and crossplatform while leaving full power to the user at any time.
Powermake is a utility that compiles C/C++/AS/ASM code, just like Make, Ninja, cmake, or xmake.
His goal is to give full power to the user, while being cross-platform and easier to use than Make.
Powermake extends what is possible to do during the compilation by providing a lot of functions to manipulate your files and a lot of freedom on the way you will implement your makefile.
Powermake is entirely configurable, but for every behavior you haven't explicitly defined, PowerMake will do most of the job for you by detecting installed toolchains, translating compiler flags, etc...
Not by default. PowerMake does not build on top of make, it replaces make.
Since PowerMake 1.20.0, Powermake is able to generate a Makefile, but this will never be as powerful as PowerMake, this feature is only used if you really need a GNU Makefile for compatibility with old deployments/tests scripts.
PowerMake was specifically designed for complex projects that have very complicated compilation steps, with a lot of pre-built tasks that need to be compiled on multiple operating systems with different options.
However, today, even for a 5 files project on Linux with GCC, PowerMake is more convenient than make because out of the box it provides a debug/release mode with different options and different build directories, it can generates a compile_commands.json file for a better integration with vscode, it detects better than make when files need to be recompiled, it provides default rebuild/clean/install options without any configuration, etc...
Cross-Platform:
Gives you complete control of what you are doing. Nothing is hidden and any behavior can be overwritten.
Provides good automatic configurations
Extremely fast:
PowerMake is very young so it changes a lot with each version and you may have to write some features by yourself (the whole point of PowerMake is that you can write missing features). In theory retrocompatibilty is kept between versions, but this might not be true if you are using very specific features, especially undocumented ones.
Because PowerMake gives you full control, the tool can't know what you are doing during the compilation step. For example, if we want to import dependencies from another PowerMake, the only thing we can do for you is run the PowerMake where it stands and scan its output directory. This works well but has some limitations... Another example of this problem is that PowerMake can't know how many steps will be done during the compilation, so if you want PowerMake to print the percent of compilation elapsed, you have to manually specify the number of steps PowerMake will do.
All other Make-like utilities that I know parse a file to understand directives from it.
PowerMake does the opposite. You write a python script, you do whatever you like in this script and you call PowerMake functions to help you compile your code.
This gives you complete control; you can retrieve files from the web, read and write files, and even train a Neural Network if you want, and at any time you can use Powermake functions to help you in your compilation journey.
This also mean that the philosophy is completely different than in tool like make.
When you read a GNU Makefile, you start at the end, the final target and you read each target dependency recursively before executing the task.
In a GNU Makefile, the order of each target doesn't reflect at all the order of execution.
Powermake is read as a script. the order of each instruction has a capital importance. The first step is read, if it needs to be executed, it's executed, then the next step is read, etc...
[!WARNING]
In this documentation, the commandpython
refers to python >= python 3.7.
On old systems,python
andpip
can refer to python 2.7, in this case, usepython3
andpip3
.
pip install -U powermake
Don't hesitate to run this command regularly to benefit from new features and bug fixes.
Version installed from sources might be untested and might not work at all.
# USE AT YOUR OWN RISKS
pip install -U build twine
git clone https://github.com/mactul/powermake
cd powermake
sed -i "s/{{VERSION_PLACEHOLDER}}/0.0.0/g" pyproject.toml
rm -rf ./dist/
python -m build
pip install -U dist/powermake-*-py3-none-any.whl --force-reinstall
This example compiles all .c
and .cpp
files that are recursively in the same folder as the python script and generate an executable named program_test
[!WARNING]
PowerMake calculates all paths from its location, not the location where it is run.
For example,python ./folder/makefile.py
will do the same ascd ./folder && python ./makefile.py
[!NOTE]
In this documentation, we often assume that your makefile is namedmakefile.py
, it makes things easier to explain. Of course, you can name your makefile the name you like the most.
import powermake
def on_build(config: powermake.Config):
files = powermake.get_files("**/*.c", "**/*.cpp")
objects = powermake.compile_files(config, files)
print(powermake.link_files(config, objects))
powermake.run("program_test", build_callback=on_build)
[!NOTE]
This documentation is not complete, if you struggle to do something, do not hesitate to ask a question in the discussions section, it may be that the feature you search for is undocumented.
To benefit from the command line parser, you have to use the powermake.run function.
If no arguments are passed through the command line, the default behavior is to trigger the build callback.
You can also write python makefile.py build
, python makefile.py clean
, python makefile.py install [install_location]
or python makefile.py test
to trigger one of the four different callbacks.
There is also the python makefile.py config
command, which doesn't trigger a callback but enters into an interactive mode for editing a configuration file.
Alternatively, you can also use the option -b
or --build
, -c
or --clean
, -i
or --install
, -t
or --test
and -f
or --config
.
This alternative has a great advantage: you can combine multiple tasks. For example, running python makefile.py -btci
will first trigger the clean callback, then the build callback, the install callback and finally the test callback.
[!IMPORTANT]
The order will always be config -> clean -> build -> install -> test.
You can also replace the -b
argument with -r
(using -br
does the same as -r
) and this will force the makefile to recompile everything, without trying to figure out which file needs to be recompiled.
There are many more options you can add such as -d
(--debug
), -q
(--quiet
), -v
(--verbose
), etc...
All these options can be listed by running python makefile.py -h
or if you haven't created a makefile yet, by directly calling the module python -m powermake -h
or just powermake -h
if the pip installation is in your path.
[!IMPORTANT]
Whilepython makefile.py install
andpython makefile.py --install
takes theinstall_location
as an optional argument, this argument has been disabled with the-i
option, because writing-bic
would have triggered the install callback with the locationc
PowerMake infers the various toolchain programs to be used using everything it knows.
Most of the time, just setting up the C compiler configuration (or the C++ compiler or the linker, etc...) will be sufficient for PowerMake to determinate the whole toolchain.
[!NOTE]
Only the unspecified fields are inferred, the field explicitly assigned in the json configuration (see powermake.Config) are left unchanged. The only exception is when CC, CXX or LD env variables are specified (see below)
The environment variables CC, CXX and LD can be used to overwrite the C compiler, C++ compiler and linker tools path.
For example, the command below will compile using the afl-* toolchain. The C++ compiler and the linker are inferred from the C compiler (but only if they are not specified in the json configuration).
CC=afl-gcc python makefile.py -rvd
This is especially useful to quickly compile with a different toolchain. For example if you want to exceptionally compile an executable for Windows using Linux:
CC=x86_64-w64-mingw32-gcc python makefile.py -rvd
When this is possible, PowerMake tries to translate C/C++/AS/ASM/LD flags.
Most flags are unknown to PowerMake, in this case they are simply transmitted to the compiler and it's your job to ensure the compatibility with the different targets through the use of if/else blocks.
However, the most common flags are automatically translated by PowerMake.
Their is also some flags that PowerMake defines that doesn't exist in any compiler, these are set of useful flags for a situation.
Here is the list of flags translated by PowerMake:
[!NOTE]
Only compiler flags are listed, they are also translated for the linker but most of the time this ends up being just the removal of the flag.
PowerMake Flag | Description |
---|---|
-w | Inhibit all warning messages. |
-Werror | Make all warnings into errors. |
-Wall | Activate warnings about all constructs that are unlikely to be intended. |
-Wextra | Enable flags that are likely to catch bugs even though they may warn on perfectly valid code. |
-Wpedantic | Warn when the code isn't ISO (typically when C/C++ extension are used). |
-pedantic | Same a -Wpedantic |
-Wswitch | Warn when a switch on an enum lacks a case (enabled by -Wall) |
-Wswitch-enum | Like -Wswitch but warns even if their is a default case |
-fanalyzer | When supported by the compiler, run the code into a static analyzer to detect some bugs. |
-Weverything | Enable as most warning as possible, even the noisy and irrelevant ones. |
-Wsecurity | Enable all warnings that have a little chance to catch a security issue. |
-O0 | Disable all optimizations. |
-Og | Enable optimizations that don't interfere with the debugger. This is better than -O0 for debugging because some warnings and analysis require some optimization. |
-O1 | Enable optimization but try to mitigate compile time. |
-O | Same as -O1. |
-O2 | Performs nearly all supported optimizations. |
-O3 | Optimize aggressively for speed. |
-Ofast | Enable all -O3 optimizations + some some optimization that can brake the program. |
-Os | Optimize for size. |
-Oz | Optimize aggressively for size rather than speed. |
-fomit-frame-pointer | Omit the frame pointer in functions that don’t need one. |
-m32 | If supported, switch to x86 architecture (you should prefer using config.set_target_architecture). |
-m64 | If supported, switch to x64 architecture (you should prefer using config.set_target_architecture). |
-march=native | Generate a program optimized for CPUs that have the same capabilities of the host. |
-mtune=native | Optimize a program for the specific CPU of the host, even if this program will run slower or not at all on any other machine. |
-mmx | Enable mmx vectorization. |
-msse | Enable sse vectorization. |
-msse2 | Enable sse2 vectorization. |
-msse3 | Enable sse3 vectorization. |
-mavx | Enable avx vectorization. |
-mavx2 | Enable avx2 vectorization. |
-g | Compile with debug symbols. |
-fPIC | Position Independent Code, required when compiling objects that will be bundled in a shared library. |
-fsecurity=1 | Enable all flags that can mitigate security issues with negligible impact on performance. Warnings included. |
-fsecurity=2 | Enable all flags that can mitigate security issues. |
-fsecurity | same as -fsecurity=2. |
-ffuzzer | Enable the address sanitizer and the fuzzer. |
powermake.run(target_name: str, *, build_callback: callable, clean_callback: callable = default_on_clean, install_callback: callable = default_on_install, test_callback: callable = default_on_test, args_parsed: argparse.Namespace = None)
It's the entry point of most programs.
This function parses the command line and generates a powermake.Config object, containing all the information required for the compilation, from the compiler path to the level of verbosity to use.
Then, depending on the command line arguments, this function will call the clean callback, the build callback, the install callback, or all of them.
The target_name
is a string that will be stored in the config and which will be used for auto-naming. You should set this to the name of your executable or the name of your library.
The build_callback
and the clean_callback
only take 1 argument: The powermake.Config object generated.
Example:
import powermake
def on_build(config: powermake.Config):
print("The build callback was called !")
print(f"Compiling the project {config.target_name}...")
def on_clean(config: powermake.Config):
print("The clean callback was called !")
print(f"Erasing the project {config.target_name}...")
powermake.run("my_project", build_callback=on_build, clean_callback=on_clean)
The install_callback
takes 2 arguments: The powermake.Config object and a string location
that can be None
if the user hasn't specified anything on the command line.
[!TIP]
It's often a very good idea to use theinstall_callback
as a "pre-install script" and then callpowermake.default_on_install
.Example:
import powermake def on_build(config: powermake.Config): print("The build callback was called !") print(f"Compiling the lib {config.target_name}...") def on_install(config: powermake.Config, location: str): if location is None: # No location is explicitly provided so we change the default for our convenience. location = "/usr/local/" # This ensures that the file "my_lib.h" will be exported into /usr/local/include/my_lib/my_lib.h # The .so or .a that corresponds will be copied into /usr/local/lib/my_lib.so config.add_exported_headers("my_lib.h", subfolder="my_lib") powermake.default_on_install(config, location) powermake.run("my_lib", build_callback=on_build, clean_callback=on_clean)
The args_parsed
argument should be left to None in most cases, to understand his purpose, see powermake.ArgumentsParser.parse_args
This is the most important object of this library.
It contains everything you need for your compilation journey. For example, it stores the C compiler alongside the path to the build folder.
Most of the time, this object is created by powermake.run and you don't need to worry about the constructor of this object (which is a bit messy...).
But one thing you have to know is that the construction of this object involves 4 steps:
scan-build
utility../powermake_config.json
just next to the makefile.py
(or whatever name your makefile has)~/.powermake/powermake_config.json
(If you create an env variable named POWERMAKE_CONFIG
, you can override this location.).In theory, after the end of these 4 steps, all members of the powermake.Config
object should be set.
In rare cases, if powermake was enabled to detect a default compiler, the c_compiler
, cpp_compiler
, archiver
, and linker
members can be None.
In this situation, it's your responsibility to give them a value before the call to the powermake.compile_files
function.
If you haven't, we recommend you try compiling your code without setting any powermake_config.json
. In most cases, the automatic detection of your environment does a good job of finding your compiler/system/etc...
We provide a tool to interactively set your configuration file, you use it either by running python -m powermake
or python makefile.py config
, but this tool cannot configure everything, so we provide here an example of a powermake_config.json
.
Here, everything is set, but you should set the bare minimum, especially, you shouldn't set the "host_architecture", it's way better to let the script find it.
Please note that this example is incoherent, but it shows as many options as possible.
{
"nb_jobs": 8,
"compile_commands_dir": ".vscode/",
"host_operating_system": "Linux",
"target_operating_system": "Windows",
"host_architecture": "x64",
"target_architecture": "x86",
"c_compiler": {
"type": "gcc",
"path": "/usr/bin/gcc"
},
"cpp_compiler": {
"type": "clang++"
},
"as_compiler": {
"type": "gcc",
"path": "/usr/bin/gcc"
},
"asm_compiler": {
"type": "nasm",
"path": "/usr/bin/nasm"
},
"archiver": {
"type": "ar",
"path": "/usr/bin/ar"
},
"linker": {
"type": "gnu",
"path": "/usr/bin/cc"
},
"shared_linker": {
"type": "g++",
"path": "/usr/bin/g++"
},
"obj_build_directory": "./build/objects/",
"lib_build_directory": "./build/lib/",
"exe_build_directory": "./build/bin/",
"defines": ["WIN32", "DEBUG"],
"additional_includedirs": ["/usr/local/include", "../my_lib/"],
"shared_libs": ["mariadb", "ssl", "crypto"],
"c_flags": ["-fanalyzer", "-O3"],
"cpp_flags": ["-g", "-O0"],
"c_cpp_flags": ["-Wswitch"],
"as_flags": [],
"asm_flags": ["-s"],
"c_cpp_as_asm_flags": ["-Wall", "-Wextra"],
"ar_flags": [],
"ld_flags": ["-static", "-no-pie"],
"shared_linker_flags": ["-fPIE"],
"exported_headers": ["my_lib.h", ["my_lib_linux.h", "my_lib/linux"], ["my_lib/windows.h", "my_lib/windows"]]
}
All fields that can be set in the powermake_config.json
have the same name in the powermake.Config
object, so we have grouped them below.
Most of the powermake.Config
members can be set in the json configuration but there are 4 exceptions: config.debug
, config.rebuild
, config.nb_total_operations
and config.target_name
.
config.debug: bool
This member is True
if the the makefile is ran in debug mode (with the flag -d or the flag --debug).
Changing this at runtime will not do anything useful, please use powermake.set_debug.
[!IMPORTANT]
This member is one of the 4 exceptions that can't be set in the json configuration.
config.rebuild: bool
This member is True
if the the makefile is ran in rebuild mode (with the flag -r or the flag --rebuild).
If you change its value at runtime, the following steps will change their behavior.
[!IMPORTANT]
This member is one of the 4 exceptions that can't be set in the json configuration.
config.nb_total_operations: int
This member is always 0 after the creation of a new powermake.Config object, if you set a value > 0, if PowerMake is not in quiet mode, it will display a percent of the compilation elapsed at each step, calculated based on the value of this member.
For example, if you compile len(files)
files and you then link all these files, you should set:
config.nb_total_operations = len(files) + 1
[!IMPORTANT]
This member is one of the 4 exceptions that can't be set in the json configuration.
config.target_name: str
The name registered by powermake.run. It's used to determine the default name of executables and libraries.
[!IMPORTANT]
This member is one of the 4 exceptions that can't be set in the json configuration.
config.nb_jobs: int
This number determines how many threads the compilation should be parallelized.
If non-set or set to zero, this number is chosen according to the number of CPU cores you have.
Set this to 1 to disable multithreading.
This can be overwritten by the command-line.
config.compile_commands_dir: str | None
If this is set, powermake.compile_files will generate a compile_commands.json
in the directory specified by this parameter.
config.host_operating_system: str
A string representing the name of your operating system.
[!TIP]
It's not recommended to set this in the json file, the autodetection should do a better job.
config.target_operating_system: str
A string representing the name of the operating system for which the executable is for.
It's used to determine the subfolder of the build folder and for the functions target_is_linux
, target_is_windows
, etc...
On Linux, if you set this to "Windows" (or anything that starts win "win"), it will enable mingw as the default toolchain.
[!WARNING]
Note that if you change this value in the script after the config is loaded, obj_build_directory, lib_build_directory and exe_build_directory will not be updated.
config.host_architecture: str
A string representing the architecture of your system, which can be "amd64", "x64", "x86", "i386", etc...
If you need an easier string to work with, use config.host_simplified_architecture
which can only be "x86", "x64", "arm32" or "arm64".
[!TIP]
It's not recommended to set this in the json file, the autodetection should do a better job.
config.target_architecture: str
A string representing the architecture of the executable, which can be "amd64", "x64", "x86", "i386", etc...
If you need an easier string to work with, use config.target_simplified_architecture
which can only be "x86", "x64", "arm32" or "arm64".
It's used to determine the subfolder of the build folder and to set the compiler architecture (if possible).
[!WARNING]
Note that if you change this value in the script after the config is loaded, the environment will not be reloaded and the compiler will keep the previous architecture, use config.set_target_architecture to reload the environment.
config.c_compiler: powermake.compilers.Compiler
This one is different in the json config and the loaded config.
In the json config, it's defined as an object with 2 fields, like this:
"c_compiler": {
"type": "gcc",
"path": "/usr/bin/gcc"
},
If the "path"
field is omitted, the compiler corresponding to the type is searched in the path. For example if "type"
is "msvc"
, the compiler "cl.exe" is searched in the path.
If the "type"
field is omitted, his value is determined based on the name of the executable and the rest of the toolchain.
"type"
field can have the value "gnu"
, "gcc"
, "clang"
, "mingw"
, "msvc"
or "clang-cl"
.gcc
syntax."c_compiler" {
"type": "gcc",
"path": "C:\\msys64\\ucrt64\\bin\\gcc.exe"
}
[!NOTE]
For mingw on Windows, you should simply setC:\msys64\ucrt64\bin
in your PATH and powermake will be able to find it automatically
"path"
field indicates where is the executable of the compiler. Note that PATH searching is always applied, so "gcc"
work as well as "/usr/bin/gcc"
"c_compiler" {
"type": "gcc",
"path": "i386-elf-gcc"
}
When the powermake.Config
object is loaded, the c_compiler
member is no longer a dict
, it's a virtual class that inherits from powermake.compilers.Compiler
and which can generate compile commands. see [documentation is coming]
config.cpp_compiler: powermake.compilers.Compiler
The cpp_compiler behave exactly like the c_compiler but the possible types are:
gnu++
g++
clang++
msvc
mingw++
You can also use one of the c_compiler types, but in this case you must add a path, or the compilers will not be C++ compilers.
config.as_compiler: powermake.compilers.Compiler
This compiler is used to compile GNU Assembly (.s and .S files)
The as_compiler behave exactly like the c_compiler but the possible types are:
gnu
gcc
clang
mingw
You can also use one of the asm_compiler types if you have to compile a .s or .S file with nasm
or something like that.
config.asm_compiler: powermake.compilers.Compiler
This compiler is used to compile .asm files (generally Intel asm)
The asm_compiler behave exactly like the c_compiler but the only type currently supported is:
nasm
You can also use one of the as_compiler types if you have to compile a .asm file with a GNU assembler.
config.archiver: powermake.archivers.Archiver
The archiver is the program used to create a static library.
The configuration in the json behave exactly like the c_compiler but the possible types are:
gnu
ar
llvm-ar
msvc
mingw
Once loaded, the config.archiver
is a virtual class that inherits from powermake.archivers.Archiver
.
config.linker: powermake.linkers.Linker
The configuration in the json behave exactly like the c_compiler but the possible types are:
gnu
gnu++
gcc
g++
clang
clang++
ld
msvc
mingw
mingw++
Once loaded, the config.linker
is a virtual class that inherits from powermake.linkers.Linker
.
config.shared_linker: powermake.shared_linkers.SharedLinker
The configuration in the json behaves exactly like config.linker but is used to link shared libraries.
Once loaded, the config.shared_linker
is a virtual class that inherits from powermake.shared_linkers.SharedLinker
.
config.obj_build_directory: str
This is the directory in which all .o (or equivalent) will be stored.
[!TIP]
It's not recommended to set this in the json file, the automatic path generation should do a better job, ensuring that debug/release, windows/Linux, or x86/x64 doesn't have any conflict.
config.lib_build_directory: str
This is the directory in which all .a, .so, .lib, .dll, etc... will be stored.
[!TIP]
It's not recommended to set this in the json file, the automatic path generation should do a better job, ensuring that debug/release, windows/Linux, or x86/x64 doesn't have any conflict.
config.exe_build_directory: str
This is the directory in which the linked executable will be stored.
[!TIP]
It's not recommended to set this in the json file, the automatic path generation should do a better job, ensuring that debug/release, windows/Linux, or x86/x64 doesn't have any conflict.
config.defines: list[str]
This is a list of some defines that will be used during the compilation process.
[!TIP]
It's not recommended to set this in the json file, it makes much more sense to add these defines directly in the makefile with config.add_defines, if needed, in a conditional statement likeif config.target_is_windows():
config.additional_includedirs: list[str]
This is a list of additional includedirs that will be used during the compilation process.
[!TIP]
It's not recommended to set this in the json file, it makes much more sense to add these includedirs directly in the makefile with config.add_includedirs, if needed, in a conditional statement likeif config.target_is_windows():
config.shared_libs: list[str]
This is a list of shared libraries that will be used for the link.
[!TIP]
It's not recommended to set this in the json file, it makes much more sense to add these shared libs directly in the makefile with config.add_shared_libs, if needed, in a conditional statement likeif config.target_is_windows():
config.c_flags: list[str]
A list of flags that will be passed to the C compiler (not the C++ compiler).
If in the powermake known flags list, these flags are translated for the specific compiler.
If not, they are simply passed to the compiler.
[!TIP]
It's not recommended to set this in the json file, it makes much more sense to add these flags directly in the makefile with config.add_c_flags, if needed, in a conditional statement likeif config.target_is_windows():
config.cpp_flags: list[str]
A list of flags that will be passed to the C++ compiler (not the C compiler).
If in the powermake known flags list, these flags are translated for the specific compiler.
If not, they are simply passed to the compiler.
[!TIP]
It's not recommended to set this in the json file, it makes much more sense to add these flags directly in the makefile with config.add_cpp_flags, if needed, in a conditional statement likeif config.target_is_windows():
config.c_cpp_flags: list[str]
A list of flags that will be passed to the C compiler AND the C++ compiler.
If in the powermake known flags list, these flags are translated for the specific compiler.
If not, they are simply passed to the compiler.
In the powermake.Config
object, this list doesn't correspond to a real list, it's just a property. You can read the value of config.c_cpp_flags
, it's just the concatenation of c_flags
and cpp_flags
, but you can't edit this property, you have to use config.add_c_cpp_flags and config.remove_c_cpp_flags
[!TIP]
It's not recommended to set this in the json file, it makes much more sense to add these flags directly in the makefile with config.add_c_cpp_flags, if needed, in a conditional statement likeif config.target_is_windows():
config.as_flags: list[str]
A list of flags that will be passed to the GNU assembly compiler.
If in the powermake known flags list, these flags are translated for the specific compiler.
If not, they are simply passed to the compiler.
[!TIP]
It's not recommended to set this in the json file, it makes much more sense to add these flags directly in the makefile with config.add_as_flags, if needed, in a conditional statement likeif config.target_is_windows():
config.asm_flags: list[str]
A list of flags that will be passed to the Intel ASM compiler.
If in the powermake known flags list, these flags are translated for the specific compiler.
If not, they are simply passed to the compiler.
[!TIP]
It's not recommended to set this in the json file, it makes much more sense to add these flags directly in the makefile with config.add_asm_flags, if needed, in a conditional statement likeif config.target_is_windows():
config.c_cpp_as_asm_flags: list[str]
A list of flags that will be passed to the C AND the C++ compiler AND the AS compiler AND the ASM compiler.
This behaves exactly like config.c_cpp_flags, with the same limitations.
config.ar_flags: list[str]
A list of flags that will be passed to the archiver.
If in the powermake known flags list, these flags are translated for the specific archiver.
If not, they are simply passed to the archiver.
[!TIP]
It's not recommended to set this in the json file, it makes much more sense to add these flags directly in the makefile with config.add_ar_flags, if needed, in a conditional statement likeif config.target_is_windows():
config.ld_flags: list[str]
A list of flags that will be passed to the linker.
If in the powermake known flags list, these flags are translated for the specific linker.
If not, they are simply passed to the linker.
[!TIP]
It's not recommended to set this in the json file, it makes much more sense to add these flags directly in the makefile with config.add_ld_flags, if needed, in a conditional statement likeif config.target_is_windows():
config.shared_linker_flags: list[str]
A list of flags that will be passed to the linker when linking a shared library.
This behaves exactly like config.ld_flags, with the same limitations.
config.exported_headers: list[str | tuple[str, str | None]]
This is a list of .h and .hpp that need to be exported in a include
folder during the installation process.
This list can directly contain strings, in this case, the file is exported at the root of the include
folder.
This list can also contain 2 elements lists. The first element is the file to export and the second element is the subfolder of the include
folder in which the file should be exported.
[!TIP]
It's not recommended to set this in the json file, it makes much more sense to add these headers directly in the makefile with config.add_exported_headers, if needed, in a conditional statement likeif config.target_is_windows():
These are all the methods you can call from the powermake.Config
object.
You can access all members to read them, but you should use these methods if possible to set them, to ensure that everything stays coherent.
config.set_debug(debug: bool = True, reset_optimization: bool = False)
If debug
is True, set everything to be in debug mode. It replaces the NDEBUG
defined by DEBUG
, adds the -g
flag and if possible modifies the output dir to change from a release folder to a debug folder.
If debug
is False, set everything to be in release mode. (does the opposite of what's explained above)
If reset_optimization
is set to True, then a debug
to True will put the optimization to -O0
and a debug
to False will put the optimization to -O3
[!NOTE]
If possible you should prefer using the command-line instead of this function.
config.set_optimization(opt_flag: str)
Remove all optimization flags set and add the opt_flag
config.set_target_architecture(architecture: str) -> None:
Reset the target architecture to the one specified.
This will reload compilers to produce code for the good architecture.
config.target_is_windows()
Returns True
if the target operating system is Windows.
This uses the config.target_operating_system member.
config.target_is_linux()
Returns True
if the target operating system is Linux.
This uses the config.target_operating_system member.
config.target_is_mingw()
Returns True
if the target operating system is MinGW
This uses the config.target_operating_system member and the config.c_compiler member.
config.add_defines(*defines: str)
Add new defines to config.defines if they do not exist.
This method is variadic so you can put as many defines as you want.
The list order is preserved.
config.remove_defines(*defines: str)
Remove defines from config.defines if they exists.
This method is variadic so you can put as many defines as you want.
config.add_shared_libs(*shared_libs: str)
Add shared libraries to config.shared_libs if they do not exist.
This method is variadic so you can put as many libs as you want.
The list order is preserved.
config.remove_shared_libs(*shared_libs: str)
Remove shared libraries from config.shared_libs if they exists.
This method is variadic so you can put as many libs as you want.
config.add_includedirs(*includedirs: str)
Add additional includedirs to config.additional_includedirs if they do not exist.
This method is variadic so you can put as many includedirs as you want.
The list order is preserved.
config.remove_includedirs(*includedirs: str)
Remove additional includedirs from config.additional_includedirs if they exists.
This method is variadic so you can put as many includedirs as you want.
The list order is preserved.
config.add_c_flags(*c_flags: str)
Add flags to config.c_flags if they do not exist.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.remove_c_flags(*c_flags: str)
Remove flags from config.c_flags if they exists.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.add_cpp_flags(*cpp_flags: str)
Add flags to config.cpp_flags if they do not exist.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.remove_cpp_flags(*cpp_flags: str)
Remove flags from config.cpp_flags if they exists.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.add_c_cpp_flags(*c_cpp_flags: str)
Add flags to config.c_cpp_flags if they do not exist.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.remove_c_cpp_flags(*c_cpp_flags: str)
Remove flags from config.c_cpp_flags if they exists.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.add_as_flags(*as_flags: str)
Add flags to config.as_flags if they do not exist.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.remove_as_flags(*as_flags: str)
Remove flags from config.as_flags if they exists.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.add_asm_flags(*asm_flags: str)
Add flags to config.asm_flags if they do not exist.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.remove_asm_flags(*asm_flags: str)
Remove flags from config.asm_flags if they exists.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.add_c_cpp_as_asm_flags(*c_cpp_as_asm_flags: str)
Add flags to config.c_cpp_as_asm_flags if they do not exist.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.remove_c_cpp_as_asm_flags(*c_cpp_as_asm_flags: str)
Remove flags from config.c_cpp_as_asm_flags if they exists.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.add_ar_flags(*ar_flags: str)
Add flags to config.ar_flags if they do not exist.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.remove_ar_flags(*ar_flags: str)
Remove flags from config.ar_flags if they exists.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.add_ld_flags(*ld_flags: str)
Add flags to config.ld_flags if they do not exist.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.remove_ld_flags(*ld_flags: str)
Remove flags from config.ld_flags if they exists.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.add_shared_linker_flags(*shared_linker_flags: str)
Add flags to config.shared_linker_flags if they do not exist.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.remove_shared_linker_flags(*shared_linker_flags: str)
Remove flags from config.shared_linker_flags if they exists.
This method is variadic so you can put as many flags as you want.
The list order is preserved.
config.add_exported_headers(*exported_headers: str, subfolder: str = None)
Add exported headers to config.exported_headers if they do not exist.
This method is variadic so you can put as many headers as you want.
The list order is preserved.
By default, there is no subfolder, but we recommend you use config.target_name
for the subfolder
argument.
config.remove_exported_headers(*exported_headers: str, subfolder: str = None)
Remove exported headers from config.exported_headers if they exists.
This method is variadic so you can put as many headers as you want.
The list order is preserved.
config.copy()
Returns a new config object which is a deepcopy of the config. It should be used to compile different set of files with different parameters.
config.empty_copy(local_config: str = None) -> powermake.Config
Generate a new fresh config object without anything inside. By default, even the local config file isn't loaded.
It can be very helpful if you have a local config file specifying a cross compiler but you want to have the default compiler at some point during the compilation step.
powermake.default_on_clean(config: powermake.Config)
This is the default callback used by powermake.run if the clean_callback
is unspecified but you can use it whenever you want.
It cleans the obj, lib, and exe build directories of the config
powermake.default_on_install(config: Config, location: str)
This is the default callback used by powermake.run if the install_callback
is unspecified but you can use it whenever you want.
If you overwrite the install_callback
(which is the normal way of adding exported headers), you should call this function inside your defined callback to have a coherent installation. See the example in powermake.run
Each library compiled is copied and put in a directory named lib
.
Each header in config.exported_headers is copied and put in a directory named include
.
Each executable compiled is copied and put in a directory named bin
.
The final structure is as follows:
location
|_ include
|_ eventual_subfolders
|_ my_lib.h
|_ lib
|_ my_lib.a
|_ bin
|_ my_program
If location
is None, the default is ./install/
.
powermake.get_files(*patterns: str) -> set
Returns a set of filepaths that matches at least one of the patterns.
Authorized patterns are:
*
to match a filename, for example "foo/*.c"
will match "foo/test.c"
but not "foo/bar/test.c"
**/
to match recursive directories, for example, "foo/**/test.c"
will match "foo/test.c"
and "foo/bar/test.c"
.[!IMPORTANT]
"**.c"
will not match"foo/test.c"
, you have to write"**/*.c"
for that.
This function is variadic.
powermake.filter_files(files: set, *patterns: str) -> set
From a given set of filepaths, remove every file that matches at least one of the patterns.
Returns a new filepaths set, filtered.
Authorized patterns are:
*
to match a filename, for example "foo/*.c"
will match "foo/test.c"
but not "foo/bar/test.c"
**/
to match recursive directories, for example, "foo/**/test.c"
will match "foo/test.c"
and "foo/bar/test.c"
.[!IMPORTANT]
"**.c"
will not match"foo/test.c"
, you have to write"**/*.c"
for that.
This function is variadic.
powermake.compile_files(config: powermake.Config, files: set, force: bool = None) -> set
This function is a wrapper of lower-level powermake functions.
From a set or a list of .c
, .cpp
, .cc
, .C
, .s
, .S
and .asm
filepaths and a powermake.Config object, runs the compilation of each file in parallel, with the appropriate compiler and options found in config
.
force
is True, all files are recompiled, even if they are up to date.force
is False, only the files that are not up to date are recompiledforce
is None (default), the value of config.rebuild
is used.Returns a set of .o
(or compiler equivalent) filepaths for the next step.
If files
is a list, the function returns a list with the order preserved.
powermake.archive_files(config: powermake.Config, object_files: set, archive_name: str = None, force: bool = None) -> str
This function is a wrapper of lower-level powermake functions.
From a set of .o
(or compiler equivalent) filepaths, maybe the one returned by powermake.compile_files and a powermake.Config object, runs the command to create a static library with the appropriate archiver and options in config
.
if archive_name
is None, the config.target_name
is concatenated with the prefix "lib"
so if config.target_name
is "XXX"
, the name will be "libXXX"
and then the extension given by the type of archiver is added.
if archiver_name
is not None, only the extension is added, if you want to use this parameter and you want your lib to be "libXXX"
, you have to explicitly write "libXXX"
.
If force
is True, the archive is re-created, even if it's up to date.
If force
is False, the archive is created only if not up to date.
If force
is None (default), the value of config.rebuild
is used.
Returns the path of the static library generated.
powermake.link_files(config: powermake.Config, object_files: set, archives: list = [], executable_name: str = None, force: bool = None) -> str
This function is a wrapper of lower-level powermake functions.
From a set of .o
(or compiler equivalent) filepaths, maybe the one returned by powermake.compile_files and a powermake.Config object, it runs the command to create a n executable with the appropriate linker and options in config
.
if executable_name
is None, the config.target_name
is used with the extension given by the type of linker.
if executable_name
is not None, his value is concatenated with the extension.
If force
is True, the executable is re-created, even if it's up to date.
If force
is False, the executable is created only if not up to date.
If force
is None (default), the value of config.rebuild
is used.
Returns the path of the executable generated.
powermake.link_shared_lib(config: Config, object_files: set, archives: list = [], lib_name: str = None, force: bool = None) -> str
This function is a wrapper of lower-level powermake functions.
From a set of .o
(or compiler equivalent) filepaths, maybe the one returned by powermake.compile_files and a powermake.Config object, runs the command to create a shared library with the appropriate shared linker and options in config
.
if lib_name
is None, the config.target_name
is concatenated with the prefix "lib"
so if config.target_name
is "XXX"
, the name will be "libXXX"
and then the extension given by the type of shared linker is added.
if lib_name
is not None, only the extension is added, if you want to use this parameter and you want your lib to be "libXXX"
, you have to explicitly write "libXXX"
.
If force
is True, the lib is re-created, even if it's up to date.
If force
is False, the lib is created only if not up to date.
If force
is None (default), the value of config.rebuild
is used.
Returns the path of the shared library generated.
[!TIP]
Don't forget to compile theobject_files
with the flag-fPIC
.
powermake.delete_files_from_disk(*filepaths: str)
Remove each filepath and skip if it doesn't exist.
This function is variadic.
powermake.run_another_powermake(config: powermake.Config, path: str, debug: bool = None, rebuild: bool = None, verbosity: int = None, nb_jobs: int = None) -> list
Run a powermake from another directory and return a list of paths to all libraries generated.
If the parameters debug
, rebuild
, verbosity
, and nb_jobs
are left to None, the values in config
are used.
These parameters are passed to the other powermake.
run_command_if_needed(config: powermake.Config, outputfile: str, dependencies: Iterable[str], command: list[str] | str, shell: bool = False, force: bool | None = None, **kwargs: Any) -> str
Run a command generating a file only if this file needs to be re-generated.
Raise a RuntimeError if the command fails.
outputfile
is the file generated by the command and dependencies
is an iterable of every file that if changed should trigger the run of the command.
If shell
is False
:
command
should be a list like argv
. The first element should be an executable and each following element will be distinct parameters.subprocess.run
If shell
is False
:
command
should be a string representing the shell command.If force
is True, the command is run anyway.
If force
is False, the command is only run if outputfile
is up to date with its dependencies.
If force
is None (default), the value of config.rebuild
is used.
**kwargs
is passed to powermake.run
powermake.needs_update(outputfile: str, dependencies: set, additional_includedirs: list) -> bool
[!NOTE]
This function is low-level.
Returns whether or not outputfile
is up to date with all his dependencies.
If dependencies
include C/C++ files and headers, all headers these files include recursively will be added as hidden dependencies.
The additional_includedirs
list is required to discover hidden dependencies. You must set this to the additional includedirs used during the compilation of outputfile
. You can use config.additional_includedirs if needed.
run_command(config: powermake.Config, command: list[str] | str, shell: bool = False, target: str | None = None, output_filter: Callable[[bytes], bytes] | None = None, **kwargs) -> int
[!NOTE]
This function is low-level.
Run a command regardless of what it does and if it's needed.
Returns the exit code of the command.
If shell
is False
:
command
should be a list like argv
. The first element should be an executable and each following element will be distinct parameters.subprocess.run
If shell
is False
:
command
should be a string representing the shell command.target
is currently only been used to print the name of the file generated, but in the future, it might be used to generate a Makefile.
output_filter
is a callback that can be used to edit the output of the command before it's printed to the screen. Warning, the output of the command is in bytes with no encoding determined. Let this to None
to just print the output of the command.
**kwargs
is passed to powermake.run
powermake.Operation(outputfile: str, dependencies: set, config: Config, command: list)
[!NOTE]
This object is low-level.
This is a simple object to execute a command only if needed.
It can be used to easily parallelize multiple commands.
[!TIP]
You can use powermake.compile_files which does that for you, but only for C/C++/AS/ASM files.
[!WARNING]
Directly using powermake.Operation is deprecated
The command should be a list like argv
. The first element should be an executable and each following element will be distinct parameters.
This list is then directly passed to subprocess.run
operation.execute(force: bool = False) -> str
Run the command
if outputfile
is not up to date.
If force
is True, the command is run in any case.
[!NOTE]
This section is advanced.
powermake.ArgumentParser(prog: str = None, description: str = None, **kwargs)
This object is a superset of argparse.ArgumentParser, you can read the documentation of argparse, it works exactly the same.
[!CAUTION]
Use this object and never argparse.ArgumentParser directly or you will break some powermake features. Obviously the usual command line options will be broken but you will also break other features like the powermake.run_another_powermake function. This object ensure that none of this is broken.
See the argparse documentation to understand how to add an argument
parser.parse_args()
Returns a namespace containing each value parsed from the command line.
This namespace can be used to take decisions and should then be passed to powermake.run or powermake.generate_config.
Example:
import powermake
def on_build(config: powermake.Config):
...
parser = powermake.ArgumentParser()
parser.add_argument("--foo")
args_parsed = parser.parse_args()
print(args_parsed.foo)
powermake.run("program_test", build_callback=on_build, args_parsed=args_parsed)
powermake.generate_config(target_name: str, args_parsed: argparse.Namespace = None)
This function behave like the first part of powermake.run, it generate a config object according to the command line. The difference with powermake.run is that it stop at this point and returns the config generated.
It can be helpful if you want a global instance of the config.
In most cases you should let args_parsed
to None, and this function will automatically parse the command line.
[!CAUTION]
You have to callpowermake.run_callback
after the call of this function (but you can obviously do something between these 2 functions)
run_callbacks(config: Config, *, build_callback: callable, clean_callback: callable = default_on_clean, install_callback: callable = default_on_install, test_callback: callable = default_on_test)
This function only make sense after a call of powermake.generate_config.
It takes a newly generated config and run each callback according to the command line.
[!CAUTION]
If neither this function nor powermake.run is used, the powermake.run_another_powermake function will be partially broken
Example:
import powermake
def on_build(config: powermake.Config):
...
config = powermake.generate_config("program_test")
powermake.run_callbacks(config, build_callback=on_build)
Powermake is compatible with clang scan-build utility.
You can run scan-build python makefile.py -rd
to compile your code with a static analysis.
Just remember that scan-build needs your program to be compiled in debug mode, hence the -d
flag.
We recommend you try compiling your code with different static analyzers to catch as many problems as possible.
We especially recommend gcc and the -fanalyzer
option, it's one of the most powerful analyzer we know and PowerMake will ensure that this flag will be removed if unsupported.
Powermake helps you compile with LLVM libfuzzer.
You can add the -ffuzzer
argument to your compiler and your linker with config.add_c_cpp_flags and config.add_ld_flags.
If you are using clang or MSVC, this will enable the address sanitizer and fuzzer. Otherwise, the argument is ignored.
Since PowerMake 1.20.0, PowerMake is able to generate a GNU Makefile You just have to run:
python makefile.py -m
This will rebuild the powermake (here in release mode) and will generate a GNU Makefile. If you want your makefile to be in debug mode, and with a certain toolchain, or with whatever custom argument just run your PowerMake like you would do and add the -m flag.
CC=x86_64-w64-mingw32-gcc python makefile.py -md
[!WARNING]
PowerMake tries its best to generate a valid Makefile, however, because of the PowerMake philosophy, PowerMake can't know exactly what you are doing in your Makefile, every function that is not provided by PowerMake can't be translated in the Makefile.
To get a good Makefile, you should never use thesubprocess
module but instead use powermake.run_command or powermake.run_command_if_needed.If you are doing conditions and loops, it's not a problem at all, but you will not see any condition in the generated Makefile, what's in the Makefile depends on the commands actually generated during the initial PowerMake compilation. (That's why the -m flag also enable the -r flag, to be sure that every command is ran.)
VSCode uses 3 important json files:
compile_commands.json
: Used to know how each file is compiled, which defines and includedirs are used, etc... Can be generated by PowerMake.tasks.json
: Used to define how to compile the project. In our case, we want to run powermake in this filelaunch.json
: Used to launch the debuggerThe compile_commands.json
can easily be generated by powermake with the option -o
(--compile-commands-dir
).
python makefile.py -o .vscode
However, we suggest you to just generate this whenever you compile your code with vscode, by putting the -o
in the tasks.json like explained below.
Here is an example of a functional .vscode/tasks.json
:
{
"tasks": [
{
"type": "cppbuild",
"label": "powermake_compile",
"command": "python",
"args": [
"makefile.py",
"-rvd", /* We rebuild each time so the warnings doesn't disappear, we build in debug mode and in verbose to verify if the good commands are ran. */
"-o",
"${workspaceFolder}/.vscode", /* We regenerate a new compile_commands.json each time to keep track of new files and modifications in the PowerMake */
"--retransmit-colors" /* This is because the vscode task terminal is not a shell but still supports colors, so we have to tell PowerMake to not remove them */
],
"options": {
"cwd": "${workspaceFolder}" /* Where to run `python makefile.py ...` */
}
},
{
/* This is fully optional, this task can be mapped to a shortcut (for example F6) so we can test the compilation of a single file */
"type": "cppbuild",
"label": "compile_single_file",
"command": "python",
"args": [
"makefile.py",
"-r",
"-s",
"${file}",
"--retransmit-colors"
],
"options": {
"cwd": "${workspaceFolder}"
}
},
],
"version": "2.0.0"
}
[!NOTE] You need the Microsoft C/C++ Extension Pack for this to work
Here is an example of a functional .vscode/launch.json
:
{
"configurations": [
{
"name": "PowerMake Debug",
"type": "cppdbg",
"preLaunchTask": "powermake_compile",
"request": "launch",
"program": "${workspaceFolder}/build/Linux/x64/debug/bin/YOUR_PROGRAM", /* Replace this path by the path of your program compiled */
"args": [], /* If your program requires arguments, put them here */
"cwd": "${workspaceFolder}"
}
]
}
[!NOTE] You need the Microsoft C/C++ Extension Pack for this to work
documentation in progress...
FAQs
Yet another makefile tool. This one is meant to be super fast, super easy and crossplatform while leaving full power to the user at any time.
We found that powermake demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Research
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Research
Security News
Attackers used a malicious npm package typosquatting a popular ESLint plugin to steal sensitive data, execute commands, and exploit developer systems.
Security News
The Ultralytics' PyPI Package was compromised four times in one weekend through GitHub Actions cache poisoning and failure to rotate previously compromised API tokens.