Security News
Maven Central Adds Sigstore Signature Validation
Maven Central now validates Sigstore signatures, making it easier for developers to verify the provenance of Java packages.
Welcome to django_logging Documentation!
The django_logging
is a powerful yet simple Django package that extends and enhances Python's built-in logging
without relying on any third-party libraries. Our goal is to keep things straightforward while providing flexible and customizable logging solutions that are specifically designed for Django applications.
One of the key advantages of django_logging
is its seamless integration. Get started with django_logging in your existing projects without refactoring any code. Even if you're already using the built-in logging module, you can instantly upgrade to advanced features with just a simple installation. No extra changes or complicated setup required!
imagine you have a Django package that was developed a few years ago and already uses Python's built-in logging
. Refactoring the entire codebase to use another logging package would be a daunting task. But with django_logging
, you don't have to worry about that. Simply install django_logging and enjoy all its advanced features with logging each LEVEL
in separate files with three extra formats (json
, xml
, flat
) without having to make any changes to your existing code.
The documentation is organized into the following sections:
Getting started with django_logging
is simple. Follow these steps to get up and running quickly:
first, Install django_logging
via pip:
$ pip install dj-logging
Add django_logging
to your INSTALLED_APPS
in your Django settings file:
INSTALLED_APPS = [
# ...
'django_logging',
# ...
]
Start your Django Development server to verify the installation:
python manage.py runserver
when the server starts, you'll see an initialization message like this in your console:
INFO | 'datetime' | django_logging | Logging initialized with the following configurations:
Log File levels: ['DEBUG', 'INFO', 'WARNING', 'ERROR', 'CRITICAL'].
Log files are being written to: logs.
Console output level: DEBUG.
Colorize console: True.
Log date format: %Y-%m-%d %H:%M:%S.
Email notifier enabled: False.
By default, django_logging will log each level to its own file:
logs/debug.log
logs/info.log
logs/warning.log
logs/error.log
logs/critical.log
In addition, logs will be displayed in colorized mode in the console
, making it easier to distinguish between different log levels.
That's it! django_logging
is ready to use. For further customization, refer to the Settings section
Once django_logging
is installed and added to your INSTALLED_APPS, you can start using it right away. The package provides several features to customize and enhance logging in your Django project. Below is a guide on how to use the various features provided by django_logging
.
At its core, django_logging
is built on top of Python’s built-in logging module. This means you can use the standard logging module to log messages across your Django project. Here’s a basic example of logging usage:
import logging
logger = logging.getLogger(__name__)
logger.debug("This is a debug message")
logger.info("This is an info message")
logger.warning("This is a warning message")
logger.error("This is an error message")
logger.critical("This is a critical message")
These logs will be handled according to the configurations set up by django_logging
, using either the default settings or any custom settings you've provided.
You can use the config_setup
context manager to temporarily apply django_logging
configurations within a specific block of code.
Example usage:
from django_logging.utils.context_manager import config_setup
import logging
logger = logging.getLogger(__name__)
def foo():
logger.info("This log will use the configuration set in the context manager!")
with config_setup():
""" Your logging configuration changes here"""
foo()
# the logging configuration will restore to what it was before, in here outside of with block
AUTO_INITIALIZATION_ENABLE
must be set to False
in the settings to use the context manager. If it is True
, attempting to use the context manager will raise a ValueError
with the message:"You must set 'AUTO_INITIALIZATION_ENABLE' to False in DJANGO_LOGGING in your settings to use the context manager."
To send specific logs as email, use the log_and_notify_admin
function. Ensure that the ENABLE
option in LOG_EMAIL_NOTIFIER
is set to True
in your settings:
from django_logging.utils.log_email_notifier.log_and_notify import log_and_notify_admin
import logging
logger = logging.getLogger(__name__)
log_and_notify_admin(logger, logging.INFO, "This is a log message")
You can also include additional request information (ip_address
and browser_type
) in the email by passing an extra
dictionary:
from django_logging.utils.log_email_notifier.log_and_notify import log_and_notify_admin
import logging
logger = logging.getLogger(__name__)
def some_view(request):
log_and_notify_admin(
logger,
logging.INFO,
"This is a log message",
extra={"request": request}
)
LOG_EMAIL_NOTIFIER["ENABLE"]
must be set to True
. If it is not enabled, calling log_and_notify_admin
will raise a ValueError
:"Email notifier is disabled. Please set the 'ENABLE' option to True in the 'LOG_EMAIL_NOTIFIER' in DJANGO_LOGGING in your settings to activate email notifications."
Additionally, ensure that all Required Email Settings are configured in your Django settings file.
The execution_tracker
decorator is used to log the performance metrics of any function. It tracks execution time and the number of database queries for decorated function (if enabled).
Example Usage:
import logging
from django_logging.decorators import execution_tracker
@execution_tracker(logging_level=logging.INFO, log_queries=True, query_threshold=10, query_exceed_warning=False)
def some_function():
# function code
pass
Arguments:
logging_level
(int
): The logging level at which performance details will be logged. Defaults to logging.INFO
.
log_queries
(bool
): Whether to log the number of database queries for decorated function(if DEBUG
is True
in your settings). If log_queries=True
, the number of queries will be included in the logs. Defaults to False
.
query_threshold
(int
): If provided, the number of database queries will be checked. If the number of queries exceeded the given threshold, a warning will be logged. Defaults to None
.
query_exceed_warning
(int
): Whether to log a WARNING
message if number of queries exceeded the threshold. Defaults to False
.
Example Log Output:
INFO | 'datetime' | execution_tracking | Performance Metrics for Function: 'some_function'
Module: some_module
File: /path/to/file.py, Line: 123
Execution Time: 0.21 second(s)
Database Queries: 15 queries (exceeds threshold of 10)
If log_queries
is set to True
but DEBUG
is False
, a WARNING will be logged:
WARNING | 'datetime' | execution_tracking | DEBUG mode is disabled, so database queries are not tracked. To include number of queries, set `DEBUG` to `True` in your django settings.
The django_logging.middleware.RequestLogMiddleware
is a middleware that logs detailed information about each incoming request to the server. It is capable of handling both synchronous and asynchronous requests.
To enable this middleware, add it to your Django project's MIDDLEWARE
setting:
MIDDLEWARE = [
#...
'django_logging.middleware.RequestLogMiddleware',
#...
]
Request Information Logging:
Logs the following details at the start of each request:
Example log at request start:
INFO | 2024-10-03 16:29:47 | request_middleware | REQUEST STARTED:
method=GET
path=/admin/
query_params=None
referrer=http://127.0.0.1:8000/admin/login/?next=/admin/
| {'ip_address': '127.0.0.1', 'request_id': '09580021-6bff-4b82-99b5-c52406b2cc91',
'user_agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36'}
Response Information Logging:
Logs the following details after the request is processed:
Example log at request completion:
INFO | 2024-10-03 16:29:47 | request_middleware | REQUEST FINISHED:
user=[mehrshad (ID:1)]
status_code=200
content_type=[text/html; charset=utf-8]
response_time=[0.08 second(s)]
3 SQL QUERIES EXECUTED
Query1={'Time': 0.000(s), 'Query':
[SELECT "django_session"."session_key", "django_session"."session_data", "django_session"."expire_date" FROM "django_session"
WHERE ("django_session"."expire_date" > '2024-10-03 12:59:47.812918' AND "django_session"."session_key" = 'uq0nrbglazfm4cy656w3451xydfirh45') LIMIT 21]}
Query2={'Time': 0.001(s), 'Query':
[SELECT "auth_user"."id", "auth_user"."password", "auth_user"."last_login", "auth_user"."is_superuser", "auth_user".
"username", "auth_user"."first_name", "auth_user"."last_name", "auth_user"."email", "auth_user"."is_staff", "auth_user".
"is_active", "auth_user"."date_joined" FROM "auth_user" WHERE "auth_user"."id" = 1 LIMIT 21]}
| {'ip_address': '127.0.0.1', 'request_id': '09580021-6bff-4b82-99b5-c52406b2cc91',
'user_agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36'}
Request ID:
X-Request-ID
header if provided). This request ID is included in logs and can help with tracing specific requests.SQL Query Logging:
Note: to enable Query logging, you can set LOG_SQL_QUERIES_ENABLE
to True
in settings. for more details, refer to the Settings.
Streaming Response Support:
User Information:
IP Address and User-Agent:
We use context variables in RequestLogMiddleware
to store the following information for each request:
These context variables can be accessed and used in other parts of the logging system or during the request processing lifecycle.
This middleware monitors the size of the log directory and checks it weekly.
It triggers the logs_size_audit
management command to assess the total size of the log files.
If the log directory size exceeds a certain limit (LOG_DIR_SIZE_LIMIT
), the middleware sends a warning email to the ADMIN_EMAIL
asynchronously.
To enable this middleware, add it to your Django project's MIDDLEWARE
setting:
MIDDLEWARE = [
#...
'django_logging.middleware.MonitorLogSizeMiddleware',
#...
]
django_logging
includes a powerful ContextVarManager
class, allowing you to manage context variables dynamically within your logging system. These variables are bound to the current context and automatically included in your log entries via the %(context)s
placeholder in the log format.
The ContextVarManager
provides several methods to manage context variables efficiently:
bind(**kwargs)
:
The bind
method allows you to bind key-value pairs as context variables that will be available during the current context. These variables can be used to add contextual information to log entries.
Example:
from django_logging.contextvar import manager
import logging
logger = logging.getLogger(__name__)
# Binding context variables
manager.bind(user="test_user", request_id="abc123")
logger.info("Logging with context")
Log Output:
INFO | 2024-10-03 12:00:00 | Logging with context | {'user': 'test_user', 'request_id': 'abc123'}
unbind(key: str)
:
The unbind
method removes a specific context variable by its key. It effectively clears the context variable from the log entry.
Example:
from django_logging.contextvar import manager
import logging
logger = logging.getLogger(__name__)
manager.unbind("user")
logger.info("Logging without the 'user' context")
Log Output:
INFO | 2024-10-03 12:05:00 | Logging without the 'user' context | {'request_id': 'abc123'}
batch_bind(**kwargs)
:
The batch_bind
method binds multiple context variables at once and returns tokens that can be used later to reset the variables to their previous state. This is useful when you need to bind a group of variables temporarily.
Example:
from django_logging.contextvar import manager
import logging
logger = logging.getLogger(__name__)
tokens = manager.batch_bind(user="admin_user", session_id="xyz789")
logger.info("Logging with batch-bound context")
Log Output:
INFO | 2024-10-03 12:10:00 | Logging with batch-bound context | {'user': 'admin_user', 'session_id': 'xyz789'}
reset(tokens: Dict[str, contextvars.Token])
:
The reset
method allows you to reset context variables to their previous state using tokens returned by batch_bind
.
Example:
from django_logging.contextvar import manager
import logging
logger = logging.getLogger(__name__)
manager.reset(tokens)
logger.info("Context variables have been reset")
Log Output:
INFO | 2024-10-03 12:15:00 | Context variables have been reset |
clear()
:
The clear
method clears all bound context variables at once, effectively removing all contextual data from the log entry.
Example:
from django_logging.contextvar import manager
import logging
logger = logging.getLogger(__name__)
manager.clear()
logger.info("All context variables cleared")
Log Output:
INFO | 2024-10-03 12:20:00 | All context variables cleared |
get_contextvars()
:
The get_contextvars
method retrieves the current context variables available in the context. This method is useful for inspecting or debugging the context.
Example:
from django_logging.contextvar import manager
import logging
logger = logging.getLogger(__name__)
current_context = manager.get_contextvars()
print(current_context) # Output: {'user': 'admin_user', 'session_id': 'xyz789'}
merge_contexts(bound_context: Dict[str, Any], local_context: Dict[str, Any])
:
The merge_contexts
method merges two dictionaries of context variables, giving priority to the bound_context
. This is useful when you want to combine different sources of context data.
Example:
from django_logging.contextvar import manager
bound_context = {"user": "admin_user"}
local_context = {"request_id": "12345"}
merged_context = manager.merge_contexts(bound_context, local_context)
print(merged_context) # Output: {'user': 'admin_user', 'request_id': '12345'}
get_merged_context(bound_logger_context: Dict[str, Any])
:
The get_merged_context
method combines the bound logger context with the current context variables, allowing you to retrieve a single dictionary with all the context data.
Example:
from django_logging.contextvar import manager
bound_logger_context = {"app_name": "my_django_app"}
merged_context = manager.get_merged_context(bound_logger_context)
print(merged_context) # Output: {'app_name': 'my_django_app', 'user': 'admin_user'}
scoped_context(**kwargs)
:
The scoped_context
method provides a context manager to bind context variables temporarily for a specific block of code. After the block completes, the context variables are automatically unbound.
Example:
from django_logging.contextvar import manager
import logging
logger = logging.getLogger(__name__)
with manager.scoped_context(transaction_id="txn123"):
logger.info("Scoped context active")
logger.info("Scoped context no longer active")
Log Output:
INFO | 2024-10-03 12:30:00 | Scoped context active | {'transaction_id': 'txn123'}
INFO | 2024-10-03 12:30:10 | Scoped context no longer active |
This Django management command zips the log directory and emails it to a specified email address. The command is useful for retrieving logs remotely and securely, allowing administrators to receive log files via email.
DJANGO_LOGGING['LOG_DIR']
).To execute the command, use the following syntax:
python manage.py send_logs <email>
If you want to send the logs to admin@example.com
, the command would be:
python manage.py send_logs admin@example.com
This will zip the log directory and send it to admin@example.com
with the subject "Log Files".
This Django management command allows you to locate and prettify JSON log files stored in a log directory that generated by django_logging
. It takes JSON files from the log directory, formats them into a clean, readable structure, and saves the result in the pretty
directory.
.json
files in the json
log directory.pretty
subdirectory, preserving the original files.json
subdirectory within your defined log directory. If it doesn't exist, an error is displayed..json
file found in the directory is processed:
pretty
subdirectory with the prefix formatted_
.To execute the command, use the following syntax:
python manage.py generate_pretty_json
Running the command will process the following files:
logs/json/error.json
➡ logs/json/pretty/formatted_error.json
logs/json/info.json
➡ logs/json/pretty/formatted_info.json
This Django management command allows you to locate and reformat XML log files stored in a log directory generated by django_logging
. It processes XML files by wrapping their content in a <logs>
element and saves the reformatted files in a separate directory.
.xml
files in the xml
log directory.<logs>
element, ensuring consistency in structure.pretty
subdirectory with the prefix formatted_
, preserving the original files.xml
subdirectory within your defined log directory. If it doesn't exist, an error is displayed..xml
file found in the directory is processed:
<logs>
element.pretty
subdirectory with the prefix formatted_
.To execute the command, use the following syntax:
python manage.py generate_pretty_xml
Running the command will process the following files:
logs/xml/error.xml
➡ logs/xml/pretty/formatted_error.xml
logs/xml/info.xml
➡ logs/xml/pretty/formatted_info.xml
This Django management command monitors the size of your log directory. If the total size exceeds the configured limit, the command sends a warning email notification to the admin. The size check helps maintain log storage and prevent overflow by ensuring administrators are informed when logs grow too large.
DJANGO_LOGGING['LOG_DIR']
or the Default. If the directory doesn't exist, an error is logged and displayed.LOG_DIR_SIZE_LIMIT
in settings.To execute the command, use the following syntax:
python manage.py logs_size_audit
Running the command when the log directory exceeds the size limit will trigger an email to the administrator:
1200 MB
(limit: 1024 MB
)ADMIN_EMAIL
configured in Django settings.The LogiBoard
in the django_logging
package provides an interface for uploading, extracting, and exploring log files that have been zipped and shared via email. This allows for easier log management.
Note: Superuser Access Only
Only superusers have access to the LogiBoard URL. If accessed by a non-superuser, they will get Access Denied page made by Lazarus.
Add to URLs: Include the following in your URL configuration to enable access to LogiBoard:
from django.urls import path, include
urlpatterns = [
# ...
path('django-logging/', include('django_logging.urls')),
# ...
]
LogiBoard will be accessible at the following link in your project after setting it up:
/django-logging/log-iboard/
Static Files: Run the following command to collect and prepare the static files necessary for LogiBoard's interface:
python manage.py collectstatic
The collectstatic
command is required to gather and serve static assets (such as JavaScript, CSS, and images) used by LogiBoard. This ensures the front-end of the log upload and browsing interface works correctly.
Enable LogiBoard:
In your settings file, ensure the following setting is added under DJANGO_LOGGING
:
DJANGO_LOGGING = {
# ...
"INCLUDE_LOG_iBOARD": True,
# ...
}
This setting ensures that LogiBoard is available in your project.
Logiboard is designed to help administrators easily review log files that have been zipped and sent via email (generated by the send_logs
management command). This is particularly useful for remotely retrieving log files from production systems or shared environments.
/django-logging/log-iboard/
in your project to open the LogiBoard interface..log
, .txt
, .json
, and .xml
.LogiBoard makes it simple to manage and review logs, ensuring you can quickly access and analyze critical log data.
By default, django_logging
uses a built-in configuration that requires no additional setup. However, you can customize the logging settings by adding the DJANGO_LOGGING
dictionary configuration to your Django settings
file.
DJANGO_LOGGING = {
"AUTO_INITIALIZATION_ENABLE": True,
"INITIALIZATION_MESSAGE_ENABLE": True,
"LOG_SQL_QUERIES_ENABLE": True,
"LOG_FILE_LEVELS": ["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"],
"LOG_DIR": "logs",
"LOG_DIR_SIZE_LIMIT": 1024, # MB
"LOG_FILE_FORMATS": {
"DEBUG": 1,
"INFO": 1,
"WARNING": 1,
"ERROR": 1,
"CRITICAL": 1,
},
"LOG_FILE_FORMAT_TYPES": {
"DEBUG": "normal",
"INFO": "normal",
"WARNING": "normal",
"ERROR": "normal",
"CRITICAL": "normal",
},
"EXTRA_LOG_FILES": { # for extra formats (JSON, XML)
"DEBUG": False,
"INFO": False,
"WARNING": False,
"ERROR": False,
"CRITICAL": False,
},
"LOG_CONSOLE_LEVEL": "DEBUG",
"LOG_CONSOLE_FORMAT": 1,
"LOG_CONSOLE_COLORIZE": True,
"LOG_DATE_FORMAT": "%Y-%m-%d %H:%M:%S",
"LOG_EMAIL_NOTIFIER": {
"ENABLE": False,
"NOTIFY_ERROR": False,
"NOTIFY_CRITICAL": False,
"LOG_FORMAT": 1,
"USE_TEMPLATE": True,
},
}
Here's a breakdown of the available configuration options:
AUTO_INITIALIZATION_ENABLE
bool
True
INITIALIZATION_MESSAGE_ENABLE
bool
True
INCLUDE_LOG_iBOARD
bool
False
LOG_SQL_QUERIES_ENABLE
bool
RequestLogMiddleware
logs. When enabled, SQL queries executed in each request will be included in the log output.False
LOG_FILE_LEVELS
list[str]
["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"]
LOG_DIR
str
"logs"
LOG_DIR_SIZE_LIMIT
int
MonitorLogSizeMiddleware
is enabled, a warning email will be sent to the admin weekly.1024 MB
(1 GB)LOG_FILE_FORMATS
dict[str, int | str]
int
representing predefined formats or a custom str
format.1
for all levels.LOG_FILE_FORMAT_TYPES
dict[str, str]
normal
, JSON
, XML
, FLAT
) for each log level. The keys are log levels, and the values are the format types.
Format Types:
normal
: Standard text log.JSON
: Structured logs in JSON format.XML
: Structured logs in XML format.FLAT
: logs with Flat format.normal
for all levels.EXTRA_LOG_FILES
dict[str, bool]
JSON
or XML
formats should be created for each log level. When set to True
for a specific level, a dedicated directory (e.g., logs/json
or logs/xml
) will be created with files like info.json
or info.xml
. if False
, json and xml logs will be written to .log
files.False
for all levels.LOG_CONSOLE_LEVEL
str
"DEBUG"
LOG_CONSOLE_FORMAT
int | str
LOG_FILE_FORMATS
.1
LOG_CONSOLE_COLORIZE
bool
True
LOG_DATE_FORMAT
str
"%Y-%m-%d %H:%M:%S"
LOG_EMAIL_NOTIFIER
Type: dict
Description: Configures the email notifier for sending log-related alerts.
ENABLE
:
bool
False
NOTIFY_ERROR
:
bool
ERROR
log level events.Default: False
NOTIFY_CRITICAL
:
bool
CRITICAL
log level events.False
LOG_FORMAT
:
int | str
1
USE_TEMPLATE
:
bool
LAZARUS
email template.The django_logging
package provides predefined log format options that you can use in configuration. Below are the available format options:
FORMAT_OPTIONS = {
1: "%(levelname)s | %(asctime)s | %(module)s | %(message)s | %(context)s",
2: "%(levelname)s | %(asctime)s | %(context)s | %(message)s",
3: "%(levelname)s | %(context)s | %(message)s",
4: "%(context)s | %(asctime)s - %(name)s - %(levelname)s - %(message)s",
5: "%(levelname)s | %(message)s | %(context)s | [in %(pathname)s:%(lineno)d]",
6: "%(asctime)s | %(context)s | %(levelname)s | %(message)s",
7: "%(levelname)s | %(asctime)s | %(context)s | in %(module)s: %(message)s",
8: "%(levelname)s | %(context)s | %(message)s | [%(filename)s:%(lineno)d]",
9: "[%(asctime)s] | %(levelname)s | %(context)s | in %(module)s: %(message)s",
10: "%(asctime)s | %(processName)s | %(context)s | %(name)s | %(levelname)s | %(message)s",
11: "%(asctime)s | %(context)s | %(threadName)s | %(name)s | %(levelname)s | %(message)s",
12: "%(levelname)s | [%(asctime)s] | %(context)s | (%(filename)s:%(lineno)d) | %(message)s",
13: "%(levelname)s | [%(asctime)s] | %(context)s | {%(name)s} | (%(filename)s:%(lineno)d): %(message)s",
14: "[%(asctime)s] | %(levelname)s | %(context)s | %(name)s | %(module)s | %(message)s",
15: "%(levelname)s | %(context)s | %(asctime)s | %(filename)s:%(lineno)d | %(message)s",
16: "%(levelname)s | %(context)s | %(message)s | [%(asctime)s] | %(module)s",
17: "%(levelname)s | %(context)s | [%(asctime)s] | %(process)d | %(message)s",
18: "%(levelname)s | %(context)s | %(asctime)s | %(name)s | %(message)s",
19: "%(levelname)s | %(asctime)s | %(context)s | %(module)s:%(lineno)d | %(message)s",
20: "[%(asctime)s] | %(levelname)s | %(context)s | %(thread)d | %(message)s",
}
You can reference these formats by their corresponding integer keys in your logging configuration settings.
To use the email notifier, the following email settings must be configured in your settings.py
:
EMAIL_HOST
: The host to use for sending emails.EMAIL_PORT
: The port to use for the email server.EMAIL_HOST_USER
: The username to use for the email server.EMAIL_HOST_PASSWORD
: The password to use for the email server.EMAIL_USE_TLS
: Whether to use a TLS (secure) connection when talking to the email server.DEFAULT_FROM_EMAIL
: The default email address to use for sending emails.ADMIN_EMAIL
: The email address where log notifications will be sent. This is the recipient address used by the email notifier to deliver the logs.Example Email Settings:
EMAIL_HOST = 'smtp.example.com'
EMAIL_PORT = 587
EMAIL_HOST_USER = 'your-email@example.com'
EMAIL_HOST_PASSWORD = 'your-password'
EMAIL_USE_TLS = True
DEFAULT_FROM_EMAIL = 'your-email@example.com'
ADMIN_EMAIL = 'admin@example.com'
These settings ensure that the email notifier is correctly configured to send log notifications to the specified ADMIN_EMAIL
address.
Thank you for using django_logging
. We hope this package enhances your Django application's logging capabilities. For more detailed documentation, customization options, and updates, please refer to the official documentation on Read the Docs. If you have any questions or issues, feel free to open an issue on our GitHub repository.
Happy logging!
FAQs
A package for logging in django applications
We found that dj-logging demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Maven Central now validates Sigstore signatures, making it easier for developers to verify the provenance of Java packages.
Security News
CISOs are racing to adopt AI for cybersecurity, but hurdles in budgets and governance may leave some falling behind in the fight against cyber threats.
Research
Security News
Socket researchers uncovered a backdoored typosquat of BoltDB in the Go ecosystem, exploiting Go Module Proxy caching to persist undetected for years.