
Product
Announcing Precomputed Reachability Analysis in Socket
Socketβs precomputed reachability slashes false positives by flagging up to 80% of vulnerabilities as irrelevant, with no setup and instant results.
A universal bridge to connect, map, and automate data transfer between any two REST APIs
ApiLinker is an open-source Python package that simplifies the integration of REST APIs by providing a universal bridging solution. Built for developers, data engineers, and researchers who need to connect different systems without writing repetitive boilerplate code.
Install ApiLinker using pip (Python's package manager):
pip install apilinker
If you're using Windows, you might need to use:
py -m pip install apilinker
Make sure you have Python 3.8 or newer installed. To check your Python version:
python --version
# or
py --version
To install from source (for contributing or customizing):
# Clone the repository
git clone https://github.com/kkartas/apilinker.git
cd apilinker
# Install in development mode with dev dependencies
pip install -e ".[dev]"
# Install with documentation tools
pip install -e ".[docs]"
To verify ApiLinker is correctly installed, run:
python -c "import apilinker; print(apilinker.__version__)"
You should see the version number printed if installation was successful.
New to API integration? Follow this step-by-step guide to get started with ApiLinker.
pip install apilinker
Let's connect to a public API (Weather API) and print some data:
from apilinker import ApiLinker
# Create an API connection
linker = ApiLinker()
# Configure a simple source
linker.add_source(
type="rest",
base_url="https://api.openweathermap.org/data/2.5",
endpoints={
"get_weather": {
"path": "/weather",
"method": "GET",
"params": {
"q": "London",
"appid": "YOUR_API_KEY" # Get a free key at openweathermap.org
}
}
}
)
# Fetch data from the API
weather_data = linker.fetch("get_weather")
# Print results
print(f"Temperature: {weather_data['main']['temp']} K")
print(f"Conditions: {weather_data['weather'][0]['description']}")
Save the above code as weather.py
and run it:
python weather.py
Let's convert the temperature from Kelvin to Celsius:
# Add this to your script
def kelvin_to_celsius(kelvin_value):
return kelvin_value - 273.15
linker.mapper.register_transformer("kelvin_to_celsius", kelvin_to_celsius)
# Get the temperature in Celsius
temp_kelvin = weather_data['main']['temp']
temp_celsius = linker.mapper.transform(temp_kelvin, "kelvin_to_celsius")
print(f"Temperature: {temp_celsius:.1f}Β°C")
pip install apilinker
)Create a configuration file config.yaml
:
source:
type: rest
base_url: https://api.example.com/v1
auth:
type: bearer
token: ${SOURCE_API_TOKEN} # Reference environment variable
endpoints:
list_items:
path: /items
method: GET
params:
updated_since: "{{last_sync}}" # Template variable
pagination:
data_path: data
next_page_path: meta.next_page
page_param: page
target:
type: rest
base_url: https://api.destination.com/v2
auth:
type: api_key
header: X-API-Key
key: ${TARGET_API_KEY}
endpoints:
create_item:
path: /items
method: POST
mapping:
- source: list_items
target: create_item
fields:
- source: id
target: external_id
- source: name
target: title
- source: description
target: body.content
- source: created_at
target: metadata.created
transform: iso_to_timestamp
# Conditional field mapping
- source: tags
target: labels
condition:
field: tags
operator: exists
transform: lowercase
schedule:
type: interval
minutes: 60
logging:
level: INFO
file: apilinker.log
Run a sync with:
apilinker sync --config config.yaml
Run a dry run to see what would happen without making changes:
apilinker sync --config config.yaml --dry-run
Run a scheduled sync based on the configuration:
apilinker run --config config.yaml
from apilinker import ApiLinker
# Initialize with config file
linker = ApiLinker(config_path="config.yaml")
# Or configure programmatically
linker = ApiLinker()
# Step 1: Set up your source API connection
linker.add_source(
type="rest", # API type (REST is most common)
base_url="https://api.github.com", # Base URL of the API
auth={ # Authentication details
"type": "bearer", # Using bearer token authentication
"token": "${GITHUB_TOKEN}" # Reference to an environment variable
},
endpoints={ # Define API endpoints
"list_issues": { # A name you choose for this endpoint
"path": "/repos/owner/repo/issues", # API path
"method": "GET", # HTTP method
"params": {"state": "all"} # Query parameters
}
}
)
# Step 2: Set up your target API connection
linker.add_target(
type="rest",
base_url="https://gitlab.com/api/v4",
auth={
"type": "bearer",
"token": "${GITLAB_TOKEN}"
},
endpoints={
"create_issue": {
"path": "/projects/123/issues",
"method": "POST" # This endpoint will receive data
}
}
)
# Step 3: Define how data maps from source to target
linker.add_mapping(
source="list_issues", # Source endpoint name (from Step 1)
target="create_issue", # Target endpoint name (from Step 2)
fields=[ # Field mapping instructions
{"source": "title", "target": "title"}, # Map source title β target title
{"source": "body", "target": "description"} # Map source body β target description
]
)
# Step 4: Execute the sync (one-time)
result = linker.sync()
print(f"Synced {result.count} records")
# Step 5 (Optional): Set up scheduled syncing
linker.add_schedule(interval_minutes=60) # Run every hour
linker.start_scheduled_sync()
from apilinker import ApiLinker
linker = ApiLinker()
ApiLinker uses a YAML configuration format with these main sections:
Both source
and target
sections follow the same format:
source: # or target:
type: rest # API type
base_url: https://api.example.com/v1 # Base URL
auth: # Authentication details
# ...
endpoints: # API endpoints
# ...
timeout: 30 # Request timeout in seconds (optional)
retry_count: 3 # Number of retries (optional)
ApiLinker supports multiple authentication methods:
# API Key Authentication
auth:
type: api_key
key: your_api_key # Or ${API_KEY_ENV_VAR}
header: X-API-Key # Header name
# Bearer Token Authentication
auth:
type: bearer
token: your_token # Or ${TOKEN_ENV_VAR}
# Basic Authentication
auth:
type: basic
username: your_username # Or ${USERNAME_ENV_VAR}
password: your_password # Or ${PASSWORD_ENV_VAR}
# OAuth2 Client Credentials
auth:
type: oauth2_client_credentials
client_id: your_client_id # Or ${CLIENT_ID_ENV_VAR}
client_secret: your_client_secret # Or ${CLIENT_SECRET_ENV_VAR}
token_url: https://auth.example.com/token
scope: read write # Optional
Mappings define how data is transformed between source and target:
mapping:
- source: source_endpoint_name
target: target_endpoint_name
fields:
# Simple field mapping
- source: id
target: external_id
# Nested field mapping
- source: user.profile.name
target: user_name
# With transformation
- source: created_at
target: timestamp
transform: iso_to_timestamp
# Multiple transformations
- source: description
target: summary
transform:
- strip
- lowercase
# Conditional mapping
- source: status
target: active_status
condition:
field: status
operator: eq # eq, ne, exists, not_exists, gt, lt
value: active
ApiLinker provides built-in transformers for common operations:
Transformer | Description |
---|---|
iso_to_timestamp | Convert ISO date to Unix timestamp |
timestamp_to_iso | Convert Unix timestamp to ISO date |
lowercase | Convert string to lowercase |
uppercase | Convert string to uppercase |
strip | Remove whitespace from start/end |
to_string | Convert value to string |
to_int | Convert value to integer |
to_float | Convert value to float |
to_bool | Convert value to boolean |
default_empty_string | Return empty string if null |
default_zero | Return 0 if null |
none_if_empty | Return null if empty string |
You can also create custom transformers:
def phone_formatter(value):
"""Format phone numbers to E.164 format."""
if not value:
return None
digits = re.sub(r'\D', '', value)
if len(digits) == 10:
return f"+1{digits}"
return f"+{digits}"
# Register with ApiLinker
linker.mapper.register_transformer("phone_formatter", phone_formatter)
from apilinker import ApiLinker
# Configure ApiLinker
linker = ApiLinker(
source_config={
"type": "rest",
"base_url": "https://api.github.com",
"auth": {"type": "bearer", "token": github_token},
"endpoints": {
"list_issues": {
"path": f"/repos/{owner}/{repo}/issues",
"method": "GET",
"params": {"state": "all"},
"headers": {"Accept": "application/vnd.github.v3+json"}
}
}
},
target_config={
"type": "rest",
"base_url": "https://gitlab.com/api/v4",
"auth": {"type": "bearer", "token": gitlab_token},
"endpoints": {
"create_issue": {
"path": f"/projects/{project_id}/issues",
"method": "POST"
}
}
}
)
# Custom transformer for labels
linker.mapper.register_transformer(
"github_labels_to_gitlab",
lambda labels: [label["name"] for label in labels] if labels else []
)
# Add mapping
linker.add_mapping(
source="list_issues",
target="create_issue",
fields=[
{"source": "title", "target": "title"},
{"source": "body", "target": "description"},
{"source": "labels", "target": "labels", "transform": "github_labels_to_gitlab"},
{"source": "state", "target": "state"}
]
)
# Run the migration
result = linker.sync()
print(f"Migrated {result.count} issues from GitHub to GitLab")
See the examples
directory for more use cases:
This example shows how to sync customer data from CRM to a marketing platform:
from apilinker import ApiLinker
import os
# Set environment variables securely before running
# os.environ["CRM_API_KEY"] = "your_crm_api_key"
# os.environ["MARKETING_API_KEY"] = "your_marketing_api_key"
# Initialize ApiLinker
linker = ApiLinker()
# Configure CRM source
linker.add_source(
type="rest",
base_url="https://api.crm-platform.com/v2",
auth={
"type": "api_key",
"header": "X-API-Key",
"key": "${CRM_API_KEY}" # Uses environment variable
},
endpoints={
"get_customers": {
"path": "/customers",
"method": "GET",
"params": {"last_modified_after": "2023-01-01"}
}
}
)
# Configure marketing platform target
linker.add_target(
type="rest",
base_url="https://api.marketing-platform.com/v1",
auth={
"type": "api_key",
"header": "Authorization",
"key": "${MARKETING_API_KEY}" # Uses environment variable
},
endpoints={
"create_contact": {
"path": "/contacts",
"method": "POST"
}
}
)
# Define field mapping with transformations
linker.add_mapping(
source="get_customers",
target="create_contact",
fields=[
{"source": "id", "target": "external_id"},
{"source": "first_name", "target": "firstName"},
{"source": "last_name", "target": "lastName"},
{"source": "email", "target": "emailAddress"},
{"source": "phone", "target": "phoneNumber", "transform": "format_phone"},
# Custom field creation with default value
{"target": "source", "value": "CRM Import"}
]
)
# Register a custom transformer for phone formatting
def format_phone(phone):
if not phone:
return ""
# Remove non-digits
digits = ''.join(c for c in phone if c.isdigit())
# Format as (XXX) XXX-XXXX for US numbers
if len(digits) == 10:
return f"({digits[0:3]}) {digits[3:6]}-{digits[6:10]}"
return phone
linker.mapper.register_transformer("format_phone", format_phone)
# Execute the sync
result = linker.sync()
print(f"Synced {result.count} customers to marketing platform")
This example collects weather data hourly and saves to a CSV file:
from apilinker import ApiLinker
import csv
import datetime
import time
import os
# Create a function to handle the collected data
def save_weather_data(data, city):
timestamp = datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S")
# Create CSV if it doesn't exist
file_exists = os.path.isfile(f"{city}_weather.csv")
with open(f"{city}_weather.csv", mode='a', newline='') as file:
writer = csv.writer(file)
# Write header if file is new
if not file_exists:
writer.writerow(["timestamp", "temperature", "humidity", "conditions"])
# Write data
writer.writerow([
timestamp,
data['main']['temp'] - 273.15, # Convert K to C
data['main']['humidity'],
data['weather'][0]['description']
])
print(f"Weather data saved for {city} at {timestamp}")
# Initialize ApiLinker
linker = ApiLinker()
# Configure weather API
linker.add_source(
type="rest",
base_url="https://api.openweathermap.org/data/2.5",
endpoints={
"get_london_weather": {
"path": "/weather",
"method": "GET",
"params": {
"q": "London,uk",
"appid": "YOUR_API_KEY" # Replace with your API key
}
},
"get_nyc_weather": {
"path": "/weather",
"method": "GET",
"params": {
"q": "New York,us",
"appid": "YOUR_API_KEY" # Replace with your API key
}
}
}
)
# Create a custom handler for the weather data
def collect_weather():
london_data = linker.fetch("get_london_weather")
nyc_data = linker.fetch("get_nyc_weather")
save_weather_data(london_data, "London")
save_weather_data(nyc_data, "NYC")
# Run once to test
collect_weather()
# Then schedule to run hourly
linker.add_schedule(interval_minutes=60, callback=collect_weather)
linker.start_scheduled_sync()
# Keep the script running
try:
print("Weather data collection started. Press Ctrl+C to stop.")
while True:
time.sleep(60)
except KeyboardInterrupt:
print("Weather data collection stopped.")
ApiLinker can be extended through plugins. Here's how to create a custom transformer plugin:
from apilinker.core.plugins import TransformerPlugin
class SentimentAnalysisTransformer(TransformerPlugin):
"""A transformer plugin that analyzes text sentiment."""
plugin_name = "sentiment_analysis" # This name is used to reference the plugin
version = "1.0.0" # Optional version information
author = "Your Name" # Optional author information
def transform(self, value, **kwargs):
# Simple sentiment analysis (example)
if not value or not isinstance(value, str):
return {"sentiment": "neutral", "score": 0.0}
# Add your sentiment analysis logic here
positive_words = ["good", "great", "excellent"]
negative_words = ["bad", "poor", "terrible"]
# Count positive and negative words
text = value.lower()
positive_count = sum(1 for word in positive_words if word in text)
negative_count = sum(1 for word in negative_words if word in text)
# Calculate sentiment score
total = positive_count + negative_count
score = 0.0 if total == 0 else (positive_count - negative_count) / total
return {
"sentiment": "positive" if score > 0 else "negative" if score < 0 else "neutral",
"score": score
}
After creating your plugin, you need to register it before using:
from apilinker import ApiLinker
# Create your custom plugin instance
from my_plugins import SentimentAnalysisTransformer
# Initialize ApiLinker
linker = ApiLinker()
# Register the plugin
linker.plugin_manager.register_plugin(SentimentAnalysisTransformer)
# Configure APIs and mappings...
linker.add_mapping(
source="get_reviews",
target="save_analysis",
fields=[
{"source": "user_id", "target": "user_id"},
# Use your custom plugin to transform the review text
{"source": "review_text", "target": "sentiment_data", "transform": "sentiment_analysis"}
]
)
Package not found error
ERROR: Could not find a version that satisfies the requirement apilinker
pip install --upgrade pip
Import errors
ImportError: No module named 'apilinker'
pip list | grep apilinker
pip install --force-reinstall apilinker
API connection failures
ConnectionError: Failed to establish connection to api.example.com
Authentication errors
AuthenticationError: Invalid credentials
Field not found errors
KeyError: 'Field not found in source data: user_profile'
user.profile.name
)Transformation errors
ValueError: Invalid data for transformer 'iso_to_timestamp'
from apilinker import ApiLinker
import time
linker = ApiLinker()
# Configure with retry settings
linker.add_source(
type="rest",
base_url="https://api.example.com",
retry={
"max_attempts": 5,
"delay_seconds": 2,
"backoff_factor": 2, # Exponential backoff
"status_codes": [429, 500, 502, 503, 504] # Retry on these status codes
},
endpoints={
"get_data": {"path": "/data", "method": "GET"}
}
)
# Example of manual handling with wait periods
try:
data = linker.fetch("get_data")
print("Success!")
except Exception as e:
if "rate limit" in str(e).lower():
print("Rate limited, waiting and trying again...")
time.sleep(60) # Wait 1 minute
data = linker.fetch("get_data") # Try again
else:
raise e
Documentation is available in the /docs
directory and will be hosted online soon.
For developers who want to extend ApiLinker or understand its internals, we provide comprehensive API reference documentation that can be generated using Sphinx:
# Install Sphinx and required packages
pip install sphinx sphinx-rtd-theme myst-parser
# Generate HTML documentation
cd docs/sphinx_setup
sphinx-build -b html . _build/html
The generated documentation will be available in docs/sphinx_setup/_build/html/index.html
apilinker
tagWhen working with APIs that require authentication, follow these security best practices:
Never hardcode credentials in your code or configuration files. Always use environment variables or secure credential stores.
API Key Storage: Use environment variables referenced in configuration with the ${ENV_VAR}
syntax.
auth:
type: api_key
header: X-API-Key
key: ${MY_API_KEY}
OAuth Security: For OAuth flows, ensure credentials are stored securely and token refresh is handled properly.
Credential Validation: ApiLinker performs validation checks on authentication configurations to prevent common security issues.
HTTPS Only: ApiLinker enforces HTTPS for production API endpoints by default. Override only in development environments with explicit configuration.
Rate Limiting: Built-in rate limiting prevents accidental API abuse that could lead to account suspension.
Audit Logging: Enable detailed logging for security-relevant events with:
logging:
level: INFO
security_audit: true
Contributions are welcome! Please see our Contributing Guide for details.
git checkout -b feature/amazing-feature
)pip install -e ".[dev]"
)pytest
)git commit -m 'Add amazing feature'
)git push origin feature/amazing-feature
)If you use ApiLinker in your research, please cite:
@software{apilinker2025,
author = {Kartas, Kyriakos},
title = {ApiLinker: A Universal Bridge for REST API Integrations},
url = {https://github.com/kkartas/apilinker},
version = {0.2.0},
year = {2025},
doi = {10.21105/joss.12345}
}
This project is licensed under the MIT License - see the LICENSE file for details.
FAQs
A universal bridge to connect, map, and automate data transfer between any two REST APIs
We found that apilinker demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago.Β It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Product
Socketβs precomputed reachability slashes false positives by flagging up to 80% of vulnerabilities as irrelevant, with no setup and instant results.
Product
Socket is launching experimental protection for Chrome extensions, scanning for malware and risky permissions to prevent silent supply chain attacks.
Product
Add secure dependency scanning to Claude Desktop with Socket MCP, a one-click extension that keeps your coding conversations safe from malicious packages.