New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

github.com/observiq/bindplane-otel-collector/receiver/splunksearchapireceiver

Package Overview
Dependencies
Alerts
File Explorer
Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

github.com/observiq/bindplane-otel-collector/receiver/splunksearchapireceiver

  • v1.70.0
  • Source
  • Go
  • Socket score

Version published
Created
Source

Splunk Search API Receiver

This receiver collects Splunk events using the Splunk Search API.

Supported Pipelines

  • Logs

Prerequisites

  • Splunk admin credentials
  • Configured storage extension

Use Case

Unlike other receivers, the SSAPI receiver is not built to collect live data. Instead, it collects a finite set of historical data and transfers it to a destination, preserving the timestamp from the source. For this reason, the SSAPI receiver only needs to be left running until all Splunk events have been migrated, which is denoted by the log message: "all search results exported". Until this log message or some other error is printed, avoid cancelling the collector for any reason, as it will unnecessarily interfere with the receiver's ability to protect against writing duplicate events.

Configuration

FieldTypeDefaultDescription
endpointstringrequired (no default)The endpoint of the splunk instance to collect from.
splunk_usernamestring(no default)Specifies the username used to authenticate to Splunk using basic auth.
splunk_passwordstring(no default)Specifies the password used to authenticate to Splunk using basic auth.
auth_tokenstring(no default)Specifies the token used to authenticate to Splunk using token auth. Mutually exclusive with basic auth using splunk_username and splunk_password.
token_typestring(no default)Specifies the type of token used to authenticate to Splunk using auth_token. Accepted values are "Bearer" or "Splunk".
job_poll_intervalduration5sThe receiver uses an API call to determine if a search has completed. Specifies how long to wait between polling for search job completion.
searches.querystringrequired (no default)The Splunk search to run to retrieve the desired events. Queries must start with search and should not contain additional commands, nor any time fields (e.g. earliesttime)
searches.earliest_timestringrequired (no default)The earliest timestamp to collect logs. Only logs that occurred at or after this timestamp will be collected. Must be in 'yyyy-MM-ddTHH:mm' format (UTC).
searches.latest_timestringrequired (no default)The latest timestamp to collect logs. Only logs that occurred at or before this timestamp will be collected. Must be in 'yyyy-MM-ddTHH:mm' format (UTC).
searches.event_batch_sizeint100The amount of events to query from Splunk for a single request.
storagecomponent(no default)The component ID of a storage extension which can be used when polling for logs. The storage extension prevents duplication of data after an exporter error by remembering which events were previously exported. This should be configured in all production environments.

Example Configuration

receivers:
  splunksearchapi:
    endpoint: "https://splunk-c4-0.example.localnet:8089"
    tls:
      insecure_skip_verify: true
    splunk_username: "user"
    splunk_password: "pass"
    job_poll_interval: 5s
    searches:
      - query: 'search index=my_index'
        earliest_time: "2024-11-01T01:00:00.000-05:00"
        latest_time: "2024-11-30T23:59:59.999-05:00"
        event_batch_size: 500
    storage: file_storage

extensions:
  file_storage:
    directory: "./local/storage"

How To

Migrate historical events to Google Cloud Logging

  1. Identify the Splunk index to migrate events from. Create a Splunk search to capture the events from that index. This will be the searches.query you pass to the receiver.
    • Example: search index=my_index
    • Note: queries must begin with the explicit search command, and must not include additional commands, nor any time fields (e.g. earliesttime)
  2. Determine the timeframe you want to migrate events from, and set the searches.earliest_time and searches.latest_time config fields accordingly.
    • To migrate events from December 2024, EST (UTC-5):
      • earliest_time: "2024-12-01T00:00:00.000-05:00"
      • latest_time: "2024-12-31T23:59:59.999-05:00"
    • Note: By default, GCL will not accept logs with a timestamp older than 30 days. Contact Google to modify this rule.
  3. Repeat steps 1 & 2 for each index you wish to collect from
  4. Configure a storage extension to store checkpointing data for the receiver.
  5. Configure the rest of the receiver fields according to your Splunk environment.
  6. Add a googlecloud exporter to your config. Configure the exporter to send to a GCP project where your service account has Logging Admin role. To check the permissions of service accounts in your project, go to the IAM page.
  7. Disable the sending_queue field on the GCP exporter. The sending queue introduces an asynchronous step to the pipeline, which will jeopardize the receiver's ability to checkpoint correctly and recover from errors. For this same reason, avoid using any asynchronous processors (e.g., batch processor).

After following these steps, your configuration should look something like this:

receivers:
  splunksearchapi:
    endpoint: "https://splunk-c4-0.example.localnet:8089"
    tls:
      insecure_skip_verify: true
    splunk_username: "user"
    splunk_password: "pass"
    job_poll_interval: 5s
    searches:
      - query: 'search index=my_index'
        earliest_time: "2024-12-01T00:00:00.000-05:00"
        latest_time: "2024-12-31T23:59:59.999-05:00"
        event_batch_size: 500
    storage: file_storage
exporters:
  googlecloud:
    project: "my-gcp-project"
    log: 
      default_log_name: "splunk-events"
    sending_queue:
      enabled: false

extensions:
  file_storage:
    directory: "./local/storage"

service:
  extensions: [file_storage]
  pipelines:
    logs:
      receivers: [splunksearchapi]
      exporters: [googlecloud]

You are now ready to migrate events from Splunk to Google Cloud Logging.

FAQs

Package last updated on 04 Feb 2025

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc