Splunk Integration
Enrich your Splunk security events with real-time IP threat intelligence from ipinsights.io.
Overview
Splunk is a leading data analytics and SIEM (Security Information and Event Management) platform used by security teams worldwide for log management, real-time monitoring, advanced threat detection, and incident investigation. Its powerful Search Processing Language (SPL) and extensible app framework make it ideal for integrating external threat intelligence feeds.
By integrating ipinsights.io you can automatically enrich every Splunk event containing an IP address with real-time threat intelligence — including reputation scores, geolocation, ASN data, Tor/proxy/VPN detection and blocklist membership — giving your SOC analysts instant context directly within Splunk dashboards and searches.
The integration uses a custom search command built with the Splunk SDK for Python, or alternatively a scripted lookup, to seamlessly call the ipinsights.io API at search time and enrich events with threat data. Results are cached in a KV store collection for performance and visualised on custom dashboards.
Architecture Overview
The diagram below shows the end-to-end data flow when security events are enriched with threat intelligence:
- Data sources (firewalls, IDS/IPS, web servers, etc.) send raw events to the Splunk indexer.
- A scheduled or ad-hoc SPL search extracts unique source IP addresses from security events.
- The custom search command (or scripted lookup) calls the ipinsights.io API for each unique IP.
- The JSON response containing threat score, geolocation, ASN data and blocklist status is parsed and merged into the event.
- Enriched results are written to a KV store collection for caching and fast lookups.
- Custom dashboards, alerts and reports visualise the enriched data for SOC analysts.
Prerequisites
- Splunk Enterprise 8.x+ or Splunk Cloud
- Admin access to the Splunk instance (required for installing apps and custom commands)
- Python 3 available on all search heads (Splunk 8.x+ ships with Python 3 by default)
- The Splunk SDK for Python —
pip install splunk-sdk(included when bundled with the app) - An ipinsights.io API key — available on your profile page (or register for free)
- Outbound HTTPS (port 443) access from Splunk search heads to
https://ipinsights.io
Step 1 — Create a Custom Search Command
The heart of the integration is a custom streaming search command that takes an IP field name as an argument, calls the ipinsights.io API for each unique IP, and appends enrichment fields to every event.
Python Script — ipinsights_lookup.py
Create the following Python script in the app's bin/ directory. It uses the
Splunk SDK's StreamingCommand to process events in a streaming fashion:
The command caches results in memory for the duration of the search to avoid redundant
API calls for the same IP address. Adjust CACHE_TTL as needed.
Step 2 — Create the App Directory Structure
Splunk apps follow a standard directory layout. Create the following structure under your Splunk installation:
Create the Structure
Step 3 — Configure the App
default/app.conf
Define the app metadata so Splunk recognises it on the Apps page:
default/commands.conf
Register the custom search command so it can be invoked in SPL:
default/collections.conf
Define a KV store collection for caching enrichment results:
metadata/default.meta
Grant global read access so the command is available across all apps:
Store the API Key Securely
Use Splunk's credential storage (passwords.conf) to keep your API key encrypted at rest:
Never hard-code your API key in scripts. The custom command retrieves it from
storage/passwords at runtime.
Step 4 — Alternative: Scripted Lookup
If you prefer Splunk's lookup mechanism over a custom search command, you can create a
scripted lookup that is invoked automatically when you reference it in
SPL with the lookup command.
default/transforms.conf
Python Script — bin/ipinsights_scripted.py
The scripted lookup reads from stdin and writes enriched CSV rows to stdout:
Using the Scripted Lookup in SPL
Step 5 — Create a Scheduled Search for Enrichment
Instead of enriching at search time (which can be slow for large result sets), create a scheduled search that runs periodically, enriches unique IPs, and stores the results in the KV store for fast lookups.
SPL — Scheduled Enrichment Search
This search finds unique source IPs from your security index, filters out IPs that are already in the cache, enriches the remainder via the custom command, and writes the results back to the KV store.
default/savedsearches.conf
Cache Expiry Search
Add a second scheduled search to expire stale cache entries (e.g. older than 24 hours):
Step 6 — Create Dashboard Panels
Use the cached enrichment data to build informative dashboard panels. Below are SPL queries for common visualisations.
High-Risk IPs Table
Display a table of IPs with the highest threat scores:
Threat Score Distribution
Show the distribution of threat scores across all enriched IPs:
Top Countries Chart
Visualise the top source countries for threat IPs:
Tor / Proxy Detection Panel
Identify IPs using anonymisation services:
Step 7 — Set Up Alerts
Create a Splunk alert that triggers whenever a high-risk IP (threat score above 70) is detected in your security events.
Alert SPL Query
savedsearches.conf Snippet
Alert Configuration Options
- Trigger condition: When the number of results is greater than 0.
- Throttle: Suppress alerts for the same
src_ipfor 1 hour to reduce noise. - Severity: Set to Critical (level 4) for high-risk IPs.
- Actions: Send email, create a notable event (Splunk ES), or trigger a webhook to your ticketing system.
- Adjust the
threat_scorethreshold to match your risk appetite (e.g. 50 for medium, 80 for critical-only).
Verification
After deploying the app and restarting Splunk, verify the integration is working correctly:
Test the Custom Command
Run a simple search in the Splunk Search & Reporting app:
You should see enrichment fields populated with threat intelligence data for the test IP.
Check the KV Store
Verify cached results exist in the KV store collection:
Validate Enriched Data
Join live security events with cached enrichment data:
Best Practices
-
Batch processing & rate limiting: Always
dedupIPs before calling the custom command to minimise API calls. The scheduled search approach ensures you only look up each IP once. - KV store caching: Use the KV store as your primary lookup source for dashboards and ad-hoc searches. Reserve real-time API calls for scheduled enrichment only.
-
Error handling: The custom command logs warnings for failed API calls.
Monitor
index=_internal sourcetype=splunkd component=ExternalSearchCommandfor errors. -
API key security: Store your API key in Splunk's
storage/passwords(encrypted at rest), never in plaintext configuration files or environment variables on shared systems. - Dedicated search heads: For large deployments, run the enrichment scheduled search on a dedicated search head to avoid impacting interactive users.
- Index-time vs. search-time: Prefer search-time enrichment (as shown in this guide) over index-time to keep raw data intact and allow the enrichment to be updated as threat intelligence evolves.
- Cache TTL: Set an appropriate cache expiry (e.g. 24 hours) to balance freshness with API usage. High-frequency environments may benefit from shorter TTLs.
- Splunk Cloud: If running on Splunk Cloud, submit the app via the App Inspect process and ensure your inputs.conf does not use file-based inputs.
Troubleshooting
Custom command not found
- Verify the app directory exists at
$SPLUNK_HOME/etc/apps/ipinsights_ta/. - Check that
commands.confis in thedefault/directory and the filename matches the script. - Restart Splunk after deploying the app:
$SPLUNK_HOME/bin/splunk restart. - Check
btool:$SPLUNK_HOME/bin/splunk btool commands list ipinsightslookup --debug.
Python path or import errors
- Ensure the Splunk SDK for Python is bundled in the
lib/directory of the app. - Verify Python 3 is configured: check
python.version = python3incommands.conf. - Test the script directly:
$SPLUNK_HOME/bin/splunk cmd python3 $SPLUNK_HOME/etc/apps/ipinsights_ta/bin/ipinsights_lookup.py. - Check
index=_internal "ipinsights_lookup" ERRORfor detailed error messages.
Permission denied errors
- Ensure the script is readable and executable by the Splunk user:
chmod 755 bin/ipinsights_lookup.py. - Check ownership:
chown -R splunk:splunk $SPLUNK_HOME/etc/apps/ipinsights_ta/. - On SELinux systems, verify the correct context:
ls -Z bin/ipinsights_lookup.py.
API connectivity issues
- Test from the search head:
curl -H "X-API-Key: YOUR_KEY" "https://ipinsights.io/api/v1/lookup?ip=8.8.8.8". - Check firewall/proxy rules allowing outbound HTTPS to
https://ipinsights.io. - Look for timeout errors in
index=_internal sourcetype=splunkd "ipinsights" "timeout". - If behind a corporate proxy, set
HTTP_PROXY/HTTPS_PROXYenvironment variables for the Splunk process.
Slow searches or timeouts
- Always
dedupIPs before calling the custom command to reduce API calls. - Use the KV store cache — run
| lookup ipinsights_cache ipinstead of calling the API at search time. - Increase the scheduled search frequency if the cache is frequently empty.
- Check your API rate limit — if you're hitting 429 responses, request a higher limit below.
Missing or empty enrichment fields
- Verify the IP field name matches exactly (field names are case-sensitive in Splunk).
- Check that the API key is valid and has not expired on your profile page.
- Ensure the IP values are valid IPv4 or IPv6 addresses (not hostnames or CIDR ranges).
- Test with a known IP:
| makeresults | eval ip="8.8.8.8" | ipinsightslookup ip_field=ip.
API Key: You can find your API key on your profile page. Don't have an account yet? Register for free.
Request Higher API Limit
Running a high-volume Splunk deployment? If the default rate limit isn't enough for your environment, submit a request below and we'll review it.