SIEM Integration

Sumo Logic Integration

Forward database audit events to Sumo Logic for cloud-native log management. Leverage Sumo Logic's analytics and visualization for comprehensive database security monitoring.

HTTP Source

Direct log ingestion via Sumo Logic's HTTP Collector for real-time data streaming.

Field Extraction

Automatic JSON parsing and field extraction for powerful search queries.

Dashboards

Build interactive dashboards for database security visibility and reporting.

Configuration Reference

1 Connection Settings

Field Type Required Default Description
name string Yes - A unique name for this SIEM connection (e.g., "sumologic-prod")
provider select Yes sumo-logic SIEM provider - select "Sumo Logic"
enabled boolean No true Enable or disable event forwarding
http_source_url string Yes - Sumo Logic HTTP Source URL for log collection

2 Sumo Logic-Specific Settings

Field Type Required Default Description
source_category string No dbaudit/events Source category for log organization
source_name string No dbaudit Source name identifier
source_host string No - Override source host (defaults to DB Audit server hostname)
fields object No - Additional metadata fields to attach to logs
compress boolean No true Enable gzip compression for HTTP requests

3 Event Filtering

Field Type Required Default Description
event_types multiselect No all Event types to forward: audit_events, alerts, ai_detections, policy_violations, classification_findings
severity_filter multiselect No all Filter by severity: critical, warning, info
database_filter array No - Limit to specific databases (empty = all databases)

4 Batching & Reliability

Field Type Required Default Description
batch_size number No 100 Number of events per batch (1-1000)
flush_interval_seconds number No 30 Maximum time between flushes (5-300 seconds)
retry_attempts number No 3 Number of retry attempts on failure

Setup Instructions

1

Create HTTP Source

Create an HTTP Source in Sumo Logic to receive DB Audit events.

                # Create HTTP Source in Sumo Logic:
# 1. Navigate to Manage Data → Collection
# 2. Click "Add Collector" → "Hosted Collector"
# 3. Name it "DB Audit Collector"
# 4. Click "Add Source" → "HTTP Logs & Metrics"
# 5. Configure the source:
#    - Name: DB Audit Events
#    - Source Category: dbaudit/events
#    - Enable Timestamp Parsing
#    - Timestamp Format: Auto
# 6. Copy the HTTP Source URL

# Example URL format:
# https://endpoint1.collection.us2.sumologic.com/receiver/v1/http/ZaVnC4dhaV1...

# Regional endpoints:
# - US1: https://endpoint1.collection.sumologic.com
# - US2: https://endpoint2.collection.us2.sumologic.com
# - EU: https://endpoint4.collection.eu.sumologic.com
# - AU: https://endpoint6.collection.au.sumologic.com
# - JP: https://endpoint7.collection.jp.sumologic.com
              
2

Test HTTP Source

Verify the HTTP Source can receive logs.

                # Test Sumo Logic HTTP Source
curl -X POST "YOUR_HTTP_SOURCE_URL" \
    -H "Content-Type: application/json" \
    -H "X-Sumo-Category: dbaudit/events" \
    -H "X-Sumo-Name: dbaudit" \
    -d '{
      "timestamp": "2024-01-15T10:30:45.123Z",
      "event_type": "test",
      "message": "Test event from DB Audit"
    }'

# Expected response: empty (HTTP 200)

# Verify in Sumo Logic:
# Log Search: _sourceCategory=dbaudit/events event_type=test
              
3

Create Field Extraction Rule (Optional)

Create a field extraction rule for easier querying.

                # Create Field Extraction Rule in Sumo Logic
# Manage Data → Logs → Field Extraction Rules → Add

Name: DB Audit Event Fields
Scope: _sourceCategory=dbaudit/events

Parse Expression:
| json auto
| json field=source "database", "db_type", "host" as db_name, db_type, db_host
| json field=actor "user", "client_ip", "application" as user, client_ip, application
| json field=action "type", "object", "rows_affected" as action_type, object, rows_affected
| json field=classification "contains_pii" as contains_pii

# Or use auto-parsing:
| json auto keys "event_type", "severity", "source.database", "actor.user", "action.type"
              

Tip: Field extraction rules make querying much easier by pre-parsing JSON fields.

4

Configure in DB Audit

Add the Sumo Logic integration in the DB Audit dashboard.

  1. Navigate to Integrations → SIEM in DB Audit
  2. Click Add SIEM Integration
  3. Select Sumo Logic as the provider
  4. Paste your HTTP Source URL
  5. Configure source category and name
  6. Select event types to forward
  7. Test the connection and save

Event Format

Events are sent to Sumo Logic in JSON format with all audit data.

          {
  "timestamp": "2024-01-15T10:30:45.123Z",
  "event_type": "audit_event",
  "severity": "warning",
  "source": {
    "database": "production-postgres",
    "db_type": "postgresql",
    "host": "db.example.com"
  },
  "actor": {
    "user": "app_user",
    "client_ip": "10.0.1.50",
    "application": "backend-api"
  },
  "action": {
    "type": "SELECT",
    "object": "customers",
    "schema": "public",
    "statement": "SELECT * FROM customers WHERE...",
    "rows_affected": 1500
  },
  "classification": {
    "contains_pii": true,
    "data_types": ["email", "phone"]
  }
}
        

Sample Sumo Logic Queries

Use these queries to search and analyze DB Audit events in Sumo Logic.

          // Sumo Logic Log Search Queries

// All DB Audit events
_sourceCategory=dbaudit/events

// Filter by severity
_sourceCategory=dbaudit/events
| json auto
| where severity = "critical"

// Large data access patterns
_sourceCategory=dbaudit/events
| json field=action "type", "rows_affected" as action_type, rows_affected
| where action_type = "SELECT" and rows_affected > 10000
| count by %"actor.user"

// User activity timeline
_sourceCategory=dbaudit/events
| json field=actor "user" as user
| where user = "app_user"
| timeslice 1h
| count by _timeslice

// Failed authentication attempts
_sourceCategory=dbaudit/events
| json auto
| where event_type = "auth_failure"
| count by %"actor.client_ip"
| sort by _count desc

// PII access monitoring
_sourceCategory=dbaudit/events
| json field=classification "contains_pii" as pii
| where pii = "true"
| count by %"actor.user", %"source.database"

// Database activity summary
_sourceCategory=dbaudit/events
| json field=source "database" as database
| json field=action "type" as action_type
| count by database, action_type
| transpose row database column action_type
        

Sample Scheduled Search

Create scheduled searches to detect suspicious database activity.

          // Sumo Logic Scheduled Search / Monitor
// Manage Data → Monitoring → Monitors → Add

Name: Suspicious Database Data Exfiltration
Description: Detects queries returning large amounts of potentially sensitive data

Query:
_sourceCategory=dbaudit/events
| json field=action "type", "rows_affected" as action_type, rows_affected
| json field=classification "contains_pii" as pii
| where action_type = "SELECT" and rows_affected > 10000 and pii = "true"
| count by %"actor.user", %"source.database"
| where _count > 0

Trigger Type: Critical
Alert Condition: Greater than 0 results
Time Range: Last 15 minutes
Run Frequency: Every 5 minutes

Alert Payload:
- Send to Slack webhook
- Create ServiceNow incident
- Send email notification
        

Troubleshooting

Connection refused

Verify the HTTP Source URL is correct and the collector is active. Check network connectivity.

Logs not appearing

Check the source category filter in Log Search. It may take a few minutes for logs to be indexed.

JSON fields not parsing

Use the json auto operator or create a Field Extraction Rule to parse nested JSON fields.

Rate limiting

If you see throttling errors, reduce batch_size or contact Sumo Logic support for higher limits.

Ready to Integrate with Sumo Logic?

Start forwarding database audit events to Sumo Logic in minutes.