Cloud Data Warehouse

BigQuery Connector

Connect DB Audit to Google BigQuery for comprehensive query monitoring, data access tracking, and compliance reporting with native Cloud Audit Logs integration.

Native Deep Integration

Cloud Audit Native

Direct GCP Cloud Logging integration

IAM Identity Context

Full user/service account attribution

Zero Performance Impact

GCP-side logging only

What You Can Monitor

Query Audit Logging

Track every SQL query with full query text, job statistics, bytes processed, and user attribution.

IAM Access Tracking

Monitor data access by service accounts, users, and groups with full GCP identity context.

Dataset Changes

Detect and alert on dataset creation, table modifications, and schema changes in real-time.

Cost & Usage Analysis

Track bytes scanned, slot usage, and query costs for security and billing correlation.

Requirements

GCP Project Active GCP project with BigQuery enabled
Cloud Logging Cloud Audit Logs enabled for BigQuery
Authentication Service account or user credentials
Required Roles roles/bigquery.dataViewer, roles/logging.viewer

Configuration Reference

GCP Connection

Field Type Required Description
name string Yes A unique name to identify this connection
project_id string Yes GCP project ID containing BigQuery datasets
location string No BigQuery dataset location (US, EU, region)
credentials_json password No Service account JSON key (stored encrypted)
credentials_path string No Path to service account JSON key file

Log Collection

Field Type Default Description
log_collection_type select cloud_logging Method for collecting audit logs
log_filter string - Cloud Logging filter for custom log selection
polling_interval number 60 Seconds between log collection polls
history_days number 7 Days of history to query

Log Collection Methods

DB Audit supports multiple methods for collecting audit data from BigQuery.

Cloud Audit Logs

Recommended

Collect audit events from GCP Cloud Logging for comprehensive BigQuery activity tracking.

All API operations
IAM identity context
Admin and data access logs
Long retention

INFORMATION_SCHEMA

Query BigQuery INFORMATION_SCHEMA views for job and query history.

Detailed job stats
Bytes processed
Slot utilization
No Cloud Logging required
1

Create Service Account

Create a dedicated service account with minimal read permissions.

# Create a service account for DB Audit
gcloud iam service-accounts create dbaudit-reader \
  --display-name="DB Audit BigQuery Reader" \
  --project=your-project-id

# Grant BigQuery data viewer role
gcloud projects add-iam-policy-binding your-project-id \
  --member="serviceAccount:dbaudit-reader@your-project-id.iam.gserviceaccount.com" \
  --role="roles/bigquery.dataViewer"

# Grant BigQuery job user role (to run queries)
gcloud projects add-iam-policy-binding your-project-id \
  --member="serviceAccount:dbaudit-reader@your-project-id.iam.gserviceaccount.com" \
  --role="roles/bigquery.jobUser"

# Grant logging viewer role (for audit logs)
gcloud projects add-iam-policy-binding your-project-id \
  --member="serviceAccount:dbaudit-reader@your-project-id.iam.gserviceaccount.com" \
  --role="roles/logging.viewer"

# Create and download service account key
gcloud iam service-accounts keys create dbaudit-key.json \
  --iam-account=dbaudit-reader@your-project-id.iam.gserviceaccount.com
2

Enable BigQuery Audit Logs

Enable data access audit logs for BigQuery in your GCP project.

# Enable BigQuery audit logging in GCP Console or via gcloud
# Data access logs must be explicitly enabled

# View current audit config
gcloud projects get-iam-policy your-project-id \
  --format=json | jq '.auditConfigs'

# Enable BigQuery data access logs via API
cat > audit-config.json << 'EOF'
{
  "auditLogConfigs": [
    {
      "logType": "ADMIN_READ"
    },
    {
      "logType": "DATA_READ"
    },
    {
      "logType": "DATA_WRITE"
    }
  ],
  "service": "bigquery.googleapis.com"
}
EOF

# Or enable via GCP Console:
# IAM & Admin > Audit Logs > BigQuery API
# Enable: Admin Read, Data Read, Data Write

# Verify audit logs are flowing
gcloud logging read 'resource.type="bigquery_resource"' \
  --project=your-project-id \
  --limit=5
3

Configure DB Audit Collector

Add your BigQuery project to the DB Audit configuration.

databases:
  - name: production-bigquery
    type: bigquery
    project_id: your-project-id
    location: US

    # Authentication (use service account)
    credentials_path: /etc/dbaudit/certs/dbaudit-key.json
    # Or inline:
    # credentials_json: ${GCP_CREDENTIALS_JSON}

    # Log collection settings
    log_collection:
      type: cloud_logging
      polling_interval: 60

    # Monitoring options
    options:
      track_queries: true
      track_data_access: true
      track_admin_events: true
      track_schema_changes: true
      history_days: 7
4

Test Connection

Verify your configuration before deploying.

# Test BigQuery access with bq CLI
bq --project_id=your-project-id ls

# Test with service account
gcloud auth activate-service-account \
  --key-file=dbaudit-key.json
bq --project_id=your-project-id query \
  --use_legacy_sql=false \
  'SELECT COUNT(*) FROM `region-us`.INFORMATION_SCHEMA.JOBS_BY_PROJECT'

# Test Cloud Logging access
gcloud logging read 'resource.type="bigquery_resource"' \
  --project=your-project-id \
  --limit=5

# Verify DB Audit collector can connect
dbaudit-collector test-connections --config /etc/dbaudit/config.yaml

Compliance Support

DB Audit's BigQuery connector helps you meet audit requirements for major compliance frameworks.

SOC 2
Access logging
HIPAA
PHI access audit
PCI DSS
Cardholder data
GDPR
Data access rights

Troubleshooting

Permission denied on dataset

Verify the service account has roles/bigquery.dataViewer role. Check IAM policy bindings at project level.

No audit logs found

Ensure BigQuery data access logs are enabled in IAM > Audit Logs. Logs may take a few minutes to appear after queries.

Service account key invalid

Regenerate the service account key and update your configuration. Ensure the JSON file is valid and permissions are correct (600).

Ready to Audit Your BigQuery Datasets?

Start monitoring your BigQuery projects in minutes with native Cloud Audit Logs integration.