Overview
The OneSignal + Databricks integration supports two flows:- Export: Send OneSignal message events to Databricks for analytics and reporting.
- Import: Send custom events from Databricks to OneSignal to trigger Journeys and personalize campaigns.
Export and import are configured separately. You can set up one without the other.
Export OneSignal message events to Databricks
Sync all your message events from OneSignal into your Databricks lakehouse for near real-time analytics and visibility. Requirements- Professional Plan or higher
- Custom Events enabled (for event imports)
- Databricks Platform: AWS, Azure, or GCP
- Databricks Plan: Premium or higher
- Databricks Unity Catalog (recommended for governance)
- Databricks SQL Warehouse for querying
- Delta Lake event tables (for custom event import)
1. Collect SQL warehouse details
2. Choose authentication method
Databricks supports two authentication methods. Choose the method that best fits your deployment and security requirements.- OAuth M2M
- Personal Access Token
When to use:
- Direct connections (not using PrivateLink)
- Organizations with OAuth infrastructure
Create a service principal
Go to the Service Principals page
Go to Workspace Settings > Identity and Access > Service Principals.
Generate a secret
Recommendation: If you’re using PrivateLink or want the simplest setup, choose Personal Access Token. For direct connections with existing OAuth infrastructure, OAuth M2M works well.
3. Assign permissions
4. Connect OneSignal
Configure the integration
- Sync Frequency: as often as every 15 minutes
- Dataset/Table Names: pre-set as
onesignal_events_<app-id>andmessage_events(editable) - Event Types: choose which to sync—select all or just what you need
Initial data sync can take 15–30 minutes to appear in BigQuery.While you wait, send messages via push, email, in-app, or SMS to trigger the events selected.
5. View data in Databricks
- Open your Catalog in Databricks.
- Once syncing completes, your configured schema will appear.
-
Access and query the
message_eventstable.
-
Click into tables for sample data preview.

If you run into issues like missing schemas, permission errors, or malformed events, contact
[email protected].Message events and properties
Message event kinds
Property:event_kind
Type: String
The kind of message and event (e.g. message.push.received, message.push.sent).
| Message Event (OneSignal) | event_kind | Description |
|---|---|---|
| Push Sent | message.push.sent | Push notification successfully sent. |
| Push Received | message.push.received | Delivered push (see Confirmed Delivery). |
| Push Clicked | message.push.clicked | User clicked the push. |
| Push Failed | message.push.failed | Delivery failure. See message reports. |
| Push Unsubscribed | message.push.unsubscribed | User unsubscribed from push. |
| In-App Impression | message.iam.impression | In-App message shown. |
| In-App Clicked | message.iam.clicked | In-App message clicked. |
| In-App Page Viewed | message.iam.page_displayed | In-App page shown. |
| Email Sent | message.email.sent | Email delivered. |
| Email Received | message.email.received | Email accepted by recipient’s mail server. |
| Email Opened | message.email.opened | Email opened. See Email Reports. |
| Email Link Clicked | message.email.clicked | Link in email clicked. |
| Email Unsubscribed | message.email.unsubscribed | Recipient unsubscribed. |
| Email Reported As Spam | message.email.reported_as_spam | Marked as spam. See Email Deliverability. |
| Email Bounced | message.email.bounced | Bounce due to permanent delivery failure. |
| Email Failed | message.email.failed | Delivery failed. |
| Email Suppressed | message.email.suppressed | Suppressed due to suppression list. |
| SMS Sent | message.sms.sent | SMS sent. |
| SMS Delivered | message.sms.delivered | SMS successfully delivered. |
| SMS Failed | message.sms.failed | SMS failed to deliver. |
| SMS Undelivered | message.sms.undelivered | SMS rejected or unreachable. |
Event data schema
For each message event generated by a user, the following metadata will be attached to the record.| Column Name | Type | Description |
|---|---|---|
event_id | UUID | Unique identifier for the event |
event_timestamp | Timestamp | Time of event occurrence |
event_kind | String | The Event Kind |
subscription_device_type | String | Device type (e.g., iOS, Android, Web, Email, SMS) |
language | String | Subscription language code |
version | String | Integration version |
device_os | String | Device operating system version |
device_type | Number | Numeric device type |
token | String | Push token, phone number, or email |
subscription_id | UUID | Subscription ID |
subscribed | Boolean | Subscription status |
onesignal_id | UUID | OneSignal user ID |
last_active | String | Last active timestamp |
sdk | String | OneSignal SDK version |
external_id | String | External user ID that should match the integration user ID |
app_id | UUID | App ID from OneSignal |
template_id | UUID | Template ID (if applicable) |
message_id | UUID | Message batch/request ID |
message_name | String | Name of the message |
message_title | String | Message title (English only) |
message_contents | String | Truncated message body (English only) |
failure_reason | String | Reason for failure (for push failed and email failed events) |
_created, _id, _index, _fivetran_synced | Internal use | Fivetran sync metadata |
Notes
- Syncs after saving/activating may take an additional 15-30 minutes to complete.
- Deactivating may still result in one final sync after deactivation.
- To ensure efficient data synchronization, our system automatically creates and manages staging datasets. These datasets, named with a pattern like
fivetran_{two random words}_staging, temporarily store data during processing before it’s integrated into your main schema. These staging datasets are essential for maintaining a streamlined workflow and should not be deleted, as they will be automatically recreated.
Import events from Databricks
Send behavioral event data from Databricks to OneSignal to:- Trigger Journeys based on user activity
- Personalize messaging based on behavioral data
- Access to Event Streams for outbound message events (Plan limitations and overages apply)
- Access to Custom Events for inbound event syncing (Plan limitations and overages apply)
- Updated Account Plan (not available on free apps)
- Databricks workspace with SQL Warehouse or compute cluster
- Personal Access Token with appropriate permissions
- Event data tables containing behavioral data in Delta Lake format
- Unity Catalog (recommended for data governance)
Create Databricks Personal Access Token
Generate a Personal Access Token for OneSignal to access your Databricks workspace:
- Navigate to User Settings in your Databricks workspace
- Click Developer tab and then Access tokens
- Click Generate new token
- Enter a comment like “OneSignal Integration” and set expiration (recommend 90 days)
- Save the generated token (you’ll need this for OneSignal)
Configure SQL Warehouse access
Ensure OneSignal can query your event data via SQL Warehouse:
- Navigate to SQL Warehouses in your Databricks workspace
- Select or create a SQL Warehouse for OneSignal access
- Note the Server Hostname and HTTP Path from the connection details
- Ensure the warehouse has access to your event data tables
Add integration in OneSignal
In OneSignal, go to Data > Integrations and click Add Integration.Select Databricks and provide:
- Server Hostname: Your Databricks SQL Warehouse hostname
- HTTP Path: SQL Warehouse HTTP path
- Personal Access Token: Token created in step 1
- Catalog (optional): Unity Catalog name if using Unity Catalog
Configure event data source
Specify the Databricks table containing your event data:
- Database/Schema: Database or schema name containing event tables
- Table: Table name with event records (e.g.,
user_events) - Event Query: Optional SQL query to filter or transform event data
- Event name/type (String)
- User identifier (String)
- Event timestamp (Timestamp)
- Additional event properties
Event data mapping
Map your to OneSignal’s custom events format:| OneSignal Field | Description | Required | |
|---|---|---|---|
name | event_name | Event identifier | Yes |
external_id | user_id | User identifier | Yes |
timestamp | event_timestamp | When event occurred | No |
properties | event_data | No |
Advanced configuration
Unity Catalog Integration
Leverage Unity Catalog for governed data access:Delta Lake Optimization
Optimize event tables for better query performance:- Partitioning: Partition by date (
event_date) for faster time-based queries - Z-Ordering: Z-order by
user_idandevent_namefor better filtering - Delta Lake Features: Use liquid clustering for automatic optimization
Streaming Event Processing
For real-time event processing, consider:- Structured Streaming: Process events as they arrive
- Delta Live Tables: Build robust event processing pipelines
- Auto Loader: Continuously ingest new event files





