- An Unified Data Stack on Snowflake
- Architecture
- Key Features
- Demo Vignettes
- Data Models
- Project Structure
- Setup
- General Usage
- Additional Resources
- Questions, Feedback, and Contribution
This project demonstrates a modern data stack for an example real-life situation that of incident management that typically requires multiple steps, and close collaboration between representatives using tools like Slack. This project uses a modern stack with Snowflake to bring AI-powered automation and real-time analytics to this known business process that can benefit from AI inclusion.
An end-to-end incident management platform* built on Snowflake that demonstrates modern data engineering practices using dbt, AI/ML capabilities, and real-time visualization through Streamlit. The system ingests incident data from Slack via Snowflake's OpenFlow connector and provides intelligent categorization, tracking, and reporting capabilities.
*a small cross-section of a much larger process to show the art of possible
┌─────────────────┐ ┌──────────────┐ ┌─────────────────┐
│ Slack App │───▶│ OpenFlow │────────▶│ Snowflake │
│ │ │ Connector │ | Landing Zone │
└─────────────────┘ └──────────────┘ └─────────────────┘
│
▼
┌─────────────────┐ ┌─────────────────┐ ┌──────────────────┐
│ Streamlit │◀───│ Snowflake │◀───│ dbt Trans- │
│ Dashboard │ │ Curated Zone │ | - formations │
└─────────────────┘ └─────────────────┘ └──────────────────┘
- AI-Powered Classification: Automatically categorizes incidents using Snowflake's AI functions
- Image Processing: AI analysis of attached images for incident categorization
- Real-time Data Pipeline: Ingests Slack messages and converts them to structured incident records
- Interactive Dashboard: Streamlit-based dashboard for monitoring and managing incidents
- dbt Data Transformations: Data modeling using one of the most popular build tool - dbt - on Snowflake provided runtime engine
This project demonstrates several key modern data platform capabilities through realistic incident management scenarios.
Scenario: Real-time incident reporting through Slack
- Users report incidents in dedicated Slack channels
- OpenFlow connector automatically ingests messages to Snowflake
- System processes both text and image attachments
- Demonstrates: Real-time data ingestion, unstructured data handling
Demo Flow:
- Post incident message in Slack: "Payment gateway is returning 500 errors"
- Attach screenshot of error page (use the sample image JPEG provided)
- If Tasks are not setup, then run the dbt Project end-to-end
- Watch as system creates structured incident record with AI classification
Scenario: Working with dbt Projects in Snowflake
- Remote deployment and execution of dbt Projects using Snowflake CLI
- Using variables in
dbt_project.yml - Support for dbt command options as "select"
- Passing arguments using Task config to dbt commands executed using
EXECUTE DBT PROJECT
Scenario: Scalable data transformation pipeline
- Raw Slack data transformed into analytics-ready models
- Incremental processing for performance
- Data quality testing and documentation
- Demonstrates: Modern data engineering practices, data lineage
Key Models:
bronze_zone.incidents: AI-enhanced incident processing with image analysisgold_zone.active_incidents: Business-ready incident trackinggold_zone.monthly_incident_trends: Aggregated analytics
Scenario: Intelligent incident classification and processing
-- Automatic incident categorization from Slack messages
ai_classify(sri.text, ['payment gateway error', 'login error', 'other']):labels[0] as category
-- Smart incident matching using natural language
ai_filter(
prompt('The text category {0} is logically relatable to this record\'s categoßry {1}',
text_category, roi.category)
)-- AI analysis of attached screenshots
case
when sri.attachment_file is not null then
ai_classify(sri.attachment_file, ['payment gateway error', 'login error', 'other']):labels[0]
else ai_classify(sri.text, ['payment gateway error', 'login error', 'other']):labels[0]
end as categoryDemonstrates: AI-native data processing, multimodal analysis, intelligent data enrichment
See the complete implementation in
incidents.sql
Scenario: Real-time operational dashboard
- Live incident monitoring and management
- Interactive attachment viewing
- Trend analysis and reporting
- Demonstrates: Data visualization, real-time analytics, user experience
Dashboard Features:
- Real-time incident metrics with priority-based color coding
- Sample trend charts powered by Snowflake analytics
- In-app image viewing for incident attachments
Complete Incident Lifecycle:
- Report: "Our login page is showing 'Invalid credentials' for valid users" + screenshot
- Ingest: OpenFlow captures message and image from Slack
- Process: dbt transforms raw data:
- AI classifies as "login error" (critical priority)
- Image analysis confirms authentication issue
- Creates incident record INC-2025-XXX
- Monitor: Streamlit dashboard shows new critical incident
- Resolve: Team addresses issue, updates status
- Analyze: System tracks resolution time, updates trends
This demonstrates a complete modern data stack handling real-world operational scenarios with AI enhancement and real-time visibility.
incidents: Core incident records with AI-powered classificationincident_comment_history: Chronological updates and commentsincident_attachments: File attachments with metadata (as ingested from Slack)users: User directory for assignees and reporters
active_incidents: Currently open incidents with enriched dataclosed_incidents: Resolved incidents with resolution metricsmonthly_incident_trends: Aggregated monthly statisticsweekly_incident_trends: Weekly trend analysis
incident-management/
├── data/ # Sample data and test files
│ ├── csv/ # CSV seed data
│ │ ├── incidents.csv # Sample incident records
│ │ ├── incident_comment_history.csv
│ │ └── users.csv # User directory
│ └── images/ # Sample incident attachments
│ ├── invalid_credentials.jpeg
│ └── payment_gateway_outage.jpeg
├── src/
│ ├── incident_management/ # dbt project
│ │ ├── models/
│ │ │ ├── bronze_zone/ # Raw data models
│ │ │ │ ├── incidents.sql # Core incident processing
│ │ │ │ ├── incident_attachments.sql
│ │ │ │ ├── users.sql
│ │ │ │ └── v_*.sql # Slack message views
│ │ │ ├── gold_zone/ # Analytics-ready models
│ │ │ │ ├── active_incidents.sql
│ │ │ │ ├── closed_incidents.sql
│ │ │ │ ├── monthly_incident_trends.sql
│ │ │ │ └── weekly_incident_trends.sql
│ │ │ └── schema.yml # Model documentation
│ │ ├── macros/ # Custom dbt macros
│ │ ├── dbt_project.yml # dbt configuration
│ │ └── profiles.yml # Database connections
│ ├── scripts/ # Deployment and setup scripts
│ │ ├── create_snowflake_yaml.sh # Generate snowflake.yml from template
│ │ ├── dbtdeploy.sh # dbt deployment automation
│ │ ├── dbtexec.sh # dbt execution wrapper
│ │ └── snowflake.yml.template
│ ├── sql/ # Raw SQL scripts
│ │ ├── 01_dbt_projects_stack.sql # dbt Projects infrastructure setup
│ │ ├── 02_slack_connector.sql # Slack connector setup
│ │ └── snowflake.yml
│ └── streamlit/ # Dashboard application
│ ├── main.py # Main dashboard
│ ├── app_utils.py # Utility functions
│ └── snowflake.png # Assets
├── requirements.txt # Python dependencies
└── README.md # This file
- Snowflake account with access to ACCOUNTADMIN privilege to create user and roles
- Snowflake CLI (latest version)
- *Openflow SPCS deployment and a Small runtime
- Git
- uv for Python virtual environment and dependency management
This project demonstrates dbt Projects deployment and execution from local dev machine. But once deployed, it is easy to switch to Snowflake Workspaces to manage further redeployments and test executions.
Before starting the installation, ensure you have:
- Snowflake CLI installed (installation guide)
- Snowflake connection configured in
~/.snowflake/config.toml - ACCOUNTADMIN privileges in your Snowflake account
- Openflow SPCS Use these installation instructions to setup Openflow SPCS deployment and a runtime on top within your Snowflake account
-
Clone this repo
git clone --branch v1.0.0 https://github.com/Snowflake-Labs/unified-data-stack-for-incident-management.git cd unified-data-stack-for-incident-management -
Configure Environment Variables
Copy the template and configure your environment:
cp env.template .env
Edit
.envfile and configure all variables except these two (generated later):DBT_SNOWFLAKE_PASSWORDDBT_SNOWFLAKE_PRIVATE_KEY_PATH
Note: If you change any of the below parameters, be sure to change the profiles.yml as well
DBT_PROJECT_DATABASE=incident_managementDBT_PROJECT_SCHEMA=dbt_project_deploymentsMODEL_SCHEMA=bronze_zoneDBT_SNOWFLAKE_WAREHOUSE=incident_management_dbt_whDBT_PROJECT_ADMIN_ROLE=dbt_projects_engineer
-
Setup Snowflake Infrastructure (Automated)
Use the Makefile to automate the entire Snowflake setup:
Option A: Complete Setup (Recommended)
# Run complete installation make install ENV_FILE=.env CONN=<your-connection-name>
Option B: Step-by-Step Setup
# Step 2.1: Generate snowflake.yml configuration make generate-yaml ENV_FILE=.env # Step 2.2: Setup dbt Projects infrastructure make setup-dbt-stack CONN=<your-connection-name> # Step 2.3: Setup Slack connector infrastructure make setup-slack-connector CONN=<your-connection-name>
Note: Replace
<your-connection-name>with the connection name defined in your~/.snowflake/config.tomlNote: Role setup commands require ACCOUNTADMIN privileges
-
Setup User Authentication (Optional)
Execute this step if planning to run Streamlit app remotely and/or deploy dbt Projects remotely:
a. Generate PAT Token: From Snowsight UI generate a PAT for the dbt Projects service user tied to the role created in previous step. Note and copy the token for use during authentication.
b. Generate Key-Pair Authentication: Follow the steps here:
# Generate private and public key openssl genrsa -out rsa_private_key.pem 2048 openssl rsa -in rsa_private_key.pem -pubout -out rsa_public_key.pemThen update the user in Snowflake (replace placeholder):
ALTER USER <user name> SET RSA_PUBLIC_KEY=' -----BEGIN PUBLIC KEY----- MIIBIjANBgkqhkiG9w0BAQEFAA... -----END PUBLIC KEY-----';
c. Update Environment Variables: Update
.envwith:DBT_SNOWFLAKE_PASSWORD=<your-PAT-token>DBT_SNOWFLAKE_PRIVATE_KEY_PATH=<path-to-private-key-file>
-
Configure Slack Connector (Post-Installation)
After running the Makefile setup, configure the Slack connector in your OpenFlow SPCS runtime:
a. Review Prerequisites: Check the pre-requisites
b. Create Slack App: Create a Slack app in your workspace using the given manifest
c. Update External Access: Update the External Access Integration object to add
slack.comdomain for egress from SPCS containersd. Configure Connector: Use these instructions to configure the connector. Ensure to use the database and role parameters created by the Makefile:
e. Start and Test: Start the connector and add the Slack app to a channel. Verify these tables are created:
SLACK_MEMBERS,SLACK_MEMBERS_STAGINGDOC_METADATA,FILE_HASHES,SLACK_MESSAGES
-
Test the Setup
Drop a test message in your Slack channel and verify that a record appears in the
SLACK_MESSAGEStable (might take a few seconds).
For detailed information about available Makefile targets:
make helpCommon Commands:
# Check prerequisites
make check-prereqs
# Complete automated setup
make install ENV_FILE=.env CONN=myconnection
# Individual steps
make generate-yaml ENV_FILE=.env
make setup-dbt-stack CONN=myconnection
make setup-slack-connector CONN=myconnectionTroubleshooting:
- Ensure your Snowflake connection is properly configured in
~/.snowflake/config.toml - Verify you have ACCOUNTADMIN privileges
- Check that all environment variables are set in your
.envfile - Run
make check-prereqsto verify prerequisites
-
Start the Streamlit Application
1.1 Install Python Dependencies
uv venv source .venv/bin/activate uv pip install -r requirements.txtSee
requirements.txtfor the complete list of dependencies.1.2 Now run the Streamlit app locally
cd src/streamlit streamlit run main.pySee
main.pyfor the complete dashboard implementation. -
Access the Dashboard
- Open your browser to
http://localhost:8501 - The dashboard will display real-time incident metrics, trends, and detailed tables
- Open your browser to
- Metrics Overview: Critical, high-priority, active, and recently closed incident counts
- Active Incidents Table: Real-time view of open incidents with priority indicators
- Recently Closed Incidents: Track resolution times and outcomes
- Attachment Viewer: Click on incidents with attachments to view images
- Trend Analysis: Monthly and weekly incident patterns (when data is available)
- Report New Incidents: Post messages in configured Slack channels
- Automatic Processing: The system will:
- Extract incident details from messages
- Classify incident type and priority using AI
- Process attached images for additional context
- Create structured incident records
- Track Progress: Monitor incidents through the Streamlit dashboard
- Snowflake AI Functions Documentation
- dbt Projects on Snowflake
- Streamlit Documentation
- OpenFlow Connectors
Please feel free to reach out with any questions to [email protected]
Feedback and contributions are very welcome:
- GitHub Issues for bug reports and feature requests
- Pull Requests for direct contributions

