Mission Underwriting Managers logo
Full-time
Remote
United States

About the Role

The Data Engineer will support the development and optimization of Mission’s data infrastructure as part of our modern data platform initiative. This hands-on role will focus on implementing scalable data pipelines, enabling a centralized enterprise data warehouse, and supporting business reporting needs. The ideal candidate will collaborate across technology, operations, product, and analytics teams to deliver high-quality, governed, and reusable data assets in alignment with Mission’s long-term architecture.


What You'll Do

  • Contribute to the implementation of scalable data pipelines to ingest, transform, and store data from third-party vendors and internal systems using APIs, files, and databases.
  • Support the development of a cloud-based data warehouse solution in partnership with architects, ensuring clean, normalized, and performant data models.
  • Establish and monitor reliable data ingestion processes across systems with varying grain and cadence, ensuring data quality and completeness.
  • Collaborate with API and integration teams to develop secure, robust data exchange processes with external vendors and internal services.
  • Set up and manage data connections from the warehouse to BI tools (e.g., Power BI, Looker, Tableau) to enable self-service analytics and dashboarding.
  • Document data flows, schemas, and definitions, and help drive data governance and standardization efforts across the organization.
  • Implement data validation, cleansing, and enrichment processes to ensure high-quality data for financial reporting and analytics
  • Ensure compliance with data standards, regulatory requirements (e.g., NAIC, SOX), and data security best practices
  • Collaborate with senior team members to apply data engineering best practices, including version control, CI/CD for data pipelines, testing, and observability.

·         Assist with building dashboards in BI tools.

Required Qualifications

  • 2-5 years of experience in data engineering, data warehousing, or a related field.
  • Strong SQL and performance tuning skills, ideally in Microsoft SQL Server environments.
  • Experience working with data warehouses or data marts using dimensional modeling.
  • Familiar with data ingestion via RESTful APIs, JSON/XML, flat files, message queues, or CDC.
  • Understanding of data governance, metadata management, and access control.
  • Experience with version control (e.g., Git) and basic CI/CD workflows.
  • Experience with BI tools like Power BI, Looker, or Tableau.
  • Bachelor’s degree in Computer Science, Engineering, Information Systems, or equivalent experience.


Preferred Qualification

  • Experience in the commercial insurance industry or with regulated data environments.
  • Hands-on experience with Azure data services.
  • Familiarity with data cleanup, fuzzy matching, or enrichment workflows.
  • Experience with Python, dbt, or Azure Functions.
  • Strong communication skills and ability to work cross-functionally.
  • Ability to travel up to 10% of the year.


Knowledge, Skills and Abilities

  • Understanding of data warehousing principles, dimensional modeling, and ETL/ELT architecture, with hands-on experience building analytics-ready structures preferably in Azure SQL and SSMS.
  • Ability to interpret business concepts in P&C insurance industry to inform data modeling.
  • Proficient in building scalable data pipelines using Azure Data Factory and SQL.
  • Skilled in integrating with diverse data sources and managing ingestion at different grains.
  • Familiar with core Azure services supporting cloud-native data engineering workflows.
  • Able to model and prepare data for downstream BI tools and support dashboard creation.

Additional Information

This is a remote position. Planned, in-office activities may be required on occasion (typically 2-4x per year).

You must live in the United States and be authorized to work in the United States without requirement of employment sponsorship/visa.