Senior Data Engineer

by Orca Intelligence Inc.

Location: London, ON
Date Posted: Dec 18, 2025
Orca Intelligence Inc.

Job Description

Senior Data Engineer

 

Orca has been providing freight audit and analytics services to its’ growing client base since 2016. We have experienced tremendous growth since inception and are proud to offer our solution to some of North America’s largest organizations. Orca processes Billions of dollars in freight invoices each year for clients and provides invaluable reporting and analysis to help our clients operate more efficiently. Orca was awarded London’s prestigious ‘Medium Business of the Year’ award in 2024 by the London Chamber of Commerce for tremendous growth and community involvement.

 

Job Summary

The successful candidate will be responsible for leading the rebuild of Orca’s data platform. Working with our existing Azure SQL, SQL Server on Azure VMs, and Azure Fabric environment, this role will assess what is working, identify gaps and risks, and design and implement improved data architecture and pipelines. The candidate will work closely with external data architecture partners, our internal analytics team, and key stakeholders to deliver a modern, scalable, and reliable data platform that supports both internal operations and client-facing analytics.

 

The candidate will act as the technical owner of Orca’s data platform, ensuring that data is accurate, timely, and accessible, while reducing manual processes through automation. The successful candidate may also support and take on responsibilities including performance tuning, reliability, and general data operations. 

 

Responsibilities

The successful candidate will:

  • Lead the assessment of Orca’s current data landscape (Azure SQL, SQL Server on Azure VMs, Azure Fabric, and related integrations) and document strengths, weaknesses, and risks
  • Design and implement a modernized data platform architecture in partnership with external advisors and internal stakeholders
  • Build, maintain, and optimize ETL/ELT pipelines to ingest, transform, and deliver data for analytics and operational needs
  • Standardize and automate manual data processes, reports, and extracts wherever possible
  • Own database performance and reliability, including indexing, query optimization, capacity planning, and environment management
  • Implement and manage backup, restore, and disaster recovery processes for critical data systems
  • Collaborate closely with our team of data and business analysts to ensure they have clean, well-modeled, and well-documented data for Power BI and other reporting tools
  • Establish and maintain data quality checks, monitoring, and alerting for key data flows
  • Contribute to data governance practices, including naming conventions, documentation, and data access controls
  • Provide technical guidance and best practices to internal teams on data-related topics and support broader data initiatives as required

 

Qualifications

The ideal candidate will possess the following traits:

  • 6+ years in data engineering / data platform roles, ideally in a SaaS or analytics-heavy environment.
  • Strong expertise with Azure SQL and SQL Server (performance tuning, indexing, query optimization).
  • Hands-on experience with Azure data tools such as:

o  Azure Data Factory, Synapse, or Fabric/Dataflows

o  Azure Storage / Data Lake

  • Excellent SQL skills and strong comfort with at least one scripting language (Python preferred; PowerShell or .NET a plus).
  • Proven track record of:

o  Designing and implementing data pipelines end-to-end.

o  Migrating from legacy data environments to more modern, centralized platforms.

o  Automating manual data workflows.

 

The following qualifications are not required, however would be considered a strong asset:

  • Experience with Azure Fabric specifically (lakehouse, medallion architecture, etc.).
  • Exposure to Power BI models and how engineers can best support BI teams.
  • Experience introducing CI/CD for data (e.g., Git-based deployments of pipelines and database changes).
  • Background covering or collaborating closely with DBA responsibilities (or prior role as a DBA who shifted into data engineering).
  • Experience in a regulated or audited environment (SOC, ISO, etc.) and handling PII or financial data.

 

Valued Traits

The ideal candidate will possess the following traits:

  • Systems thinking – sees how source systems, pipelines, databases, and reports fit together and designs solutions that work end-to-end.
  • Pragmatism – balances “ideal” architecture with practical constraints, making incremental improvements that deliver value quickly.
  • Ownership mindset – treats the data platform as their product, taking responsibility for reliability, performance, and long-term maintainability.
  • Accountability & execution – reliably follows through on commitments, drives initiatives to completion, and is comfortable being the person ultimately responsible for making the data platform work.
  • Strong communication skills – can explain technical concepts clearly to non-technical stakeholders and collaborate effectively with analysts, engineers, and leadership.
  • Bias toward automation – naturally looks for ways to reduce manual work, standardize processes, and build reusable components.
  • Attention to detail – cares about data quality, edge cases, and the impact of changes on downstream consumers and clients.
  • Curiosity and learning orientation – stays current with Azure data services and best practices, and continuously looks for better ways to solve problems.
  • Collaboration and humility – works well with external consultants and internal teams, and is open to feedback and different perspectives.
  • Customer focus – understands that better data directly impacts client experience and business outcomes, and prioritizes work accordingly.
  • Integrity and reliability – can be trusted with sensitive data and production systems, and follows through on commitments.

 

Compensation

Orca provides a competitive compensation package:

  • Competitive base salary
  • RRSP contribution matching
  • Fantastic benefits package
  • Healthcare Spending Account
  • Lifestyle Spending Account
  • Complimentary cell phone plan
  • Profit Based Bonus Plan

Requirements added by the job poster

• Commute to this job’s location

• Accept a background check

• 2+ years of work experience with Microsoft Azure

Apply Via LinkedIn