AI Engineer

Location: Calgary, AB (Hybrid/Remote)
Type: Full-time
Position Overview

MetaFactor is hiring a Calgary based AI Developer to design, build, and operationalize advanced AI systems. This role includes several categories of work, including:

  • Designing and architecting scalable AI solutions based on customer requirements, technology evolution, and best practices
  • Taking models from experimentation to production-ready, scalable solutions, with a strong emphasis on LLMs, deep learning, and GPU-optimized workloads.
  • Custom software development centered around AI implementations, including programming interfaces in Go, C#, and Rust, and integrations with machine learning libraries and frameworks including Python and SQL.

The ideal candidate must have a Bachelors or Master’s degree in Instrumentation Engineering, Electrical Engineering, Process Control Engineering, Computer Science or Electrical Engineering or a related field and at least 2 years relevant experience in oil and gas and or power industries.

Key Responsibilities
  • Design, build, and maintain end-to-end AI data pipelines using tools such as Azure Data Factory, Databricks, Airflow, or similar orchestration frameworks.
  • Architect, design, and build distributed data architectures with cloud-based and lake-house systems.
  • Develop, test, and deploy dbt models for data transformation and lineage tracking.
  • Integrate and optimize data from streaming sources (e.g., Event Hubs, Kafka, IoT streams) and batch systems.
  • Build robust data models and performance-optimized datasets in Databricks, Snowflake and Azure Synapse.
  • Work closely with analysts and data scientists to enable reliable, governed access to curated datasets.
  • Implement CI/CD and version control best practices for data code (Git, dev/test/prod environments).
  • Ensure data quality, observability, and compliance with organizational and regulatory standards.
  • Monitor and optimize compute and storage costs across platforms.
  • Assist customers with Microsoft Copilot implementations in accordance with best practices.
Preferred Skills & Experience
  • 3+ years of experience in AI design and development, data engineering, analytics engineering, or ETL development.
  • SQL and experience with Python for data manipulation and automation.
  • C# and Go for custom data integration applications.
  • Azure data services (Data Factory, Synapse, Event Hubs, Data Lake)
  • Databricks (Spark, Delta Live Tables)
  • Snowflake
  • Apache Airflow or similar orchestration tools
  • Experience with infrastructure as code (Terraform, ARM, or Bicep) and containerized workflows (Docker, Kubernetes).
  • Familiarity with data governance, metadata management, or Purview.
  • Data observability and system monitoring tools such as Splunk
  • dbt (Data Build Tool)
  • Experience working with data warehousing, data lakes, and ELT/ETL frameworks.
  • Understanding of streaming vs batch architecture, and when to use each.
  • Strong version control (Git) and CI/CD familiarity.
  • Excellent communication and documentation skills.

Due to a large volume of applications we are asking for some additional information to help with the decision-making process. Please provide your resume and a short video introducing yourself and explaining why you are the best fit for the job.

Please submit your resume and your video presentation link to the email below.