Jobs.ca
Jobs.ca
Language
Mission.dev logo

DevOps/Data Engineer

Mission.dev8 days ago
Remote
Mid Level
contract

About the role

Our company description

Mission is a platform for hiring, vetting, and managing software development talent. It enables our clients to connect with the world’s best talent to build mission-critical software products.

About the client

As a premier Google Cloud partner in data and analytics, our client delivers cutting-edge cloud data solutions to world-class organizations. By combining deep expertise in machine learning, data engineering, and analytics, they help businesses push the boundaries of modern technology.

Our client is a full-stack data consultancy with a clear mission — to become the market leader in Modern Data Stack consulting. Their portfolio spans from fast-growing North American scale-ups to well-established global tech companies, all benefiting from their advanced AI and data capabilities.

About the role

We are seeking a DevOps or Data Engineer with hands-on experience in cloud infrastructure, data migration, and containerized environments. The ideal candidate has previously worked on Azure-to-GCP migrations, understands both DevOps automation and data engineering pipelines, and can handle the transition of microservices and data systems with minimal downtime.

What You’ll Do

  • Configure and manage Google Kubernetes Engine (GKE) clusters.
  • Deploy and manage infrastructure using Terraform.
  • Migrate FastAPI/Python microservices to GKE (including container image rebuilds).
  • Implement automated scaling for streaming pipelines and services.
  • Design and configure network topology including VPCs, subnets, and firewall rules for WebRTC traffic.
  • Tune WebRTC stack architecture for GCP, including STUN/TURN server configuration.
  • Plan and execute service mesh implementations.
  • Manage GPU VM specifications and allocation strategies for compute-intensive workloads.
  • Handle Azure to GCP microservice migration, ensuring performance and reliability.
  • Configure Google Cloud Storage (GCS) and migrate existing Azure Blob Storage data.
  • Execute Azure Event Hub → Google Pub/Sub migration for real-time data streaming.
  • Build automated data migration pipelines from Azure Cosmos DB → PostgreSQL.
  • Handle vector store data migrations, ensuring semantic search capabilities are preserved.
  • Perform data validation and reconciliation to ensure successful end-to-end migration.
  • Work with both Azure Cosmos DB and PostgreSQL, understanding schema mapping and data consistency.

Required Skills & Experience

  • 5+ years of experience in DevOps or Data Engineering roles.
  • Proven success in multi-cloud migrations (Azure → GCP).
  • Strong experience with Terraform, GKE, Docker, and Kubernetes networking.
  • Experience in Python, FastAPI, and modern data pipelines.
  • Familiarity with vector databases and semantic search frameworks.
  • Knowledge of service mesh architectures (e.g., Istio, Linkerd).
  • Understanding of auto-scaling, load balancing, and GPU-based compute workloads.

This is as short term engagement until mid December with a chance to renew! Full overlap with EST timezone.

About Mission.dev

Software Development
51-200

Mission is a global network where senior software and product talent can come together to learn, share, and join our marketplace to work on curated teams with vetted talent.