Data Platform Engineer [In Person, Toronto]
Top Benefits
About the role
About Terminal Terminal is Plaid for Telematics in commercial trucking. Companies building the next generation of insurance products, financial services and fleet software for trucking use our Universal API to access GPS data, speeding data and vehicle stats. We are a fast-growing, venture-backed startup supported by top investors including Y Combinator, Golden Ventures and Wayfinder Ventures. Our exceptionally talented team is based in Toronto, Canada.
For more info, check out our website: https://withterminal.com Note: This role is only available to Toronto/GTA-based candidates About The Role We’re looking for an engineer who thrives on building scalable data platforms and enjoys tackling complex backend challenges. This isn’t just a data engineering role, you’ll be designing and optimizing the data platform that powers Terminal’s API, managing everything from data streaming and storage to analytics features at petabyte scale.
You will lead building the data platform for the long-term vision. You’ll not only design and deliver complex systems, but also drive technical strategy, set architectural direction, and influence how the data platform evolves. You’ll partner closely with product, engineering teams, and leadership to ensure we’re building the right abstractions and reusable components that scale with our growth. You know when to slow down to build the right solution vs. work against time. This is a role with broader ownership, where your leadership and judgment will raise the technical bar across the team and directly impact how customers succeed with high-volume telematics data.
What You Will Do
- Own projects aimed at enhancing data replication, storage, enrichment, and reporting capabilities.
- Build and optimize efficient streaming and batch data pipelines that support our core product and API.
- Design scalable storage solutions for handling petabytes of IoT and time-series data.
- Develop and maintain real-time data systems to ingest growing data volumes.
- Implement distributed tracing, data lineage and observability patterns to improve monitoring and troubleshooting.
- Manage infrastructure that powers the current and future services, ensuring scalability, reliability, and zero downtime for HA services.
- Write clean, maintainable code in Java and Python for various platform components.
- Shape architectural decisions to ensure scalability and reliability throughout the data platform.
The Ideal Candidate Will Have
-
6+ years of experience in platform engineering or data engineering.
-
4+ years of experience designing and optimizing data pipelines at TB to PB scale.
-
Proficient in Java, with a focus on clean, maintainable code.
-
Strong system design skills with a focus on big data and real-time workflows.
-
Experience with lake-house architectures (e.g., Iceberg, Delta, Paimon).
-
Experience with real-time data processing tools like Kafka, Flink and Spark.
-
Knowledge of distributed systems and large-scale data challenges.
-
Strong problem-solving skills and a collaborative mindset.
-
Nice-to-have:
-
Experience working with orchestration / workflow engines (e.g. Step Functions, Temporal)
-
Experience with serverless and/or event-driven architectures (e.g. AWS Lambda, SQS).
-
Experience with Javascript/Typescript languages (for cross team work)
Tech Stack
- Languages: Java, Python
- Framework: Springboot
- Storage: AWS S3, AWS DynamoDB, Apache Iceberg, Redis
- Streaming: AWS Kinesis, Apache Kafka, Apache Flink
- ETL: AWS Glue, Apache Spark
- Serverless: AWS SQS, AWS EventBridge, AWS Lambda and Step Functions.
- Infrastructure as Code: AWS CDK
- CI/CD: GitHub Actions
Benefits
- Strong compensation and equity packages
- Brand new MacBook and computer equipment
- Top-tier health/dental benefits and a flexible healthcare spending account
- Personal spending account for professional development, fitness and wellness
- Four weeks paid time off + statutory holidays
- In-person culture with an office located in downtown Toronto
About Terminal (YC S23)
Terminal is a Universal API for Telematics Data in commercial trucking. Companies building insurance products, financial services and fleet software for commercial trucking use our Universal API to access GPS data, safety data, dash cam videos and vehicle statistics. Terminal is backed by world class investors including Y Combinator & Golden Ventures.
Data Platform Engineer [In Person, Toronto]
Top Benefits
About the role
About Terminal Terminal is Plaid for Telematics in commercial trucking. Companies building the next generation of insurance products, financial services and fleet software for trucking use our Universal API to access GPS data, speeding data and vehicle stats. We are a fast-growing, venture-backed startup supported by top investors including Y Combinator, Golden Ventures and Wayfinder Ventures. Our exceptionally talented team is based in Toronto, Canada.
For more info, check out our website: https://withterminal.com Note: This role is only available to Toronto/GTA-based candidates About The Role We’re looking for an engineer who thrives on building scalable data platforms and enjoys tackling complex backend challenges. This isn’t just a data engineering role, you’ll be designing and optimizing the data platform that powers Terminal’s API, managing everything from data streaming and storage to analytics features at petabyte scale.
You will lead building the data platform for the long-term vision. You’ll not only design and deliver complex systems, but also drive technical strategy, set architectural direction, and influence how the data platform evolves. You’ll partner closely with product, engineering teams, and leadership to ensure we’re building the right abstractions and reusable components that scale with our growth. You know when to slow down to build the right solution vs. work against time. This is a role with broader ownership, where your leadership and judgment will raise the technical bar across the team and directly impact how customers succeed with high-volume telematics data.
What You Will Do
- Own projects aimed at enhancing data replication, storage, enrichment, and reporting capabilities.
- Build and optimize efficient streaming and batch data pipelines that support our core product and API.
- Design scalable storage solutions for handling petabytes of IoT and time-series data.
- Develop and maintain real-time data systems to ingest growing data volumes.
- Implement distributed tracing, data lineage and observability patterns to improve monitoring and troubleshooting.
- Manage infrastructure that powers the current and future services, ensuring scalability, reliability, and zero downtime for HA services.
- Write clean, maintainable code in Java and Python for various platform components.
- Shape architectural decisions to ensure scalability and reliability throughout the data platform.
The Ideal Candidate Will Have
-
6+ years of experience in platform engineering or data engineering.
-
4+ years of experience designing and optimizing data pipelines at TB to PB scale.
-
Proficient in Java, with a focus on clean, maintainable code.
-
Strong system design skills with a focus on big data and real-time workflows.
-
Experience with lake-house architectures (e.g., Iceberg, Delta, Paimon).
-
Experience with real-time data processing tools like Kafka, Flink and Spark.
-
Knowledge of distributed systems and large-scale data challenges.
-
Strong problem-solving skills and a collaborative mindset.
-
Nice-to-have:
-
Experience working with orchestration / workflow engines (e.g. Step Functions, Temporal)
-
Experience with serverless and/or event-driven architectures (e.g. AWS Lambda, SQS).
-
Experience with Javascript/Typescript languages (for cross team work)
Tech Stack
- Languages: Java, Python
- Framework: Springboot
- Storage: AWS S3, AWS DynamoDB, Apache Iceberg, Redis
- Streaming: AWS Kinesis, Apache Kafka, Apache Flink
- ETL: AWS Glue, Apache Spark
- Serverless: AWS SQS, AWS EventBridge, AWS Lambda and Step Functions.
- Infrastructure as Code: AWS CDK
- CI/CD: GitHub Actions
Benefits
- Strong compensation and equity packages
- Brand new MacBook and computer equipment
- Top-tier health/dental benefits and a flexible healthcare spending account
- Personal spending account for professional development, fitness and wellness
- Four weeks paid time off + statutory holidays
- In-person culture with an office located in downtown Toronto
About Terminal (YC S23)
Terminal is a Universal API for Telematics Data in commercial trucking. Companies building insurance products, financial services and fleet software for commercial trucking use our Universal API to access GPS data, safety data, dash cam videos and vehicle statistics. Terminal is backed by world class investors including Y Combinator & Golden Ventures.