Jobs.ca
Jobs.ca
Language
Charger Logistics Inc logo

Data Engineer

Brampton
Mid Level
Full-Time

Top Benefits

Competitive Salary
Healthcare Benefits Package
Career Growth

About the role

Charger logistics Inc. is a world- class asset-based carrier with locations across North America. With over 20 years of experience providing the best logistics solutions, Charger logistics has transformed into a world-class transport provider and continue to grow.

Charger logistics invests time and support into its employees to provide them with the room to learn and grow their expertise and work their way up. We are seeking aData Engineer with expertise in SQL, Python, DBT and RisingWaveto join our modern data team.

Responsibilities:

  • Design high-performanceSQL pipelinesacross PostgreSQL, BigQuery, Snowflake, and MongoDB.
  • DevelopPython applicationsfor data ingestion, transformation, and automation.
  • ImplementRisingWave streaming pipelinesfor real-time analytics.
  • BuildApache Kafkaarchitectures for high-throughput data processing.
  • Orchestrate workflows usingApache AirflowonGoogle Cloud Platform.
  • Optimize queries and implement data quality checks across multiple platforms.
  • Mentor team members and collaborate with business stakeholders.
  • Deploy CI/CD workflows using Git for reliable pipeline management.

Requirements

Required Qualifications:

  • Bachelor's degree in Computer Science, Engineering, or related field. -5+ years of data engineering experiencewithSQL, Python, and RisingWave.
  • Must haveAlloyDB and CDC experience (DataStream/Debezium) -Expert DBT skillsacross Big Query, Snowflake and AlloyDB. -Expert SQL skills: CTEs, window functions, optimization across PostgreSQL, BigQuery, Snowflake. -Advanced Python: pandas, sqlalchemy, API integration, streaming data processing. -Production experiencewith Apache Kafka, Apache Airflow, and Google Cloud Platform.
  • Experience with MongoDB, dimensional modeling, and both batch/streaming ETL pipelines.
  • Strong Git and collaborative development experience.

Technical Skills:

-Core: SQL (advanced), Python, RisingWave (required). -Cloud: Google Cloud Platform, BigQuery, GCP native services. -Streaming: Apache Kafka, real-time data processing. -Orchestration: Apache Airflow (production experience). -Databases: PostgreSQL, Snowflake, MongoDB. -Tools: Git, Docker, CI/CD pipelines.

Preferred Qualifications:

  • GCP certifications, Terraform/CloudFormation experience.
  • previous experience with RisingWave is strongly preferred
  • Data visualization tools (Looker, Tableau, Power BI).
  • DataOps and analytics engineering best practices.
  • ClickHouse experience is preferred

What You'll Build:

  • Scalable SQL pipelines across multiple database systems.
  • Python-based ETL/ELT solutions spanning cloud and on-premise.
  • Real-time streaming pipelines using RisingWave and Kafka.
  • GCP-native data solutions with automated quality checks.
  • Airflow-orchestrated workflows with CI/CD deployment.

Benefits

  • Competitive Salary
  • Healthcare Benefits Package
  • Career Growth

About Charger Logistics Inc

Transportation, Logistics, Supply Chain and Storage
501-1000

Charger Logistics'​ strives to offer the best client focused logistics solution. We start with flexibility. By offering various safe and efficient solutions for all product sizes, weights and sensitivities our limits are minimal. Additionally, our network, various locations throughout North America and fleet size allow us to offer our clients what they need every time.

Charger Logistics was founded in the early 2000's and has grown by leaps and bounds since then. From owning a single truck to owning a fleet of over eight-hundred trucks, two-thousand trailers including reefers, dry vans, chassis, flat beds, step decks and more! A lot has changed however, our commitment to our clients will never change.

Similar jobs you might like