About the role
We are seeking a Senior Data Engineer with proven expertise in Databricks, Snowflake, and AWS Cloud . This fully remote contract role requires hands-on experience designing, developing, and optimizing scalable, secure, and high-performance data pipelines across modern cloud-based ecosystems.
Responsibilities
- Design, develop, and maintain data pipelines and ETL workflows using Databricks and Snowflake.
- Implement data ingestion, transformation, and orchestration solutions across structured and unstructured data sources.
- Develop and optimize data models, warehouse schemas, and partitioning strategies for analytical performance.
- Build and maintain AWS-based data infrastructure (e.g., S3, Lambda, Glue, Redshift, IAM, CloudFormation).
- Ensure data security and compliance through encryption/decryption processes and governance frameworks (e.g., encrypt/decrypt guest reservation data).
- Implement CI/CD pipelines for data engineering using tools like GitHub Actions, AWS CodePipeline, or Azure DevOps.
- Collaborate with data scientists, analysts, and architects to align infrastructure with business intelligence needs.
- Monitor, troubleshoot, and resolve data pipeline performance or reliability issues.
- Document technical solutions and follow best practices for code versioning, testing, and deployment.
Must Have
-
5+ years of experience in Data Engineering, building and maintaining cloud-based data solutions.
-
Hands-on experience with Snowflake (mandatory):
-
Expertise in Snowflake SQL, data modeling, staging, warehouse optimization, time travel, and data sharing.
-
Experience integrating Snowflake with Databricks and AWS data services.
-
Strong proficiency in Databricks (PySpark, Delta Lake, notebook development).
-
Solid knowledge of AWS Cloud services such as S3, Glue, Athena, Lambda, Step Functions, and Redshift.
-
Proficiency with Python and SQL for data manipulation and ETL logic.
-
Strong understanding of ETL/ELT frameworks, data lakehouse architectures, and data governance principles.
-
Experience with data encryption, decryption, and key management best practices.
-
Excellent communication, documentation, and collaboration skills.
Nice to Have
- Experience with Airflow, dbt, or AWS Glue Workflows for orchestration.
- Familiarity with Terraform or CloudFormation for infrastructure as code.
- Exposure to Azure Data Factory, Google BigQuery, or Kafka streaming pipelines.
- Knowledge of CI/CD automation for data pipelines.
- Certifications (e.g., AWS Certified Data Analytics – Specialty, Databricks Certified Data Engineer, SnowPro Core).
- Experience working in agile environments with DevOps and Git-based workflows.
, Matheo Theodossiou
About Myticas Consulting
Myticas Consulting is a proven industry-leading IT Staffing and Recruitment organization, with active locations in major North American cities such as Ottawa Ontario, Chicago Illinois, and Atlanta Georgia.
Our large team of IT recruitment and sales professionals have over 50+ years of combined experience providing the highest quality of Telecommunication Engineering, ERP and IT Software Development resources, to both Public and Private sector enterprise level environments.
Here at Myticas, we strive to continually provide unmatched quality of IT resources, that prove to stand the test of time, and go above and beyond all project expectations and deliverables.
The "motto" at Myticas is not only to work with you, but to understand what success really looks like for your organization, and to deliver the IT staffing needs that matter most to increasing both the functionality and efficiency of your environment.
With our evolving expertise in Telecom, ERP, DW/BI and EAI resources, and a guaranteed replacement program for all our partners, the defined win-win scenario has never been presented in a more simple manner.
About the role
We are seeking a Senior Data Engineer with proven expertise in Databricks, Snowflake, and AWS Cloud . This fully remote contract role requires hands-on experience designing, developing, and optimizing scalable, secure, and high-performance data pipelines across modern cloud-based ecosystems.
Responsibilities
- Design, develop, and maintain data pipelines and ETL workflows using Databricks and Snowflake.
- Implement data ingestion, transformation, and orchestration solutions across structured and unstructured data sources.
- Develop and optimize data models, warehouse schemas, and partitioning strategies for analytical performance.
- Build and maintain AWS-based data infrastructure (e.g., S3, Lambda, Glue, Redshift, IAM, CloudFormation).
- Ensure data security and compliance through encryption/decryption processes and governance frameworks (e.g., encrypt/decrypt guest reservation data).
- Implement CI/CD pipelines for data engineering using tools like GitHub Actions, AWS CodePipeline, or Azure DevOps.
- Collaborate with data scientists, analysts, and architects to align infrastructure with business intelligence needs.
- Monitor, troubleshoot, and resolve data pipeline performance or reliability issues.
- Document technical solutions and follow best practices for code versioning, testing, and deployment.
Must Have
-
5+ years of experience in Data Engineering, building and maintaining cloud-based data solutions.
-
Hands-on experience with Snowflake (mandatory):
-
Expertise in Snowflake SQL, data modeling, staging, warehouse optimization, time travel, and data sharing.
-
Experience integrating Snowflake with Databricks and AWS data services.
-
Strong proficiency in Databricks (PySpark, Delta Lake, notebook development).
-
Solid knowledge of AWS Cloud services such as S3, Glue, Athena, Lambda, Step Functions, and Redshift.
-
Proficiency with Python and SQL for data manipulation and ETL logic.
-
Strong understanding of ETL/ELT frameworks, data lakehouse architectures, and data governance principles.
-
Experience with data encryption, decryption, and key management best practices.
-
Excellent communication, documentation, and collaboration skills.
Nice to Have
- Experience with Airflow, dbt, or AWS Glue Workflows for orchestration.
- Familiarity with Terraform or CloudFormation for infrastructure as code.
- Exposure to Azure Data Factory, Google BigQuery, or Kafka streaming pipelines.
- Knowledge of CI/CD automation for data pipelines.
- Certifications (e.g., AWS Certified Data Analytics – Specialty, Databricks Certified Data Engineer, SnowPro Core).
- Experience working in agile environments with DevOps and Git-based workflows.
, Matheo Theodossiou
About Myticas Consulting
Myticas Consulting is a proven industry-leading IT Staffing and Recruitment organization, with active locations in major North American cities such as Ottawa Ontario, Chicago Illinois, and Atlanta Georgia.
Our large team of IT recruitment and sales professionals have over 50+ years of combined experience providing the highest quality of Telecommunication Engineering, ERP and IT Software Development resources, to both Public and Private sector enterprise level environments.
Here at Myticas, we strive to continually provide unmatched quality of IT resources, that prove to stand the test of time, and go above and beyond all project expectations and deliverables.
The "motto" at Myticas is not only to work with you, but to understand what success really looks like for your organization, and to deliver the IT staffing needs that matter most to increasing both the functionality and efficiency of your environment.
With our evolving expertise in Telecom, ERP, DW/BI and EAI resources, and a guaranteed replacement program for all our partners, the defined win-win scenario has never been presented in a more simple manner.