Sr. Data Engineering Specialist (AWS)
Top Benefits
About the role
Sr. Data Engineering Specialist (AWS)
Description
You’ll be joining a pivotal moment in our organization’s evolution as we establish the foundational data infrastructure that will power wealth management decisions for years to come. As a Sr. Data Engineering Specialist, you won’t just be building pipelines – you’ll be architecting the data backbone that enables personalized wealth experiences for our clients and data‑driven insights for our advisors.
We are building a practice of excellence that integrates deep wealth domain knowledge with modern cloud‑native data engineering, focused on delivering curated datasets and data products that empower digital advisors, client personalization, regulatory compliance, and advanced analytics. This is an opportunity to work alongside seasoned practitioners with real‑world wealth management expertise, collaborate closely with DnA teams across the Wealth lines of business, and contribute to solutions that have tangible impact on investment, insurance, and advisory services. If you’re driven by purpose, passionate about data craftsmanship, and eager to be part of a forward‑looking, data‑centric culture, we’d love to have you onboard.
What Makes This Opportunity Different:
- Greenfield Innovation: Lead the establishment of our AWS DataOps and MLOps practices from the ground up, with the autonomy to implement best practices and cutting‑edge technologies
- Domain Expertise Advantage: Work alongside seasoned wealth management practitioners who understand the nuances of client portfolios, risk assessment, and regulatory requirements – translating complex business needs into elegant technical solutions
- Enterprise Impact: Your data products will directly influence investment strategies, client outcomes, and business growth across our wealth line of business
- Technical Excellence: Leverage the full AWS ecosystem to build scalable, secure, and compliant data solutions that meet the stringent requirements of wealth management
- Career Acceleration: Gain specialized expertise in wealth domain data engineering – a highly valued and differentiated skill set in the financial services industry
This role drives the development and implementation of modern data engineering practices, policies, and cloud‑native solutions to support Wealth Management’s DataOps platform on AWS. The specialist will manage the full data lifecycle, ensuring secure, efficient, and cost‑effective collection, transformation, storage, and access of data across hybrid and cloud‑native environments.
As a senior member of the Wealth Data Engineering team, the role integrates data from legacy investment platforms and new cloud‑based systems to build a unified, governed data tier. It applies DataOps and DevOps principles using AWS‑native services to enable real‑time insights, automation, and reliable delivery of curated datasets for analytics, ML, and regulatory use cases.
**This is a HYBRID role
Main Responsibilities
- Design and implement secure, scalable, and cost‑efficient data pipelines using AWS Glue, Step Functions, Apache Airflow and Lambda to support batch and streaming ingestion from cloud native, third party (e.g. Snowflake), and on‑premises; relational and/or non‑relational data sources
- Build and maintain a centralized data lake using AWS foundational components such as S3, Lake Formation, and Apache Iceberg to enable federated access and fine‑grained security across Wealth business domains.
- Integrate data from on‑premises and cloud systems using AWS DMS, Glue connectors, and Redshift federated queries.
- Design CI/CD pipelines for infrastructure‑as‑code and data deployments using AWS CloudFormation, CDK, CodePipeline, and/or Terraform.
- Monitor data health, quality, and lineage using AWS Glue Data Catalog, CloudWatch, and integrate with third‑party observability tools like Monte Carlo or OpenMetadata.
- Collaborate with data scientists, analysts, and product owners to ensure curated datasets are discoverable, trustworthy, and reusable across analytics and machine learning use cases.
- Define and implement data quality rules, metadata capture, and schema versioning for curated assets using AWS Glue DataBrew and Data Catalog.
- Build AWS‑native services and solutions to support governance, access control, encryption, and auditing to meet financial industry compliance standards.
- Translate Wealth business requirements into data models using Redshift, Aurora (PostgreSQL), and DynamoDB for varied analytical and operational workloads.
- Champion DataOps principles by building reusable components, self‑service capabilities, and automated testing, version control, and deployment processes for data assets.
- Creates and maintains an optimal data pipeline architecture, in AWS Cloud and hybrid infrastructure.
- Contribute to container orchestration (ECS, Fargate) and data lake security (Lake Formation).
- Builds analytics tools that utilize the data pipeline to provide actionable insights into customer, operational efficiency and other key business performance metrics, using AWS Redshift and PowerBI.
- Works with experts in data strategy and data governance to suggest quality enhancements.
- Creates and maintains various applications using different types of programming languages.
- Designs, configures and implements the cloud‑native software that solves actual business problems using cloud technology.
- Creates instructions for cloud computing operations and documentation.
- Establishes a development environment and continuous integration pipeline. Utilizes modern application patterns and code efficiency.
- Translates user needs into technical specifications by understanding, conceptualizing, and facilitating technical requirements from users.
- Operates at a group/enterprise‑wide level and serves as a specialist resource to senior leaders and stakeholders.
- Applies expertise and thinks creatively to address unique or ambiguous situations and to find solutions to problems that can be complex and non‑routine.
- Implements changes in response to shifting trends.
- Broader work or accountabilities may be assigned as needed.
Technical Qualifications
- Min of 7+ years of experience in data engineering or cloud‑based data platform roles with a minimum 5 years of AWS data solutions experience
- 7+ years of experience with each of the following skills/areas:
- AWS Services: S3, Glue, Redshift, Lake Formation, Lambda, Step Functions, CloudFormation, CodePipeline, IAM, DMS, Athena, CloudWatch, DataBrew, Data Catalog.
- SQL and NoSQL database design (Redshift, Aurora PostgreSQL, DynamoDB).
- Metadata management, data lineage, and data observability tooling.
- Infrastructure‑as‑Code and DevOps tools: Terraform, CDK, Git, Jenkins, CodeBuild.
- Python, PySpark, SQL, and shell scripting for data engineering.
- Familiarity with ML workflows on Amazon SageMaker is a plus.
Methodologies (Intermediate to Advanced): 5‑7+ yrs of experience in each of the following areas:
- Agile methodology and Scrum‑based delivery in cross‑functional teams.
- DataOPs, MLOPs and DevOps principles applied to data engineering.
- Knowledge of financial services data models and regulatory compliance in wealth management (e.g., KYC, suitability, client reporting).
Core Skills
- Excellent problem‑solving, communication, and stakeholder engagement abilities.
- Ability to manage complexity, ambiguity, and shifting priorities across domains.
- Strong time management
- Proven ability to work with teams across the organization (both horizontally and vertically)
Salary
$94,600.00 - $176,000.00
Pay Type: Salaried
The above represents BMO Financial Group’s pay range and type.
Salaries will vary based on factors such as location, skills, experience, education, and qualifications for the role, and may include a commission structure. Salaries for part‑time roles will be pro‑rated based on number of hours regularly worked. For commission roles, the salary listed above represents BMO Financial Group’s expected target for the first year in this position.
BMO Financial Group’s total compensation package will vary based on the pay type of the position and may include performance‑based incentives, discretionary bonuses, as well as other perks and rewards.
Benefits
BMO also offers health insurance, tuition reimbursement, accident and life insurance, and retirement savings plans. To view more details of our benefits, please visit: https://jobs.bmo.com/global/en/Total-Rewards
About Us
At BMO we are driven by a shared Purpose: Boldly Grow the Good in business and life. It calls on us to create lasting, positive change for our customers, our communities and our people. By working together to innovate and push boundaries, we transform lives and businesses, and power economic growth around the world.
To find out more visit us at https://jobs.bmo.com/ca/en
About Desjardins
Desjardins Group is the largest cooperative financial group in North America and the fifth largest cooperative financial group in the world, with assets of $435.8 billion as at March 31, 2024. It was named one of Canada's Best Employers by Forbes magazine and by Mediacorp. To meet the diverse needs of its members and clients, Desjardins offers a full range of products and services to individuals and businesses through its extensive distribution network, online platforms and subsidiaries across Canada. Ranked among the world's strongest banks according to The Banker magazine, Desjardins has some of the highest capital ratios and credit ratings in the industry and the first according to Bloomberg News.
Sr. Data Engineering Specialist (AWS)
Top Benefits
About the role
Sr. Data Engineering Specialist (AWS)
Description
You’ll be joining a pivotal moment in our organization’s evolution as we establish the foundational data infrastructure that will power wealth management decisions for years to come. As a Sr. Data Engineering Specialist, you won’t just be building pipelines – you’ll be architecting the data backbone that enables personalized wealth experiences for our clients and data‑driven insights for our advisors.
We are building a practice of excellence that integrates deep wealth domain knowledge with modern cloud‑native data engineering, focused on delivering curated datasets and data products that empower digital advisors, client personalization, regulatory compliance, and advanced analytics. This is an opportunity to work alongside seasoned practitioners with real‑world wealth management expertise, collaborate closely with DnA teams across the Wealth lines of business, and contribute to solutions that have tangible impact on investment, insurance, and advisory services. If you’re driven by purpose, passionate about data craftsmanship, and eager to be part of a forward‑looking, data‑centric culture, we’d love to have you onboard.
What Makes This Opportunity Different:
- Greenfield Innovation: Lead the establishment of our AWS DataOps and MLOps practices from the ground up, with the autonomy to implement best practices and cutting‑edge technologies
- Domain Expertise Advantage: Work alongside seasoned wealth management practitioners who understand the nuances of client portfolios, risk assessment, and regulatory requirements – translating complex business needs into elegant technical solutions
- Enterprise Impact: Your data products will directly influence investment strategies, client outcomes, and business growth across our wealth line of business
- Technical Excellence: Leverage the full AWS ecosystem to build scalable, secure, and compliant data solutions that meet the stringent requirements of wealth management
- Career Acceleration: Gain specialized expertise in wealth domain data engineering – a highly valued and differentiated skill set in the financial services industry
This role drives the development and implementation of modern data engineering practices, policies, and cloud‑native solutions to support Wealth Management’s DataOps platform on AWS. The specialist will manage the full data lifecycle, ensuring secure, efficient, and cost‑effective collection, transformation, storage, and access of data across hybrid and cloud‑native environments.
As a senior member of the Wealth Data Engineering team, the role integrates data from legacy investment platforms and new cloud‑based systems to build a unified, governed data tier. It applies DataOps and DevOps principles using AWS‑native services to enable real‑time insights, automation, and reliable delivery of curated datasets for analytics, ML, and regulatory use cases.
**This is a HYBRID role
Main Responsibilities
- Design and implement secure, scalable, and cost‑efficient data pipelines using AWS Glue, Step Functions, Apache Airflow and Lambda to support batch and streaming ingestion from cloud native, third party (e.g. Snowflake), and on‑premises; relational and/or non‑relational data sources
- Build and maintain a centralized data lake using AWS foundational components such as S3, Lake Formation, and Apache Iceberg to enable federated access and fine‑grained security across Wealth business domains.
- Integrate data from on‑premises and cloud systems using AWS DMS, Glue connectors, and Redshift federated queries.
- Design CI/CD pipelines for infrastructure‑as‑code and data deployments using AWS CloudFormation, CDK, CodePipeline, and/or Terraform.
- Monitor data health, quality, and lineage using AWS Glue Data Catalog, CloudWatch, and integrate with third‑party observability tools like Monte Carlo or OpenMetadata.
- Collaborate with data scientists, analysts, and product owners to ensure curated datasets are discoverable, trustworthy, and reusable across analytics and machine learning use cases.
- Define and implement data quality rules, metadata capture, and schema versioning for curated assets using AWS Glue DataBrew and Data Catalog.
- Build AWS‑native services and solutions to support governance, access control, encryption, and auditing to meet financial industry compliance standards.
- Translate Wealth business requirements into data models using Redshift, Aurora (PostgreSQL), and DynamoDB for varied analytical and operational workloads.
- Champion DataOps principles by building reusable components, self‑service capabilities, and automated testing, version control, and deployment processes for data assets.
- Creates and maintains an optimal data pipeline architecture, in AWS Cloud and hybrid infrastructure.
- Contribute to container orchestration (ECS, Fargate) and data lake security (Lake Formation).
- Builds analytics tools that utilize the data pipeline to provide actionable insights into customer, operational efficiency and other key business performance metrics, using AWS Redshift and PowerBI.
- Works with experts in data strategy and data governance to suggest quality enhancements.
- Creates and maintains various applications using different types of programming languages.
- Designs, configures and implements the cloud‑native software that solves actual business problems using cloud technology.
- Creates instructions for cloud computing operations and documentation.
- Establishes a development environment and continuous integration pipeline. Utilizes modern application patterns and code efficiency.
- Translates user needs into technical specifications by understanding, conceptualizing, and facilitating technical requirements from users.
- Operates at a group/enterprise‑wide level and serves as a specialist resource to senior leaders and stakeholders.
- Applies expertise and thinks creatively to address unique or ambiguous situations and to find solutions to problems that can be complex and non‑routine.
- Implements changes in response to shifting trends.
- Broader work or accountabilities may be assigned as needed.
Technical Qualifications
- Min of 7+ years of experience in data engineering or cloud‑based data platform roles with a minimum 5 years of AWS data solutions experience
- 7+ years of experience with each of the following skills/areas:
- AWS Services: S3, Glue, Redshift, Lake Formation, Lambda, Step Functions, CloudFormation, CodePipeline, IAM, DMS, Athena, CloudWatch, DataBrew, Data Catalog.
- SQL and NoSQL database design (Redshift, Aurora PostgreSQL, DynamoDB).
- Metadata management, data lineage, and data observability tooling.
- Infrastructure‑as‑Code and DevOps tools: Terraform, CDK, Git, Jenkins, CodeBuild.
- Python, PySpark, SQL, and shell scripting for data engineering.
- Familiarity with ML workflows on Amazon SageMaker is a plus.
Methodologies (Intermediate to Advanced): 5‑7+ yrs of experience in each of the following areas:
- Agile methodology and Scrum‑based delivery in cross‑functional teams.
- DataOPs, MLOPs and DevOps principles applied to data engineering.
- Knowledge of financial services data models and regulatory compliance in wealth management (e.g., KYC, suitability, client reporting).
Core Skills
- Excellent problem‑solving, communication, and stakeholder engagement abilities.
- Ability to manage complexity, ambiguity, and shifting priorities across domains.
- Strong time management
- Proven ability to work with teams across the organization (both horizontally and vertically)
Salary
$94,600.00 - $176,000.00
Pay Type: Salaried
The above represents BMO Financial Group’s pay range and type.
Salaries will vary based on factors such as location, skills, experience, education, and qualifications for the role, and may include a commission structure. Salaries for part‑time roles will be pro‑rated based on number of hours regularly worked. For commission roles, the salary listed above represents BMO Financial Group’s expected target for the first year in this position.
BMO Financial Group’s total compensation package will vary based on the pay type of the position and may include performance‑based incentives, discretionary bonuses, as well as other perks and rewards.
Benefits
BMO also offers health insurance, tuition reimbursement, accident and life insurance, and retirement savings plans. To view more details of our benefits, please visit: https://jobs.bmo.com/global/en/Total-Rewards
About Us
At BMO we are driven by a shared Purpose: Boldly Grow the Good in business and life. It calls on us to create lasting, positive change for our customers, our communities and our people. By working together to innovate and push boundaries, we transform lives and businesses, and power economic growth around the world.
To find out more visit us at https://jobs.bmo.com/ca/en
About Desjardins
Desjardins Group is the largest cooperative financial group in North America and the fifth largest cooperative financial group in the world, with assets of $435.8 billion as at March 31, 2024. It was named one of Canada's Best Employers by Forbes magazine and by Mediacorp. To meet the diverse needs of its members and clients, Desjardins offers a full range of products and services to individuals and businesses through its extensive distribution network, online platforms and subsidiaries across Canada. Ranked among the world's strongest banks according to The Banker magazine, Desjardins has some of the highest capital ratios and credit ratings in the industry and the first according to Bloomberg News.