About the role
About QuadReal Property Group
QuadReal Property Group is a global real estate investment, operating and development company headquartered in Vancouver, British Columbia. Its assets under management are $94 billion. From its foundation in Canada as a full-service real estate operating company, QuadReal has expanded its capabilities to invest in equity and debt in both the public and private markets. QuadReal invests directly, via programmatic partnerships and through operating platforms in which it holds an ownership interest.
QuadReal seeks to deliver strong investment returns while creating sustainable environments that bring value to the people and communities it serves. Now and for generations to come.
QuadReal: Excellence lives here.
Reporting to the Team Lead, Data Platform, this role will help build, scale, and optimize our enterprise data platform. This is a hands-on engineering position for someone who thrives in code — designing and delivering high-quality pipelines, APIs, and integrations using modern Python-based tools. You will work closely with Data Governance, IT, and business teams to ensure our solutions are scalable, maintainable, secure, and trusted. In addition to development work, you will contribute to platform standards and collaborate with team members to deliver high-impact, production-ready solutions.
Responsibilities
Engineering & Development
- Design, code, and deploy robust, maintainable data pipelines and APIs using Python and modern frameworks such as dbt, dltHub, and Apache Airflow.
- Automate environment provisioning using Infrastructure as Code (IaC) using Terraform for Azure storage, compute, and orchestration.
- Implement data quality checks, validation, and lineage to maintain enterprise trust and compliance.
Architecture & Standards
- Participate in platform architecture discussions, bringing a coder’s perspective to designing scalable, performant data solutions
- Champion coding excellence: consistent naming conventions, clear documentation, version control discipline, and clean, readable Python.
- Ensure adherence to data governance and security standards, integrating with Microsoft Purview for classification, tagging, and access control.
Collaboration & Mentorship
- Work closely with Data Governance, Data Solutions, Advanced Analytics, and IT Security teams to ensure business alignment.
- Participate in code reviews and knowledge-sharing sessions with peers.
- Support a culture of continuous learning and experimentation within the team.
Qualifications and Experience
- Bachelor’s degree in Computer Science, Information Systems, Business Technology Management, or related field.
- 2–4 years of experience in data engineering or similar roles, with a strong programming background.
- Hands-on experience building enterprise-wide data pipelines and transformations using Python and dbt. Proficiency in SQL for data querying and manipulation.
- Advanced Python skills — comfortable writing clean, efficient, production-grade code for pipelines, APIs, and data transformations.
- Proficiency in SQL for complex querying and data manipulation.
- Hands-on experience with workflow orchestration (e.g., Apache Airflow) and modern data stack tools (dbt, dltHub, Microsoft Fabric).
- Experience using Infrastructure as Code (IaC) to automate provisioning of cloud data environments and platform components.
- Hands-on experience with Terraform to define, version, and deploy reproducible data platform infrastructure — ideally in Azure.
- Familiarity with modern data architectures (data lakehouse, data mesh, warehousing best practices).
- Strong communication and collaboration skills.
- Experience with governance tools (Microsoft Purview) and CI/CD workflows (Azure DevOps, GitHub Actions) is an asset.
What Success Looks Like in 12 Months
- 2–3 production-grade pipelines delivered, meeting performance SLAs and governance standards.
- Reduction in pipeline latency and improved data freshness across key domains.
- Contribution to at least one major platform standard or automation initiative adopted enterprise-wide.
- Recognition as a collaborative, reliable contributor to the data platform team.
This role is ideal for a data engineer who loves to code — someone who can move quickly from concept to working software, is comfortable tackling complex data challenges, and wants to shape the future of our enterprise data platform.
#LI-TV1
#LI-Hybrid
Note to Recruiters: QuadReal does not accept unsolicited resumes from any source other than directly from a candidate. Any unsolicited resumes sent to QuadReal, directly or indirectly, will be considered QuadReal property. QuadReal will not pay a fee for any placement resulting from the receipt of an unsolicited resume. A recruiting agency must first have a valid, written and fully executed agency agreement contract for engaged services to submit resumes.
QuadReal Property Group will provide reasonable accommodation at any time throughout the hiring process for applicants with disabilities or for those needing job postings in an alternate format. If you require accommodation, please advise the Talent Acquisition team member you are working with and include the following: Job posting #, your name and your preferred method of contact.
About QuadReal
Headquartered in Vancouver, Canada, QuadReal Property Group is a global real estate investment, operating and development company. QuadReal manages the real estate and mortgage programs of British Columbia Investment Management Corporation (BCI), one of Canada’s largest asset managers with a $153.4 billion portfolio.
QuadReal manages a $37.6 billion portfolio spanning 23 Global Cities across 17 countries. The company seeks to deliver strong investment returns while creating sustainable environments that bring value to the people and communities it serves. Now and for generations to come.
QuadReal: Excellence lives here.
About the role
About QuadReal Property Group
QuadReal Property Group is a global real estate investment, operating and development company headquartered in Vancouver, British Columbia. Its assets under management are $94 billion. From its foundation in Canada as a full-service real estate operating company, QuadReal has expanded its capabilities to invest in equity and debt in both the public and private markets. QuadReal invests directly, via programmatic partnerships and through operating platforms in which it holds an ownership interest.
QuadReal seeks to deliver strong investment returns while creating sustainable environments that bring value to the people and communities it serves. Now and for generations to come.
QuadReal: Excellence lives here.
Reporting to the Team Lead, Data Platform, this role will help build, scale, and optimize our enterprise data platform. This is a hands-on engineering position for someone who thrives in code — designing and delivering high-quality pipelines, APIs, and integrations using modern Python-based tools. You will work closely with Data Governance, IT, and business teams to ensure our solutions are scalable, maintainable, secure, and trusted. In addition to development work, you will contribute to platform standards and collaborate with team members to deliver high-impact, production-ready solutions.
Responsibilities
Engineering & Development
- Design, code, and deploy robust, maintainable data pipelines and APIs using Python and modern frameworks such as dbt, dltHub, and Apache Airflow.
- Automate environment provisioning using Infrastructure as Code (IaC) using Terraform for Azure storage, compute, and orchestration.
- Implement data quality checks, validation, and lineage to maintain enterprise trust and compliance.
Architecture & Standards
- Participate in platform architecture discussions, bringing a coder’s perspective to designing scalable, performant data solutions
- Champion coding excellence: consistent naming conventions, clear documentation, version control discipline, and clean, readable Python.
- Ensure adherence to data governance and security standards, integrating with Microsoft Purview for classification, tagging, and access control.
Collaboration & Mentorship
- Work closely with Data Governance, Data Solutions, Advanced Analytics, and IT Security teams to ensure business alignment.
- Participate in code reviews and knowledge-sharing sessions with peers.
- Support a culture of continuous learning and experimentation within the team.
Qualifications and Experience
- Bachelor’s degree in Computer Science, Information Systems, Business Technology Management, or related field.
- 2–4 years of experience in data engineering or similar roles, with a strong programming background.
- Hands-on experience building enterprise-wide data pipelines and transformations using Python and dbt. Proficiency in SQL for data querying and manipulation.
- Advanced Python skills — comfortable writing clean, efficient, production-grade code for pipelines, APIs, and data transformations.
- Proficiency in SQL for complex querying and data manipulation.
- Hands-on experience with workflow orchestration (e.g., Apache Airflow) and modern data stack tools (dbt, dltHub, Microsoft Fabric).
- Experience using Infrastructure as Code (IaC) to automate provisioning of cloud data environments and platform components.
- Hands-on experience with Terraform to define, version, and deploy reproducible data platform infrastructure — ideally in Azure.
- Familiarity with modern data architectures (data lakehouse, data mesh, warehousing best practices).
- Strong communication and collaboration skills.
- Experience with governance tools (Microsoft Purview) and CI/CD workflows (Azure DevOps, GitHub Actions) is an asset.
What Success Looks Like in 12 Months
- 2–3 production-grade pipelines delivered, meeting performance SLAs and governance standards.
- Reduction in pipeline latency and improved data freshness across key domains.
- Contribution to at least one major platform standard or automation initiative adopted enterprise-wide.
- Recognition as a collaborative, reliable contributor to the data platform team.
This role is ideal for a data engineer who loves to code — someone who can move quickly from concept to working software, is comfortable tackling complex data challenges, and wants to shape the future of our enterprise data platform.
#LI-TV1
#LI-Hybrid
Note to Recruiters: QuadReal does not accept unsolicited resumes from any source other than directly from a candidate. Any unsolicited resumes sent to QuadReal, directly or indirectly, will be considered QuadReal property. QuadReal will not pay a fee for any placement resulting from the receipt of an unsolicited resume. A recruiting agency must first have a valid, written and fully executed agency agreement contract for engaged services to submit resumes.
QuadReal Property Group will provide reasonable accommodation at any time throughout the hiring process for applicants with disabilities or for those needing job postings in an alternate format. If you require accommodation, please advise the Talent Acquisition team member you are working with and include the following: Job posting #, your name and your preferred method of contact.
About QuadReal
Headquartered in Vancouver, Canada, QuadReal Property Group is a global real estate investment, operating and development company. QuadReal manages the real estate and mortgage programs of British Columbia Investment Management Corporation (BCI), one of Canada’s largest asset managers with a $153.4 billion portfolio.
QuadReal manages a $37.6 billion portfolio spanning 23 Global Cities across 17 countries. The company seeks to deliver strong investment returns while creating sustainable environments that bring value to the people and communities it serves. Now and for generations to come.
QuadReal: Excellence lives here.