Top Benefits
About the role
The Opportunity
The Data Operations Analyst I is a hybrid role that blends foundational SQL Server administration with modern cloud-based data operations in Microsoft Fabric. This role supports day-to-day operations of on‑premises Microsoft SQL Server databases (availability, performance, security, backup/restore) while building and operating lightweight data pipelines and workflows in Fabric (OneLake/Lakehouse, Data Pipelines, Dataflows Gen2, Notebooks). The analyst partners with senior DBAs, data specialists, and analysts to deliver reliable data platforms, automate routine operations, and provide excellent customer service.
Responsibilities include:
- Maintain and optimize on‑premises SQL Server environments by keeping instances patched, ensuring reliable backups and restores, and tuning performance to ensure an optimize and efficient environment.
- Protect the data platform across SQL Server and Fabric by enforcing least‑privilege/RBAC, managing credentials and secrets appropriately, and operating within Canadian data residency and privacy requirements while assisting with ITIA/POPA audit evidence when required.
- Build and operate Microsoft Fabric data workflows using Data Pipelines, Dataflows Gen2, and Notebooks, with monitoring and alerting that sustain data‑freshness and overall pipeline reliability.
- Manage Lakehouse/OneLake data assets by creating and maintaining Delta schemas, administering permissions, and promoting changes consistently through dev/test/prod environments.
- Support Power BI operations by overseeing dataset refresh health, resolving gateway/connectivity issues, and helping govern workspaces in partnership with BI administrators.
- Maintain clear, up‑to‑date operational documentation—ERDs, data‑flow diagrams, and lineage—supported by runbooks and SOPs and keep ITSM incident and change records accurate and current, aligning with evolving Microsoft Fabric standards.
- Meet disaster‑recovery and business‑continuity objectives by performing routine restore tests and DR exercises and keeping recovery playbooks current and aligned to organizational BCPs.
- Deliver light data engineering by designing, implementing, and maintaining SQL‑based transformations and reusable data objects, applying baseline data‑quality controls and operational logging to ensure reliable, auditable data flows.
- Support the design and implementation of automation for recurring operational tasks using PowerShell or Python. Work with senior team members to schedule and orchestrate these automations through SQL Server Agent or Microsoft Fabric, helping to reduce manual workload and improve service reliability.
- Provide high‑quality customer support by triaging incidents, executing service requests on time, communicating status and impact clearly, and completing thorough resolution documentation. Participate in scheduled after‑hours patching windows and the on‑call rotation to meet service commitments.
- Implement changes through ITSM/CAB with required approvals and scheduling, follow documented technical policies, security controls, and architecture standards, and complete a brief post‑implementation review when required.
You Bring
- Completion of a post-secondary information technology degree or diploma, with a focus in database management.
- Microsoft SQL Server Certificate; or successful completion in the first 3 months.
- Microsoft Certified Azure Database Administrator Associate DP-300; or successful completion within 8 months.
- Fabric Analytics Engineer Associate DP-600; considered an asset
- Prior experience in a DBA/operations role supporting on‑premises Microsoft SQL Server environments and aware of Microsoft Cloud environments.
- Proficiency with T‑SQL for administrative tasks such as, health checks, data fixes, parameterized queries and basic automation using SQL Server Agent jobs/alerts.
- Solid grasp of backup/restore strategies (full/diff/log), retention, encryption where applicable, and experience with restore procedures.
- Performance monitoring & troubleshooting using tools such as SSMS
- Security administration basics including logins, users, roles, least‑privilege access, and patching routines.
- Windows Server fundamentals for databases and participating of after‑hours patching/maintenance windows.
- Experience participating in incident/change management (ITSM), including clear resolution notes and following CAB/approved windows for production changes.
- Ability to document environments with ERDs, data‑flow diagrams, lineage, and maintain clear runbooks/SOPs.
- Experience with PowerShell (or Python) for small administrative scripts.
- Familiarity with Microsoft Fabric concepts (Lakehouse/OneLake, Data Pipelines, Dataflows Gen2, Notebooks) and Power BI service operations (dataset refresh/gateway basics).
- Ability to respond calmly, professionally and with a sense of urgency regarding escalated incidents.
- Ability to establish effective, collaborative working relationships and an ability to build trust and interact with others in a positive manner
- Self-motivated with the ability to work independently and in a collaborative team environment.
- Ability to adapt well in an ever-changing environment.
- Ability to see the big picture and broader implications of issues/solutions.
- Ability to perform under pressure, handling interruptions and changes and the ability to plan assignments and monitor performance according to priorities.
- Ability to provide consistent follow-through with the team on issues/concerns to ensure appropriate visibility and escalation where needed.
We Offer
Along with a competitive compensation program and City paid health and dental premiums, this position also includes:
-
Excellent health, dental, paramedical, and benefits plan
-
First-in-class pension plan
-
Career development and tuition reimbursement
-
Employee discounts, annual adult Genesis Place pass, social events, and health & wellness initiatives
Continuous learning through training and development is encouraged as are flexible work arrangements, when possible. We recognize that our people work best when they feel engaged in their environment and appreciated for their efforts and our overall benefits package reflects that.
Additional Information
The position is full time (37.5 hours per week) and a comprehensive benefits package is included.
- Please provide a cover letter along with your resume as a means of introducing yourself and your interest in this role.
Not the right fit? Search for Data Operations Analyst jobs in Airdrie, AB
About City of Airdrie
Airdrie is one of the fastest growing communities in Canada. With over 70,000 residents, Airdrie is a dynamic and growing city located just 15 minutes north of Calgary, Alberta. The City of Airdrie is the municipal government responsible for the delivery of services to Airdrie residents and businesses.
Similar jobs you might like
Top Benefits
About the role
The Opportunity
The Data Operations Analyst I is a hybrid role that blends foundational SQL Server administration with modern cloud-based data operations in Microsoft Fabric. This role supports day-to-day operations of on‑premises Microsoft SQL Server databases (availability, performance, security, backup/restore) while building and operating lightweight data pipelines and workflows in Fabric (OneLake/Lakehouse, Data Pipelines, Dataflows Gen2, Notebooks). The analyst partners with senior DBAs, data specialists, and analysts to deliver reliable data platforms, automate routine operations, and provide excellent customer service.
Responsibilities include:
- Maintain and optimize on‑premises SQL Server environments by keeping instances patched, ensuring reliable backups and restores, and tuning performance to ensure an optimize and efficient environment.
- Protect the data platform across SQL Server and Fabric by enforcing least‑privilege/RBAC, managing credentials and secrets appropriately, and operating within Canadian data residency and privacy requirements while assisting with ITIA/POPA audit evidence when required.
- Build and operate Microsoft Fabric data workflows using Data Pipelines, Dataflows Gen2, and Notebooks, with monitoring and alerting that sustain data‑freshness and overall pipeline reliability.
- Manage Lakehouse/OneLake data assets by creating and maintaining Delta schemas, administering permissions, and promoting changes consistently through dev/test/prod environments.
- Support Power BI operations by overseeing dataset refresh health, resolving gateway/connectivity issues, and helping govern workspaces in partnership with BI administrators.
- Maintain clear, up‑to‑date operational documentation—ERDs, data‑flow diagrams, and lineage—supported by runbooks and SOPs and keep ITSM incident and change records accurate and current, aligning with evolving Microsoft Fabric standards.
- Meet disaster‑recovery and business‑continuity objectives by performing routine restore tests and DR exercises and keeping recovery playbooks current and aligned to organizational BCPs.
- Deliver light data engineering by designing, implementing, and maintaining SQL‑based transformations and reusable data objects, applying baseline data‑quality controls and operational logging to ensure reliable, auditable data flows.
- Support the design and implementation of automation for recurring operational tasks using PowerShell or Python. Work with senior team members to schedule and orchestrate these automations through SQL Server Agent or Microsoft Fabric, helping to reduce manual workload and improve service reliability.
- Provide high‑quality customer support by triaging incidents, executing service requests on time, communicating status and impact clearly, and completing thorough resolution documentation. Participate in scheduled after‑hours patching windows and the on‑call rotation to meet service commitments.
- Implement changes through ITSM/CAB with required approvals and scheduling, follow documented technical policies, security controls, and architecture standards, and complete a brief post‑implementation review when required.
You Bring
- Completion of a post-secondary information technology degree or diploma, with a focus in database management.
- Microsoft SQL Server Certificate; or successful completion in the first 3 months.
- Microsoft Certified Azure Database Administrator Associate DP-300; or successful completion within 8 months.
- Fabric Analytics Engineer Associate DP-600; considered an asset
- Prior experience in a DBA/operations role supporting on‑premises Microsoft SQL Server environments and aware of Microsoft Cloud environments.
- Proficiency with T‑SQL for administrative tasks such as, health checks, data fixes, parameterized queries and basic automation using SQL Server Agent jobs/alerts.
- Solid grasp of backup/restore strategies (full/diff/log), retention, encryption where applicable, and experience with restore procedures.
- Performance monitoring & troubleshooting using tools such as SSMS
- Security administration basics including logins, users, roles, least‑privilege access, and patching routines.
- Windows Server fundamentals for databases and participating of after‑hours patching/maintenance windows.
- Experience participating in incident/change management (ITSM), including clear resolution notes and following CAB/approved windows for production changes.
- Ability to document environments with ERDs, data‑flow diagrams, lineage, and maintain clear runbooks/SOPs.
- Experience with PowerShell (or Python) for small administrative scripts.
- Familiarity with Microsoft Fabric concepts (Lakehouse/OneLake, Data Pipelines, Dataflows Gen2, Notebooks) and Power BI service operations (dataset refresh/gateway basics).
- Ability to respond calmly, professionally and with a sense of urgency regarding escalated incidents.
- Ability to establish effective, collaborative working relationships and an ability to build trust and interact with others in a positive manner
- Self-motivated with the ability to work independently and in a collaborative team environment.
- Ability to adapt well in an ever-changing environment.
- Ability to see the big picture and broader implications of issues/solutions.
- Ability to perform under pressure, handling interruptions and changes and the ability to plan assignments and monitor performance according to priorities.
- Ability to provide consistent follow-through with the team on issues/concerns to ensure appropriate visibility and escalation where needed.
We Offer
Along with a competitive compensation program and City paid health and dental premiums, this position also includes:
-
Excellent health, dental, paramedical, and benefits plan
-
First-in-class pension plan
-
Career development and tuition reimbursement
-
Employee discounts, annual adult Genesis Place pass, social events, and health & wellness initiatives
Continuous learning through training and development is encouraged as are flexible work arrangements, when possible. We recognize that our people work best when they feel engaged in their environment and appreciated for their efforts and our overall benefits package reflects that.
Additional Information
The position is full time (37.5 hours per week) and a comprehensive benefits package is included.
- Please provide a cover letter along with your resume as a means of introducing yourself and your interest in this role.
Not the right fit? Search for Data Operations Analyst jobs in Airdrie, AB
About City of Airdrie
Airdrie is one of the fastest growing communities in Canada. With over 70,000 residents, Airdrie is a dynamic and growing city located just 15 minutes north of Calgary, Alberta. The City of Airdrie is the municipal government responsible for the delivery of services to Airdrie residents and businesses.