Analytics Engineer
Top Benefits
About the role
Who you are
- We are seeking an experienced and versatile Analytics Engineer to join our dynamic team
- The ideal candidate will have a strong background in data engineering, analytics, and machine learning, with the ability to drive data-driven decision-making across the organization
- Education: Bachelor’s degree in Computer Science, Statistics, or a related field; Master’s degree preferred
- Experience: 2+ years of experience in data analytics or a related field, with significant exposure to AI and Machine Learning applications in analytics
- SQL Expertise: Advanced SQL skills with experience in writing and optimizing complex queries on large-scale datasets
- Dbt Proficiency: Hands-on experience with dbt (Data Build Tool) and its features for building, testing, and documenting data models
- Data Modeling: Expert-level knowledge of data modeling and data warehouse concepts (e.g., star schema, normalization, slowly changing dimensions)
- Snowflake & AI Capabilities: Experience with Snowflake’s Data Cloud platform and familiarity with its advanced AI capabilities (Snowflake Intelligence – Cortex Analyst, Cortex Agents, Cortex Search, AISQL, etc.) is highly preferred
- Business Intelligence Tools: Strong skills in Looker data visualization and LookML (including familiarity with Looker’s conversational AI and data agent capabilities) or similar BI tools
- AI Agents & Automation: Experience with AI agents or generative AI tools to optimize workflows and service delivery (such as creating chatbots or automated analytic assistants) is a plus
- Real-Time & Streaming Data: Experience with real-time data processing and streaming technologies (e.g., Kafka, Kinesis, Spark Streaming) for handling continuous data flows
- Programming: Proficient in Python for data analysis and manipulation (pandas, NumPy, etc.), with the ability to write clean, efficient code. Experienced with shell scripting and command-line tools for automating workflows and data processing tasks
- ETL/Orchestration: Familiarity with ETL processes and workflow orchestration tools like Apache Airflow (or similar scheduling tools) for automating data pipelines alongside Docker for local development and testing
- Cloud Platforms: Experience with cloud platforms and services (especially AWS or GCP) for data storage, compute, and deployment
- Version Control & CI/CD: Solid understanding of code versioning (Git) and continuous integration/continuous deployment (CI/CD) processes in a data engineering context
- Agile Methodology: Familiarity with agile development methodologies and ability to work in a fast-paced, iterative environment
- Soft Skills: Excellent communication and presentation skills, with critical thinking and problem-solving abilities. Proven track record of working effectively on cross-functional teams and translating business needs into technical solutions
- Data Governance & Ethics: Experience implementing data governance best practices, ensuring data quality and consistency. Knowledge of data ethics, bias mitigation strategies, and data privacy regulations (e.g., GDPR, CCPA) with a commitment to compliance
- Community & Open Source: Contributions to open-source projects or active participation in data community initiatives
- AI/ML Skills: Experience with applying Artificial Intelligence/Machine Learning techniques in analytics (e.g., building predictive models for forecasting, churn prediction, fraud detection, etc.). Practical experience deploying models and using MLOps/DataOps practices for lifecycle management
- Statistical Background: Solid foundation in statistics and probability, with ability to apply various modeling techniques and design A/B tests or experiments
- Additional Programming: Knowledge of additional programming or query languages (e.g., R, Scala, Julia, Spark SQL) that can be applied in analytics workflows
- Certifications: Certifications in relevant data technologies or cloud platforms (such as Snowflake, AWS, GCP, or Looker) demonstrating your expertise
- The ideal candidate will be a self-starter with a passion for data and analytics, capable of navigating complex datasets to uncover valuable insights
- They should be comfortable working in a fast-paced environment, adapting to new technologies, and driving innovation in our data practices
- They should be able to navigate complex data landscapes, uncover meaningful insights, and communicate these findings effectively to both technical and non-technical audiences
- The ability to stay current with industry trends and continuously learn new technologies is essential in this role
- If you are a data enthusiast with a track record of delivering impactful analytics solutions and a desire to push the boundaries of what's possible with data, we want to hear from you!
What the job involves
- In this role, you will apply your advanced analytics expertise to extract actionable insights from raw data
- Data Modeling & Pipelines: Design, develop, and maintain complex data models in our Snowflake data warehouse. Utilize dbt (Data Build Tool) to create efficient data pipelines and transformations for our data platform
- Snowflake Intelligence Integration: Leverage Snowflake Intelligence features (e.g., Cortex Analyst, Cortex Agents, Cortex Search, AISQL) to implement conversational data queries and AI-driven insights directly within our data environment. Develop AI solutions that harness these capabilities to extract valuable business insights
- Advanced SQL & Analysis: Design and build advanced SQL queries to retrieve and manipulate complex data sets. Dive deep into large datasets to uncover patterns, trends, and opportunities that inform strategic decision-making
- Business Intelligence (BI): Develop, maintain, and optimize Looker dashboards and LookML to effectively communicate data insights. Leverage Looker’s conversational analytics and data agent features to enable stakeholders to interact with data using natural language queries
- Cross-Functional Collaboration: Communicate effectively with stakeholders to understand business requirements and deliver data-driven solutions. Identify opportunities for implementing AI/ML/NLP technologies in collaboration with product, engineering, and business teams
- Programming & Automation: Write efficient Python code for data analysis, data processing, and automation of recurring tasks. Skilled in shell scripting and command-line tools to support data workflows and system tasks. Ensure code is well-tested and integrated into automated workflows (e.g., via Airflow job scheduling)
- Visualization & Presentation: Create compelling visualizations and presentations to deliver analytical insights and actionable recommendations to senior management and cross-functional teams. Tailor communication of complex analyses to diverse audiences
- Innovation & Best Practices: Stay up-to-date with industry trends, emerging tools, and best practices in data engineering and analytics (with a focus on dbt features, Snowflake’s latest offerings and BI innovations). Develop and implement innovative ideas to continuously improve our analytics stack and practices
- This role offers the opportunity to make a significant impact on our organization's data strategy and contribute to critical business decisions through advanced analytics. The successful candidate will play a key role in shaping our data culture and driving the adoption of cutting-edge data technologies and methodologies
Benefits
- Flexible work hours and an asynchronous approach
- Remote-first
- "Distributed Workplace Bonus"
- Health, vision, dental benefits
- Annual Flexible Health Spending and Lifestyle Spending dollars
- Learning and development programs with a flexible, annual learning reimbursement fund
- Mental health benefits
- Employee resource groups
- Family/fertility planning support, parental and adoptive leave
- Three weeks of leave, plenty of personal days, and annual paid gratitude days over the holiday season
- Stock options
- Retirement savings plan contributions
About Tucows
We do a lot, but at our core, we’re in the business of keeping people connected and keeping the Internet open. We’re made up of three companies: Tucows Domains, Ting, and Wavelo.
As Tucows Domains, we help people find their place online as the world’s largest domain name wholesaler and the third-largest domain registrar globally.
As Ting Internet, we deliver high-speed fiber internet service to communities across the United States.
As Wavelo, we believe the future of telecom is simple. We build modern, flexible software for CSPs (communication service providers) globally.
Together, we are making the Internet better.
Tucows’ investor info (NASDAQ: TCX, TSX: TC) can be found at https://www.tucows.com/investors/
#JoinTheHerd at tucows.com/careers
[It's pronounced two-cows, btw.]
Analytics Engineer
Top Benefits
About the role
Who you are
- We are seeking an experienced and versatile Analytics Engineer to join our dynamic team
- The ideal candidate will have a strong background in data engineering, analytics, and machine learning, with the ability to drive data-driven decision-making across the organization
- Education: Bachelor’s degree in Computer Science, Statistics, or a related field; Master’s degree preferred
- Experience: 2+ years of experience in data analytics or a related field, with significant exposure to AI and Machine Learning applications in analytics
- SQL Expertise: Advanced SQL skills with experience in writing and optimizing complex queries on large-scale datasets
- Dbt Proficiency: Hands-on experience with dbt (Data Build Tool) and its features for building, testing, and documenting data models
- Data Modeling: Expert-level knowledge of data modeling and data warehouse concepts (e.g., star schema, normalization, slowly changing dimensions)
- Snowflake & AI Capabilities: Experience with Snowflake’s Data Cloud platform and familiarity with its advanced AI capabilities (Snowflake Intelligence – Cortex Analyst, Cortex Agents, Cortex Search, AISQL, etc.) is highly preferred
- Business Intelligence Tools: Strong skills in Looker data visualization and LookML (including familiarity with Looker’s conversational AI and data agent capabilities) or similar BI tools
- AI Agents & Automation: Experience with AI agents or generative AI tools to optimize workflows and service delivery (such as creating chatbots or automated analytic assistants) is a plus
- Real-Time & Streaming Data: Experience with real-time data processing and streaming technologies (e.g., Kafka, Kinesis, Spark Streaming) for handling continuous data flows
- Programming: Proficient in Python for data analysis and manipulation (pandas, NumPy, etc.), with the ability to write clean, efficient code. Experienced with shell scripting and command-line tools for automating workflows and data processing tasks
- ETL/Orchestration: Familiarity with ETL processes and workflow orchestration tools like Apache Airflow (or similar scheduling tools) for automating data pipelines alongside Docker for local development and testing
- Cloud Platforms: Experience with cloud platforms and services (especially AWS or GCP) for data storage, compute, and deployment
- Version Control & CI/CD: Solid understanding of code versioning (Git) and continuous integration/continuous deployment (CI/CD) processes in a data engineering context
- Agile Methodology: Familiarity with agile development methodologies and ability to work in a fast-paced, iterative environment
- Soft Skills: Excellent communication and presentation skills, with critical thinking and problem-solving abilities. Proven track record of working effectively on cross-functional teams and translating business needs into technical solutions
- Data Governance & Ethics: Experience implementing data governance best practices, ensuring data quality and consistency. Knowledge of data ethics, bias mitigation strategies, and data privacy regulations (e.g., GDPR, CCPA) with a commitment to compliance
- Community & Open Source: Contributions to open-source projects or active participation in data community initiatives
- AI/ML Skills: Experience with applying Artificial Intelligence/Machine Learning techniques in analytics (e.g., building predictive models for forecasting, churn prediction, fraud detection, etc.). Practical experience deploying models and using MLOps/DataOps practices for lifecycle management
- Statistical Background: Solid foundation in statistics and probability, with ability to apply various modeling techniques and design A/B tests or experiments
- Additional Programming: Knowledge of additional programming or query languages (e.g., R, Scala, Julia, Spark SQL) that can be applied in analytics workflows
- Certifications: Certifications in relevant data technologies or cloud platforms (such as Snowflake, AWS, GCP, or Looker) demonstrating your expertise
- The ideal candidate will be a self-starter with a passion for data and analytics, capable of navigating complex datasets to uncover valuable insights
- They should be comfortable working in a fast-paced environment, adapting to new technologies, and driving innovation in our data practices
- They should be able to navigate complex data landscapes, uncover meaningful insights, and communicate these findings effectively to both technical and non-technical audiences
- The ability to stay current with industry trends and continuously learn new technologies is essential in this role
- If you are a data enthusiast with a track record of delivering impactful analytics solutions and a desire to push the boundaries of what's possible with data, we want to hear from you!
What the job involves
- In this role, you will apply your advanced analytics expertise to extract actionable insights from raw data
- Data Modeling & Pipelines: Design, develop, and maintain complex data models in our Snowflake data warehouse. Utilize dbt (Data Build Tool) to create efficient data pipelines and transformations for our data platform
- Snowflake Intelligence Integration: Leverage Snowflake Intelligence features (e.g., Cortex Analyst, Cortex Agents, Cortex Search, AISQL) to implement conversational data queries and AI-driven insights directly within our data environment. Develop AI solutions that harness these capabilities to extract valuable business insights
- Advanced SQL & Analysis: Design and build advanced SQL queries to retrieve and manipulate complex data sets. Dive deep into large datasets to uncover patterns, trends, and opportunities that inform strategic decision-making
- Business Intelligence (BI): Develop, maintain, and optimize Looker dashboards and LookML to effectively communicate data insights. Leverage Looker’s conversational analytics and data agent features to enable stakeholders to interact with data using natural language queries
- Cross-Functional Collaboration: Communicate effectively with stakeholders to understand business requirements and deliver data-driven solutions. Identify opportunities for implementing AI/ML/NLP technologies in collaboration with product, engineering, and business teams
- Programming & Automation: Write efficient Python code for data analysis, data processing, and automation of recurring tasks. Skilled in shell scripting and command-line tools to support data workflows and system tasks. Ensure code is well-tested and integrated into automated workflows (e.g., via Airflow job scheduling)
- Visualization & Presentation: Create compelling visualizations and presentations to deliver analytical insights and actionable recommendations to senior management and cross-functional teams. Tailor communication of complex analyses to diverse audiences
- Innovation & Best Practices: Stay up-to-date with industry trends, emerging tools, and best practices in data engineering and analytics (with a focus on dbt features, Snowflake’s latest offerings and BI innovations). Develop and implement innovative ideas to continuously improve our analytics stack and practices
- This role offers the opportunity to make a significant impact on our organization's data strategy and contribute to critical business decisions through advanced analytics. The successful candidate will play a key role in shaping our data culture and driving the adoption of cutting-edge data technologies and methodologies
Benefits
- Flexible work hours and an asynchronous approach
- Remote-first
- "Distributed Workplace Bonus"
- Health, vision, dental benefits
- Annual Flexible Health Spending and Lifestyle Spending dollars
- Learning and development programs with a flexible, annual learning reimbursement fund
- Mental health benefits
- Employee resource groups
- Family/fertility planning support, parental and adoptive leave
- Three weeks of leave, plenty of personal days, and annual paid gratitude days over the holiday season
- Stock options
- Retirement savings plan contributions
About Tucows
We do a lot, but at our core, we’re in the business of keeping people connected and keeping the Internet open. We’re made up of three companies: Tucows Domains, Ting, and Wavelo.
As Tucows Domains, we help people find their place online as the world’s largest domain name wholesaler and the third-largest domain registrar globally.
As Ting Internet, we deliver high-speed fiber internet service to communities across the United States.
As Wavelo, we believe the future of telecom is simple. We build modern, flexible software for CSPs (communication service providers) globally.
Together, we are making the Internet better.
Tucows’ investor info (NASDAQ: TCX, TSX: TC) can be found at https://www.tucows.com/investors/
#JoinTheHerd at tucows.com/careers
[It's pronounced two-cows, btw.]