Top Benefits
About the role
Who you are
- We're searching for passionate individuals eager to contribute to Alpaca's rapid growth
- If you align with our core values—Stay Curious, Have Empathy, and Be Accountable—and are ready to make a significant impact, we encourage you to apply
- Core Experience: 3+ years of experience in data analytics or data engineering with a strong focus on the "T" (transformation) in ELT
- Expert SQL Skills: High fluency in SQL for complex queries and data manipulation on large datasets
- Analytics Engineering Fundamentals: Deep understanding of data modeling, transformation principles, and data engineering best practices (e.g., source control, code reviews, testing)
- Dbt Experience: Proven experience building scalable transformation layers using formalized SQL modeling tools, preferably dbt
- Work Ethic: Comfortable with ambiguity, able to take ownership with minimal oversight, and adaptable in a fast-paced environment
- Experience with data ingestion tools (e.g., Airbyte) and orchestration tools (e.g., Airflow)
- Experiences with Semantic Layer modelling (e.g. Cube, dbt Semantic Layer)
What the job involves
- We are seeking an Analytics Engineer to own and execute the vision for our data transformation layer
- You will be at the heart of our data platform, which processes hundreds of millions of events daily from a wide array of sources, including transactional databases, API logs, CRMs, payment systems, and marketing platforms
- You will join our 100% remote team and work closely with Data Engineers (who manage data ingestion) and Data Scientists and Business Users (who consume your data models)
- Your primary responsibility will be to use dbt and Trino on our GCP-based, open-source data infrastructure to build robust, scalable data models
- These models are critical for stakeholders across the company—from finance and operations to the executive team—and are delivered via BI tools, reports, and reverse ETL systems
- Own the Transformation Layer: Design, build, and maintain scalable data models using dbt and SQL to support diverse business needs, from monthly financial reporting to near-real-time operational metrics
- Set Technical Standards: Establish and enforce best practices for data modelling, development, testing, and monitoring to ensure data quality, integrity (up to cent-level precision), and discoverability
- Enable Stakeholders: Collaborate directly with finance, operations, customer success, and marketing teams to understand their requirements and deliver reliable data products
- Integrate and Deliver: Create repeatable patterns for integrating our data models with BI tools and reverse ETL processes, enabling consistent metric reporting across the business
- Ensure Quality: Champion high standards for development, including robust change management, source control, code reviews, and data monitoring as our products and data evolve
Benefits
- Competitive Salary & Stock Options
- Benefits: Health benefits start on day 1. In the US this includes Medical, Dental, Vision. In Canada, this includes supplemental health care. Internationally, this includes a stipend value to offset medical costs
- New Hire Home-Office Setup: One-time USD $500
- Monthly Stipend: USD $150 per month via a Brex Card
- Work with awesome people, clients and partners from around the world
About Alpaca
Alpaca is a developer-first API brokerage platform that supports hundreds of businesses globally. Alpaca offers stock, options, ETF and crypto trading, real-time market data, and end-to-end brokerage infrastructure through modern APIs.
Alpaca has raised over $120m in funding and is backed by top investors in the industry globally, including Portage Ventures, Spark Capital, Tribe Capital, Social Leverage, Horizons Ventures, Unbound, SBI Group, Eldridge, Positive Sum, Elefund, and Y Combinator.
Top Benefits
About the role
Who you are
- We're searching for passionate individuals eager to contribute to Alpaca's rapid growth
- If you align with our core values—Stay Curious, Have Empathy, and Be Accountable—and are ready to make a significant impact, we encourage you to apply
- Core Experience: 3+ years of experience in data analytics or data engineering with a strong focus on the "T" (transformation) in ELT
- Expert SQL Skills: High fluency in SQL for complex queries and data manipulation on large datasets
- Analytics Engineering Fundamentals: Deep understanding of data modeling, transformation principles, and data engineering best practices (e.g., source control, code reviews, testing)
- Dbt Experience: Proven experience building scalable transformation layers using formalized SQL modeling tools, preferably dbt
- Work Ethic: Comfortable with ambiguity, able to take ownership with minimal oversight, and adaptable in a fast-paced environment
- Experience with data ingestion tools (e.g., Airbyte) and orchestration tools (e.g., Airflow)
- Experiences with Semantic Layer modelling (e.g. Cube, dbt Semantic Layer)
What the job involves
- We are seeking an Analytics Engineer to own and execute the vision for our data transformation layer
- You will be at the heart of our data platform, which processes hundreds of millions of events daily from a wide array of sources, including transactional databases, API logs, CRMs, payment systems, and marketing platforms
- You will join our 100% remote team and work closely with Data Engineers (who manage data ingestion) and Data Scientists and Business Users (who consume your data models)
- Your primary responsibility will be to use dbt and Trino on our GCP-based, open-source data infrastructure to build robust, scalable data models
- These models are critical for stakeholders across the company—from finance and operations to the executive team—and are delivered via BI tools, reports, and reverse ETL systems
- Own the Transformation Layer: Design, build, and maintain scalable data models using dbt and SQL to support diverse business needs, from monthly financial reporting to near-real-time operational metrics
- Set Technical Standards: Establish and enforce best practices for data modelling, development, testing, and monitoring to ensure data quality, integrity (up to cent-level precision), and discoverability
- Enable Stakeholders: Collaborate directly with finance, operations, customer success, and marketing teams to understand their requirements and deliver reliable data products
- Integrate and Deliver: Create repeatable patterns for integrating our data models with BI tools and reverse ETL processes, enabling consistent metric reporting across the business
- Ensure Quality: Champion high standards for development, including robust change management, source control, code reviews, and data monitoring as our products and data evolve
Benefits
- Competitive Salary & Stock Options
- Benefits: Health benefits start on day 1. In the US this includes Medical, Dental, Vision. In Canada, this includes supplemental health care. Internationally, this includes a stipend value to offset medical costs
- New Hire Home-Office Setup: One-time USD $500
- Monthly Stipend: USD $150 per month via a Brex Card
- Work with awesome people, clients and partners from around the world
About Alpaca
Alpaca is a developer-first API brokerage platform that supports hundreds of businesses globally. Alpaca offers stock, options, ETF and crypto trading, real-time market data, and end-to-end brokerage infrastructure through modern APIs.
Alpaca has raised over $120m in funding and is backed by top investors in the industry globally, including Portage Ventures, Spark Capital, Tribe Capital, Social Leverage, Horizons Ventures, Unbound, SBI Group, Eldridge, Positive Sum, Elefund, and Y Combinator.