Top Benefits
About the role
About Deck We're building the data infrastructure for the internet. Deck makes scattered, login-protected data instantly accessible through straightforward APIs and integrations. No fluff—just the tools businesses need to make smarter decisions faster.
Our team combines sharp thinkers and builders from top tech companies, united by one belief: great ideas thrive on great data.
Our Product
Deck connects businesses to the data they need without the hassle. From energy data to bill payments, our platform ensures everything talks to each other. Backed by top VCs and led by proven founders, we're powering tomorrow's data platforms.
The Role
We’re looking for a Site Reliability & Quality Engineer to ensure the stability, performance, and integrity of our production services. You’ll work to optimize our web scraping infrastructure, guarantee data accuracy, and strengthen the resilience of our systems using advanced monitoring and automation tools.
What We’re Looking For
- Strong ability to improve the efficiency and reliability of scraping processes.
- Problem-solving mindset for web scraping challenges and continuous improvement.
- Experience with infrastructure for large-scale Python web scraping (e.g., Docker, Kubernetes).
- Proficiency with monitoring and observing scraping performance (e.g., Prometheus, Grafana, Datadog).
Why Join Deck?
- Competitive pay for the right skills
- Proven leadership with a track record of big results
- Outsized impact with significant ownership and autonomy
- World-class team tackling challenging problems
- Growth opportunities to expand expertise and advance your career
Check ou our Constitution , if you don't dislike it, apply!
Top Benefits
About the role
About Deck We're building the data infrastructure for the internet. Deck makes scattered, login-protected data instantly accessible through straightforward APIs and integrations. No fluff—just the tools businesses need to make smarter decisions faster.
Our team combines sharp thinkers and builders from top tech companies, united by one belief: great ideas thrive on great data.
Our Product
Deck connects businesses to the data they need without the hassle. From energy data to bill payments, our platform ensures everything talks to each other. Backed by top VCs and led by proven founders, we're powering tomorrow's data platforms.
The Role
We’re looking for a Site Reliability & Quality Engineer to ensure the stability, performance, and integrity of our production services. You’ll work to optimize our web scraping infrastructure, guarantee data accuracy, and strengthen the resilience of our systems using advanced monitoring and automation tools.
What We’re Looking For
- Strong ability to improve the efficiency and reliability of scraping processes.
- Problem-solving mindset for web scraping challenges and continuous improvement.
- Experience with infrastructure for large-scale Python web scraping (e.g., Docker, Kubernetes).
- Proficiency with monitoring and observing scraping performance (e.g., Prometheus, Grafana, Datadog).
Why Join Deck?
- Competitive pay for the right skills
- Proven leadership with a track record of big results
- Outsized impact with significant ownership and autonomy
- World-class team tackling challenging problems
- Growth opportunities to expand expertise and advance your career
Check ou our Constitution , if you don't dislike it, apply!