You will be the point person, champion, and owner of our data warehouse and processing pipelines, including ETL, performance, maintenance, and development. Working closely with our product, engineering, and business stakeholders, you will help prioritize and balance data needs with building a robust data infrastructure.
- Design, build, deploy and maintain high quality data pipelines
- Build and deploy the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- Identify, design, and implement internal process improvements: automating manual processes, process monitoring, optimizing data delivery, building deployment pipelines, re-designing infrastructure for greater scalability, etc
- Work closely with our business analysts to optimize data delivery for BI and reporting needs
- Scope, estimate, and prioritize work to deliver business value and improve the data infrastructure
We are a very collaborative, close knit group that is inclusive, fun and hard working. Our number one priority after producing a great product is keeping that spirit alive. We have a firm, non-negotiable no jerk policy. We have a generous time off and work from home policy and strive to create an environment where you can do your best work.
- A bright, motivated, passionate engineer who wants to work with the same and is open-minded about applying the best tool for the job and solving real problems for real users
- You have 3-5 years experience scaling and optimizing schemas and performance tuning SQL and ETL pipelines. Have strong operational and development experience of at least one RDBMS (PostgreSQL is strongly preferred)
- You are proficient in Python scripting and Web frameworks (e.g., Flask).
- You have experience with JVM.
- You have experience building deployment pipelines with Jenkins or Kubernetes and with AWS, Redshift, and infrastructure as code (terraform and ansible)
- You have experience with analytics tracking tools such as Google Analytics, Segment, Heap, etc.
- You may have experience with BI tools such as Looker, PowerBI, or Tableau, which is a strong plus
- You have a Bachelors in Computer Science or equivalent experience