Lead Data Engineer
Responsibilities:
- Design, build, and maintain production-grade data pipelines using Airflow and AWS services like Lambda, DynamoDB.
- Own data ingestion from internal systems and third-party integrations (e.g., Google, Bing, external APIs).
- Manage data storage and movement across S3, Snowflake, Snowpipe, and DynamoDB.
- Write and maintain custom Python code that runs reliably in production.
- Work across dev, staging, and production environments with proper deployment and rollback practices.
- Partner with analytics, data science, and product teams to design reliable, usable data models.
- Review code, mentor junior engineers, and help establish best practices for data quality, reliability, and observability.
- Identify and fix performance, cost, or reliability issues in existing pipelines.
Skills:
- Strong experience building and maintaining production data pipelines.
- Deep comfort with Python for data engineering (not just scripting).
- Hands-on experience with Airflow in a real production environment.
- Experience working in AWS, including S3 and managed services.
- Solid SQL skills and experience working with analytical warehouses (Snowflake strongly preferred).
- Experience operating systems across multiple environments (dev / prd).
Benefits:
- 💻 Opportunity to work from home
- 🌟 Excellent work environment
- 🩺 Medical, dental, and vision insurance
- 🌴 Up to 15 days of paid time off
- 🎉 11 company observed holidays
- 👶 8 weeks of paid parental leave
- 💰 401k plan with company match
- 🛡️ Life insurance
- 📈 Professional growth opportunity
- 🌐 Most importantly, an inclusive company culture established by an incredible team!
Published about 4 hours ago • Expires March 30, 2026 06:02