futureproof consulting's logo

Data Engineer at futureproof consulting

  • Contract
  • Remote, Worldwide

Data Engineer

Responsibilities

  • Design, build, and optimize scalable and reliable data pipelines and data platforms that support advanced analytics, experimentation, and data-driven decision-making across manufacturing and business domains
  • This is a senior-level position requiring deep technical expertise, strong problem-solving skills, and the ability to collaborate effectively with cross-functional teams in a regulated environment
  • Develop and optimize ETL/ELT pipelines for structured and semi-structured data
  • Work extensively with SQL and Python or Scala to process and transform large datasets
  • Build and manage data solutions on AWS cloud infrastructure
  • Leverage big data technologies such as Spark, Kafka, and Hadoop
  • Design and maintain data models and data warehousing solutions
  • Orchestrate data workflows using tools such as Apache Airflow
  • Ensure data integrity, quality, and accuracy through validation, cleaning, and preprocessing techniques
  • Optimize data collection, processing, and storage for performance, scalability, and cost efficiency
  • Design and conduct experiments to test hypotheses and validate data-driven solutions
  • Collaborate with analytics, data science, engineering, and business stakeholders to understand data needs and align solutions with organizational goals
  • Create templates, dashboards, and visualizations to communicate insights clearly to stakeholders
  • Apply and support data governance, security, and compliance standards, especially in a regulated pharmaceutical environment

Skills

  • 10+ years of experience in Data Engineering or similar roles
  • Strong expertise in SQL and Python or Scala
  • Hands-on experience with AWS (profiles with strong AWS exposure will be prioritized)
  • Solid experience with Spark, Kafka, and Hadoop
  • Proven experience designing and implementing ETL/ELT pipelines
  • Strong understanding of data modeling, data warehousing concepts, and analytics use cases
  • Experience with workflow orchestration tools such as Apache Airflow
  • Strong analytical thinking and problem-solving skills
  • Excellent communication and collaboration skills
  • Ability to work independently in a remote, distributed team

Benefits

  • 💼 Project Duration: 10 months+
  • 💼 Work Model: Remote (US time zone collaboration required)

Published about 16 hours ago • Expires March 15, 2026 06:02