סקירה כללית

^^משרה זו נלקחה מ Career^^We are looking for a Senior Data Infrastructure Engineer to lead the design, build, and optimization of a modern data platform. The role involves hands
• on work with cloud
• based data technologies, building data lakes from scratch, and managing large
• scale data pipelines while ensuring high performance, cost efficiency, and reliability. You will collaborate closely with data engineers, data science, analytics, and product teams to support business needs. Key Responsibilities: Design and build scalable data lakes / platforms using technologies such as Snowflake, Databricks, BigQuery, or Redshift Develop and optimize large
• scale data pipelines for batch and streaming use cases Ensure high performance, scalability, and cost efficiency across data systems Work with complex data workflows, AI models, transformations, and orchestration Apply best practices in data modeling, monitoring, security, and governance Requirements 5+ years in data engineering or data infrastructure roles Proven experience building modern data platforms or data lakes from scratch Strong Python programming skills and experience with Spark / PySpark Knowledge of distributed systems and cloud
• based architectures Experience with ETL/ELT processes and handling data at scale Nice to Have: Experience with cloud providers (AWS, GCP, Azure) Familiarity with orchestration tools (Airflow, Dagster) Knowledge of data governance, security, and access control Experience supporting analytics, BI, or machine learning workloads What We Offer: Ownership of end
• to
• end modern data platforms Opportunity to tackle high
• impact, large
• scale data challenges Collaborative, professional engineering environment Competitive compensation and benefits

דרישות המשרה

Design and build scalable data lakes / platforms using technologies such as Snowflake, Databricks, BigQuery, or Redshift Develop and optimize large
• scale data pipelines for batch and streaming use cases Ensure high performance, scalability, and cost efficiency across data systems Work with complex data workflows, AI models, transformations, and orchestration Apply best practices in data modeling