סקירה כללית
Data Engineer – Cyber Research & Development Team (Networks & Cryptography) We are seeking an experienced Data Engineer to join a newly established and innovative Research and Development team in the field of Cyber Security. This role focuses on internet communications, cryptography, and large‑scale data processing, and offers the opportunity to work on cutting‑edge technologies from early research stages through to full product delivery. The team operates across the domains of Cyber Security, Cryptography, Big Data, and Data Engineering, tackling complex data‑driven engineering and algorithmic challenges. Our work emphasizes advanced data processing, storage, and analysis, and integrates Big Data architectures, data pipelines, cryptographic systems, networking, and cyber security into a unique and multidisciplinary environment. Key Responsibilities * Design, develop, and maintain large‑scale data pipelines and data processing systems * Work with Big Data and streaming technologies in complex production environments * Collaborate with researchers and engineers on data‑driven and algorithmic challenges * Take part in the full development lifecycle, from technological ideation to production delivery * Develop scalable, efficient, and secure data solutions for cyber and networking domains Requirements * Bachelor’s degree in Computer Science, Software Engineering, Statistics, or a related field * At least 3 years of experience in Data Engineering or Big Data roles * Hands‑on experience with Big Data technologies such as Hadoop, Spark, Kafka, Flink, or Beam * Proven experience building data pipelines and ETL processes in complex environments * Strong knowledge of SQL and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra) * Experience working with cloud platforms such as AWS, GCP, or Azure * Passion for advanced technologies, curiosity for new domains, creativity, and problem‑solving skills Advantages * Experience with streaming data and real‑time data processing systems * Familiarity with technologies such as Apache Kafka or RabbitMQ * Knowledge of Machine Learning or Data Science * Experience with Docker and Kubernetes for deploying data systems * Experience with Data Warehousing and BI tools such as Snowflake, Tableau, or Power BI * Programming experience with Python, Scala, or Java * Experience working with version control systems, especially Git
דרישות המשרה
* Design, develop, and maintain large‑scale data pipelines and data processing systems * Work with Big Data and streaming technologies in complex production environments * Collaborate with researchers and engineers on data‑driven and algorithmic challenges * Take part in the full development lifecycle, from technological ideation to production delivery * Develop scalable, efficient, and secure dat