סקירה כללית
^^משרה זו נלקחה מ Career^^Mentee Robotics is redefining humanoid automation with an AI
• first approach, integrating cutting
• edge perception, reasoning, and dexterous manipulation into a fully autonomous humanoid robot that continuously adapts and learns. Our flagship product, Menteebot v3, is designed to seamlessly integrate into industrial, logistics, and retail environments, performing complex tasks with human
• like adaptability. We are looking for an experienced Senior Software Engineer to join our SW Engineering team. This role is central to our "data
• focused" strategy. You will build the core data infrastructure that fuels our AI
• first approach, responsible for the entire lifecycle of our robotics data. Responsibilities Design and implement high
• performance, scalable software solutions, primarily using Python. Design and build robust, scalable ETL pipelines to ingest and transform multi
• modal robotics data, including video, sensor streams (joint states), teleoperation logs, and open
• source datasets. Architect and maintain our core data lake infrastructure, creating a "single source of truth" for tagged and versioned robotics data. Work closely with AI researchers to define labeling schemas, ensure data quality, Champion best practices in data engineering and software development within a cutting
• edge robotics environment. Requirements 5+ years of experience as a Software Engineer, Data Engineer, or ML Infrastructure Engineer. Extensive experience and strong proficiency in Python
• a must
• have. Deep understanding and hands
• on experience with ETL and data pipeline development
• a must
• have. Proven experience working with data lake technologies. Experience with cloud platforms (e.g., AWS, GCP, Azure). Solid understanding of database systems (SQL/NoSQL). Advantages Familiarity with robotics data (e.g., ROS bags, sensor time
• series) or multi
• modal data (video, text, sensor fusion). Experience with data annotation/labeling platforms (e.g., Label Studio, V7, or custom
• built tools). A strong understanding of the data
• centric challenges in modern AI (e.g., active learning, data curation for foundation models). Familiarity with containerization and orchestration (Docker, Kubernetes). Experience with stream processing technologies (e.g., Kafka). Deep understanding of Linux.
דרישות המשרה
Design and implement high
• performance, scalable software solutions, primarily using Python. Design and build robust, scalable ETL pipelines to ingest and transform multi
• modal robotics data, including video, sensor streams (joint states), teleoperation logs, and open
• source datasets. Architect and maintain our core data lake infrastructure, creating a "single source of truth" for tagged and