Data Engineer

Tel Aviv · Full-time · Intermediate

About The Position

We are looking for a seasoned data engineer to work closely with our data scientists and ML engineers, playing a key role in the design, development, and automation of our data collection, processing and modeling pipelines for smart manufacturing related products that build on artificial intelligence and advanced machine learning.  

The ideal candidate is a versatile, self-directed individual, and team-player, with demonstrated experience and knowledge in working with cloud, automation and data technologies.

What you'll do:

  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Work with stakeholders including the Product and Data Science and Solution teams to assist with data-related technical issues and support their data infrastructure needs
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader
  • Work closely with the Data Science team

We'd love to hear from you if you have:

  •   5+ years of hands-on experience as a backend (server-side) developer.
  •   3+ years of hands-on experience in Python.
  • experience working with docker containers and Kubernetes 
  • experience with MLOPS practices and tools (e.g. Kubeflow, MLflow)
  • Experience developing large-scale data systems, ETL pipelines, or stream processing systems.
  • Familiarity with data manipulation and Python ML frameworks (such as Numpy, Pandas, and scikit-learn).
  • Proficient with SQL (relational) DataBases (such as SQL Server, PostgreSQL, MySQL).
  • Experience with at least one NoSQL or big data store (e.g. Redis, Cassandra, Parquet). 
  •  Bsc. in Computer Science or equivalent from a leading university.


  • Experience with Airflow, Kafka, Spark, Databricks.
  • Experience with services and microservices methodologies over Docker and or Kubernetes.
  • Experience working with Microsoft Azure data services (Blob, Data warehouse, analytics service, Event hub).
  •  Experience with node.JS.
  • Worked with CI/CD methodologies and tools (e.g. Jenkins).

Apply for this position