monday.com
  • 97 active jobs (view)

  • Published: March 31, 2022
Category
Job Type
Level of education
High school
Level of Hebrew
Medium
Location of job
Tel Aviv/ Ramat Gan
How many relevant years experience do you require for the role:
3 years

Description

monday.com is looking for an experienced Data Infra Engineer with high technical skills and passion for data who can design, implement & scale our next generation of Data solutions and ecosystem. This is an opportunity to join us in a very exciting phase of design & build BigBrain 2.0 Data systems. You’ll join the BigBrain group based in our headquarters, Tel Aviv, Israel.

The BigBrain group is taking big data into its own hands! We manage the data backbone of our company that allows monday.com to thrive off of A/B testing, analyzing tools, dashboards, and much more! It is different from systems that can be defined as external bi systems, BigBrain is the engine of the organization from which we make decisions based on its insights, change strategies and set strategies based on the data and insight.

About The Role:

As a Data Infra Engineer in the BigBrain data engineering team you will be part of a growing team that builds a top-notch big-data ecosystem with hyper scale and most influential data needs.The Data infra engineer will Design, develop , implement and support robust, scalable solutions and be responsible for data core components & data systems such as Data Lake, Batch & Steaming solution, build high scale data pipelines and Manage distributed systems.

Our Team:

The BigBrain group is taking big data into its own hands! We manage the data backbone of our company that allows monday.com to thrive off of A/B testing, analyzing tools, dashboards, and much more! It is different from systems that can be defined as external bi systems, BigBrain is the engine of the organization from which we make decisions based on its insights, change strategies and set strategies based on the data and insight.

Requirements

- Experience Building and designing large-scale applications.
- Experience with Big Data architectures.
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience working in a cloud environment, preferably AWS.
- 3+ Experience with a programming language (python, java, scala, ruby, etc).

Advantage:

- Experience with linux/unix based OS.
- Experience with designing data pipelines & data lakes.
- Experience with SQL and database architecture.
- Experience with workflows and data processing pipelines like AirFlow.

Apply
(Check on your spam box)
Drop files here browse files ...

Related Jobs