Guesty
  • 12 active jobs (view)

  • Published: August 4, 2022
Category
Job Type
Level of education
High school
Level of Hebrew
Medium
Location of job
Tel Aviv/ Ramat Gan
How many relevant years experience do you require for the role:
2 years

Description

Guesty provides a property management software product and support services for the $100 billion Airbnb and vacation rental market. We are one of the fastest growing high-tech start-ups in Tel Aviv, we are profitable and showing very promising results.

Our team is growing, and we are looking for a talented DataOps.

DataOps mission is to build and develop the organization's data backbone architecture to enable the ingestion and storage of large scale data in a reliable manner, and to develop internal frameworks and practices to enable engineers across the organization with the best practices and tools for their data-related operations needs.

We need our engineers to be independent, versatile, and enthusiastic to take on new problems across the vast tech stack with which we work. If this kind of working environment sounds exciting to you, if you understand that Engineering is about building the most effective and elegant solution within a given set of constraints - consider applying for this position.

Responsibilities:

Responsibilities

- Designing and optimizing scalable data pipelines to meet the company's fast growth.
- Full responsibility for our data platforms (DWH/ streaming solutions/ reporting services/ETL processes)
- Developing internal data tools and frameworks, such as orchestration tools, to support other engineering team’s data operations.
- Designing and optimizing data models and access patterns with both performance and cost-efficiency in mind.
- Improving our data systems performance and reliability against steadily increasing loads and varieties of work.

Requirements

Requirements

- Proven experience designing and building distributed large-scale production systems.
- Proven experience working with large-scale data processing (preferably Kafka)
- Experience with various SQL and NoSQL data stores, such as Mongo, Postgres, Elasticsearch, Redis, DynamoDB, etc.
- Experience designing and optimizing high volume data pipelines.
- Hands-on experience with data platforms and big data frameworks (e.g., Spark, snowflake, airflow, kinesis etc’) building large-scale solutions

Apply
(Check on your spam box)
Drop files here browse files ...

Related Jobs

October 23, 2022
May 8, 2022
April 5, 2022