Hunters
  • 3 active jobs (view)

  • Published: March 19, 2024
Category
Job Type
Level of education
Undergraduate
Level of Hebrew
Medium
Location of job
Tel Aviv/ Ramat Gan
How many relevant years experience do you require for the role:
More than 3 years

Description

Hunters SOC Platform is a Human-Driven, AI-Powered SIEM alternative that revolutionizes the way SOCs operate. Hunters automates the entire TDIR process, replacing repetitive human work with machine-powered detection, enrichment, correlation, prioritization, triage, and investigation, freeing analysts to proactively protect their organizations. Hunters utilizes an open security data lake architecture, ensuring complete and cost-effective coverage of the entire security stack.

Enterprises like Booking.com, Snowflake, and ABInBev leverage Hunters SOC Platform to empower their security teams. Hunters is backed by leading VCs and strategic investors including Stripes, YL Ventures, DTCP, Cisco Investments, Bessemer Venture Partners, U.S. Venture Partners (USVP), Microsoft’s venture fund M12, Blumberg Capital, Snowflake, Databricks, and Okta.

We are looking for a Senior Big Data Engineer with proven experience as a Backend Engineer who will play a critical role at one of Hunters' Big Data teams. As a leading member, you will take part in building real-time detection & transformation engines and handling petabytes of data on top of our cloud infrastructure. As well as building Backend components in our platform.

Responsibilities:

- Work as a member of a senior Agile Scrum team to design, develop, and maintain large-scale data stream processing pipelines and backend components on top of Flink, Spark, K8S, and AWS.
- Design, develop, and implement solutions to handle scaling with good performance & cost that are easy to maintain.
- Solve a wide variety of complex problems on a high scale.
- Collaborate with engineers across Hunters' R&D group and Product Managers to improve our platform.
- Develop and implement components features and components in our platform to give internal and external users the ability to build data transformations and detection pipelines at scale.
- Develop and implement data quality checks to ensure accuracy and completeness of data and work on externalizing monitoring capabilities to provide full visibility into all stages of data ingestion.
- Optimize and tune performance of Flink and WH jobs to handle large volumes of data.
- Stay up-to-date with the latest technologies and trends in big data processing and distributed computing.
- Participate in code reviews and ensure adherence to coding standards and best practices.

REQUIREMENTS

- 7+ years of experience as a Backend Engineer
- 5+ years of development experience in Scala with functional programming paradigm.
- 3+ years of hands-on experience in Big Data engineering with a focus on Scala/Java, Spark/Flink/Kafka, and cloud architecture (EMR/K8S).
- Experience with modern Data lakes/warehouses such as Snowflake and Databricks.
- Deep technical expertise in distributed systems, stream processing, and data modeling of large data sets.
- Proven track record of delivering high-quality, scalable, and secure systems in a fast-paced working environment.
- Experience with data governance practices, data security, and performance & cost optimization containers, working with AWS services such as S3, EKS, and more.
- Strong problem-solving skills and ability to work independently.
- A team player with excellent communication skills.

- B.Sc. in computer science or an equivalent.

Advantages:

- Experience with Hadoop MapReduce.
- Advantage for experience in effect systems such as ZIO.
- Production experience working with SaaS environments.
- Experience in data modeling.

Apply
(Check on your spam box)
Drop files here browse files ...

Related Jobs

April 3, 2024