
Elementor
Published: November 22, 2020
Job Type
Category
Description
Responsibilities
- Design, architect and build Elementor data architecture from the ground up
- Implement best in class solutions for different use cases
- Utilize a combination of cloud based and open source tools and frameworks to solve our most complex data problems
- Develop tailormade solutions as part of the data pipelines
- Implement data pipelines and data architecture that are scalable, fault tolerant, support high throughout, low latency while considering security aspects at all times
- Consider cost optimization (finops) at all stages of the implementation
- You will be expected to provide solutions that are best-in-class in terms of industry standards and best practices
- Design, architect and build Elementor data architecture from the ground up
- Implement best in class solutions for different use cases
- Utilize a combination of cloud based and open source tools and frameworks to solve our most complex data problems
- Develop tailormade solutions as part of the data pipelines
- Implement data pipelines and data architecture that are scalable, fault tolerant, support high throughout, low latency while considering security aspects at all times
- Consider cost optimization (finops) at all stages of the implementation
- You will be expected to provide solutions that are best-in-class in terms of industry standards and best practices
Requirements
- BSc in computer science / software engineering or other equivalent / related academic degree - A Must
- Hold a relevant certification in the area of big data architecture in one or more of the public clouds (AWS Solution Architect / Big Data | GCP Data Engineer) - Nice to Have
- At Least 4 years of industry experience in a similar role - A Must
- Familiar with different tools for using off-the-shelf ETL tools (like Matillion, Stitch, Fivetran) and orchestration frameworks (like Airflow, AWS Appflow, AWS Step Functions & Data pipelines, AWS Glue, GCP Cloud Composer & Dataflow) - A Must
- Extensive experience in big data architectures at scale in the cloud (AWS | GCP) - A Must
- Deep understanding and experience with data lakes & data warehouses in the cloud (S3, Redshift, Cloud storage, BigQuery) - A Must
- Deep understanding and experience with RDS and NoSql DBs (MySql, Elastic, MongoDB, Redis) - A Must
- You speak fluent python but can understand other languages as well - A Must
- Familiar with bash scripting and linux OS - A Must
- Experience with near real time data streaming solutions (mainly different types of queueing and streaming, pub/sub paradigms, Kafka, AWS sns / sqs, Kinesis firehose & streames, GCP Cloud Pub/Sub & Dataflow) - Nice to Have
- You are accustomed to talking to internal stakeholders, understand their needs and build the absolute best-in-class data architecture to solve their problems while looking at the company needs for short and medium term
- Hold a relevant certification in the area of big data architecture in one or more of the public clouds (AWS Solution Architect / Big Data | GCP Data Engineer) - Nice to Have
- At Least 4 years of industry experience in a similar role - A Must
- Familiar with different tools for using off-the-shelf ETL tools (like Matillion, Stitch, Fivetran) and orchestration frameworks (like Airflow, AWS Appflow, AWS Step Functions & Data pipelines, AWS Glue, GCP Cloud Composer & Dataflow) - A Must
- Extensive experience in big data architectures at scale in the cloud (AWS | GCP) - A Must
- Deep understanding and experience with data lakes & data warehouses in the cloud (S3, Redshift, Cloud storage, BigQuery) - A Must
- Deep understanding and experience with RDS and NoSql DBs (MySql, Elastic, MongoDB, Redis) - A Must
- You speak fluent python but can understand other languages as well - A Must
- Familiar with bash scripting and linux OS - A Must
- Experience with near real time data streaming solutions (mainly different types of queueing and streaming, pub/sub paradigms, Kafka, AWS sns / sqs, Kinesis firehose & streames, GCP Cloud Pub/Sub & Dataflow) - Nice to Have
- You are accustomed to talking to internal stakeholders, understand their needs and build the absolute best-in-class data architecture to solve their problems while looking at the company needs for short and medium term