- Published: August 15, 2021
* Team8 Portfolio Company
Noogata is industrializing enterprise AI, enabling enterprises to easily and scalably infuse AI into key processes at a fraction of the cost and complexity of current generation solutions. Our mission is to empower enterprise data consumers across all business functions to leverage AI-powered features, seamlessly plugged into their system of choice. Noogata's technology allows companies to deploy our prefabricated, modular AI Boards for common industry use-cases, customize, and scale them in production without requiring custom development or data-science capabilities.
Noogata is looking for world-class data fanatics to join us, build our product and grow our vision. We're looking for customer-focused, self-driven individuals that want to solve big, complex problems with a huge impact. Our team of Software Engineers, Data Scientists, ML Engineers and Customer Success Managers is passionate about delivering the promise of AI in the business world. So if you’re all that and looking to lead, invent and grow professionally -join us on our journey!
- Part of our core engineering team developing our AI automation technology. End to end ownership (from architecture to deployment) of core microservices in our technology stack.
- Build and own robust backend infrastructure (APIs, microservices, processing jobs) to streamline the data workflow of large datasets for machine learning purposes.
- Design system architectures, select/evaluate tools and open source projects to use within our architecture
- Participate in our Agile, delivery-focused development process. Be part of a startup company: Help setting priorities and making design decisions based on your experience and insights.
- Automate, Scale and Secure !
- 4+ years development experience in backend and data-engineering in Python or equivalent
- A fast learner and generalist - While you are a backend developer at heart, you enjoy and like working on different parts of the stack to get things done
- Experience working with Big data technologies (BigQuery, Kafka, Spark, Dask, Beam) at scale
- Experience working in Docker/Kubernetes in production
- Experience with cloud platforms (GCP, AWS, Azure), working on production payloads of large scale and complexity.
- Experience in working on enterprise software and data products. Knowledge/experience with machine-learning technologies a big advantage
- Experience in working in an Agile/SCRUM-like environment with cross-functional engineering teams