Share
What you'll be doing:
Scope and build tools, microservices, workflows, and distributed applications to accelerate data mining and AI training.
Design and implement solutions for streaming, resilience, logging, security, authentication, workflow orchestration, and data management.
Deploy AI models.
Design and develop Retrieval-Augmented Generation (RAG) workflows enabling hybrid and agentic patterns.
Analyze and operationalize complex distributed systems for speed-of-light performance.
What we need to see:
Master’s degree in Computer Science | Electrical Engineering (or equivalent experience).
8+ years of experience developing high-performance, scalable software systems.
Strong programming skills in Python or Golang
Proficiency in key technologies like Kubernetes, Helm, Hive, Parquet, SQL, vector databases, e.g., Milvus.
Strong architectural skills with a proactive, problem-solving mentality.
Experience in data mining | AI development
Experience building ETL pipelines and working with big data engines.
Exceptional collaboration skills to work with system software and AI expert teams.
Eagerness to learn and adopt new technologies such as NVIDIA RAPIDS.
Ways to Stand Out from the Crowd:
Prior experience with large-scale real-time streaming, augmented reality, or data curation.
Prior background with Spark.
Exposure to the latest advances in AI, including Large Language Models, Vision-Language Models, and Retrieval-Augmented Generation (RAGs).
Innovative results, including patents, publications, or open source contributions.
You will also be eligible for equity and .
These jobs might be a good fit