AI/ML Engineer - Retrieval-Augmented Generation (RAG)

Intriq

Intriq

Software Engineering, Data Science
Velké Němčice, Czechia · Brno, Czechia · Moravia, NY, USA · Czechia
Posted on Sep 21, 2025

We’re looking for a mid-to-senior level engineer to join our team and help shape the future of AI-powered applications. This role is ideal for someone who thrives at the intersection of machine learning, large language models (LLMs), and retrieval-augmented generation (RAG).

What You’ll Do
  • Design, build, and optimize retrieval-augmented generation (RAG) pipelines to enhance LLM performance.
  • Develop and integrate semantic search, vector databases, and retrieval strategies to ground AI systems in reliable sources.
  • Experiment with different embeddings, ranking algorithms, and retrieval techniques to balance accuracy, speed, and scalability.
  • Fine-tune and adapt LLMs for domain-specific use cases, ensuring outputs are factual, context-aware, and explainable.
  • Collaborate closely with product and research teams to translate business needs into robust AI/ML solutions.
  • Define and implement evaluation frameworks for trustworthiness, relevance, and latency.
What We’re Looking For
  • Strong background in machine learning, NLP, or information retrieval.
  • Experience with LLMs, RAG systems, and vector databases (e.g., Pinecone, Weaviate, FAISS, Milvus).
  • Proficiency in Python and modern ML frameworks (PyTorch, TensorFlow, Hugging Face).
  • Familiarity with MLOps practices for deploying and scaling ML-powered systems.
  • Ability to work independently and take ownership of complex projects from research to production.
  • Bonus: Experience with fine-tuning LLMs, prompt engineering, or ranking models.
Why Join Us
  • Work on cutting-edge AI/ML systems at the frontier of retrieval and generation.
  • Help build trustworthy, production-ready AI applications that reduce hallucinations and improve explainability.
  • Collaborate with a passionate team where your expertise directly shapes the product roadmap.