Luminal optimizes AI models to accelerate and simplify model deployment using a search-based compiler.
The AI stack needs to be rethought from the ground up to achieve this. As demand for inference grows, teams will need to run models across a wider range of hardware, not just the platforms with the most mature software support. Luminal makes AI workloads faster, more portable, and easier to deploy by automatically optimizing models for the ideal hardware. Our mission is to make state-of-the-art AI production-ready on any compute platform. We have already closed multiple contracts with non-Nvidia hardware platforms.
Luminal is backed by Y Combinator and Felicis as well as top-tier angels such as Paul Graham (founder of Y Combinator), Guillermo Rauch (founder of Vercel) and many others.
What You’ll Do
Design and build core compiler infrastructure in Rust
Build search-based optimization systems for discovering faster kernels
Develop backend code generation for multiple targets (NVIDIA, Trainium, AMD, TPU, etc.)
Implement compiler passes for fusion, scheduling, memory planning, and kernel selection
Profile real models, identify bottlenecks, and improve latency and throughput
Help shape Luminal’s engineering culture from the ground up
What we’re looking for
Experience with compiler frameworks: LLVM, MLIR, or other custom compilers
Hands-on experience with at least one of: PTX/SASS, GCN/RDNA assembly, or other GPU ISAs
Familiarity with ML compilers: torch.compile (or custom PyTorch backend), XLA, TVM, etc.
Nice to haves
Background in distributed systems or multi-device compilation
Experience with Egg/Egglog is a STRONG plus
Experience with Rust
Experience in developing and deploying AI models in production environments
Role Description
This is a full-time on-site role for a Founding Compiler Engineer located in downtown San Francisco. You will be responsible for assisting the design of the core compiler. Day-to-day tasks will include writing CUDA kernels, conducting model performance reviews, and shitposting on social media.