Poolside is looking for a Member of Engineering focused on pre-training and CUDA to join our team building Large Language Models. You would be working hands-on in our pre-training team, focused on optimizing large-scale training runs via custom kernel development with access to thousands of GPUs to verify changes.
What You'll Do
- Profile large-scale training workloads and identify communication and computation bottlenecks
- Develop custom kernels to improve training performance
- Collaborate with researchers to make novel research ideas scale efficiently
- Enhance and maintain our training and inference codebases
- Write high-quality Python (PyTorch), Cython, C/C++ code and perform refactorings
- Work within the team to plan future steps, discuss, and always stay in touch
What We're Looking For
- Understanding of Large Language Models (LLMs)
- Basic knowledge of Transformers
- Knowledge of distributed training
- Strong CUDA background and experience with GPU programming
- Development experience with NCCL, CUTLASS, CUBLAS, etc.
- Understanding of NVLink, NVSwitch, NVShmem
- Strong engineering background
- Programming experience
- Experience with Linux
- Strong algorithmic skills
- Proficiency in Python with PyTorch, or Jax
- Proficiency in C/C++
- A mindset of using modern tools and always looking to improve
- Strong critical thinking and ability to question code quality policies when applicable
Technical Stack
- Languages & Frameworks: Python, PyTorch, Jax, C, C++, Cython
- GPU & Systems: CUDA, NCCL, CUTLASS, CUBLAS, Linux
Team & Environment
You'll join a multidisciplinary blend of research, engineering, and business experts.
Benefits & Compensation
- Fully remote work & flexible hours
- 37 days/year of vacation & holidays
- Health insurance allowance for you & your dependents
- Company-provided equipment
- Well-being, always-be-learning & home office allowances
- Frequent team get togethers
Work Mode
This role is hybrid and open to candidates in Europe and North America.
Poolside is an equal opportunity employer committed to a diverse and inclusive people-first culture.






