Skyworks is expanding its Machine Learning & Data Science team and seeks an LLM / Machine Learning Engineering Summer Intern to help build AI‑powered applications. You will work on internal chatbots, LLM-driven tools, and Model Context Protocol (MCP) integrations to streamline engineering and manufacturing workflows. This internship provides hands-on experience with modern LLM architectures, data pipelines, chatbot systems, and production AI tooling.
What You'll Do
- Develop and enhance internal chatbots powered by large language models.
- Build and connect MCP servers to interface with internal engineering and manufacturing tools.
- Experiment with LLMs and transformer architectures for text understanding and automation.
- Support development of LLM-driven applications, retrieval workflows, embeddings, and agent tools.
- Prepare and process structured and unstructured data for model training and inference.
- Assist with building APIs and model-serving components.
- Collaborate closely with ML engineers, software developers, and domain experts.
- Communicate findings clearly to technical and non‑technical stakeholders.
What We're Looking For
- Currently pursuing a Bachelor’s, Master’s, or PhD in Computer Science, Electrical Engineering, Data Science, Applied Mathematics, or a related field.
- Strong proficiency in Python and ML libraries (PyTorch, TensorFlow, NumPy, Pandas, scikit‑learn).
- Experience with LangChain, LlamaIndex, or similar LLM frameworks.
- Familiarity with LLMs and transformer models (GPT, LLaMA, Mistral, etc.).
- Experience or strong interest in chatbots and conversational AI.
- Understanding of data preparation, preprocessing, and model evaluation.
- Strong problem-solving, communication, and teamwork abilities.
- Ability to work in a fast-paced, collaborative environment.
- Will be available from May/June - August 2026.
Nice to Have
- Experience with Model Context Protocol (MCP) tools or agent integrations.
- Experience with Vector databases (FAISS, Pinecone, Chroma, Milvus).
- Experience with backend frameworks like FastAPI or Flask.
- Experience with Cloud platforms (Azure, AWS, or GCP).
- Experience with Docker or containerized development.
- Exposure to engineering or industrial datasets.
- Interest in applying LLMs to real-world engineering and manufacturing workflows.
Technical Stack
- Languages & Core Libraries: Python, PyTorch, TensorFlow, NumPy, Pandas, scikit‑learn
- LLM Frameworks & Models: LangChain, LlamaIndex, GPT, LLaMA, Mistral
- Tools & Protocols: Model Context Protocol (MCP)
- Databases: FAISS, Pinecone, Chroma, Milvus
- Backend & Cloud: FastAPI, Flask, Azure, AWS, GCP, Docker
Benefits & Compensation
- Compensation: USD $26.00 - $47.50 per hour.
Skyworks is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability, protected veteran status, or any other characteristic protected by law.



