ONLINE COURSE ON
LARGE LANGUAGE MODELS: A HANDS-ON APPROACH (3:1)
CCE-PROFICIENCE MAY – JULY 2026
Duration
3 months May -July 2026
Schedule
Wednesday and Friday
7PM to 8:30 PM
Lab timings: Tuesday and Thursday (7:00 PM 8.30 PM).
Course offered
Online
Exam Duration
31 July to 9 August 2026
Classes Start
~4 May 2026
Objectives of the Course
This course provides hands-on engineering of Large Language Models (LLMs), focusing on the challenges of building, optimizing, and deploying them. Students will cover the entire lifecycle from: Transformer foundations, GPU scaling,Inference optimization,Fine-tuning Retrieval-augmented generation (RAG) Agentic tool use,Deployment on edge devices, Students will learn through real-world case studies, labs, and projects. By the end, students will be equipped to design production-grade LLM systems for both research and industry applications.
Syllabus
Transformer Foundations, GPUs, Pretraining basics, inference optimization, quantization, fine-tuning, instruction tuning and alignment, reasoning, and alignment, retrieval-augmented generation, tool-use and agents, multimodal LLMs, evaluation and edge deployment.
Minimum Qualification required by the candidates
B.E/B.Tech/ M.Sc . (Computer Science)/ MCA
Pre-requisites
Basics of NLP , Deep Learning, Python Programming
Software/Hardware to be used for the Lab
Python, CUDA, Google Colab, AWS Sagemaker (Open source software’s)
Reference Books
- Hands-On Large Language Models: Language Understanding and Generation By: Jay Alammar,
- Maarten Grootendorst Large Language Models: A Deep Dive Bridging Theory and Practice By Uday Kamath et al
- Al Engineering by Chip Huyen
- Various technical blogs
Know The Facilitators

Yoginder Kumar Negi
Supercomputer Education and Research Centre (SERC),
Indian Institute of Science, IISc Bangalore
Course Fee
| Particulars | Amount |
| Course Fee | 20,000 |
| Application Fee | 300 |
| GST@18% | 3,654 |
| Total | 23,954 |

