ECTS - Applied Large Language Models

Applied Large Language Models (CMPE454) Course Detail

Course Name Course Code Season Lecture Hours Application Hours Lab Hours Credit ECTS
Applied Large Language Models CMPE454 Area Elective 3 0 0 3 5
Pre-requisite Course(s)
N/A
Course Language English
Course Type Elective Courses
Course Level Natural & Applied Sciences Master's Degree
Mode of Delivery Face To Face
Learning and Teaching Strategies Lecture, Question and Answer, Drill and Practice, Problem Solving.
Course Coordinator
Course Lecturer(s)
  • Asst. Prof. Dr. Arda Sezen
Course Assistants
Course Objectives To equip students with a solid understanding of Generative AI technologies, particularly focusing on Large Language Models (LLMs), Transformer architectures, and widely used development ecosystems such as Hugging Face. Students will explore foundational concepts, hands-on experiments, and real-world applications of LLMs.
Course Learning Outcomes The students who succeeded in this course;
  • Understand the foundations and impact of Generative AI and LLMs.
  • Gain practical experience with Hugging Face tools, NLP pipelines, and tokenization.
  • Analyze the Transformer architecture and its role in autoregressive training.
  • Examine scaling laws, pre-training approaches, and reinforcement learning with human feedback.
  • Apply parameter-efficient fine-tuning and integration of LLM components and systems.
  • Explore the use of LLM agents and compound systems in real-world tasks.
Course Content LLMs, Generative AI, tokenization, embeddings, NLP pipelines, attention, Transformer architecture, autoregressive training, LLM scaling and fine-tuning, RLHF, LLM system design, Hugging Face ecosystem, and deployment.

Weekly Subjects and Releated Preparation Studies

Week Subjects Preparation
1 Course Introduction, Introduction to Generative AI Course Book – Chapter 1
2 Introduction to Hugging Face Tools and Models, Natural Language Processing and Language Models Natural Language Processing with Transformers, Revised Edition – Chapter 1
3 Tokens and Tokenization, Word Embeddings Course Book – Chapter 2
4 Experimenting with Word Embeddings, Tokens, and NLP Course Book – Chapter 2
5 Language Models and Attention Course Book – Chapter 3
6 The Transformer Architecture Course Book – Chapter 3, Build a Large Language Model – Chapter 4
7 Midterm Exam
8 Autoregressive Training Transformer LLMs Lecture Notes, Natural Language Processing with Transformers, Revised Edition – Chapter 10
9 LLM Architecture Variants, Scaling Laws in Training LLMs, Training Data for Larger LLMs Natural Language Processing with Transformers, Revised Edition – Chapter 8, Chapter 11
10 Part I: Pre-training, continued pre-training, and task training Part II: Reinforcement Learning with Human Feedback (Chat preparation) Course Book – Chapter 10, Build a Large Language Model – Chapter 5
11 Parameter-Efficient Fine-Tuning Methods Course Book – Chapter 12
12 LLM Components Lecture Notes, Course Book – Chapter 5
13 LLM Compound Systems Course Book – Chapter 7
14 LLM Agents Course Book – Chapter 7
15 Review
16 Final Exam

Sources

Course Book 1. Hands-On Large Language Models: Language Understanding and Generation, 1st Edition by Jay Alammar and Maarten Grootendorst, Publisher: O'Reilly Media, Oct. 15, 2024.
Other Sources 2. NVIDIA Deep Learning Institute: https://www.nvidia.com/en-us/training/
3. Natural Language Processing with Transformers, Revised Edition, by Lewis Tunstall, Leandro von Werra, and Thomas Wolf, Publisher: O'Reilly Media, July. 5, 2022.
4. Build a Large Language Model (From Scratch), 1st Edition by Sebastian Raschka, Publisher: Manning, Sep., 2024.
5. Hugging Face web page: https://huggingface.co/
6. PyTorch web page: https://pytorch.org/
7. TensorFlow web page: https://www.tensorflow.org/

Evaluation System

Requirements Number Percentage of Grade
Attendance/Participation - -
Laboratory - -
Application - -
Field Work - -
Special Course Internship - -
Quizzes/Studio Critics - -
Homework Assignments 1 20
Presentation - -
Project - -
Report - -
Seminar - -
Midterms Exams/Midterms Jury 1 35
Final Exam/Final Jury 1 45
Toplam 3 100
Percentage of Semester Work 55
Percentage of Final Work 45
Total 100

Course Category

Core Courses X
Major Area Courses
Supportive Courses
Media and Managment Skills Courses
Transferable Skill Courses

The Relation Between Course Learning Competencies and Program Qualifications

# Program Qualifications / Competencies Level of Contribution
1 2 3 4 5
1 Applies knowledge of mathematics, science, and engineering
2 Designs and conducts experiments, analyzes and interprets experimental results.
3 Designs a system, component, or process to meet specified requirements.
4 Works effectively in interdisciplinary fields.
5 Identifies, formulates, and solves engineering problems.
6 Has awareness of professional and ethical responsibility.
7 Communicates effectively.
8 Recognizes the need for lifelong learning and engages in it.
9 Has knowledge of contemporary issues.
10 Uses modern tools, techniques, and skills necessary for engineering applications.
11 Has knowledge of project management skills and international standards and methodologies.
12 Develops engineering products and prototypes for real-world problems.
13 Contributes to professional knowledge.
14 Conducts methodological and scientific research.
15 Produces, reports, and presents a scientific work based on original or existing knowledge.
16 Defends the original idea generated.

ECTS/Workload Table

Activities Number Duration (Hours) Total Workload
Course Hours (Including Exam Week: 16 x Total Hours) 16 3 48
Laboratory
Application
Special Course Internship
Field Work
Study Hours Out of Class 16 2 32
Presentation/Seminar Prepration
Project
Report
Homework Assignments 1 18 18
Quizzes/Studio Critics
Prepration of Midterm Exams/Midterm Jury 1 12 12
Prepration of Final Exams/Final Jury 1 15 15
Total Workload 125