ECE 364

ECE 364 - Data Science and Engineering

Spring 2025

TitleRubricSectionCRNTypeHoursTimesDaysLocationInstructor
Programming Methods for MLECE364CSP77448LEC31400 - 1515 F    Nickvash Kani
Programming Methods for MLECE364ML77256LEC31230 - 1350 T R  2017 Electrical & Computer Eng Bldg Nickvash Kani
Data Science and EngineeringECE364ZJ177436LCD3 -    Nickvash Kani

Official Description

Focuses on auto-differentiation tools like PyTorch used with basic machine learning algorithms (linear regression, logistic regression, deep nets, k-means clustering), and extensions in custom methods to fit specific needs. Auto-differentiation tools are essential for data analysis and a solid understanding is increasingly important in many disciplines. In contrast to existing courses which focus on algorithmic and theoretical aspects of Machine Learning, the focus here is on implementation with auto-diff tools. Course Information: Prerequisite: MATH 257.

Website

https://courses.grainger.illinois.edu/ece364/fa2025

Goals

This course covers how to use auto-differentiation tools like PyTorch for basic linear algebra operations, how to leverage them for basic machine learning algorithms (linear regression, logistic regression, deep nets, k-means), and how to extend them with custom methods. Auto-differentiation tools are key for data analysis and a solid understanding is increasingly important across disciplines. Different from courses which focus on algorithmic and theoretical aspects of Machine Learning, here we focus on deployment of auto-diff tools to different areas of interest. Core concepts in machine learning are introduced and developed both mathematically and practically in software within the framework of implementation in a popular auto-differentiation framework like PyTorch.

Topics

Outline of covered topics:

  1. Introduction (week 1-2): introduction to software, tensors, tensor views and functions
  2. Linear algebra, differentiation, and optimization (week 3-5): introduction to basic linear algebra, calculus, differentiation, and differentiation w.r.t. vectors, as well as their implementation in auto-diff tools and their use in basic optimization algorithms
  3. Linear Regression (week 6-8): careful discussion of linear regression, its implementation in auto-diff tools and its combination with auto-diff tool primitives like datasets, data-loaders etc.
  4. Logistic Regression, Deep Nets and Temporal Data (week 9-11): mathematical derivation of logistic regression, discussion of its combination with auto-diff tool primitives and extension to deep nets, image data and temporal data
  5. Clustering (week 12): discussion of basic unsupervised learning algorithms like k-means clustering and its implementation with PyTorch primitives
  6. Extending PyTorch and Special Topics (week 13-14): discussion of distributed training, development of extensions via Python, C++ and CUDA, or other special topics based on students and instructor interest.
  7. Review lectures (one week total): we need one week to review material before the midterm and the final

Detailed Description and Outline

Detailed Outline:


Class

Goal

1

Intro and Software Install

get PyTorch and other software up and running

2

Pytorch Tensors, Views, Indexing

get to know the concept of a tensor, a view, and basic indexing

3

Pytorch Storage (advanced indexing, CPU/GPU, data types)

learn how tensors are stored, what a view means to storage and when advanced indexing creates a copy or a view

4

Pytorch Functions

get to know different functions and how they can be applied to tensors elementwise; discuss stacking of functions

5

Linear algebra and differentiation w.r.t. vectors/matrices

get to know lin alg basics and differentiation of functions w.r.t. vectors and matrices

6

Pytorch Matrix

get to know matrix and vector functions as well as batching (e.g., bmm); stacking and computation graph

7

Automatic differentiation

understand how to compute simple derivatives manually, using grad or backward function; tensors with gradients

8

Automatic differentiation

understand computation of hessians and jacobians

9

Automatic differentiation

understand efficient computation of hessian vector products

10

Primal optimization

learn optimization basics (gradient descent and stochastic gradient descent)

11

Linear regression 1

learn how to code the objective and how to solve it with gradient descent and stochastic gradient descent; also manual computation of gradients; parameters

12

Linear regression 2

derive how to solve objective exactly; introduce pytorch solvers (LU, cholesky, etc.)

13

Pytorch Optimizers

learn about pytorch optimizers

14

Pytorch Dataset

learn about python classes; dataset class

15

Pytorch Dataloaders

get to know existing dataloaders and write a new dataloader from scratch

16

Logistic regression

derive logistic regression and implement with model classes, auto-diff and dataloaders

17

Logistic regression on images

understand Multi-layer perceptron (MLP) and convolutional neural nets

18

Deep Nets 1

learn how to scale to larger data (other layers, e.g., dilated convolution, residual blocks, normalization)

19

Deep Nets 2

learn about recurrent blocks, transformers, etc.

20

Temporal data

get to know the long-short-term-memory (LSTM) module and its use

21

Clustering

understand the basics of k-Means

22

Distributed Training

learn how to train deep nets on multiple GPUs or computers

23

Extending Pytorch (Python)

learn how to write your own layers and functions

24

Extending Pytorch (C++)

learn how to write your own layers and functions using C++

25

Extending Pytorch (C++/CUDA)

learn how to write your own layers and functions using CUDA (requires GPU)

26

Review (somewhere inbetween)


27

Review


28

Midterm (somewhere inbetween)


29

Final


Computer Usage

Students are completing assignments on their laptop and/or the google cloud platform.

Lab Projects

The course may have an optional or required final project where students employ their knowledge from the semester on a dataset and machine learning task of their choice.

Topical Prerequisites

Linear algebra (MATH 257)

Texts

No text book is used.

Required, Elective, or Selected Elective

This course is a technical elective for both electrical engineers and computer engineers. The course counts as a department-approved software lab course.

Course Goals

This course covers how to use auto-differentiation tools like PyTorch for basic linear algebra operations, how to leverage them for basic machine learning algorithms (linear regression, logistic regression, deep nets, k-means), and how to extend them with custom methods. Auto-differentiation tools are key for data analysis and a solid understanding is increasingly important across disciplines. Different from courses which focus on algorithmic and theoretical aspects of Machine Learning, here we focus on deployment of auto-diff tools to different areas of interest. Core concepts in machine learning are introduced and developed both mathematically and practically in software within the framework of implementation in a popular auto-differentiation framework like PyTorch.

Instructional Objectives

After completing this course, students should:

  • Be familiar with key primitive PyTorch data types, e.g. tensors, datasets, dataloaders, modules, and be able to design and implement common or custom instances of each type for a given data source, machine learning model, or optimization algorithm. (1, 2, 6, 7)
  • Be able to explain the mathematical motivation and process of backpropagation for accumulating gradients in computational graphs (6, 7)
  • Be able to apply auto-differentiation tools effectively using PyTorch to update trainable parameters in a computational graph (1, 2)
  • Understand how auto-differentation tools, computational graphs, and machine learning algorithms can be used together to efficiently solve data-driven engineering problems (1, 2, 6, 7)
  • Understand the application settings for linear regression models, mathematical principles of their solutions, and efficient implementations of their solutions via auto-differentiation in PyTorch. (1, 2, 6, 7)
  • Understand the application settings for logistic regression models, mathematical principles of their solutions, efficient implementations of their solutions via auto-differentiation in PyTorch, and the key differences between logistic and linear regression. (1, 2, 6, 7)
  • Be able to explain how neural networks, e.g. multi-layer perceptrons, convolutional neural networks, recurrent neural networks, extend linear and logistic regression models to more complex machine learning tasks and necessary non-linear solutions. (1, 6, 7)
  • Understand how objective functions may be chosen for desired engineering problems alongside appropriate machine learning algorithms (1, 2, 6, 7)
  • Be able to identify the key differences and use cases of common neural network architectures and models for engineering and data science problems (1, 6)
  • Be able to design and implement existing or custom solutions for popular neural network architectures, e.g. convolutional neural networks and transformers, using PyTorch and autodifferentation for engineering and data science problems (1, 2, 6)
  • Be comfortable reading PyTorch and other software package documentation as necessary to apply alternative solutions or for experimentation within class homeworks or projects. (7)
  • Understand how PyTorch may be extended to further implementations in C, C++, other Python libraries, and/or CUDA. (1, 2, 7)

Last updated

6/25/2025by Corey Ethan Snyder