Home
|
Table of Contents
|
Overview Map
|
Icon Legend
Neural Networks Spring 2026
Your contact information
info@company.com
Home
>
7. Covered Topics
Previous
Next
7.
Covered Topics
Category
Topic
Key Concepts
Reference
Foundations
Single Neuron Model
Inputs, weights, bias
Net value
Activation (transfer) functions
Ch. 3.1
Geometry
Geometric Interpretation
Net value as a hyperplane
Decision boundaries
Ch. 3.1.1
Regression
Linear Regression
Model formulation
Neural networks for regression
Ch. 3.1
Error Metrics
Error Calculation
Mean Squared Error (MSE)
Mean Absolute Error
SVM
Cross Entropy (using Softmax)
Sample-wise error
Minibatch
Batch
Epoch
Ch. 3.1.
Training Concepts
Gradient Decent
Numerical Derivatives, centered difference
Ch. 3.1, Ch. 3.2
Multi-Layer Networks
Multi-layer Neurons
Weight Matrices
Bias in Weight Matrix
Layered architectures
Matrix-based formulation
Augmented input representation
Ch. 5.1
Computational Graphs
Forward propagation
Backward propagation
Chain rule
Local derivatives
Ch. 5.3
Pytorch
Pytorch pipeline
Data transforms
Datasets
Dataloaders
Model definition
Loss
Optimizer
Scheduler
Training loop
Evaluation loop
Metrics
Plots
Saving/loading a checkpoint
https://colab.research.google.com/github/farhadkamangar/CSE5368/blob/master/PyTorch_MNIST_FullyConnected_Pipeline.ipynb#scrollTo=7e1c9a65
Convolutional Neural Networks (CNN):
Creating and analyzing CNNs using PyTorch
Understanding convolutional filters, padding, and stride.
Creating convolutional, pooling, flattening, and fully connected layers
Determining shape of the weight matrix
Determining shape of the output for each layer
Determining number of parameters
Top
Powered By Mindjet MindManager
Last updated:
2026-03-21