ViewTube

ViewTube
Sign inSign upSubscriptions
Filters

Upload date

Type

Duration

Sort by

Features

Reset

381 results

Andreas Maier
Deep Learning: Regularization - Part 2 (WS 20/21)

Deep Learning - Regularization Part 2 This video discusses classical regularization techniques such as early stopping using a ...

14:19
Deep Learning: Regularization - Part 2 (WS 20/21)

871 views

5 years ago

CampusX
Ridge Regression Part 1 | Geometric Intuition and Code | Regularized Linear Models

Dive into the fundamentals of Ridge Regression with the first part of our series. We'll provide a clear geometric intuition, backed by ...

19:58
Ridge Regression Part 1 | Geometric Intuition and Code | Regularized Linear Models

161,838 views

4 years ago

CodeEmporium
Optimizers - EXPLAINED!

From Gradient Descent to Adam. Here are some optimizers you should know. And an easy way to remember them. SUBSCRIBE ...

7:23
Optimizers - EXPLAINED!

144,848 views

5 years ago

CodeEmporium
Activation Functions - EXPLAINED!

We start with the whats/whys/hows. Then delve into details (math) with examples. Follow me on M E D I U M: ...

10:05
Activation Functions - EXPLAINED!

151,602 views

5 years ago

Andreas Maier
Deep Learning: Regularization - Part 5 (WS 20/21)

Deep Learning - Regularization Part 5 This video discusses multi-task learning. For reminders to watch the new video follow on ...

6:49
Deep Learning: Regularization - Part 5 (WS 20/21)

698 views

5 years ago

د. معتز سعد | Dr. Motaz Saad
Deep Learning - L1 & L2 Regularization
18:41
Deep Learning - L1 & L2 Regularization

2,397 views

2 years ago

Friday Talks Tübingen
PENEX: AdaBoost-Inspired Neural Network Regularization - [Klaus-Rudolf Kladny]

In this talk, I introduce PENEX, a new loss function for deep neural networks inspired by multi-class AdaBoost. PENEX acts as an ...

12:14
PENEX: AdaBoost-Inspired Neural Network Regularization - [Klaus-Rudolf Kladny]

64 views

2 months ago

CodeEmporium
Batch Normalization - EXPLAINED!

What is Batch Normalization? Why is it important in Neural networks? We get into math details too. Code in references. Follow me ...

8:49
Batch Normalization - EXPLAINED!

129,193 views

5 years ago

BioniChaos
AI Model Optimization: From Chance-Level Performance to 86% Accuracy

#AI #MachineLearning #DeepLearning #ModelTraining #DataScience #DataAugmentation #Shuffling #Overfitting ...

8:44
AI Model Optimization: From Chance-Level Performance to 86% Accuracy

60 views

1 year ago

Andreas Maier
Deep Learning: Regularization - Part 4

Deep Learning - Regularization Part 4 This video discusses initialization techniques and transfer learning. Full Transcript ...

10:01
Deep Learning: Regularization - Part 4

517 views

5 years ago

Dr Juan Klopper
Regularization in deep learning

Start this series from the beginning ...

16:44
Regularization in deep learning

190 views

7 years ago

Vu Hung Nguyen (Hưng)
09 Regularization

Regularization Explained: Prevent Overfitting & Improve ML Models Overview: Dive deep into regularization techniques, crucial ...

7:39
09 Regularization

3 views

3 months ago

Matt Yedlin
Regularization L2, L1

This is a video that talks about regularization L2, L1. Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew ...

12:40
Regularization L2, L1

3,480 views

5 years ago

Matt Yedlin
Regularization - Data Augmentation

This is a video that introduces regularization - Data Augmentation. Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: ...

5:09
Regularization - Data Augmentation

620 views

5 years ago

CIS 522 - Deep Learning
Introduction to Week 5

This video was recorded as part of CIS 522 - Deep Learning at the University of Pennsylvania. The course material, including the ...

5:54
Introduction to Week 5

555 views

4 years ago

Women in AI Research WiAIR
NeurIPS 2025 in San Diego. Deep Learning Secrets

In this post #NeurIPS2025 episode, Linara Adilova shares the work on relative flatness in deep learning — a concept that could ...

8:22
NeurIPS 2025 in San Diego. Deep Learning Secrets

249 views

4 weeks ago

CampusX
Early Stopping In Neural Networks | End to End Deep Learning Course

Early stopping is a method in Deep Learning that allows you to specify an arbitrarily large number of training epochs and stop ...

12:00
Early Stopping In Neural Networks | End to End Deep Learning Course

74,076 views

3 years ago

ICTP Quantitative Life Sciences
Implicit Regularization of Gradient Descent for Wide Two-layer Relu Neural Networks

Lénaïc CHIZAT (University of Paris-Saclay, France) Youth in High-Dimensions | (smr 3602) 2021_06_15-18_20-smr3602.

18:10
Implicit Regularization of Gradient Descent for Wide Two-layer Relu Neural Networks

118 views

4 years ago

CodeEmporium
DropBlock - A BETTER DROPOUT for Neural Networks

Dropout is a common method of regularization in neural networks. However, it doesn't work too well in Convolution Neural ...

7:45
DropBlock - A BETTER DROPOUT for Neural Networks

6,265 views

7 years ago

GreyAtom EduTech
Machine Learning Tutorial Chap 5| Part-2 L1 Regularization | Rohit Ghosh Machine Learning | GreyAtom

Get access to FREE Data Science courses, projects, e-books, and more... Start learning now! https://bit.ly/3009dgI Welcome to ...

15:11
Machine Learning Tutorial Chap 5| Part-2 L1 Regularization | Rohit Ghosh Machine Learning | GreyAtom

1,236 views

6 years ago