ViewTube

ViewTube
Sign inSign upSubscriptions
Filters

Upload date

Type

Duration

Sort by

Features

Reset

240 results

MLWorks
🚀 Cuda Programming Day 5: Layer Normalization | Neural Network | Transformer Architecture

Welcome to CUDA Programming Day 5! Today, we step into the world of deep learning and explore how CUDA powers one of the ...

23:16
🚀 Cuda Programming Day 5: Layer Normalization | Neural Network | Transformer Architecture

96 views

3 weeks ago

Alxandria - Science should be free
Stabilizing Training: Why Batch Norm and Layer Norm Exist

Layer Normalization was introduced to remove this dependency on batch-level statistics. Instead of normalizing across the batch, ...

2:30
Stabilizing Training: Why Batch Norm and Layer Norm Exist

0 views

9 days ago

ApexFlow
Neural Network & GPT Lecture 1.22 Layer normalization, Dropout, and Summary

Chinese guide Credits to Andrej Karpathy References: https://www.youtube.com/watch?v=VMj-3S1tku0 ...

11:27
Neural Network & GPT Lecture 1.22 Layer normalization, Dropout, and Summary

4 views

4 weeks ago

Data Science Made Easy
What is Batch Normalization and Layer Normalization?

Batch Normalization and Layer Normalization are techniques used in deep learning to stabilize and accelerate training by ...

1:10
What is Batch Normalization and Layer Normalization?

10 views

3 weeks ago

Latent Space
[NeurIPS Best Paper] 1000 Layer Networks for Self-Supervised RL — Kevin Wang et al, Princeton

... architectural tricks that made it work (*residual connections, layer normalization, and a shift from regression to classification*), ...

28:19
[NeurIPS Best Paper] 1000 Layer Networks for Self-Supervised RL — Kevin Wang et al, Princeton

10,340 views

5 days ago

AI Atlas
Transformer Architecture Explained: From Attention to ChatGPT, BERT & LLMs (Deep Dive)

Delve into the core components of the Transformer Block: * Skip Connections (Add) and Layer Normalization (Norm): Essential ...

19:52
Transformer Architecture Explained: From Attention to ChatGPT, BERT & LLMs (Deep Dive)

35 views

4 days ago

Tales Of Tensors
I Followed One Token Through a Transformer (Every Step)

... encoding sentencepiece tokenizer embedding layer positional embeddings rotary positional embeddings layer normalization ...

8:17
I Followed One Token Through a Transformer (Every Step)

496 views

9 days ago

Aleksei Ivanov
Building Google's DeepMind RegressLM from scratch (30% improvement)

Link: https://arxiv.org/abs/1706.03762 * Title: Layer Normalization. Link: https://arxiv.org/abs/1607.06450 * Title: Dropout: A Simple ...

32:07
Building Google's DeepMind RegressLM from scratch (30% improvement)

169 views

2 weeks ago

Skill Advancement
Why Batch Normalization Fails in Transformers: The Padding Problem Explained

The LayerNorm Solution: Why Layer Normalization is superior for Transformers because it operates "horizontally" across features, ...

7:37
Why Batch Normalization Fails in Transformers: The Padding Problem Explained

41 views

9 days ago

Code&Learn AI
LLM Evolution: Transformer to Sparsity

Root Mean Square Layer Normalization. Link to Paper • RoPE: Su et al., 2021. RoFormer: Enhanced Transformer with Rotary ...

7:08
LLM Evolution: Transformer to Sparsity

96 views

2 weeks ago

Pravalika Kuppireddy
ECE 5831 Demo | Plant Disease Recognition: Comparison of CNNsand Transfer Learning Models
12:52
ECE 5831 Demo | Plant Disease Recognition: Comparison of CNNsand Transfer Learning Models

7 views

2 weeks ago

ML Guy
The Core Building Block Behind GPT (Explained Visually)

Residual Connections and Layer Normalization: why deep Transformers are stable and trainable. Rather than treating the ...

6:18
The Core Building Block Behind GPT (Explained Visually)

236 views

2 weeks ago

TIT Technocrats Academic
Deep learning

unit 3,4,5.

4:48
Deep learning

13 views

3 weeks ago

iSEO AI
The Deep Learning Bible — Book Summary by Goodfellow, Bengio & Courville

In this video, we summarize “Deep Learning” by Ian Goodfellow, Yoshua Bengio, and Aaron Courville — the definitive textbook ...

13:57
The Deep Learning Bible — Book Summary by Goodfellow, Bengio & Courville

9 views

3 weeks ago

Adapticx AI
Optimization, Regularization, GPUs

In this episode, we explore the three engineering pillars that made modern deep learning possible: advanced optimization ...

28:49
Optimization, Regularization, GPUs

5 views

3 weeks ago

IgnoVex
Batch Normalization Technique | Deep Learning | Machine Learning | Artificial Intelligence | IgnoVex

Batch Normalization is a key technique that makes deep neural networks faster, more stable, and easier to train. In this video, we ...

7:11
Batch Normalization Technique | Deep Learning | Machine Learning | Artificial Intelligence | IgnoVex

22 views

6 days ago

1 Minute Glossary - AI ML
Batch Normalization Explained in 60 Seconds | What is Batch Normalization?

Batch Normalization is a technique used in deep learning to stabilize and accelerate neural network training by normalizing layer ...

1:21
Batch Normalization Explained in 60 Seconds | What is Batch Normalization?

0 views

6 hours ago

GOSIM Foundation
【GOSIM HANGZHOU 2025】Shiwei Liu:The Curse of Depth in Large Language Models

We identify the issue as stemming from the prevalent use of Pre-Layer Normalization (Pre-LN) and introduce LayerNorm Scaling ...

26:29
【GOSIM HANGZHOU 2025】Shiwei Liu:The Curse of Depth in Large Language Models

15 views

3 weeks ago

CosmoX
B-Trans: Population Bayesian Transformers 기반 LLM 생성 다양성 및 RLVR 성능 최적화

B-Trans(Population Bayesian Transformers)는 표준 대규모 언어 모델(LLM)을 베이지안 모델로 변환하여 단일 가중치 세트에서 다양 ...

5:22
B-Trans: Population Bayesian Transformers 기반 LLM 생성 다양성 및 RLVR 성능 최적화

4 views

3 days ago

Torial: Custom Video Explanations
How to Reduce Internal Covariate Shift in 5-10 Layer CNNs Using Batch Normalization During Backpr...

Batch normalization is a crucial technique used in deep learning models to improve their performance and stability. When training ...

5:13
How to Reduce Internal Covariate Shift in 5-10 Layer CNNs Using Batch Normalization During Backpr...

18 views

4 weeks ago