Skip to content

Commit 97a95ed

Browse files
authored
Update overfit_batches docs (#19622)
1 parent b3275e0 commit 97a95ed

File tree

1 file changed

+7
-3
lines changed

1 file changed

+7
-3
lines changed

docs/source-pytorch/debug/debugging_intermediate.rst

+7-3
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,7 @@ Machine learning code requires debugging mathematical correctness, which is not
2020
**************************************
2121
Overfit your model on a Subset of Data
2222
**************************************
23+
2324
A good debugging technique is to take a tiny portion of your data (say 2 samples per class),
2425
and try to get your model to overfit. If it can't, it's a sign it won't work with large datasets.
2526

@@ -28,14 +29,17 @@ argument of :class:`~lightning.pytorch.trainer.trainer.Trainer`)
2829

2930
.. testcode::
3031

31-
# use only 1% of training data (and turn off validation)
32+
# use only 1% of training data
3233
trainer = Trainer(overfit_batches=0.01)
3334

3435
# similar, but with a fixed 10 batches
3536
trainer = Trainer(overfit_batches=10)
3637

37-
When using this argument, the validation loop will be disabled. We will also replace the sampler
38-
in the training set to turn off shuffle for you.
38+
# equivalent to
39+
trainer = Trainer(limit_train_batches=10, limit_val_batches=10)
40+
41+
Setting ``overfit_batches`` is the same as setting ``limit_train_batches`` and ``limit_val_batches`` to the same value, but in addition will also turn off shuffling in the training dataloader.
42+
3943

4044
----
4145

0 commit comments

Comments
 (0)