Neurips Competition Uncertainty Estimation

Neurips Competition Summary
uncertainty
metrics
Published

August 8, 2021

Notes on Neurips Competition

I am participating in the Neurips Bayesian Deep Learning Competition, I will like to journal my notes here

Idea

Broad idea is to use Evidential loss function, Dropout, TTA combination.

Journal

  • 8th August
    • Working in superconvergence
  • 13th August
    • Roll back to pytorch-cifar and modifications
  • 14th August
    • Training on evidential loss reaches only 83% accuracy in 300 epochs.
    • Why is evidental loss reducing the accuracy
  • 15th August
    • Training with CE, AdamW, OnecylceLR? Can we improve training speed.
  • 17th August
    • Dirichlet loss function.
  • 18th August
    • Dirichlet + Mixup : best results, touched 90%

Reference

Pytorch cifar

Model data criterion optim scheduler epochs accuracy link Notes
Resnet18 pytorch cross-entropy SGD annealing-200 200 94 1
Resnet20 pytorch cross-entropy SGD annealing-200 200 89 1
Resnet20 tf cross-entropy SGD annealing-200 200 90 1
Resnet20 tf Evidential SGD annealing-200 600 73/??/83 1 Added randmErasing
Resnet20 tf Label smooting SGD annealing-200 200 ?? ??
Resent20 tf cross-entropy AdamW 1 cycle 30 83 1
Resnet20 tf cross-entropy AdamW 1 cycle 100 88 1 max_lr = 0.01
Resnet20 tf cross-entropy AdamW 1 cycle 30 50 1 max_lr=0.1
Resnet20 tf cross-entropy AdamW 1 cycle 30 80 1 max_lr=0.05
Resnet20 tf Evidential AdamW 1 cycle 30 69 1 max_lr=0.05
Resnet20 tf Evidential AdamW annealing-200 200 75 1 max_lr=0.01
Resnet20 tf cross-entropy AdamW 1 cycle 200 89 1 max_lr = 0.05
Resnet20 tf cross-entropy AdamW 1 cycle 200 89 1 max_lr = 0.05, randomErase