Contrastive Language-Image Pretraining
Connecting text and images.
Read MoreDeep learning is advancing rapidly as thousands of new papers are published every year. Exploring best practices and state of the art techniques can often feel like drinking from a fire hose. Dead Neuron serves as a concise guide to research, where our collection of notebooks distill key ideas and implementation details from influential papers to help you learn how to build better neural networks.
Connecting text and images.
Read MoreLocal minima in loss landscapes are connected by high accuracy pathways.
Read MoreLearning optimal transformation pipelines for data augmentation.
Read MoreEnsembles where new members are trained to correct previous mistakes.
Read MoreTraining a small model on the outputs of a larger and more accurate model.
Read MoreA phenomena where generalization gets worse then better with larger models and bigger datasets.
Read MoreA class of generative latent variable models inspired by nonequilibrium thermodynamics.
Read MoreNeural network ensembles have been effectively used to improve generalization by combining the predictions of multiple independently trained models. However, the growing scale and complexity of deep neural networks have led to these methods becoming prohibitively expensive and time consuming to implement. Low-cost ensemble methods have become increasingly important as they can alleviate the need to train multiple models from scratch while retaining the generalization benefits that...
Read More