Adversarial

Notes: Generalization and Equilibrium in GANs

5 minute read

Published:

Notes: Generalization and Equilibrium in GANs

This post is about an interesting paper by Arora et al. 2017. They explains a reasoning for not achieving correct equilibrium in GANs generators and discriminators. The paper points out that the choice of distance metrics to model objective may not be suitable for practical case. Also, the theoretical assumptions for computing objective may not be valid while training in practical domains. Finally, they present an new distance metric based solution from the perspective of psuedorandomness to solve this issue.

Notes: BEGAN

1 minute read

Published:

Notes: Boundary Equilibrium GAN

This post provides summary of the paper by Berthelot et al. 2017. They proposed a robust architecture for GAN with usual training procedure. In order to have stable convergence, they propose use to use equilibrium concept between Generator and Discriminator. The results are much imporoved in terms of both image diversity and visual quality.

Atrous conv

Notes: DeepLab Segmentation

1 minute read

Published:

DeepLab: Semantic Image Segmentation

This post is a summary of Segmentation paper by Chen et al. 2016. They combine CRFs to generate a more accurate segmentation results.

CRF

Notes: DeepLab Segmentation

1 minute read

Published:

DeepLab: Semantic Image Segmentation

This post is a summary of Segmentation paper by Chen et al. 2016. They combine CRFs to generate a more accurate segmentation results.

Deep Learning

Pytorch Tutorial

5 minute read

Published:

Pytorch Tutorial for Practitioners

Notes: Generalization and Equilibrium in GANs

5 minute read

Published:

Notes: Generalization and Equilibrium in GANs

This post is about an interesting paper by Arora et al. 2017. They explains a reasoning for not achieving correct equilibrium in GANs generators and discriminators. The paper points out that the choice of distance metrics to model objective may not be suitable for practical case. Also, the theoretical assumptions for computing objective may not be valid while training in practical domains. Finally, they present an new distance metric based solution from the perspective of psuedorandomness to solve this issue.

Notes: BEGAN

1 minute read

Published:

Notes: Boundary Equilibrium GAN

This post provides summary of the paper by Berthelot et al. 2017. They proposed a robust architecture for GAN with usual training procedure. In order to have stable convergence, they propose use to use equilibrium concept between Generator and Discriminator. The results are much imporoved in terms of both image diversity and visual quality.

Notes: Understanding Deep Learning Requires Rethinking Generalization

2 minute read

Published:

Notes: Understanding Deep Learning Requires Rethinking Generalization

In this post I provide a summary of paper by Zang et al. that won the best paper award at ICLR’17. It is quite informative in terms of understanding why some neural networks can generalize well while others can’t. They provide detailed results to check Generalization Error accross various tests.

Tips: Tensorflow-Wrap

2 minute read

Published:

Tensorflow-Wrap

This post shows how to setup tensorboard summaries with popular CNN architecture layers in TF. This does not only help debug but also provide insights into working of deep neural nets.

Notes: DeepLab Segmentation

1 minute read

Published:

DeepLab: Semantic Image Segmentation

This post is a summary of Segmentation paper by Chen et al. 2016. They combine CRFs to generate a more accurate segmentation results.

GAN

Notes: Generalization and Equilibrium in GANs

5 minute read

Published:

Notes: Generalization and Equilibrium in GANs

This post is about an interesting paper by Arora et al. 2017. They explains a reasoning for not achieving correct equilibrium in GANs generators and discriminators. The paper points out that the choice of distance metrics to model objective may not be suitable for practical case. Also, the theoretical assumptions for computing objective may not be valid while training in practical domains. Finally, they present an new distance metric based solution from the perspective of psuedorandomness to solve this issue.

Notes: BEGAN

1 minute read

Published:

Notes: Boundary Equilibrium GAN

This post provides summary of the paper by Berthelot et al. 2017. They proposed a robust architecture for GAN with usual training procedure. In order to have stable convergence, they propose use to use equilibrium concept between Generator and Discriminator. The results are much imporoved in terms of both image diversity and visual quality.

Pytorch

Pytorch Tutorial

5 minute read

Published:

Pytorch Tutorial for Practitioners

Segmentation

Notes: DeepLab Segmentation

1 minute read

Published:

DeepLab: Semantic Image Segmentation

This post is a summary of Segmentation paper by Chen et al. 2016. They combine CRFs to generate a more accurate segmentation results.

Tensorboard

Tips: Tensorflow-Wrap

2 minute read

Published:

Tensorflow-Wrap

This post shows how to setup tensorboard summaries with popular CNN architecture layers in TF. This does not only help debug but also provide insights into working of deep neural nets.

Tensorflow

Tips: Tensorflow-Wrap

2 minute read

Published:

Tensorflow-Wrap

This post shows how to setup tensorboard summaries with popular CNN architecture layers in TF. This does not only help debug but also provide insights into working of deep neural nets.

Theory

Notes: Generalization and Equilibrium in GANs

5 minute read

Published:

Notes: Generalization and Equilibrium in GANs

This post is about an interesting paper by Arora et al. 2017. They explains a reasoning for not achieving correct equilibrium in GANs generators and discriminators. The paper points out that the choice of distance metrics to model objective may not be suitable for practical case. Also, the theoretical assumptions for computing objective may not be valid while training in practical domains. Finally, they present an new distance metric based solution from the perspective of psuedorandomness to solve this issue.

Notes: BEGAN

1 minute read

Published:

Notes: Boundary Equilibrium GAN

This post provides summary of the paper by Berthelot et al. 2017. They proposed a robust architecture for GAN with usual training procedure. In order to have stable convergence, they propose use to use equilibrium concept between Generator and Discriminator. The results are much imporoved in terms of both image diversity and visual quality.

Notes: Understanding Deep Learning Requires Rethinking Generalization

2 minute read

Published:

Notes: Understanding Deep Learning Requires Rethinking Generalization

In this post I provide a summary of paper by Zang et al. that won the best paper award at ICLR’17. It is quite informative in terms of understanding why some neural networks can generalize well while others can’t. They provide detailed results to check Generalization Error accross various tests.

python

Pytorch Tutorial

5 minute read

Published:

Pytorch Tutorial for Practitioners

Tips: Tensorflow-Wrap

2 minute read

Published:

Tensorflow-Wrap

This post shows how to setup tensorboard summaries with popular CNN architecture layers in TF. This does not only help debug but also provide insights into working of deep neural nets.