Differentiable unrolled alternating direction method of multipliers for OneNet

Zoltán Milacski, Barnabás Póczos, András Lorincz

Research output: Contribution to conferencePaper

Abstract

Deep neural networks achieve state-of-the-art results on numerous image processing tasks, but this typically requires training problem-specific networks. Towards multi-task learning, the One Network to Solve Them All (OneNet) method was recently proposed that first pretrains an adversarial denoising autoencoder and subsequently uses it as the proximal operator in Alternating Direction Method of Multipliers (ADMM) solvers of multiple imaging problems. In this work, we highlight training and ADMM convergence issues of OneNet, and resolve them by proposing an end-to-end learned architecture for training the two steps jointly using Unrolled Optimization with backpropagation. In our experiments, our solution achieves superior or on par results compared to the original OneNet and Wavelet sparsity on four imaging problems (pixelwise inpainting-denoising, blockwise inpainting, scattered inpainting and super resolution) on the MS-Celeb-1M and ImageNet data sets, even with a much smaller ADMM iteration count.

Original languageEnglish
Publication statusPublished - 2020
Event30th British Machine Vision Conference, BMVC 2019 - Cardiff, United Kingdom
Duration: Sep 9 2019Sep 12 2019

Conference

Conference30th British Machine Vision Conference, BMVC 2019
CountryUnited Kingdom
CityCardiff
Period9/9/199/12/19

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Differentiable unrolled alternating direction method of multipliers for OneNet'. Together they form a unique fingerprint.

  • Cite this

    Milacski, Z., Póczos, B., & Lorincz, A. (2020). Differentiable unrolled alternating direction method of multipliers for OneNet. Paper presented at 30th British Machine Vision Conference, BMVC 2019, Cardiff, United Kingdom.