Illumination-robust variational optical flow using cross-correlation

József Molnár, Dmitry Chetverikov, Sándor Fazekas

Research output: Contribution to journalArticle

29 Citations (Scopus)

Abstract

We address the problem of variational optical flow for video processing applications that need fast operation and robustness to drastic variations in illumination. Recently, a solution [1] has been proposed based on the photometric invariants of the dichromatic reflection model [2]. However, this solution is only applicable to colour videos with brightness variations. Greyscale videos, or colour videos with colour illumination changes cannot be adequately handled. We propose two illumination-robust variational methods based on cross-correlation that are applicable to colour and grey-level sequences and robust to brightness and colour illumination changes. First, we present a general implicit nonlinear scheme that assumes no particular analytical form of energy functional and can accommodate different image components and data metrics, including cross-correlation. We test the nonlinear scheme on standard synthetic data with artificial brightness and colour effects added and conclude that cross-correlation is robust to both kinds of illumination changes. Then we derive a fast linearised numerical scheme for cross-correlation based variational optical flow. We test the linearised algorithm on challenging data and compare it to a number of state-of-the-art variational flow methods.

Original languageEnglish
Pages (from-to)1104-1114
Number of pages11
JournalComputer Vision and Image Understanding
Volume114
Issue number10
DOIs
Publication statusPublished - Oct 1 2010

Keywords

  • Cross-correlation
  • Illumination-robustness
  • Variational optical flow

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Illumination-robust variational optical flow using cross-correlation'. Together they form a unique fingerprint.

  • Cite this