Toward joint approximate inference of visual quantities on cellular processor arrays

Research output: Chapter in Book/Report/Conference proceedingConference contribution

  • Authors:
  • Julien N.P. Martel
  • Miguel Chau
  • Piotr Dudek
  • Matthew Cook


The interacting visual maps (IVM) algorithm introduced in [1] is able to perform the joint approximate inference of several visual quantities such as optic-flow, gray-level intensities and ego-motion, using a sparse input coming from a neuromorphic dynamic vision sensor (DVS). We show that features of the model such as the intrinsic parallelism and distributed nature of its computation make it a natural candidate to benefit from the cellular processor array (CPA) hardware architecture. We have now implemented the IVM algorithm on a general-purpose CPA simulator, and here we present results of our simulations and demonstrate that the IVM algorithm indeed naturally fits the CPA architecture. Our work indicates that extended versions of the IVM algorithm could benefit greatly from a dedicated hardware implementation, eventually yielding a high speed, low power visual odometry chip.

Bibliographical metadata

Original languageEnglish
Title of host publicationIEEE International Symposium on Circuits and Systems, ISCAS 2015
ISBN (Electronic)978-1-4799-8391-9
Publication statusPublished - 2015
EventIEEE International Symposium on Circuits and Systems, ISCAS 2015 - Lisbon, Portugal
Event duration: 24 May 201527 May 2015


ConferenceIEEE International Symposium on Circuits and Systems, ISCAS 2015
Internet address