Toward joint approximate inference of visual quantities on cellular processor arrays

Research output: Chapter in Book/Report/Conference proceedingConference contribution

  • Authors:
  • Julien N.P. Martel
  • Miguel Chau
  • Piotr Dudek
  • Matthew Cook

Abstract

The interacting visual maps (IVM) algorithm introduced in [1] is able to perform the joint approximate inference of several visual quantities such as optic-flow, gray-level intensities and ego-motion, using a sparse input coming from a neuromorphic dynamic vision sensor (DVS). We show that features of the model such as the intrinsic parallelism and distributed nature of its computation make it a natural candidate to benefit from the cellular processor array (CPA) hardware architecture. We have now implemented the IVM algorithm on a general-purpose CPA simulator, and here we present results of our simulations and demonstrate that the IVM algorithm indeed naturally fits the CPA architecture. Our work indicates that extended versions of the IVM algorithm could benefit greatly from a dedicated hardware implementation, eventually yielding a high speed, low power visual odometry chip.

Bibliographical metadata

Original languageEnglish
Title of host publicationIEEE International Symposium on Circuits and Systems, ISCAS 2015
PublisherIEEE
Pages2061-2064
ISBN (Electronic)978-1-4799-8391-9
DOIs
Publication statusPublished - 2015
EventIEEE International Symposium on Circuits and Systems, ISCAS 2015 - Lisbon, Portugal
Event duration: 24 May 201527 May 2015
http://www.scopus.com/inward/record.url?eid=2-s2.0-84946225388&partnerID=40&md5=58467212fe9944b83fe45c3ea4058d6b

Conference

ConferenceIEEE International Symposium on Circuits and Systems, ISCAS 2015
CountryPortugal
CityLisbon
Period24/05/1527/05/15
Internet address