Frontiers in Artificial Intelligence (Jun 2022)

Signal Perceptron: On the Identifiability of Boolean Function Spaces and Beyond

  • Miguel-Angel Mendez Lucero,
  • Rafael-Michael Karampatsis,
  • Enrique Bojorquez Gallardo,
  • Vaishak Belle,
  • Vaishak Belle

DOI
https://doi.org/10.3389/frai.2022.770254
Journal volume & issue
Vol. 5

Abstract

Read online

In a seminal book, Minsky and Papert define the perceptron as a limited implementation of what they called “parallel machines.” They showed that some binary Boolean functions including XOR are not definable in a single layer perceptron due to its limited capacity to learn only linearly separable functions. In this work, we propose a new more powerful implementation of such parallel machines. This new mathematical tool is defined using analytic sinusoids—instead of linear combinations—to form an analytic signal representation of the function that we want to learn. We show that this re-formulated parallel mechanism can learn, with a single layer, any non-linear k-ary Boolean function. Finally, to provide an example of its practical applications, we show that it outperforms the single hidden layer multilayer perceptron in both Boolean function learning and image classification tasks, while also being faster and requiring fewer parameters.

Keywords