Extracting formulae in many-valued logic from deep neural networks
Authors
Yani Zhang and Helmut BölcskeiReference
IEEE Transactions on Signal Processing, Vol. 74, 2026.[BibTeX, LaTeX, and HTML Reference]
Abstract
We propose a new perspective on deep rectified linear unit (ReLU) networks, namely as circuit counterparts of Łukasiewicz infinite-valued logic—a many-valued (MV) generalization of Boolean logic. An algorithm for extracting formulae in MV logic from (trained) deep ReLU networks is presented. The algorithm respects the network topology, in particular compositionality, thereby honoring algebraic information present in the training data. While the two existing methods for turning truth functions in MV logic into formulae are for the univariate case only, the algorithm we propose applies to multivariate functions. Moreover, it is demonstrated—through numerical results and in one specific case analytically—that our algorithm, in the univariate case, can deliver shorter formulae than the other two methods. We also establish the representation benefits of deep networks from a mathematical logic perspective.Keywords
Mathematical logic, many-valued logic, McNaughton functions, deep neural networks
Download this document:
Copyright Notice: © 2026 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.