Kalendarium
27
May
PhD seminar: Frej Dahlin
Title
Packed differentiable logic gates
Abstract
Logic gate networks, in contrast to neural networks, allow for extremely fast evaluations on typical consumer grade hardware. However, fitting these using nondifferentiable optimization is regarded as intractable, for this reason, differentiable logic gate networks were introduced in 2022. Each gate in such a network is in a superposition of 16 possible logic gates. During training the probability distribution of each gate is modified; for inference the gate is collapsed into the most likely candidate. One drawback is the expensive training cost of these networks. I will give a short introduction and show a new 'bit-packed' representation of differentiable logic gates, which yields a 4x improvement in memory footprint and processing time during training.