Neural Networks as Cellular Sheaves: Bidirectional Flow and the Sheaf Laplacian.
We’ve been taught that a neural network is like a one-way street. Information enters at the beginning, gets crunched through layers, and spits out an answer at the end. It’s a 'feedforward' world where the past doesn't know what the future is doing until the 'backpropagation' phase kicks in.
But what if a neural network wasn't a conveyor belt, but a cellular sheaf—a mathematical structure where every part of the network is constantly trying to stay 'consistent' with every other part?
Welcome to the show. Today, we are exploring a radical rethink of deep learning in our episode: 'Neural Netwo…
Watch on YouTube ↗
(saves to browser)
DeepCamp AI