Gradient routing is a method controlling where learning happens in neural networks. Gradient routing applies masks to limit the flow of gradients during backpropagation. By supplying different masks for different data points, the user can induce specialized subcomponents within a model. As part of SPAR, I am working on applying this method to AI safety research.