I think I know enough about how neural networks work even though I could not tell you in any detail what the exact layer structure is, which activation functions they use, how the attention mechanism is build up or what training procedure they use. But why does it matter how well I understand the details?