Non-bijunctive attention collapse for LLM inference — POWER8 hardware AES (vcipher) + AltiVec vec_perm. Hebbian path selection, cross-head diffusion, O(1) KV prefiltering.
-
Updated
Mar 31, 2026 - C
Non-bijunctive attention collapse for LLM inference — POWER8 hardware AES (vcipher) + AltiVec vec_perm. Hebbian path selection, cross-head diffusion, O(1) KV prefiltering.
Accelerate LLM inference by collapsing attention paths with hardware-optimized selective pruning using POWER8 vector instructions and crypto operators.
Add a description, image, and links to the non-bijunctive topic page so that developers can more easily learn about it.
To associate your repository with the non-bijunctive topic, visit your repo's landing page and select "manage topics."