The500Feed.Live

Everything going on in AI - updated daily from 500+ sources

← Back to The 500 Feed
📄 ResearchMay 14, 2026

Forgetting That Sticks: Quantization-Permanent Unlearning via Circuit Attribution

Standard unlearning evaluations measure behavioral suppression in full precision, immediately after training, despite every deployed language model being quantized first. Recent work has shown that 4-bit post-training quantization can reverse machine unlearning; we show this is not a tuning artefact...

Read Original Article →

Source

http://arxiv.org/abs/2605.15138v1