Thought Anchors: Which LLM Reasoning Steps Matter? — AI Alignment Forum