Iterative Vectors: In-Context Gradient Steering without Backpropagation

Yiting Liu, Zhi-Hong Deng
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:38290-38312, 2025.

Abstract

In-context learning has become a standard approach for utilizing language models. However, selecting and processing suitable demonstration examples can be challenging and time-consuming, especially when dealing with large numbers of them. We propose Iterative Vectors (IVs), a technique that explores activation space to enhance in-context performance by simulating gradient updates during inference. IVs extract and iteratively refine activation-based meta-gradients, applying them during inference without requiring backpropagation at any stage. We evaluate IVs across various tasks using four popular models and observe significant improvements. Our findings suggest that in-context activation steering is a promising direction, opening new avenues for future research.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-liu25j, title = {Iterative Vectors: In-Context Gradient Steering without Backpropagation}, author = {Liu, Yiting and Deng, Zhi-Hong}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {38290--38312}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/liu25j/liu25j.pdf}, url = {https://proceedings.mlr.press/v267/liu25j.html}, abstract = {In-context learning has become a standard approach for utilizing language models. However, selecting and processing suitable demonstration examples can be challenging and time-consuming, especially when dealing with large numbers of them. We propose Iterative Vectors (IVs), a technique that explores activation space to enhance in-context performance by simulating gradient updates during inference. IVs extract and iteratively refine activation-based meta-gradients, applying them during inference without requiring backpropagation at any stage. We evaluate IVs across various tasks using four popular models and observe significant improvements. Our findings suggest that in-context activation steering is a promising direction, opening new avenues for future research.} }
Endnote
%0 Conference Paper %T Iterative Vectors: In-Context Gradient Steering without Backpropagation %A Yiting Liu %A Zhi-Hong Deng %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-liu25j %I PMLR %P 38290--38312 %U https://proceedings.mlr.press/v267/liu25j.html %V 267 %X In-context learning has become a standard approach for utilizing language models. However, selecting and processing suitable demonstration examples can be challenging and time-consuming, especially when dealing with large numbers of them. We propose Iterative Vectors (IVs), a technique that explores activation space to enhance in-context performance by simulating gradient updates during inference. IVs extract and iteratively refine activation-based meta-gradients, applying them during inference without requiring backpropagation at any stage. We evaluate IVs across various tasks using four popular models and observe significant improvements. Our findings suggest that in-context activation steering is a promising direction, opening new avenues for future research.
APA
Liu, Y. & Deng, Z.. (2025). Iterative Vectors: In-Context Gradient Steering without Backpropagation. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:38290-38312 Available from https://proceedings.mlr.press/v267/liu25j.html.

Related Material