Bibliography

Set10

Burr Settles. Active Learning Literature Survey. Computer Sciences Technical Report 1648 University of Wisconsin–Madison, 2010.

LG94

David D. Lewis and William A. Gale. A sequential algorithm for training text classifiers. In SIGIR’94, 1994, 3-12.

LUO05

Tong Luo, Kurt Kramer, Dmitry B. Goldgof, Lawrence O. Hall, Scott Samson, Andrew Remsen, and Thomas Hopkins. 2005. Active Learning to Recognize Multiple Types of Plankton. J. Mach. Learn. Res. 6, 2005, 589–613.

Set07

Burr Settles, Mark Craven, and Soumya Ray. 2007. Multiple-instance active learning. In Proceedings of the 20th International Conference on Neural Information Processing Systems (NIPS’07). Curran Associates Inc., Red Hook, 1289–1296.

HOL08

Alex Holub, Pietro Perona, and Michael C. Burl. 2008. Entropy-based active learning for object recognition. In 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, IEEE, 1–8.

ZLW17

Ye Zhang, Matthew Lease, and Byron C. Wallace. 2017. Active discriminative text representation learning. In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (AAAI’17). AAAI Press, 3386–3392.

HR18

Jeremy Howard and Sebastian Ruder Universal Language Model Fine-tuning for Text Classification. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, 2008, 328–339.

AZK+20

Jordan T. Ash, Chicheng Zhang, Akshay Krishnamurthy, John Langford and Alekh Agarwal. 2020. Deep Batch Active Learning by Diverse, Uncertain Gradient Lower Bounds. International Conference on Learning Representations 2020 (ICLR 2020).

YLB20

Michelle Yuan, Hsuan-Tien Lin, and Jordan Boyd-Graber. 2020. Cold-start Active Learning through Self-supervised Language Modeling In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) Association for Computational Linguistics, 7935–-7948.

SNP21

Christopher Schröder, Andreas Niekler and Martin Potthast. Uncertainty-based Query Strategies for Active Learning with Transformers. ArXiv abs/2107.05687, 2021.