Shayestehmanesh, Hamid2020-09-012020-09-0120202020-08-31http://hdl.handle.net/1828/12075We study online active learning under the Bernstein condition for bounded general losses and offer a solution for online variance estimation. Our suggested algorithm is based on IWAL (Importance Weighted Active Learning) which utilizes the online variance estimation technique to shrink the hypothesis set. For our algorithm, we provide a fallback guarantee and prove that in the case that R(f*) is small, it will converge faster than passive learning, where R(f*) is the risk of the best hypothesis in the hypothesis class. Finally, in the special case of zero-one loss exponential improvement is achieved in label complexity over passive learning.enAvailable to the World Wide WebMachine LearningActive LearningTheoretical Machine LearningBernstein ConditionActive learning under the Bernstein condition for general lossesThesis