Speakers
Details
The efficient processing of massive datasets has become critically important across a wide range of areas. Traditional methods, especially direct approaches, for handling such data often face significant computational challenges due to their high dimensionality and massive volume. In response to this, iterative refinement methods using randomized sampling and sketching have emerged as powerful tools for effectively solving algorithmic tasks in large-scale data science. In addition to addressing the scalability, iterative refinement approaches allow for flexibility in the algorithm design that adapts to the specifics of the problem, such as, the geometry of the underlying data, or the type of the noise. In this talk, I will discuss several recently proposed algorithms for the regression problems with adversarial corruptions or challenging geometry.
Based on the work with Halyun Jeong, Deanna Needell and Jackie Lok.