Verónica Álvarez defenderá su tesis doctoral el lunes 25 de septiembre

  • La defensa se llevará a cabo en la Facultad de Informática de Donostia

Veronica Alvarez se graduó en Matemáticas por la Universidad de Salamanca en 2019 y obtuvo un máster en Investigación Matemática por La universidad Politécnica de Valencia en 2020.

Se incorporó al Basque Center for Applied Mathematics - BCAM en julio de 2019. Sus intereses científicos incluyen la estadística, la ciencia de datos y el aprendizaje automático.

Su tesis doctoral, Supervised Learning in Time-dependent Environments with Performance Guarantees, ha sido supervisada por Santiago Mazuelas (BCAM) y Jose Antonio Lozano, director científico de BCAM.

La defensa se llevará a cabo en la Facultad de Informática de la UPV/EHU en Donostia. Tendrá lugar el lunes 25 de septiembre a las 11:00 horas.

En nombre de todos los miembros de BCAM, nos gustaría desear a Verónica la mejor de las suertes en la defensa de su tesis.

 

PhD thesis Title:

Supervised Learning in Time-dependent Environments with Performance Guarantees

Abstract:

In practical scenarios, it is common to learn from a sequence of related problems (tasks). Such tasks are usually time-dependent in the sense that consecutive tasks are often significantly more similar. Time-dependency is common in multiple applications such as load forecasting, spam main filtering, and face emotion recognition. For instance, in the problem of load forecasting, the consumption patterns in consecutive time periods are significantly more similar since human habits and weather factors change gradually over time. Learning from a sequence tasks holds promise to enable accurate performance even with few samples per task by leveraging information from different tasks. However, harnessing the benefits of learning from a sequence of tasks is challenging since tasks are characterized by different underlying distributions. Most existing techniques are designed for situations where the tasks’ similarities do not depend on their order in the sequence. Existing techniques designed for timedependent tasks adapt to changes between consecutive tasks accounting for a scalar rate of change by using a carefully chosen parameter such as a learning rate or a weight factor. However, the tasks’ changes are commonly multidimensional, i.e., the timedependency often varies across different statistical characteristics describing the tasks. For instance, in the problem of load forecasting, the statistical characteristics related to weather factors often change differently from those related to generation. In this dissertation, we establish methodologies for supervised learning from a sequence of time-dependent tasks that effectively exploit information from all tasks, provide multidimensional adaptation to tasks’ changes, and provide computable tight performance guarantees. We develop methods for supervised learning settings where tasks arrive over time including techniques for supervised classification under concept drift (SCD) and techniques for continual learning (CL). In addition, we present techniques for load forecasting that can adapt to time changes in consumption patterns and assess intrinsic uncertainties in load demand. The numerical results show that the proposed methodologies can significantly improve the performance of existing methods using multiple benchmark datasets. This dissertation makes theoretical contributions leading to efficient algorithms for multiple machine learning scenarios that provide computable performance guarantees and superior performance than state-of-the-art techniques.