-
Title: Data Compression and Large Language Models – University of Aveiro, PT
Abstract: The use of large language models (LLMs) has seen a meteoric rise since the introduction of the transformer neural network architecture. In one of its configurations, known as decoder-only, a transformer operates as a next token (or symbol) predictor. Because prediction is a fundamental process in modern data compressors, it is natural to question whether LLMs can be effective and efficient predictors for data compression systems. In this talk, we will present and discuss some ideas related to aspects of both data compressors and LLMs.
Short BIO: https://www.cienciavitae.pt/portal/6012-FBC9-2C32
Seminar in the scope of the 31st Portuguese Conference on Pattern Recognition (RECPAD2025, https://sites.google.com/view/recpad2025)
See more here: https://www.ua.pt/en/noticias/4/93247
