Many people consider Claude Shannon ( see A mathematical theory of communication, Bell Labs Tech Journal vol 27 (1948) ) as the founder of Information Theory.
In this introductory talk I tell the story of how his deep questions led to further work at Bell Labs by Slepian, Landau and Pollak in 1960-1964. In their work something magical happens: one finds a concrete second order differential operator that commutes with a naturally appearing integral one.
This has crucial numerical consequences: computing the eigenfunctions of a differential operator is substantially simpler than computing those of an integral operator. The commutativity just mentioned implies that these eigenfunctions are the same ones for both operators.
In recent joint work with Alexei Zhedanov and Luc Vinet we extend this original result (dealing with Fourier analysis) to many other situations that arise in signal processing and explore a number of open questions that range from algebra to analysis, geometry and other parts of mathematics. This work has already found
applications in medical imaging, geodesy and other fields.