Machine Learning at the Wireless Edge
H. Vincent Poor is the Michael Henry Strater University Professor at Princeton University, where his interests include information theory, machine learning and network science, and their applications in wireless networks, energy systems, and related areas. His publications in these areas include the forthcoming book Machine Learning and Wireless Communications (Cambridge University Press). Dr. Poor is a Member of U.S. National Academy of Engineering and U.S. National Academy of Sciences, and a foreign member of the Royal Society and other national and international academies. He received the IEEE Alexander Graham Bell Medal in 2017.
Wireless networks can be used as platforms for machine learning, taking advantage of the fact that data is often collected at the edges of the network, and also mitigating the latency and privacy concerns that backhauling data to the cloud can entail. This talk will present an overview of some results on distributed learning at the edges of wireless networks, in which machine learning algorithms interact with the physical limitations of the wireless medium. Two topics will be considered: federated learning, in which end-user devices interact with edge devices such as access points to implement joint learning algorithms; and decentralized learning, in which end-user devices learn by interacting in a peer-to-peer fashion without the benefit of an aggregating edge device. Open topics for future research will also be discussed briefly.
Universal decoding - freeing ourselves from traditional error-correcting codes
Muriel Médard is the Cecil H. and Ida Green Professor in the Electrical Engineering and Computer Science (EECS) Department at MIT, where she leads the Network Coding and Reliable Communications Group in the Research Laboratory for Electronics at MIT. She obtained three bachelor’s degrees (EECS 1989, Mathematics 1989 and Humanities 1991), as well as her M.S. (1991) and Sc.D (1995), all from MIT. She is a Member of the US National Academy of Engineering (elected 2020), a Fellow of the US National Academy of Inventors (elected 2018), American Academy of Arts and Sciences (elected 2021), and a Fellow of the Institute of Electrical and Electronics Engineers (elected 2008). She holds an Honorary Doctorate from the Technical University of Munich (2020).
She was co-winner of the MIT 2004 Harold E. Egerton Faculty Achievement Award and was named a Gilbreth Lecturer by the US National Academy of Engineering in 2007. She received the 2022 IEEE Koji Kobayashi Computers and Communications Award, the 2017 IEEE Communications Society Edwin Howard Armstrong Achievement Award and the 2016 IEEE Vehicular Technology James Evans Avant Garde Award. She received the 2019 Best Paper award for IEEE Transactions on Network Science and Engineering, the 2018 ACM SIGCOMM Test of Time Paper Award, the 2009 IEEE Communication Society and Information Theory Society Joint Paper Award, the 2009 William R. Bennett Prize in the Field of Communications Networking, the 2002 IEEE Leon K. Kirchmayer Prize Paper Award, as well as eight conference paper awards. Most of her prize papers are co-authored with students from her group.
She has served as technical program committee co-chair of ISIT (twice), CoNext, WiOpt, WCNC and of many workshops. She has chaired the IEEE Medals committee, and served as member and chair of many committees, including as inaugural chair of the Millie Dresselhaus Medal. She was Editor in Chief of the IEEE Journal on Selected Areas in Communications and has served as editor or guest editor of many IEEE publications, including the IEEE Transactions on Information Theory, the IEEE Journal of Lightwave Technology, and the IEEE Transactions on Information Forensics and Security. She was a member of the inaugural steering committees for the IEEE Transactions on Network Science and for the IEEE Journal on Selected Areas in Information Theory. She currently serves as the Editor-in-Chief of the IEEE Transactions on Information Theory. Muriel was elected president of the IEEE Information Theory Society in 2012, and serves on its board of governors, having previously served for eleven years.
Muriel received the inaugural 2013 MIT EECS Graduate Student Association Mentor Award, voted by the students. She set up the Women in the Information Theory Society (WithITS) and Information Theory Society Mentoring Program, for which she was recognized with the 2017 Aaron Wyner Distinguished Service Award. She served as undergraduate Faculty in Residence for seven years in two MIT dormitories (2002–2007). She was elected by the faculty and served as member and later chair of the MIT Faculty Committee on Student Life and as inaugural chair of the MIT Faculty Committee on Campus Planning. She was chair of the Institute Committee on Student Life. She was recognized as a Siemens Outstanding Mentor (2004) for her work with High School students. She serves on the Board of Trustees since 2015 of the International School of Boston, for which she is treasurer.
She is inventor of over seventy US and associated international patents, the vast majority of which have been licensed or acquired. For technology transfer, she has co-founded CodeOn, for which she consults, and Steinwurf, for which she is Chief Scientist.
Muriel has supervised over 40 master students, over 20 doctoral students and over 25 postdoctoral fellows.
Much of the recent debate on the adoption of 5G wireless technology has centered on the issue of standards. To maintain data integrity in the face of network unreliability, systems rely on error-correcting codes. System standardization is predicated on co-designing these error-correcting codes and, most importantly, their generally complex decoders, into efficient, dedicated and customized chips.
In this talk, we describe “Guessing Random Additive Noise Decoding,” or GRAND, by Duffy, Médard and their research groups, which renders universal, optimal, code-agnostic decoding possible for low to moderate redundancy settings. Moreover, recent work with Yazicigil and her group has demonstrated that such decoding can be implemented with extremely low latency in silicon. GRAND enables a new exploration of codes, in and of themselves, independently of tailored decoders, over a rich family of code designs, including random ones.
Surprisingly, even the simplest code constructions, such as those used merely for error checking, match or outperform state of the art codes when optimally decoded with GRAND. Without the need for highly tailored codes and bespoke decoders, we can envisage using GRAND to avoiding the issue of fixing code choices that 5G encountered, and instead have an open platform for coding and decoding.
Fundamental limits of deep generative neural networks
Helmut Bölcskei was born in Mödling, Austria on May 29, 1970, and received the Dipl.-Ing. and Dr. techn. degrees in electrical engineering from Vienna University of Technology, Vienna, Austria, in 1994 and 1997, respectively. In 1998 he was with Vienna University of Technology. From 1999 to 2001 he was a postdoctoral researcher in the Information Systems Laboratory, Department of Electrical Engineering, and in the Department of Statistics, Stanford University, Stanford, CA. He was in the founding team of Iospan Wireless Inc., a Silicon Valley-based startup company (acquired by Intel Corporation in 2002) specialized in multiple-input multiple-output (MIMO) wireless systems for high-speed Internet access, and was a co-founder of Celestrius AG, Zurich, Switzerland. From 2001 to 2002 he was an Assistant Professor of Electrical Engineering at the University of Illinois at Urbana-Champaign. He has been with ETH Zurich since 2002, where he is a Professor of Mathematical Information Science in the Department of Electrical Engineering, also associated with the Department of Mathematics. He was a visiting researcher at Philips Research Laboratories Eindhoven, The Netherlands, ENST Paris, France, and the Heinrich Hertz Institute Berlin, Germany. His research interests are in applied mathematics, machine learning theory, mathematical signal processing, data science, and statistics.
He received the 2001 IEEE Signal Processing Society Young Author Best Paper Award, the 2006 IEEE Communications Society Leonard G. Abraham Best Paper Award, the 2010 Vodafone Innovations Award, the ETH "Golden Owl" Teaching Award, is a Fellow of the IEEE, a 2011 EURASIP Fellow, was a Distinguished Lecturer (2013-2014) of the IEEE Information Theory Society, an Erwin Schrödinger Fellow (1999-2001) of the Austrian National Science Foundation (FWF), was included in the 2014 Thomson Reuters List of Highly Cited Researchers in Computer Science, was the 2016 Padovani Lecturer of the IEEE Information Theory Society, and received a 2021 Rothschild Fellowship from the Isaac Newton Institute for Mathematical Sciences, Cambridge University, UK. He served as an associate editor of the IEEE Transactions on Information Theory, the IEEE Transactions on Signal Processing, the IEEE Transactions on Wireless Communications, and the EURASIP Journal on Applied Signal Processing. He was editor-in-chief of the IEEE Transactions on Information Theory during the period 2010-2013 and served on the editorial board of the IEEE Signal Processing Magazine, “Foundations and Trends in Communication and Information Theory”, and “Foundations and Trends in Networking”. He was TPC co-chair of the 2008 IEEE International Symposium on Information Theory and the 2016 IEEE Information Theory Workshop and served on the Board of Governors of the IEEE Information Theory Society. He has been a delegate for faculty appointments of the president of ETH Zurich since 2008.
Deep neural networks have been employed very successfully as generative models for complex natural data such as images and natural language. In practice this is realized by training deep networks so that they realize high-dimensional probability distributions by transforming simple low-dimensional distributions such as uniform or Gaussian. The aim of this talk is to develop understanding of the fundamental representational capabilities of deep generative neural networks. Specifically, we show that every d-dimensional probability distribution of bounded support can be generated through deep ReLU networks out of a 1-dimensional uniform input distribution. What is more, this is possible without incurring a cost—in terms of approximation error as measured in Wasserstein-distance—relative to generating the d-dimensional target distribution from d independent random variables. This is enabled by a space-filling approach which elicits the importance of network depth in driving the Wasserstein distance between the target distribution and its neural network approximation to zero. Finally, we show that the number of bits needed to encode the corresponding generative networks equals the fundamental limit for encoding probability distributions as dictated by quantization theory.