Intelligent Resource Management for Computation Offloading and Content Proactive Caching in Edge Computing

Publisher:闻天明Release Time:2019-06-04Number of visits:117

Speaker:    Prof. Jun Cai

Time:        10:30-11:30, June 25

Location:    SIST 1C-502

Host:       Prof. Yang Yang & Dr. Kunlun Wang

Abstract:

Nowdays, smart mobile devices have played more and more important roles in our daily lives for business, learning, and entertainment. However, many emerging mobile applications, such as face recognition, natural language processing, interactive gaming and augmented reality, are ordinarily computation-intensive and energy-consuming. To address the conflict between resource-hungry mobile applications and resource-limited mobile devices, edge computing has been recently proposed as a supplement to conventional remote public clouds, which brings computing capabilities close to end users for satisfying the requirements of pervasive, prompt and flexible computation and communication augmenting services. For edge computing, limitation on computation and communication resources severely blocks its practical implementation, and thus calls for more efficient resource management mechanisms.

In this talk, the speaker will go over the resource management issues involved in developing an effective and efficient edge computing system and a plethora of advanced techniques he has developed in recent years. Particular emphases will be put on computation task offloading (uplink) and content proactive caching (downlink) with the consideration of human/machine intelligence. After presenting the procedure, design challenges, and objectives for edge computing, the speaker will explain how the newly developed techniques can achieve high network efficiency (social welfare maximization), ensure the fulfillment of desired quality-of-service (QoS) and prevent any untruthful strategic behaviors from smart mobile users. Three examples of his work will be discussed in some depth, which are designed to support distributed decision making, delay-sensitive task offloading, and social-aware proactive caching.

Bio:

Jun Cai received the Ph.D. degree in electrical engineering from the University of Waterloo, Canada, in 2004. From 2004 to 2006, he was a Natural Sciences and Engineering Research Council of Canada (NSERC) Post-Doctoral Fellow with McMaster University, Canada. Between 2006 and 2018, he had been with the Department of Electrical and Computer Engineering, University of Manitoba, Canada, where he was a full Professor. Since 2019, he joined Concordia University, Canada, as PERFORM Centre Research Chair and a full Professor. His current research interests include mechanism design, game theory, queueing theory, machine learning, and their applications in different wireless communication networks including edge computing, IoT, 5G and beyond, and ehealth. He has contributed over 130 published papers in top journals and international conferences. He received the Best Paper Award from Chinacom in 2013, the Rh Award for outstanding contributions to research in applied sciences from the University of Manitoba in 2012, and the Outstanding Service Award from the IEEE Globecom in 2010. He served as a TPC Chairs for many conferences, including IEEE VTC-Fall 2019 Radio Access Technology and Heterogeneous Networks, IEEE GreenCom 2018, IEEE CCECE 2017 Communications and Networks Track, Chinacom 2013 Workshop on M2M Communications, IEEE VTC-Fall 2012 Wireless Applications and Services Track, IEEE Globecom 2010 Wireless Communications Symposium, and IWCMC 2008 General Symposium. He served on the Editorial Board for Wireless Communications and Mobile Computing and IET Communications. He was the Chair of IEEE Communications Society Chapter IEEE Winnipeg Section.

  

SIST-Seminar 18167