With the
explosive growth of smart devices and the advent of many new applications, mobile
traffic volume has been growing exponentially. The myriad technological
advances proposed for the 5G networks still mostly focus on capacity increase,
which is constrained by the limited spectrum resources as well as the
diminishing profits for operators and, therefore, will always lag behind the
growth rate of mobile traffic. Therefore, novel distributed architectures,
which bring network functions (such as computing and caching) and contents to
the edge, emerges, i.e., mobile edge computing and caching, to confront the
aforementioned challenges in the network development and many emerging
applications, such as AR/VR, IoT, eHealth, autonomous
driving , gaming etc. However, on the way towards efficient and intelligent
network edge computing and caching, there are many open problems ahead. From
the computing part, how to flexibly utilize the distributed computing resources
at the network edge, such as mobile computing or fog computing is of
significance. Moreover, what to be offloaded to the
edge node and when to offload also call for research attentions. From the
caching part, what, when, where and how to cache the contents to reduce the
demand for radio resources are vital. Last but not the most, how to efficiently
integrate computing and caching at the edge node and utilize the synergy of
computing and caching also requires a breakthrough.
This workshop aims to consolidate the timely and solid works of the current
state-of-the art in terms of fundamental research ideas and network engineering
geared towards exploiting intelligent caching and computing at the network
edge. The topics of interests related to edge caching and computing include
(but are not limited to):
l System modelling: Computation
modelling, content modelling, energy consumption modelling etc.;
l Novel transmission technologies for
learning-based applications at the network edge;
l Scheduling schemes for efficient
training, inference for edge learning/ edge AI;
l Timely data acquisition mechanisms
to support delay sensitive edge processing;
l Coded computing for edge
intelligence;
l Enabling technologies: e.g., SDN,
NFV, CRAN, D2D, cloud/fog computing and networking, etc.;
l Emerging applications via edge
intelligence: vehicular networking, massive IoT,
smart grid, healthcare, intelligent manufactory, etc;
l Novel network architecture: convergence
of computing, communications and caching, content/information-centric network,
cognitive computing and networking, big data analytic;
l Context-aware schemes: incentive
mechanism for computing and caching, pricing, game theoretic approach, network
economic etc, caching placement and delivery;
l Mobility management for mobile edge
computing and proactive caching, the way to exploit the mobility for more
computing and caching opportunities;
l Energy efficiency aspects: energy
harvesting, energy storage, energy transfer, etc;
l Security and privacy issue;
l Prototyping, test-beds and field
trials.
Paper
Submission Deadline: June 30, 2019
Paper
Acceptance Notification: August 15, 2019
Camera-Ready:
September 15, 2019
Please
follow the general guideline of IEEE Globecom. The
submission link can be found here.
Andreas Molisch, University of Southern California
General
Co-Chairs
Luiz DaSilva , Trinity College Dublin, Ireland,
Zhisheng Niu,
Tsinghua University, China
Program
Co-Chairs
Sheng Zhou,
Tsinghua University, China
Zheng Chang,
University of Jyväskylä, Finland
Publicity
Co-Chairs
Jie Gong, Sun Yet-Sen University, China,
Jie Xu, University of Miami, U.S.
Zhiyuan Jiang, Shanghai University, China
Technical
Program Members
TBD