References of "Sharma, Shree Krishna 50003084"
     in
Bookmark and Share    
Full Text
Peer Reviewed
See detailTowards Massive Machine Type Communications in Ultra-Dense Cellular IoT Networks: Current Issues and Machine Learning-Assisted Solutions
Sharma, Shree Krishna UL; Wang, Xianbin

in IEEE Communications Surveys and Tutorials (2019)

The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense ... [more ▼]

The ever-increasing number of resource-constrained Machine-Type Communication (MTC) devices is leading to the critical challenge of fulfilling diverse communication requirements in dynamic and ultra-dense wireless environments. Among different application scenarios that the upcoming 5G and beyond cellular networks are expected to support, such as enhanced Mobile Broadband (eMBB), massive Machine Type Communications (mMTC) and Ultra-Reliable and Low Latency Communications (URLLC), the mMTC brings the unique technical challenge of supporting a huge number of MTC devices in cellular networks, which is the main focus of this paper. The related challenges include Quality of Service (QoS) provisioning, handling highly dynamic and sporadic MTC traffic, huge signalling overhead and Radio Access Network (RAN) congestion. In this regard, this paper aims to identify and analyze the involved technical issues, to review recent advances, to highlight potential solutions and to propose new research directions. First, starting with an overview of mMTC features and QoS provisioning issues, we present the key enablers for mMTC in cellular networks. Along with the highlights on the inefficiency of the legacy Random Access (RA) procedure in the mMTC scenario, we then present the key features and channel access mechanisms in the emerging cellular IoT standards, namely, LTE-M and Narrowband IoT (NB-IoT). Subsequently, we present a framework for the performance analysis of transmission scheduling with the QoS support along with the issues involved in short data packet transmission. Next, we provide a detailed overview of the existing and emerging solutions towards addressing RAN congestion problem, and then identify potential advantages, challenges and use cases for the applications of emerging Machine Learning (ML) techniques in ultra-dense cellular networks. Out of several ML techniques, we focus on the application of low-complexity Q-learning approach in the mMTC scenario along with the recent advances towards enhancing its learning performance and convergence. Finally, we discuss some open research challenges and promising future research directions. [less ▲]

Detailed reference viewed: 106 (7 UL)
Full Text
Peer Reviewed
See detailCollaborative Distributed Q-Learning for RACH Congestion Minimization in Cellular IoT Networks
Sharma, Shree Krishna UL; Wang, Xianbin

in IEEE Communications Letters (2019), 23(4), 600-603

Due to infrequent and massive concurrent access requests from the ever-increasing number of machine-type communication (MTC) devices, the existing contention-based random access (RA) protocols, such as ... [more ▼]

Due to infrequent and massive concurrent access requests from the ever-increasing number of machine-type communication (MTC) devices, the existing contention-based random access (RA) protocols, such as slotted ALOHA, suffer from the severe problem of random access channel (RACH) congestion in emerging cellular IoT networks. To address this issue, we propose a novel collaborative distributed Q-learning mechanism for the resource-constrained MTC devices in order to enable them to find unique RA slots for their transmissions so that the number of possible collisions can be significantly reduced. In contrast to the independent Q-learning scheme, the proposed approach utilizes the congestion level of RA slots as the global cost during the learning process and thus can notably lower the learning time for the low-end MTC devices. Our results show that the proposed learning scheme can significantly minimize the RACH congestion in cellular IoT networks. [less ▲]

Detailed reference viewed: 76 (5 UL)
Full Text
Peer Reviewed
See detailQuantum Machine Learning for 6G Communication Networks: State-of-the-Art and Vision for the Future
Nawaz, Sayed Junaid; Sharma, Shree Krishna UL; Wyne, Shurjeel et al

in IEEE Access (2019)

The upcoming 5th Generation (5G) of wireless networks is expected to lay a foundation of intelligent networks with the provision of some isolated Artificial Intelligence (AI) operations. However, fully ... [more ▼]

The upcoming 5th Generation (5G) of wireless networks is expected to lay a foundation of intelligent networks with the provision of some isolated Artificial Intelligence (AI) operations. However, fully-intelligent network orchestration and management for providing innovative services will only be realized in Beyond 5G (B5G) networks. To this end, we envisage that the 6th Generation (6G) of wireless networks will be driven by on-demand self-reconfiguration to ensure a many-fold increase in the network performanceandservicetypes.Theincreasinglystringentperformancerequirementsofemergingnetworks may finally trigger the deployment of some interesting new technologies such as large intelligent surfaces, electromagnetic-orbital angular momentum, visible light communications and cell-free communications – tonameafew.Ourvisionfor6Gis–amassivelyconnectedcomplexnetworkcapableofrapidlyresponding to the users’ service calls through real-time learning of the network state as described by the network-edge (e.g., base-station locations, cache contents, etc.), air interface (e.g., radio spectrum, propagation channel, etc.), and the user-side (e.g., battery-life, locations, etc.). The multi-state, multi-dimensional nature of the network state, requiring real-time knowledge, can be viewed as a quantum uncertainty problem. In this regard, the emerging paradigms of Machine Learning (ML), Quantum Computing (QC), and Quantum ML (QML) and their synergies with communication networks can be considered as core 6G enablers. Considering these potentials, starting with the 5G target services and enabling technologies, we provide a comprehensivereviewoftherelatedstate-of-the-artinthedomainsofML(includingdeeplearning),QCand QML, and identify their potential benefits, issues and use cases for their applications in the B5G networks. Subsequently,weproposeanovelQC-assistedandQML-basedframeworkfor6Gcommunicationnetworks whilearticulatingitschallengesandpotentialenablingtechnologiesatthenetwork-infrastructure,networkedge, air interface and user-end. Finally, some promising future research directions for the quantum- and QML-assisted B5G networks are identified and discussed. [less ▲]

Detailed reference viewed: 170 (6 UL)
Full Text
Peer Reviewed
See detailSlicing based Resource Allocation for Multiplexing of eMBB and URLLC Services in 5G Wireless Networks
Korrai, Praveenkumar UL; Lagunas, Eva UL; Sharma, Shree Krishna UL et al

in Slicing based Resource Allocation for Multiplexing of eMBB and URLLC Services in 5G Wireless Networks (2019)

Detailed reference viewed: 67 (9 UL)
Full Text
Peer Reviewed
See detailMargin-based Active Online Learning Techniques for Cooperative Spectrum Sharing in CR Networks
Korrai, Praveenkumar UL; Lagunas, Eva UL; Sharma, Shree Krishna UL et al

in International Conference on Cognitive Radio Oriented Wireless Networks (CROWNCOM), Poznan, Poland, June 2019 (2019)

Detailed reference viewed: 76 (7 UL)
Full Text
Peer Reviewed
See detailEmerging Edge Computing Technologies for Distributed IoT Systems
Alnoman, Ali; Sharma, Shree Krishna UL; Ejaz, Waleed et al

in IEEE Network (2019)

The ever-increasing growth of connected smart devices and Internet of Things (IoT) verticals is leading to the crucial challenges of handling the massive amount of raw data generated by distributed IoT ... [more ▼]

The ever-increasing growth of connected smart devices and Internet of Things (IoT) verticals is leading to the crucial challenges of handling the massive amount of raw data generated by distributed IoT systems and providing timely feedback to the end-users. Although existing cloud computing paradigm has an enormous amount of virtual computing power and storage capacity, it might not be able to satisfy delaysensitive applications since computing tasks are usually processed at the distant cloud-servers. To this end, edge/fog computing has recently emerged as a new computing paradigm that helps to extend cloud functionalities to the network edge. Despite several benefits of edge computing including geo-distribution, mobility support and location awareness, various communication and computing related challenges need to be addressed for future IoT systems. In this regard, this paper provides a comprehensive view on the current issues encountered in distributed IoT systems and effective solutions by classifying them into three main categories, namely, radio and computing resource management, intelligent edge-IoT systems, and flexible infrastructure management. Furthermore, an optimization framework for edge-IoT systems is proposed by considering the key performance metrics including throughput, delay, resource utilization and energy consumption. Finally, a Machine Learning (ML) based case study is presented along with some numerical results to illustrate the significance of ML in edge-IoT computing. [less ▲]

Detailed reference viewed: 203 (4 UL)
Full Text
Peer Reviewed
See detailDevice Grouping for Fast and Efficient Channel Access in IEEE 802.11ah based IoT Networks
Bhandari, Sabin; Sharma, Shree Krishna UL; Wang, Xianbin

in Proc. IEEE Int. Conf. Communications 2018 (2018, July)

The recent advances in Internet of Things (IoT) have led to numerous emerging applications ranging from eHealthcare to industrial control, which often demand stringent Quality of Service (QoS ... [more ▼]

The recent advances in Internet of Things (IoT) have led to numerous emerging applications ranging from eHealthcare to industrial control, which often demand stringent Quality of Service (QoS) requirements such as low-latency and high system reliability. However, the ever-increasing number of connected devices in ultra-dense IoT networks and the dynamic traffic patterns increase the channel access delay and packet collision rate. In this regard, this paper proposes a sector-based device grouping scheme for fast and efficient channel access in IEEE 802.11ah based IoT networks such that the total number of the connected devices within each sector is dramatically reduced. In the proposed framework, the Access Point (AP) divides its coverage area into different sectors, and then each sector is further divided into distinct groups based on the number of devices and their location information available from the cloud-center. Subsequently, individual groups within a sector are assigned to specific Random Access Window (RAW) slots, and the devices within distinct groups in different sectors access the allocated RAW slots by employing a spatial orthogonal access mechanism. The performance of the proposed sectorized device grouping scheme has been analyzed in terms of system delay and network throughput. Our simulation results show that the proposed scheme can significantly enhance the network throughput while simultaneously decreasing the system delay as compared to the conventional Distributed Coordination Function (DCF) and IEEE 802.11ah grouping scheme. [less ▲]

Detailed reference viewed: 33 (2 UL)
Peer Reviewed
See detailDistributed Caching Enabled Peak Traffic Reduction in Ultra-Dense IoT Networks
Sharma, Shree Krishna UL; Wang, Xianbin

in IEEE Communications Letters (2018), 22(6), 1252-1255

The proliferation of massive machine-type communications devices and their random and intermittent transmissions have brought the new challenge of sporadic access-network congestion in ultra-dense ... [more ▼]

The proliferation of massive machine-type communications devices and their random and intermittent transmissions have brought the new challenge of sporadic access-network congestion in ultra-dense Internet of Things (IoT) networks. To address this issue, we propose an innovative approach of peak traffic reduction within the access network by utilizing distributed cache of IoT devices to coordinate their sporadic transmissions. The proposed technique is realized by employing a novel uplink transmission scheduling based on delay adaptation, in which distributed IoT devices adjust their transmission timings by utilizing embedded caching. An optimization problem is formulated for the minimization of peak data rate demand subject to delay tolerance levels, and is solved for the 3GPP-based traffic models by employing a gradient descent-based algorithm. Our results show that the proposed scheme can significantly reduce the peak data traffic in ultra-dense IoT networks. [less ▲]

Detailed reference viewed: 43 (4 UL)
Peer Reviewed
See detailCooperative sensing delay minimization in cloud-assisted DSA networks
Sharma, Shree Krishna UL; Wang, Xianbin

in Proc. IEEE PIMRC 2017 (2018, February 15)

Dynamic Spectrum Access (DSA) is considered as a promising solution to address the problem of spectrum scarcity in future wireless networks. However, the main challenges associated with this approach are ... [more ▼]

Dynamic Spectrum Access (DSA) is considered as a promising solution to address the problem of spectrum scarcity in future wireless networks. However, the main challenges associated with this approach are to acquire accurate spectrum usage information in a timely manner and to deal with the dynamicity of channel occupancy. Although Cooperative Sensing (CS) can provide significant advantages over individual device-level sensing in terms of sensing efficiency and the achievable throughput, the acquired channel occupancy information may become outdated in dynamic channel conditions due to the involved latency. In this regard, we propose to utilize a collaborative cloud-edge processing framework to minimize the CS delay in DSA networks. In this framework, the cloud-center can estimate channel occupancy parameters such as duty cycle based on the available historical sensing data by using a suitable spectrum prediction technique, and subsequently this prior knowledge can be utilized to adapt the sensing mechanism employed at the edge-side of a DSA network. Motivated by this, we formulate and solve the problem of minimizing CS delay in cloud-assisted DSA networks. A two-stage bisection search method is employed to solve this CS delay minimization problem. Our results show that the proposed cloud-assisted CS scheme can significantly reduce the CS delay in DSA networks. [less ▲]

Detailed reference viewed: 55 (4 UL)
Full Text
Peer Reviewed
See detailLatency Minimization in Wireless IoT Using Prioritized Channel Access and Data Aggregation
Bhandari, Sabin; Sharma, Shree Krishna UL; Wang, Xianbin

in Proc. IEEE Global Communications (GLOBECOM) Conf. 2018 (2018, January)

Future Internet of Things (IoT) networks are expected to support a massive number of heterogeneous devices/sensors in diverse applications ranging from eHealthcare to industrial control systems. In highly ... [more ▼]

Future Internet of Things (IoT) networks are expected to support a massive number of heterogeneous devices/sensors in diverse applications ranging from eHealthcare to industrial control systems. In highly-dense deployment scenarios such as industrial IoT systems, providing reliable communication links with low-latency becomes challenging due to the involved system delay including data acquisition and processing latencies at the edge-side of IoT networks. In this regard, this paper proposes a priority-based channel access and data aggregation scheme at the Cluster Head (CH) to reduce channel access and queuing delays in a clustered industrial IoT network. First, a prioritized channel access mechanism is developed by assigning different Medium Access Control (MAC) layer attributes to the packets coming from two types of IoT nodes, namely, high-priority and low-priority nodes, based on the application-specific information provided from the cloud-center. Subsequently, a preemptive M/G/1 queuing model is employed by using separate low-priority and high- priority queues before sending aggregated data to the Cloud. Our results show that the proposed priority-based method significantly improves the system latency and reliability as compared to the non-prioritized scheme. [less ▲]

Detailed reference viewed: 33 (3 UL)
See detailSatellite Communications in the 5G Era
Sharma, Shree Krishna UL; Chatzinotas, Symeon UL; Arapoglou, Pantelis-Daniel

Book published by IET (2018)

Detailed reference viewed: 140 (5 UL)
Full Text
Peer Reviewed
See detailDynamic Spectrum Sharing in 5G Wireless Networks With Full-Duplex Technology: Recent Advances and Research Challenges
Sharma, Shree Krishna UL; Bogale, Tadilo Endeshaw; le, Long Bao et al

in IEEE Communications Surveys and Tutorials (2018), 20(1), 674-707

Full-duplex (FD) wireless technology enables a radio to transmit and receive on the same frequency band at the same time, and it is considered to be one of the candidate technologies for the fifth ... [more ▼]

Full-duplex (FD) wireless technology enables a radio to transmit and receive on the same frequency band at the same time, and it is considered to be one of the candidate technologies for the fifth generation (5G) and beyond wireless communication systems due to its advantages, including potential doubling of the capacity and increased spectrum utilization efficiency. However, one of the main challenges of FD technology is the mitigation of strong self-interference (SI). Recent advances in different SI cancellation techniques, such as antenna cancellation, analog cancellation, and digital cancellation methods, have led to the feasibility of using FD technology in different wireless applications. Among potential applications, one important application area is dynamic spectrum sharing (DSS) in wireless systems particularly 5G networks, where FD can provide several benefits and possibilities such as concurrent sensing and transmission (CST), concurrent transmission and reception, improved sensing efficiency and secondary throughput, and the mitigation of the hidden terminal problem. In this direction, first, starting with a detailed overview of FD-enabled DSS, we provide a comprehensive survey of recent advances in this domain. We then highlight several potential techniques for enabling FD operation in DSS wireless systems. Subsequently, we propose a novel communication framework to enable CST in DSS systems by employing a power control-based SI mitigation scheme and carry out the throughput performance analysis of this proposed framework. Finally, we discuss some open research issues and future directions with the objective of stimulating future research efforts in the emerging FD-enabled DSS wireless systems. [less ▲]

Detailed reference viewed: 75 (3 UL)
Full Text
Peer Reviewed
See detailSimultaneous Wireless Information and Power Transfer (SWIPT): Recent Advances and Future Challenges
Perera, T. D. Ponnimbaduge; Jayakody, Dushantha Nalin; Sharma, Shree Krishna UL et al

in IEEE Communications Surveys and Tutorials (2018), 20(1), 264-302

Initial efforts on wireless power transfer (WPT) have concentrated toward long-distance transmission and high power applications. Nonetheless, the lower achievable transmission efficiency and potential ... [more ▼]

Initial efforts on wireless power transfer (WPT) have concentrated toward long-distance transmission and high power applications. Nonetheless, the lower achievable transmission efficiency and potential health concerns arising due to high power applications, have caused limitations in their further developments. Due to tremendous energy consumption growth with ever-increasing connected devices, alternative wireless information and power transfer techniques have been important not only for theoretical research but also for the operational costs saving and for the sustainable growth of wireless communications. In this regard, radio frequency energy harvesting (RF-EH) for a wireless communications system presents a new paradigm that allows wireless nodes to recharge their batteries from the RF signals instead of fixed power grids and the traditional energy sources. In this approach, the RF energy is harvested from ambient electromagnetic sources or from the sources that directionally transmit RF energy for EH purposes. Notable research activities and major advances have occurred over the last decade in this direction. Thus, this paper provides a comprehensive survey of the state-of-art techniques, based on advances and open issues presented by simultaneous wireless information and power transfer (SWIPT) and WPT assisted technologies. More specifically, in contrast to the existing works, this paper identifies and provides a detailed description of various potential emerging technologies for the fifth generation communications with SWIPT/WPT. Moreover, we provide some interesting research challenges and recommendations with the objective of stimulating future research in this emerging domain. [less ▲]

Detailed reference viewed: 81 (7 UL)
Full Text
Peer Reviewed
See detailLocation-aware and Superimposed-Pilot based Channel Estimation of Sparse HAP Radio Communication Channels
Nawaz, Sayed Junaid; Mansoor, Babar; Sharma, Shree Krishna UL et al

in Proc. IEEE VTC-Spring 2017 (2017, November 16)

A superimposed (arithmetically added) Pilot (SiP) sequence based channel estimation method for beamforming assisted multi-antenna High Altitude Platform (HAP) land mobile radio communication systems is ... [more ▼]

A superimposed (arithmetically added) Pilot (SiP) sequence based channel estimation method for beamforming assisted multi-antenna High Altitude Platform (HAP) land mobile radio communication systems is proposed, which exploits the prior available information of users' spatial location, density of users, and beam-width of HAP directional antenna. A thorough characterization of HAP sparse multipath radio propagation channels' is presented in first part of the paper, where mathematical relationship of HAP antenna beam-width with channel's delay span and optimal length of SiP base sequence are presented. Further, a location information aided and low- power SiP sequence based Stage-wise Orthogonal Match Pursuit (StOMP) algorithm is proposed for estimation of channels from single-antenna user terminals to beamforming assisted large scale multiple-antenna HAP. A thorough analysis on the basis of Normalized Channel Mean Square Error (NCMSE) and Bit Error Rate (BER) performance of proposed method is presented; where the effect of channels' sparsity level, Pilot-to-Information power Ratio (PIR), beam-width of HAP's directional antenna, amount of HAP antenna elements, density of interfering users, and spatial location of active user terminal are thoroughly studied. A comparison of the proposed method with a notable reference technique available in the literature is also presented. [less ▲]

Detailed reference viewed: 49 (2 UL)
Full Text
Peer Reviewed
See detailRelay Selection Strategies for SWIPT-Enabled Cooperative Wireless Systems
Gautam, Sumit UL; Lagunas, Eva UL; Sharma, Shree Krishna UL et al

in IEEE International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC), Montreal, Canada, Oct. 2017 (2017, October)

Detailed reference viewed: 377 (53 UL)
Full Text
Peer Reviewed
See detailMulti-antenna based one-bit spatio-temporal wideband sensing for cognitive radio networks
Merlano Duncan, Juan Carlos UL; Sharma, Shree Krishna UL; Chatzinotas, Symeon UL et al

in Proceedings of IEEE International Conference on Communications (ICC) 2017 (2017, July 31)

Cognitive Radio (CR) communication has been considered as one of the promising technologies to enable dynamic spectrum sharing in the next generation of wireless networks. Among several possible enabling ... [more ▼]

Cognitive Radio (CR) communication has been considered as one of the promising technologies to enable dynamic spectrum sharing in the next generation of wireless networks. Among several possible enabling techniques, Spectrum Sensing (SS) is one of the key aspects for enabling opportunistic spectrum access in CR Networks (CRN). From practical perspectives, it is important to design low-complexity wideband CR receiver having low resolution Analog to Digital Converter (ADC) working at a reasonable sampling rate. In this context, this paper proposes a novel spatio-temporal wideband SS technique by employing multiple antennas and one-bit quantization at the CR node, which subsequently enables the use of a reasonable sampling rate. In our analysis, we show that for the same sensing performance requirements, the proposed wideband receiver can have lower power consumption than the conventional CR receiver equipped with a single-antenna and a high-resolution ADC. Furthermore, the proposed technique exploits the spatial dimension by estimating the direction of arrival of Primary User (PU) signals, which is not possible by the conventional SS methods and can be of a significant benefit in a CRN. Moreover, we evaluate the performance of the proposed technique and analyze the effects of one-bit quantization with the help of numerical results. [less ▲]

Detailed reference viewed: 95 (14 UL)
Full Text
Peer Reviewed
See detailImpact of residual transceiver impairments on MMSE filtering performance of Rayleigh-product MIMO channels
Sharma, Shree Krishna UL; Papazafeiropoulos, Anastasios; Chatzinotas, Symeon UL et al

in Proc. IEEE International Workshop on Signal Processing Advances in Wireless Communications (SPAWC 2017) (2017, July)

ecent studies have demonstrated the presence of residual transceiver hardware impairments even after employing calibration and compensation techniques in different wireless systems. The effect of these ... [more ▼]

ecent studies have demonstrated the presence of residual transceiver hardware impairments even after employing calibration and compensation techniques in different wireless systems. The effect of these impairments becomes more severe in the systems involving a large number of inexpensive Radio Frequency (RF) chains such as massive Multiple Input Multiple Output (MIMO) systems due to the requirement of cost-efficient implementation. However, most of the existing studies consider ideal transceivers without incorporating the effect of residual hardware impairments. In this regard, this paper studies the impact of additive residual transceiver hardware impairments on the Minimum Mean Square Error (MMSE) filtering performance of Rayleigh-Product (RP) MIMO channels. Using principles from Random Matrix Theory (RMT), the MMSE filtering performance of the RP channels is analyzed and a tight lower bound is derived by taking the effects of residual additive transceiver impairments into account. Moreover, some useful insights on the performance of the considered system with respect to various parameters such as the transmit Signal to Noise Ratio (SNR), the number of scatterers and the severity of impairments on both the transmit and receive sides are provided. [less ▲]

Detailed reference viewed: 40 (0 UL)
Full Text
Peer Reviewed
See detailCloud-Assisted Device Clustering for Lifetime Prolongation in Wireless IoT Networks
Bhandari, Sabin; Sharma, Shree Krishna UL; Wang, Xianbin

in Proc. IEEE 30th Canadian Conference on Electrical and Computer Engineering (CCECE) 2017 (2017, June 15)

One of the crucial challenges in the recently emerging Internet of Things (IoT) applications is how to handle the massive heterogeneous data generated from a large number of resource-constrained sensors ... [more ▼]

One of the crucial challenges in the recently emerging Internet of Things (IoT) applications is how to handle the massive heterogeneous data generated from a large number of resource-constrained sensors. In this context, cloud computing has emerged as a promising paradigm due to its enormous storage and computing capabilities, thus leading to the IoT-Cloud convergence. In such a framework, IoT devices can be grouped into several clusters and each cluster head can send the aggregated information to the cloud via a gateway for further processing. Although a number of clustering methods have been proposed for the conventional Wireless Sensor Networks (WSNs), it is important to consider specific IoT characteristics while adapting these techniques for wireless IoT networks. One of the important features of IoT networks that can be exploited while developing clustering techniques is the collaborations among heterogeneous IoT devices. In this regard, the network-wide knowledge at the cloud center can be useful to provide information about the device relations to the IoT gateway. Motivated by this, we propose and evaluate a cloud-assisted device interaction-aware clustering scheme for heterogeneous IoT networks. The proposed method considers the joint impact of residual energy and device closeness factor for the effective selection of cluster heads. Our results show that the proposed clustering scheme can significantly prolong the network lifetime, and enhance the overall throughput of a wireless IoT network. [less ▲]

Detailed reference viewed: 43 (2 UL)
Full Text
Peer Reviewed
See detailLive Data Analytics with Collaborative Edge and Cloud Processing in Wireless IoT Network
Sharma, Shree Krishna UL; Wang, Xianbin

in IEEE Access (2017)

Recently, big data analytics has received important attention in a variety of application domains including business, finance, space science, healthcare, telecommunication and Internet of Things (IoT ... [more ▼]

Recently, big data analytics has received important attention in a variety of application domains including business, finance, space science, healthcare, telecommunication and Internet of Things (IoT). Among these areas, IoT is considered as an important platform in bringing people, processes, data and things/objects together in order to enhance the quality of our everyday lives. However, the key challenges are how to effectively extract useful features from the massive amount of heterogeneous data generated by resource-constrained IoT devices in order to provide real-time information and feedback to the endusers, and how to utilize this data-aware intelligence in enhancing the performance of wireless IoT networks. Although there are parallel advances in cloud computing and edge computing for addressing some issues in data analytics, they have their own benefits and limitations. The convergence of these two computing paradigms, i.e., massive virtually shared pool of computing and storage resources from the cloud and real-time data processing by edge computing, could effectively enable live data analytics in wireless IoT networks. In this regard, we propose a novel framework for coordinated processing between edge and cloud computing/processing by integrating advantages from both the platforms. The proposed framework can exploit the network-wide knowledge and historical information available at the cloud center to guide edge computing units towards satisfying various performance requirements of heterogeneous wireless IoT networks. Starting with the main features, key enablers and the challenges of big data analytics, we provide various synergies and distinctions between cloud and edge processing. More importantly, we identify and describe the potential key enablers for the proposed edge-cloud collaborative framework, the associated key challenges and some interesting future research directions. [less ▲]

Detailed reference viewed: 21 (5 UL)