Predictive resource management using deep learning in next generation Passive Optical Networks.

dc.contributor.authorHatem, John Abied Mitri
dc.contributor.departmentDepartment of Computer Scienceen_US
dc.contributor.facultyFaculty of Arts and Sciencesen_US
dc.contributor.institutionAmerican University of Beirut
dc.date.accessioned2020-03-27T18:42:58Z
dc.date.available2020-03-27T18:42:58Z
dc.date.issued2018
dc.date.submitted2018
dc.descriptionThesis. M.S. American University of Beirut. Department of Computer Science, 2018. T:6904
dc.descriptionAdvisor : Dr. Ahmad R. Dhaini, Assistant Professor, Computer Science ; Committee members : Dr. Shady Elbassuoni, Assistant Professor, Computer Science ; Dr. Haidar Safa, Professor, Computer Science ; Dr. Fatima K. Abu Salem, Associate Professor, Computer Science.
dc.descriptionIncludes bibliographical references (leaves 59-63)
dc.description.abstractOver the last decade, Passive Optical Network (PON) has emerged as the best solution for the bottleneck problem in the first-mile, making it an ideal candidate for next-generation broadband access networks. Meanwhile, machine learning, and more specifically deep learning, has been regarded as a star technology for solving complex classification and prediction problems. Recent advances in hardware and cloud technologies offer all the necessary capabilities for employing deep learning to enhance PON's performance. In PON systems, to allocate bandwidth for the end-users, the Optical Line Terminal (OLT) polls the Optical Network Units (ONU) in a cyclic manner using control messages to enable Dynamic Bandwidth Allocation (DBA) in the upstream direction. In this thesis, we propose a novel DBA approach, thus-called Deep DBA, that employs deep learning to predict the bandwidth demand of end-users so that the overhead due to the request-grant mechanism in PON is reduced, thereby increasing the bandwidth utilization. More specifically, we employ a Long Short-Term Memory recurrent neural network that predicts the bandwidth demands of ONUs for several future cycles by peep-holing only a few previous cycles. Consequently, the OLT does not need to poll the ONUs during the predicted cycles, thereby reducing the overhead of control messages and idle times in the network. The gain achieved through Deep-DBA enables to provision more users and-or services on the same network while ensuring fairness among ONUs and supporting quality of service. Extensive simulations highlight the merits of the new DBA approach and offer insights for this new line of research. Results show that with Deep-DBA, the control message overhead and total overhead in the upstream direction are reduced by up to 70percent compared to existing schemes.
dc.format.extent1 online resource (x, 63 leaves) : illustrations (some color)
dc.identifier.otherb2312913x
dc.identifier.urihttp://hdl.handle.net/10938/21565
dc.language.isoeng
dc.subject.classificationT:006904
dc.subject.lcshOptical communications.
dc.subject.lcshMachine learning.
dc.subject.lcshArtificial intelligence.
dc.titlePredictive resource management using deep learning in next generation Passive Optical Networks.
dc.typeThesisen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
t-6904.pdf
Size:
4.18 MB
Format:
Adobe Portable Document Format