Study On The Effect Of Edge Computing On Latency In Iot Applications
DOI:
https://doi.org/10.37676/jki.v3i1.572Keywords:
Study, Edge Computing, Latency Iot ApplicationsAbstract
This article examines the effect of implementing edge computing on latency in Internet of Things (IoT) applications. Edge computing moves data processing closer to the data source to reduce latency and improve efficiency. This research compares the performance of IoT applications with and without edge computing. The results show a significant reduction in latency and an increase in application response speed.
References
Al-Fuqaha, A., Guizani, M., Mohammadi, M., Aledhari, M., & Ayyash, M. (2015). Internet of Things: A Survey on Enabling Technologies, Protocols, and Applications. IEEE Communications Surveys & Tutorials, 17(4), 2347-2376.
Li, J., & Wu, S. (2019). Towards Real-Time Data Processing in Edge Computing for IoT Applications. IEEE Transactions on Parallel and Distributed Systems, 30(5), 1134-1146.
Srinivasan, K., & Zhao, S. (2019). A Survey of Edge Computing: Concepts, Applications, and Challenges. IEEE Communications Surveys & Tutorials, 21(2), 1407-1433.
Zhang, H., & Wang, X. (2020). Latency Reduction Techniques in Edge Computing for IoT: A Survey. ACM Computing Surveys, 53(6), 1-35.
Zheng, Q., & Zhang, C. (2020). A Survey on Edge Computing for IoT: Architecture, Applications, and Challenges. IEEE Access, 8, 65783-65802.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Vettyca Diana Saputri , Rizki Annisa Febriani
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.