An illuminated 5G sign hangs behind a weave of electronic cables on the opening day of the MWC ... [+] Barcelona in Barcelona, Spain, on Monday, Feb. 25, 2019. At the wireless industrys biggest conference, over 100,000 people are set to see the latest innovations in smartphones, artificial intelligence devices and autonomous drones exhibited by more than 2,400 companies. Photographer: Angel Garcia/Bloomberg
5G is ushering in a new breed of genius networks to deal with the increased levels of complexity, prediction and real time decision making that is required to deliver the performance gains promised not just in enhanced mobile broadband applications but also in IoT and mission critical use cases. At the core of this evolutionary step is the use of machine learning algorithms.
The ability to be more dynamic with real-time network optimization capabilities such as resource loading, power budget balancing and interference detection is what made networks smart in the 4G era. 5G adds support for new antenna capabilities, high-density and heterogeneous network topologies, and uplink and downlink channel allocation and configuration based on payload type and application. While there are many uses of machine learning across all layers of a 5G network from the physical layer through to the application layer, the base station is emerging as a key application for machine learning.
A Nokia OYJ ultra deployable 5G Massive MIMO millimeter wave antenna is displayed at the company's ... [+] booth during the Mobile World Congress Americas event in Los Angeles, California, U.S., on Friday, Sept. 14, 2018. The conference features prominent executives representing mobile operators, device manufacturers, technology providers, vendors and content owners from across the world. Photographer: Patrick T. Fallon/Bloomberg
One of the hallmarks of a next generation 5G base station is the use of advanced antenna capabilities These capabilities include but are not limited to massive multiple-input multiple-output (MIMO) antenna arrays, beamforming, and beam steering.
Massive MIMO is the use of antenna arrays with a large number of active elements. Depending on the frequency band in which it is deployed, massive MIMO designs can employ from 24 active antenna elements to as many as several hundred. One of the uses of MIMO in general is to be able to transmit and receive parallel and redundant streams of information to address errors introduced by interference. However, another use specific to massive MIMO is beamforming and in more advanced systems, beam steering. Beamforming is the ability to utilize a set of phased arrays to create a beam of energy that can be used to focus and extend signal transmission and reception to and from the base station to a particular mobile device. Beam steering is the ability to then control that beam to follow the device in a fully mobile environment within the coverage footprint of that antenna array. When massive MIMO is fully brought to bear and beamforming and beam steering optimally employed, network operators and consumers alike benefit from increased network capacity and expanded coverage through increased data streams, decreased interference, extended range and more optimized power efficiency.
But how does machine learning help with this? Imagine if you will a race between a boat with 10 oars vs a boat with 20 oars. The boat with 10 oars is coordinated by a coxswain not just for rhythm but also is making real-time corrections to heading and cadence based not just on what is currently happening but also what is predicted to happen further down the course. In contrast the boat with 20 oars has a coxswain who is not capable of coordinating rhythm and is only making corrections based on general information that has already occurred. Clearly the former will win the race while the latters oars are not only making minimal progress but in some cases are actually interfering with each other. The same is true with massive MIMO. In order to fully realize the benefits of massive MIMO capability, beamforming and beam steering, machine learning is being utilized at the base station to provide real time and predictive analysis and modeling to better schedule, coordinate, configure and select which arrays to use and when.
A 5G K9 robot distributes hand sanitiser to a visitor in a shopping mall in Bangkok on June 4, 2020, ... [+] as sectors of the economy reopen following restrictions to halt the spread of the COVID-19 novel coronavirus. (Photo by Mladen ANTONOV / AFP) (Photo by MLADEN ANTONOV/AFP via Getty Images)
The new 5G network standard requires higher density deployments of smaller cells working with larger macro cells and multiple air interface protocols. The vision is for smaller cells to be designed for indoor locations or dense urban environments where GPS positioning is not always reliable and the radio frequency (RF) environment is far from predictable. Understanding the location of the devices interacting with the network is essential not only to application layer use cases but also to real time network operation and optimization. It is therefore critical to find ways not only to be able to accurately locate where user equipment is located but also to track them as they move within the coverage footprint.
To this end, machine learning is being applied to estimate user equipment location using RF data and triangulation techniques. While this is not a new concept, the use of machine learning algorithms is yielding material improvements in terms of accuracy, precision, and viability of widespread use than previous means. This is even more significant in that these improvements are being achieved in an environment that is orders of magnitude more complex and dynamically variable than ever before.
One of the driving considerations for the development of 5G is to have one framework to address the varied and often conflicting requirements of 3 use cases, including Enhanced Mobile Broadband (eMBB), massive IoT, and mission critical applications.
Previously served by purpose built, disparate networks, these use cases now will be supported with the 5G network architecture while continuing to require capabilities that are at odds with each other. Networks designed to support EMBB use cases are required to be optimized for high speed, low to medium latency, and profitable capacity. Massive IoT networks on the other hand, need to be low cost, narrow bandwidth, with low control plane overhead and high reliability. While mission critical networks require high speed, low latency and high reliability.
In order to make this vision a reality, 5G has been designed for high variability and flexibility both in the control plane and in channel configuration. As such, it is essential that 5G networks have the ability to predict payload type and use case based on changing conditions, such as historical loading data, RF conditions, location and a wide range of other factors, in order to efficiently and dynamically configure and utilize 5G channel resources.
Consequently, machine learning is being used to not only predict user equipment characteristics and capabilities, probable use case requirements, and RF conditions, but also potentially the type of content most likely to be requested and using edge caching techniques to bring the content closer to the end user. For example, based on historical trend data, it might become known that due to the proximity of a base station to the university as well as the current trending titles on Netflix or Disney + that at certain times of the day, specific movies should be made available closer to that base station to reduce network congestion, buffering, and latency. Similarly, a certain base station located close to an intersection that gets congested at certain times of the day might need more traffic and V2X sensor data to help aid ADAS or autonomous driving applications.
As an industry, we are at a critical evolutionary point as the combination of 5G and machine learning combine to put us on a path towards generational leaps in network capability and efficiency brought about by increasingly more complex functionality and adaptability. But it is an evolution not a revolution and these are the very early days. These 5G machine learning applications are just the beginning of the potential that can be unleashed not just at the physicallayer enabled by the base station but through to the application layer as these two foundational technologies are brought together and we enter the era of genius networks.
Read more from the original source:
5G And Machine Learning: Taking Cellular Base Stations From Smart To Genius - Forbes
- Machine learning provides a new picture of the great gray owl - Phys.org - April 2nd, 2024
- What is Machine Learning? Definition, Types, Tools & More - April 2nd, 2024
- Revolutionizing Industries: The Convergence of RFID, AI, and Machine Learning - yTech - April 2nd, 2024
- Layerwise Importance Sampled AdamW (LISA): A Machine Learning Optimization Algorithm that Randomly Freezes Layers of LLM Based on a Given Probability... - April 2nd, 2024
- Dimensionality reduction for images of IoT using machine learning | Scientific Reports - Nature.com - April 2nd, 2024
- 3 Machine Learning Stocks That Could Be Multibaggers in the Making: March Edition - InvestorPlace - April 2nd, 2024
- Researchers use machine learning to improve the taste of Belgian beers Physics World - physicsworld.com - April 2nd, 2024
- PM Modi Emphasizes The Importance Of Incorporating AI & Machine Learning To Enhance Digital Infra - Business Today - April 2nd, 2024
- Accurate and rapid antibiotic susceptibility testing using a machine learning-assisted nanomotion technology platform - Nature.com - March 21st, 2024
- Machine Learning Accelerates the Simulation of Dynamical Fields - Eos - March 21st, 2024
- Quantum Machine Learning: Exploring the Intersection of New Frontiers - DataScientest - March 21st, 2024
- Advancements in Pancreatic Cancer Detection: Integrating Biomarkers, Imaging Technologies, and Machine Learning ... - Cureus - March 21st, 2024
- Google Health Researchers Propose HEAL: A Methodology to Quantitatively Assess whether Machine Learning-based Health Technologies Perform Equitably -... - March 21st, 2024
- A change in the machine learning landscape - InfoWorld - March 21st, 2024
- Informing immunotherapy with multi-omics driven machine learning | npj Digital Medicine - Nature.com - March 21st, 2024
- Crypto Entities That Neglect AI and Machine Learning Investment Will Lag Behind, Warns Binance CTO Bitcoin News - Bitcoin.com News - March 21st, 2024
- MIT Researchers Developed an Image Dataset that Allows Them to Simulate Peripheral Vision in Machine Learning Models - MarkTechPost - March 21st, 2024
- BurstAttention: A Groundbreaking Machine Learning Framework that Transforms Efficiency in Large Language Models with Advanced Distributed Attention... - March 21st, 2024
- A machine learning system to identify progress level of dry rot disease in potato tuber based on digital thermal image ... - Nature.com - January 24th, 2024
- Mind the Gap Machine Learning, Dataset Shift, and History in the Age of Clinical Algorithms | NEJM - nejm.org - January 24th, 2024
- Cracking the Business Code of Clusters Machine Learning Times - The Machine Learning Times - January 24th, 2024
- Machine-learning-based models found to have predictive abilities no better than chance in out-of-sample evaluations - 2 Minute Medicine - January 24th, 2024
- Hybrid machine learning method boosts resolution of electrical impedance tomography - Tech Xplore - January 24th, 2024
- Cow moos and burps to be monitored using machine learning - FoodNavigator.com - January 24th, 2024
- Enhancing foveal avascular zone analysis for Alzheimer's diagnosis with AI segmentation and machine learning using ... - Nature.com - January 24th, 2024
- How to Use AI and Machine Learning for Academic Research - Innovation & Tech Today - January 24th, 2024
- Smart Use of Machine Learning Algorithms: Beyond the Hype, Into Real-World Solutions - Medium - January 24th, 2024
- How A.I./Machine Learning Is Boosting the Diversity of U.S. Med Students and Americas Future Doctors - Higher Education Digest - January 24th, 2024
- Weekly AiThority Roundup: Biggest Machine Learning, Robotic And Automation Updates - AiThority - January 24th, 2024
- How to Develop and Deploy Machine Learning Project in Python - Analytics Insight - January 24th, 2024
- Machine learning education | TensorFlow - January 7th, 2024
- How LinkedIn Uses Machine Learning to Address Content-Related Threats and Abuse - InfoQ.com - January 7th, 2024
- What is AI and Machine Learning? - GovernmentCIO Media & Research - January 7th, 2024
- Overview: Machine Learning Specialization by Andrew Ng (Course 1) - Medium - January 7th, 2024
- Study uses new tools, machine learning to investigate major cause of blindness in older adults - Medical Xpress - January 7th, 2024
- Leveraging AI and Machine Learning on AWS | by Be | Jan, 2024 - Medium - January 7th, 2024
- The Future at the Intersection of AI, Machine Learning, and Data Science - Medriva - January 7th, 2024
- Navigating the AI Landscape: From Machine Learning Foundations to Multimodal Advancements - Medium - January 7th, 2024
- Brake Noise And Machine Learning (3 of 4) - The BRAKE Report - January 7th, 2024
- 'Local' machine learning promises to cut the cost of AI development in 2024 - ITPro - January 7th, 2024
- Voice Recognition with Machine Learning on Arduino Nano 33 BLE Sense - Medium - January 7th, 2024
- This Paper from MIT and Microsoft Introduces LASER: A Novel Machine Learning Approach that can Simultaneously Enhance an LLMs Task Performance and... - January 7th, 2024
- How to Choose the Right Advanced Certification Program in AI & Machine Learning - TechGraph - January 7th, 2024
- What Is Machine Learning? | A Beginner's Guide - Scribbr - November 17th, 2023
- AI vs. Machine Learning vs. Deep Learning vs. Neural Networks ... - IBM - January 30th, 2023
- The Latest Google Research Shows how a Machine Learning ML Model that Provides a Weak Hint can Significantly Improve the Performance of an Algorithm... - January 30th, 2023
- What Is Machine Learning and Why Is It Important? - January 22nd, 2023
- Achieving Next-Level Value From AI By Focusing On The Operational Side Of Machine Learning - Forbes - January 22nd, 2023
- UCLA Researcher Develops a Python Library Called ClimateLearn for Accessing State-of-the-Art Climate Data and Machine Learning Models in a... - January 22nd, 2023
- Alto Neuroscience Presents New Data Leveraging EEG and Machine Learning to Predict Individual Response to Antidepressants at the 61st Annual Meeting... - December 12th, 2022
- Apple has released a Set of Optimizations that allow the Stable Diffusion AI Image Generator to be used on Apple Silicon, making use of Core ML,... - December 12th, 2022
- Genomic Testing Cooperative to Present Data at the American Society of Hematology Meeting on New Applications of its Proprietary Tests that Combine... - December 12th, 2022
- Astronomers at Caltech Have Used a Machine Learning Algorithm to Classify 1,000 Supernovae Completely Autonomously - MarkTechPost - December 4th, 2022
- Deep Learning | NVIDIA Developer - November 25th, 2022
- Check Out This Tool That Uses Machine Learning To Animate 3D Models In Real-Time And Will Soon Be Compatible With Unreal Engine - MarkTechPost - November 17th, 2022
- The NFT World is Evolving, and That's No Secret. Machine Learning and Algorithmic Tools ... - Latest Tweet - LatestLY - October 23rd, 2022
- Its Not Just About Accuracy - Five More things to Consider for a Machine Learning Model - AZoM - October 15th, 2022
- Machine learning operations offer agility, spur innovation - MIT Technology Review - October 15th, 2022
- Machine learning to predict the development of recurrent urinary tract infection related to single uropathogen, Escherichia coli | Scientific Reports... - October 15th, 2022
- The more data, the more deep learning capacity - Innovation Origins - October 15th, 2022
- Outlook on the Machine Learning in Life Sciences Global Market to 2027 - Featuring Alteryx, Anaconda, Canon Medical Systems and Imagen Technologies... - October 15th, 2022
- Forensic Discovery Taps Reveal-Brainspace to Bolster its Analytics, AI and Machine Learning Capabilities - Business Wire - October 15th, 2022
- Long-term exposure to particulate matter was associated with increased dementia risk using both traditional approaches and novel machine learning... - October 15th, 2022
- Machine Learning | Google Developers - October 7th, 2022
- Machine Learning in Oracle Database | Oracle - October 7th, 2022
- Learning on the edge | MIT News | Massachusetts Institute of Technology - MIT News - October 7th, 2022
- Study: Few randomized clinical trials have been conducted for healthcare machine learning tools - Mobihealth News - October 7th, 2022
- The Worldwide Industry for Machine Learning in the Life Sciences is Expected to Reach $20.7 Billion by 2027 - ResearchAndMarkets.com - Business Wire - October 7th, 2022
- Dominos MLops release focuses on GPUs and deep learning, offers multicloud preview - VentureBeat - October 7th, 2022
- MLOps Company Iterative Sees Steady Growth in First Half of 2022 - Business Wire - October 7th, 2022
- Machine learning tool could help people in rough situations make sure their water is good to drink - ZME Science - October 7th, 2022
- Developing Machine-Learning Apps on the Raspberry Pi Pico - Design News - October 7th, 2022
- Arctoris welcomes on board globally recognized experts in Machine Learning, Chemical Computation, and Alzheimer's Disease - Business Wire - October 7th, 2022
- Machine vision breakthrough: This device can see 'millions of colors' - Northeastern University - October 7th, 2022
- RBI plans to extensively use artificial intelligence, machine learning to improve regulatory supervision - ETCIO - October 7th, 2022
- Artificial intelligence may improve suicide prevention in the future - EurekAlert - October 7th, 2022
- Google turns to machine learning to advance translation of text out in the real world - TechCrunch - September 29th, 2022
- Machine learning has predicted the winners of the Worlds - CyclingTips - September 29th, 2022
- Peking University released the first open-source dataset for machine learning applications in fast chip design - EurekAlert - September 29th, 2022
- Predicting the effects of winter water warming in artificial lakes on zooplankton and its environment using combined machine learning models |... - September 29th, 2022