Glossary

Learn about advanced telecom and cloud-based networking functions and technologies that are enabling the latest wave of innovations in the telecom industry.​

Glossary Filter

Type the words you want to search

Search

AI model training starts with data. While the actual size of the dataset depends on the project, all machine learning projects require high-quality, well-annotated data to succeed. One of the computer science rules is garbage in, garbage out. To start model training AI is given a set of training data and asked to make decisions based on that information. As mistakes are made, adjustments can be made to the model to help the AI become more accurate.

Once the AI has completed basic training, it can begin the validation phase. In this phase, data is validated using a new data set, and any adjustments are made depending on the results. Then the testing phase is reached by conducting a real-world test. Giving the AI a dataset that does not include any tags or targets (training data that help the AI interpret the data). If the AI model makes accurate decisions based on this unstructured information, it has passed its training phase. If not, the model training process is repeated until the AI performs as expected.

Read more

Classification systematically groups observations into categories, such as when biologists categorize plants, animals, and other lifeforms into different taxonomies. Classification is a supervised form of learning where machine learning learns to do something with data humans already label. This training set includes a fixed amount of labels or categories for the computer to learn from. The machine can classify new data to pre-determine categories by spotting patterns in the training data.

Read more

Clustering means organizing similar objects into groups within a machine-learning algorithm. Clustering has many uses in data science, like image processing, knowledge discovery in data, and unsupervised learning. Cluster analysis, or clustering, is done by scanning the unlabeled datasets in a machine-learning model and setting measurements for specific data point features. The cluster analysis will then classify and place the data points in a group with matching features. The clustering technique to break down large, intricate datasets in a machine-learning model can simplify complex data.

Clustering is a form of unsupervised learning vs. classification, which is supervised learning. In clustering, there are no training sets and no labels. Depending on which data characteristics are important, some of these points will be similar to other data points. These are clusters. They tell us that the pieces of data are similar based on the parameters set.

Read more

Root cause analysis is discovering the underlying causes of issues to identify and resolve them. Generally, if a network issue occurs, telecom operators analyze the list of alarms and drill down to plan recovery or maintenance actions. However, this process can take many minutes or hours, from the first incident to preparing a resolution. To reduce the time for solving problems and provide customers with seamless communications service, AI can perform root cause analysis automatically. AI can analyze massive amounts of data and quickly identify problems, quantify the impact on subscribers, and share root cause analysis data with the network operations team. This ensures that any incident causes minimal impact on our end users.

Read more

Alerts for anomalies detected by machine learning when certain conditions are reached. For example, key performance indicators that pass a certain quality threshold. If the conditions of the rule are met, an alert is created, and the associated action is triggered. These alerts can also be triggered in specific situations on metrics that may not be known in advance or change as the data changes. An anomaly alert will trace the historical value of the metric and would be triggered when the metric has an abnormal value compared to the past.

Read more

In machine learning, a baseline model provides a baseline metric to be used as a reference point. It can be a straightforward rule-based approach, a basic statistical model, or a simple machine-learning model with minimal complexity. It often represents the minimum level of performance that can be achieved without advanced techniques. By comparing the performance of more sophisticated models against this baseline, the value and effectiveness of models can be evaluated.

Read more

Forecasting is a field of artificial intelligence (AI) that is used to predict the future based on past data. Artificial intelligence and machine learning (ML) are applied to time series data to estimate future developments. In the telecom industry, this is used to predict subscriber behavior, estimate network capacity requirements, and changes in key performance indicators over time. Using predictive analytics helps operators provide better services by utilizing data and machine learning techniques to predict future results based on historical data.

Read more

Time series data is composed of a sequence of values over time. Usually, each point is a pair of two items: the moment in which the metric was measured and the value of that metric then. Once plotted, a time series shows how that value has behaved over time up until the last data point. Time series data is a record, not a forecast. However, the time series data contains information that can help predict what can be expected in the future. In the telecom industry, key performance indicators (KPIs) which are best suited for analysis as time series data are the ones that are time-oriented and repeatedly sampled.

Read more

Anomaly detection is a data mining and analytics process that identifies data points that deviate from a dataset’s normal behavior. Within datasets are patterns that represent standard behavior. An unexpected change within these data patterns, or an event that does not conform to the expected data pattern, is considered an anomaly. Machine learning is progressively being used to automate anomaly detection. In the telecom industry, detecting an anomaly can indicate critical issues and save hours of manual work to proactively ensure the customers’ experience.

Read more

Artificial intelligence is the simulation of human intelligence processes by machines. Specific applications of AI include natural language processing, speech recognition, and machine vision. In the telecom industry, AI proactively improves the customer experience and ensures network performance by analyzing massive amounts of data to identify potential problems before they occur, enabling telecom operators to fix the issues and prevent outages.

Read more
Skip to content