Capitalised Terms in EVC-UFV V1.0 have the meaning defined in Table 1.
A dash “-” preceding a Term in Table 1 indicates the following readings according to the font:
- Normal font: the Term in the table without a dash and preceding the one with a dash should be read after that Term. For example, “Risk” and “- Assessment” will yield “Risk Assessment”.
- Italic font: the Term in the table without a dash and preceding the one with a dash should be read before that Term. For example, “Descriptor” and “- Financial” will yield “Financial Descriptors.”
All MPAI-specified Terms are defined online.
Term | Definition |
Activation Function |
A mathematical function determining whether a neuron should be activated based on the input to the neuron.
|
Block | |
Data Augmentation | A technique that increasing the training dataset with new training examples obtained by altering some features of the original training dataset. |
Densely Residual Laplacian | |
– Module | (DRLM) A set of RUs where each RU is followed by a Concatenation Layer. |
– Network | A Deep Learning Model that combines dense connections, residual learning, and Laplacian pyramids to enhance image restoration tasks like super-resolution and denoising. |
Dilation | |
Dependency Graph | (DepGraph) a framework to simplify the Structured Pruning operation of neural networks. |
Epoch | The total number of iterations of all the Training data in one cycle for training a Machine Learning Model. |
Fine Tuning | The Process of re-training a model trained on a dataset A on a new dataset B. |
Frame | |
– Video | An image drawn from for the sequences of images composing a video. |
Inference | The process of running a Model on an input to produce an output. |
Laplacian | |
– Attention Unit | (LC) A set of Convolutional Layers with a square filter size and Dilation that is greater than or equal the filter size. |
– Pyramid | |
Layer | A set of parameters at a particular depth in Neural Network. |
– Concatenation | |
– Convolutional | A Layer of Neural Network Model that applies a convolutional filter over the input. |
Learning | |
– Deep | A type of Machine Learning that uses artificial Neural Networks with many Layers to learn patterns from data. |
– Machine | A class of algorithms that enable computers to learn from data thus enabling them to make predictions called inferences from new data. |
– Rate | A value linked to the step size at each iteration toward a minimum of the Loss Function. |
– Sparsity | Learning strategy to detect the most relevant features of a Model in the set of all the Model features for a particular learning task. |
Loss function |
A mathematical function that measures the distance between the output of a Machine Learning Model and the actual value.
|
Model | |
– Deep Learning | An algorithm that is implemented with a multi-Layered Neural Network. |
– Machine Learning | An algorithm able to identify patterns or make predictions on datasets not experienced before. |
– Pre-trained | A Model that has been trained on a Dataset possibly of a different from the one in which the Model has to be used. |
Neural Network | Also Artificial Neural Network, a set of interconnected data processing nodes whose connections are affected by Weights. |
Neuron | A data processing node in a Neural Network. |
Patch | A cut-out of a frame of a square size, whose size if often multiple of 2, used to define the square size (e.g., 8, 16, 32 etc…) |
Parameter | The multiplier of the input to a Neural Network neuron learned via Training. |
Patience | In the ReduceLROnPlateau scheduler, the number of Epochs the training must be continued after the error metric has stopped improving before reducing the Learning Rate. |
Pre-training | A phase of Neural Network Model Training where a model is trained on an often-generic dataset to allow it to learn a more generic representation of the task. |
Pruning | The process of removing less important parameters (like weights or neurons) from a neural network to reduce its size and computational requirements, while retaining the model performance. |
– Group | A set of input and output components of Layers which are required to be pruned together. |
– Growing Regularization | A technique that forces the model to set a whole dimension to low values before applying the pruning, so as to make the removed dimension already close to zero not impacting the model result. |
– Learning-Based | A set of Pruning techniques which require variations of learning in order to be implemented. |
– Recovery | A method that involves retraining a pruned neural network to regain any lost accuracy. |
– Structured | A method that removes entire components like neurons, filters, or channels, resulting in a smaller dense model architecture. |
– Unstructured | Unstructured Pruning focuses on removing single redundant neurons. However, creating a sparse model representation which does not compute faster in common hardware. |
Rectified Linear Unit | (ReLU) An Activation Function whose output is the input if it is positive and zero otherwise. |
Residual | |
– Block | A Block composed of concatenated DRLM modules where each module is followed by a Concatenation and Convolutional Layer. |
– Function | |
– Neural Network | (ResNet) A Neural Network whose Layers learn Residual Functions with reference to the inputs to each Layer. |
– Unit | (RU) A set of alternate ReLU and Convolutional Layer. |
Resolution | |
– Visual | The dimension in pixels, expressed as width × height (e.g., 1920×1080), indicating how many pixels make up an image or a video frame. |
Saliency Value | A value representing the ability of an image or a video frame to grab the attention of a human. |
Sampling | |
– Down- | The process of reducing the Visual Resolution. |
– Up- | The process of increasing the Visual Resolution. |
Super Resolution | The technique enabling generating High-Resolution Visual Data from a low-Resolution one. |
Training | The process of letting a Model experience examples of inputs that the Trained Model might experience or outputs that the Trained Model should produce, or both. |
– Set | The dataset used to train a Model. |
Validation | The process of evaluating a Trained Model on a dataset (called Validation Set) that the Model has not experienced during training. |
– Score | The error of a Model on the Validation Set. |
– Set | The data set used to check the performance of a Model to know when to stop the Training. |