Skip to main content
Fig. 8 | Visual Computing for Industry, Biomedicine, and Art

Fig. 8

From: Visual analytics tool for the interpretation of hidden states in recurrent neural networks

Fig. 8

Examples of incorrectly classified sequences in a model trained for multi-class classification. The heatmap matrix supports the identification of the classes for which the model becomes confused and how the EP evolves throughout the training process, delivering an inaccurate result. In the left visualizations, the prediction is quite uncertain between all five classes. This is visible because the class contributions are similarly transparent for all classes. In the right visualizations, the model is rather certain in the classification of acquisition (violet), despite the correct class being earns (yellow). Underlying data source: Reuters [25]

Back to article page