Skip to main content
was successfully added to your cart.

Projects Combining Models trained with different embeddings, New View setup set in Project Configuration and more in Annotation Lab

Annotation Lab supports projects combining models trained with multiple embeddings for preannotation as well as predefined Demo projects that can be imported with the click of a button for easy experimentations and features testing. The Project Configuration page now has a new “View” step to configure the layout of the Labeling page. The release also includes stabilization and fixes bugs reported by our user community. Below are the details of what has been included in this release.

 

Features

 

Projects can reuse and combine models trained with different embeddings for pre-annotation

 

multiple-embeddings-support

 

In previous versions it was not possible to combine in a project configuration models trained using different embeddings. The preview page showed an error when models with different embeddings were added to the project configuration. This limitation is now removed. Now, it is easily possible to use models with different embeddings and deploy them as part of the same pre-annotation server. In the customize configuration page all the added models and their embeddings are listed. The list makes it easier for the user to delete the labels of a specific model.

 

Demo Projects can be imported for experiments

import-demo-projects

 

To allow users access and experiment with already configured and populated projects we have added the option to import predefined Demo projects. This is for helping users understand the various features offered by the Annotation Lab. The user can import demo projects from the Import Project window, by clicking on the Import Demo Project option. This will import 4 demo projects:

  • Open Source NER Model Onto100 + Relation Extraction;
  • Clinical NER + Assertion Status + Relation Extraction;
  • Recipes analysis for automatic extraction of ingredients, quantities, cooking time and temperature;
  • Demo project that showcases the results obtained using the emotion detection classifier applied to Twitter data. In addition, all these projects will be assigned to a new user demo_user as an Annotator role with tasks assigned to this user.

 

Visual Update of the Annotation Screen Layout from the View Tab

A new tab – “View” – has been added to the project setup wizard after the “Content Type” selection tab. This gives users the ability to set different layouts based on their needs and preferences.

step-view

 

Support for Granular License Scopes

This versions brings support for more granular license scopes such as Healthcare: Inference, Healthcare: Training, OCR: Inference or OCR: Training. This is in line with the latest developments of the John Snow Labs licenses.

Easy Reuse and Editing of Pre-annotations

For an improved usability, when pre-annotations are available for a task, those will be shown by default when accessing the labeling screen. Users can filter them based on the confidence score and the either accept the visible annotations as a new submitted completion or start editing those as part of a new completion. This simplifies the workflow as the user does not have to manually navigate to the Versions tab and create a new completion based on the predictions.

 

modify-preannotation-result

 

Easy Export of Large Visual NER Projects

For Visual NER projects, the size of a task is considerably larger than in the case of Text projects. As such, the export of the annotations for those projects is problematic as the resulting archive can quickly bypass 500MB. In older versions of the Annotation Lab this raised an error and the export was not possible. From version 4.2.0 users will be able to export large NER/ Visual NER projects with a size bigger than 500 MB.

Miscellaneous

  • Smaller Project Tiles on the Projects Dashboard

    The size of a project tile was compacted in this version in order to increase the number of project cards that could be displayed on the screen at one time.

 

Screenshot 2022-11-01 at 4 02 22 PM

 

  • Confusion Matrix in Training Logs for NER projects

    The training logs currently include the ‘TPR’ metric which is difficult to interpret especially when the data is imbalanced or when the data includes multiple class labels. With the addition of confusion matrix it will be easier to understand the performance of the model and judge whether the model is underfitting or overfitting.

 

imgonline-com-ua-twotoone-8bT6MK82Yuy

 

Get & Install It Here.

Full Feature Set Here.

Improved Labeling and Models Hub Pages in the Annotation Lab

Updated Sign-In Screen In the previous version, the login page was simple with static images. Now the Sign-In page highlights the new...