was successfully added to your cart.

    Task Review Workflows in the Annotation Lab

    Avatar photo
    Ph.D. in Computer Science – Head of Product

    A new generation of the NLP Lab is now available: the Generative AI Lab. Check details here https://www.johnsnowlabs.com/nlp-lab/

    With the release of Annotation Lab v1.3.0 our users have access to new features related to task management.  A project admin can now assign a task to one or more Annotators for labeling and to one or more Reviewers for double-checking the correctness of the annotator’s work. The Reviewers can either accept the work or reject it. In both cases, Reviewers can leave feedback for further improving the Annotators performance. It is possible to assign a Reviewer to a bulk of tasks at once and also filter tasks by any Reviewer. Below each task name on the task list screen, the names of the assigned Annotators and Reviewers are displayed.

    reviewer_1

    When a reviewer or annotator logs into his/her account only tasks assigned to himself/herself will be visible. This is a useful feature for removing work over overlaps and for helping the user focus on their tasks.

    reviewer_2

    Inside the Labeling page, Reviewers will see an option to provide feedback on any “submitted” completions. If any task does not have submitted completions, the Reviewer will see empty completions section as none of the completions are submitted (meaning they are not ready for reviewing).

    The Reviewers can either request changes or approve the work depending upon the judgment they make, and it is also possible to leave comments.

    reviewer_3

    A green icon (means approved) or red icon (means change requested) will appear in the completions area after a review is left by the Reviewer. Clicking the icon will show the details of a review.

    reviewer_4

    Another major improvement that comes with this release is the behavior of task import that is managed with the introduction of a new overwrite checkbox on the import page. Until now, when a user imported a task that had the same task id as one of the existing tasks in Annotation Lab, the existing task was replaced. This case is now handled with the new check box.

    If it is checked and the data being imported already exists in the system (same task id and same input content), then rather than creating a new task, the existing task will get updated.

    overwrite_checked

    If the duplicate task is being imported and the checkbox is not checked, the user would see the warning message in the import status box.

    overwrite_unchecked

    Previously, it was not possible to identify the version of the current Annotation Lab deployment in UI. From this release, we have added the version text below the logo.

    version

    From the previous release, at a time only one completion per Annotator could be marked as ground truth. Now we have also aligned the same behavior while importing tasks that will mark the latest ground truth completion from each Annotator.

    When a task is imported, it is automatically assigned to any Annotators whose completions are present in that task. The same behavior is true for all tasks present in the existing system during upgrade to this or a higher version of Annotation Lab. Hence any old tasks that Annotators are working on will be automatically assigned, and hence visible to them. This avoids the trouble for Project Owners to assigning each task to respective Annotators.

    To learn more about the Annotation Lab and the available features please visit the documentation page.

    How useful was this post?

    Avatar photo
    Ph.D. in Computer Science – Head of Product
    Our additional expert:
    Dia Trambitas is a computer scientist with a rich background in Natural Language Processing. She has a Ph.D. in Semantic Web from the University of Grenoble, France, where she worked on ways of describing spatial and temporal data using OWL ontologies and reasoning based on semantic annotations. She then changed her interest to text processing and data extraction from unstructured documents, a subject she has been working on for the last 10 years. She has a rich experience working with different annotation tools and leading document classification and NER extraction projects in verticals such as Finance, Investment, Banking, and Healthcare.

    How to Add a New User | Annotation Lab

    The Fastest Way for Enterprise Teams to Annotate Data & Train New Models. See Annotation Lab in action: https://www.johnsnowlabs.com/annotati...
    preloader