In the ever-evolving world of Natural Language Processing, staying ahead of the curve is paramount. With the release of NLP Lab 5.3, we’re taking a leap forward by integrating one of the most advanced Large Language Models (LLMs) available today: Open AI’s GPT models. This isn’t just another feature update; it’s a game-changer. Imagine having the capability to tap into the vast expertise of ChatGPT, broadening your prompt possibilities and refining your entity extraction process like never before. As we delve deeper into this exciting integration, you’ll discover how ChatGPT is set to redefine the way you approach pre-annotation tasks, offering unparalleled flexibility and ease of interaction. Join me on this journey to explore the transformative potential of ChatGPT prompts for entity recognition in NLP Lab 5.3.
Entity Extraction and Pre-Annotation via GPT Prompting
The highlight of this release is the integration with an external service provider, Open AI, to expand and deepen the range of prompts available for pre-annotation (in addition to the Zero Shot entity and relation prompts already supported). This feature:
- Broadens Prompt Possibilities: By integrating with Open AI LLM models, users can tap into a more diverse set of prompts, leveraging external expertise to craft pre-annotations, as an alternative preannotation solution or when pretrained models are not available.
- Efficient Entity Extraction: As current LLMs, GPT family included, are not very good at entity recognition tasks, NLP Lab included a post-processing step on the result provided by LLM. This improves entity identification and helps precisely locate the entities in the given text. These entities, carefully curated and aligned with NLP Lab pre-annotation requirements pave the way for a more efficient and streamlined annotation experience.
The following sections explain in detail how to define and use GPT prompts.
Setting Up the Integration with Open AI service
Integrating “ChatGPT” into the NLP Lab has been designed to be a straightforward process, ensuring users can harness the power of external expertise seamlessly. It consists of three easy steps:
Integrations Page: Navigate to the Integrations Page located within the System Settings. This is the hub where all external service providers, including Open AI’s GPT Models, can be defined and managed.
Define the Service Provider: To initiate the integration, users are required to provide specific details:
- Service Provider Name: This is the identifier for the external service, which in this case would be “ChatGPT” or any other name you prefer to use.
- Secret Key: Every external service comes with a unique Secret Key that ensures secure communication between the platforms. Enter the Secret Key associated with your Open Ai subscription here. For ensuring the integration process is error-free, users can validate the provided Secret Key directly within the form. This validation step ensures that the connection is secure and that the key is correct.
Project Association: Once a successful connection with “ChatGPT” (or any external LLM service provider) is established, it doesn’t end there. The integrated service will now be available for association with selected projects. This means users can decide which projects will benefit from the “ChatGPT” integration and enable it accordingly.
The Open AI integration allows users to tap into a vast reservoir of external expertise, enhancing the depth and breadth of their projects. We’ve ensured that the integration process is as intuitive as possible, allowing users to focus on what truly matters: crafting refined and effective pre-annotations.
ChatGPT Prompt Definition and Testing
Users can generate LLM prompts on the dedicated Prompt page from the Hub of Ressources. For ChatGPT Prompts, NLP Lab offers a dedicated definition interface. Here’s what to expect when creating a new LLM prompt:
– Name the Prompt: Within this new tab, users will first be asked to provide a name for their prompt. This name will be used for pre-annotating identified entities. At this point, we recommend creating one prompt per target entity.
– Select the Service Provider: Next, users can choose the specific service provider they’ve previously set up via the Integrations Page.
– Test in Real-time: A standout feature is the ability to test ChatGPT prompts at creation time. As you craft your prompt, you can immediately see how it performs on some test data. This not only allows for immediate feedback but also ensures that the final prompt aligns perfectly with the user’s objectives.
This streamlined approach ensures that integrating and testing external prompts is as intuitive and efficient as possible.
Consistent Workflow with LLM Prompts
Even with the introduction of new features in NLP Lab’s 5.3 release, users can take comfort in the consistent experience offered when working with prompts. The addition of external service provider prompts brings a fresh layer to the annotation process, yet the core workflow you’re familiar with stays the same.
– Familiarity Amidst Innovation: Despite the new integrations, the process of using available prompts remains as straightforward as ever. Whether you’re working with traditional prompts or the newly introduced ones, the experience is smooth and consistent.
– Seamless Transition: Our commitment to user-centric design means that even as we innovate, we prioritize the ease of use you’ve come to expect. Transitioning to or incorporating external prompts is made effortless, with the interface and steps for prompt creation, selection, and integration remaining intuitive and unchanged.
With NLP Lab 5.3, you get the best of both worlds: exciting new features and the comfort of a familiar workflow.
Note: Pre-annotation of tasks using LLM Prompts does not require the deployment of the pre-annotation server. The pop-up to deploy pre-annotation server is only shown if the project configuration consists of both LLM prompts and spark NLP models.
As we reflect on the introduction of ChatGPT integration in NLP Lab 5.3, it’s clear that this addition marks a significant step in the evolution of entity recognition. This integration brings together the strengths of both platforms, aiming to enhance the overall efficiency and precision of pre-annotation tasks. By incorporating ChatGPT’s capabilities, we’re offering our users an expanded toolkit to navigate the complexities of NLP. Our goal remains consistent: to equip users with reliable tools that can aid in their NLP projects. With the integration of ChatGPT, we’re optimistic about the potential improvements in entity recognition and look forward to seeing how our users leverage this new feature in their workflows.
Getting Started is Easy
The NLP Lab is a free text annotation tool that can be deployed in a couple of clicks on the AWS, Azure or OCI Marketplaces, or installed on-premise with a one-line Kubernetes script.
Get started here: https://nlp.johnsnowlabs.com/docs/en/alab/install