We are thrilled to announce the release of NLP Lab version 5.5, marking another significant step forward in its Natural Language Processing capabilities. This update introduces integration with Azure OpenAI Service that provides organizations with access to OpenAI’s models with added layers of enterprise-grade capabilities, offering enhanced security, compliance, and governance features that are crucial for large organizations. This integration covers the already available NER pre-annotation and synthetic task generation but also a new feature: Prompt-based Text Classification with GPT models.
Integration with Azure OpenAI Service
NLP Lab 5.5 offers Azure OpenAI Service integration support, expanding its pre-annotation and synthetic data generation capabilities. This new feature allows users to connect with a broader array of external service providers, providing an additional pre-annotation method in the absence of specific pre-trained models.
Begin by registering the external service provider with NLP Lab. Currently, NLP Lab supports Azure OpenAI and OpenAI as service providers. Here’s a quick guide for this:
- On your NLP Lab instance, go to the “Settings” page and then click on the “Integrations” tab.
- Click the “Add” button, and fill in the required fields: “Service Provider Name” and “Secret Key.”
- Click “Validate” to ensure a successful connection between the service provider and NLP Lab.
- Click “Integrate.”
Creating and Testing NER Prompts:
After integrating the external service provider, you can create and test the NER prompt.
- Navigate to “Hub” and then to the “Prompts” page.
- Click the “+ Add Prompt” button.
- Select the “Create external LLM Prompt” tab.
- Provide a name for the prompt, select the entity from the type dropdown, and choose the desired service provider from the service provider dropdown.
- Enter the appropriate prompt in the “Prompt” text box.
- Before saving the prompt, test it by entering text in the “Test Prompt” text box and clicking on the “Test” button.
- Experiment with different temperature values during testing to ensure the desired results.
- After testing and confirming that the prompt is working as expected, click on the “Save” button to save the prompt.
Using GPT Prompts in a Project
NER prompts can be used in a project just like regular Zero Shot prompts. You can find the prompt under the “Prompt” tab or the “Reuse Resource Page” in the project configuration. Add and use it for pre-annotation, similar to Zero-Shot prompts, rules, or models.
Generate synthetic tasks using Azure OpenAI
In addition to creating new prompts, Azure OpenAI can also be used to generate synthetic tasks. Here’s a quick guide:
Setting up and Validating the New Service Provider:
- From the task page, click on the “Import” button and navigate to the “Generate Synthetic Task” page.
- Provide an appropriate prompt in the “Write Prompt” text box and click on the settings icon located on the right side of the page.
- Enter the API endpoint URL and secret key, then click “validate.”
- After validating the connection, set the desired temperature and the number of tasks to generate.
- Click on the “Generate” button to create synthetic tasks.
It’s important to note that the user interface experience remains unchanged for import and pre-annotation. The application continues to provide the same user-friendly flow and experience as it always has.
Adjustable Temperature Parameter for External LLM Prompts
In this version, users can fine-tune the diversity of their output when creating GPT prompts. This is achieved through the use of a “temperature” parameter. This feature gives users control over the diversity of their prompt outputs: a higher temperature yields more varied responses, while a lower temperature produces more predictable results. This adjustability allows users to find the perfect balance for their specific needs and can be altered as necessary for fine-tuning prompt outputs. While creating GTP LLM prompts, users can experiment with different temperature settings to understand how this affects the output. Users can also edit the prompt and update the temperature as needed to ensure the desired output is obtained. This flexibility empowers users to fine-tune their prompts to match their precise requirements.
NEW: Prompt-based Classification available with OpenAI and Azure OpenAI Services
NLP Lab 5.5 introduces Text Classification with LLM prompts using external services. This new feature empowers users to access a more diverse array of prompts, enabling them to harness external expertise for crafting pre-annotations. This feature becomes particularly valuable when pre-trained models are not readily available, serving as an alternative pre-annotation solution.
Classification prompts are supported by both Azure OpenAI and OpenAI service integrations.
Definition and Testing of Classification LLM Prompts:
Users can generate LLM prompts through the dedicated Prompt page within the Hub of Resources. Within the dedicated External LLM Prompt creation interface, the following details can be added to define the new LLM prompt:
- Name: Provide a clear and descriptive name that precisely conveys the prompt’s type and purpose.
- Type: Specify the type as “Classification.”
- Service Provider: Users can choose the specific service provider they have previously configured via the Integrations Page.
- Test in Real-time: The ability to test ChatGPT/Azure Open AI prompts in real-time during creation is provided right on the creation page. As prompts are crafted, users can immediately gauge their performance on test data. This not only facilitates immediate feedback but also ensures the final prompt aligns seamlessly with the user’s objectives.
The advantage of using external prompts lies in their simplicity and their power resides in the domain understanding incorporated in the GPT models used. Prompt crafting is user-friendly and can be done swiftly, which allows for rapid testing and integration into custom projects as necessary.
We continue to prioritize usability, ensuring a seamless transition or addition of external prompts. Incorporating external prompts is a straightforward process, with the interface and steps for prompt creation, selection, and integration retaining their intuitive and user-friendly design, much like the Entity Extraction with External Prompts. While classification prompts bring a layer of innovation to the annotation process, the core workflow remains unchanged and aligned with the existing experience.
The integration with Azure OpenAI Service as well as the addition of GPT Prompt Based Classification not only enrich the existing pre-annotation powers of the NLP Lab but also facilitate its usage in secured and private enterprise environments.
The introduction of prompt-based text classification offers a powerful alternative for pre-annotation, especially in scenarios where pre-trained classification models are not available. With its user-friendly interface and enhanced functionalities, NLP Lab 5.5 stands out as a comprehensive and adaptable platform for professionals seeking advanced NLP solutions.
The release of NLP Lab 5.5 is a testament to our continuous efforts to evolve our NLP technology and solutions and their increasing relevance in addressing the sophisticated requirements of today’s data-driven world.
Getting Started is Easy
The NLP Lab is a free text annotation tool that can be deployed in a couple of clicks on the AWS, Azure, or OCI Marketplaces, or installed on-premise with a one-line Kubernetes script.
Get started here: https://nlp.johnsnowlabs.com/docs/en/alab/install
Start your journey with NLP Lab and experience the future of data analysis and model training today!