RNet AI
  • INTRODUCTION
    • About Rnet
    • Quickstart
    • Core Concepts
      • List of Model Providers
  • GETTING STARTED
    • Model
      • Add New Provider
      • Predefined Model Integration
      • Custom Model Integration
      • Interfaces
      • Schema
    • Application Orchestration
      • Overview
      • Interactive Chat Application
      • Agent
      • App Kits
    • Workflow
      • Core Concepts
      • Node Overview
        • Start
        • Question Classifier
        • Knowledge Retrieval
        • Variable Aggregator
        • LLM
        • Direct Reply
        • IF/ELSE
        • HTTP Request
        • End
    • Knowledge Base
Powered by GitBook
On this page
  • 1. Definition
  • 2. Scenarios
  • 3. How to configure
  • Advanced Settings:
  1. GETTING STARTED
  2. Workflow
  3. Node Overview

Question Classifier

PreviousStartNextKnowledge Retrieval

Last updated 9 months ago

1. Definition

Uses defined classification descriptions to allow the LLM to categorise user input.

2. Scenarios

Common use cases include customer service conversation intent classification, product review classification, and bulk email classification.

The workflow example below is designed to automate customer support for smart devices. This system efficiently manages user queries by logically connecting nodes that classify, process, and respond to issues.

In this scenario, we set up three classification labels/descriptions:

  • CLASS 1: Questions related to hardware

  • CLASS 2: Questions related to software

  • CLASS 3: Other questions

When users input different questions, a model like GPT-4 can analyze the customer's query and classify whether the issue is related to software or hardware.

  • Query: "My smart camera's lens is blurry, how can I fix it?" —> "Questions related to hardware"

  • Query: "How do I reinstall the app for my smart light system?" —> "Questions related to software"

  • Query: "What payment methods do you accept?" —> Other questions

3. How to configure

Configuration Steps:

  1. Input Variables: This refers to the content that needs to be classified. Since this Customer Service is a type of Chatflow, the input variable will be sys.query, which represents the user's input or query within the context of a conversational application.

  2. Choose Inference Model: The issue classifier uses large language models to classify and infer meaning. Choosing the right model can improve how well it works.

  3. Write Classification Labels: You can manually add multiple classifications by writing keywords or descriptive statements for each category, helping the large language model better understand the classification criteria.

Advanced Settings:

Instructions: In Advanced Settings - Instructions, you can add supplementary instructions, such as more detailed classification criteria, to enhance the classifier's capabilities.

Memory: When enabled, each input to the issue classifier will include chat history from the conversation to help the LLM understand the context and improve question comprehension in interactive dialogues.

Memory Window: When the memory window is closed, the system dynamically filters the amount of chat history passed based on the model's context window; when open, users can precisely control the amount of chat history passed (in terms of numbers).

Output Variable:

class_name

This is the classification name output after classification. You can use the classification result variable in downstream nodes as needed.

For example:

  • Query: "How do I reinstall the app for my smart light system?"

  • Classification Result: The issue classifier processes this input and determines that it is a "Software Issue." The class_name variable would then hold the value "Software Issue."