NsfwGPT.ai
  • Introduction
    • NsfwGPT.ai
    • Legal
  • getting started
    • Account Creation
    • Password Reset
    • Edit Profile
  • Chatbot
    • Chatbot Creation
    • Chatbot Creation Guidelines
    • Chatbot Interaction
      • Selecting a Chatbot
      • Start a New Chat/ Continue Chat
      • Sending Messages
      • Restart Chat
      • Deleting a Chat
      • Share a public chat
      • {{char}} and {{user}} Variables
      • Name Card
      • Backtracing
      • Using asterisks to Enclose Actions and Mental Activities
      • Message Instruction
      • AI Helper Message Instruction
      • Memory Chip
  • Chat Group
    • Chat Group Creation
  • AI Image Generation
    • Generate AI Image
    • Gallery
    • Beginner's Guide to Prompting for Image Generation
  • Knowledge Base
    • Introduction
    • What is LLM?
    • What is LLM prompt?
    • What is LLM Token
    • LLM Context
    • Understanding how our LLM works
Powered by GitBook
On this page
  1. Knowledge Base

What is LLM prompt?

PreviousWhat is LLM?NextWhat is LLM Token

Last updated 1 year ago

Simply put, the message you send to LLM is referred to as a prompt. Therefore, you can understand a prompt as a message. When LLM receives your prompt/message, it processes it and returns an output. In this document and on the NsfwGPT.ai website, the terms prompt, message, and input all refer to the same thing.

System Prompt

You may have noticed that on the , there is a System Prompt input box. So, what is the difference between your Message (User Prompt) and the System Prompt?

User Prompt and System Prompt have different roles and meanings when using LLM. The User Prompt is the input text provided by the user to guide LLM in generating a corresponding response or output. The User Prompt typically includes the user's questions, statements, or instructions, and it serves as the starting point for the user's conversation with LLM. The content and expression of the User Prompt can directly influence the generated response from LLM.

On the other hand, the System Prompt is additional text provided to LLM as context or background information. It can be used to set the topic of the conversation, simulate a particular role or context, or provide a previous conversation history. The purpose of the System Prompt is to influence the overall style, tone, or content of the generated response from LLM. By using different System Prompts, you can guide LLM to generate replies related to specific topics or characters.

In summary, the User Prompt is the primary input provided by the user, directly guiding LLM in generating a response. The System Prompt is supplementary information provided to set the background, style, or role of the conversation. When used together, they can influence the way LLM generates responses.

Chatbot creation page of NsfwGPT.ai