Dittin.AI
  • 👁️Introduction
    • What is Dittin AI?
    • Why should you choose Dittin AI?
    • Contact
    • Legal
    • Community Guidelines
  • 🚀Getting Started
    • Signup & Login
    • Password Recovery
    • Edit your profile
  • 📕Some Concepts
    • DittinLLM
    • Credits
  • 🤖Chatbot
    • Create Chatbot
    • Collection
    • Chatbot Interaction
      • Selecting a Chatbot
      • Start a New Chat/ Continue Chat
      • Restart Chat
      • Deleting a Chat
      • Share a public chat
      • Let Chatbot Continue Generating
      • Regenerate
      • Backtracing
      • Edit
      • Rename
  • ⚙️Chat Setting
    • Chat Setting
    • Message Instruction
    • Max AI Response Length(Premium)
    • User Profile
    • Memory Chip
  • AI IMAGE GENERATION
    • Generate AI Image
    • Gallery
    • Beginner's Guide to Prompting for Image Generation
  • 🏆Creator ecosystem
    • What is a creator?
    • Creator leaderboard
    • View creators
    • Follow creators
    • Block creators
    • Comment section
    • Notifications for creators
  • 📖Knowledge Base
    • Introduction
    • What is LLM?
    • What is LLM prompt?
    • What is LLM Token
    • LLM Context
Powered by GitBook
On this page
  1. Knowledge Base

What is LLM prompt?

PreviousWhat is LLM?NextWhat is LLM Token

Last updated 10 months ago

Simply put, the message you send to LLM is referred to as a prompt. Therefore, you can understand a prompt as a message. When LLM receives your prompt/message, it processes it and returns an output. In this document and on the Dittin AI website, the terms prompt, message, and input all refer to the same thing.

System Prompt

You may have noticed that on the page of Dittin AI, there is a System Prompt input box. So, what is the difference between your Message (User Prompt) and the System Prompt?

User Prompt and System Prompt have different roles and meanings when using LLM. The User Prompt is the input text provided by the user to guide LLM in generating a corresponding response or output. The User Prompt typically includes the user's questions, statements, or instructions, and it serves as the starting point for the user's conversation with LLM. The content and expression of the User Prompt can directly influence the generated response from LLM.

On the other hand, the System Prompt is additional text provided to LLM as context or background information. It can be used to set the topic of the conversation, simulate a particular role or context, or provide a previous conversation history. The purpose of the System Prompt is to influence the overall style, tone, or content of the generated response from LLM. By using different System Prompts, you can guide LLM to generate replies related to specific topics or characters.

In summary, the User Prompt is the primary input provided by the user, directly guiding LLM in generating a response. The System Prompt is supplementary information provided to set the background, style, or role of the conversation. When used together, they can influence the way LLM generates responses.

📖
Chatbot creation