📚
Docs - Float16
homeapp
  • 🚀GETTING STARTED
    • Introduction
    • Account
      • Dashboard
      • Profile
      • Payment
      • Workspace
      • Service Quota
    • LLM as a service
      • Quick Start
        • Set the credentials
      • Supported Model
      • Limitation
      • API Reference
    • One Click Deploy
      • Quick Start
        • Instance Detail
        • Re-generate API Key
        • Terminate Instance
      • Features
        • OpenAI Compatible
        • Long context and Auto scheduler
        • Quantization
        • Context caching
      • Limitation
      • Validated model
      • Endpoint Specification
    • Serverless GPU
      • Quick Start
        • Mode
        • Task Status
        • App Features
          • Project Detail
      • Tutorials
        • Hello World
        • Install new library
        • Prepare model weight
        • S3 Copy output from remote
        • R2 Copy output from remote
        • Direct upload and download
        • Server mode
        • LLM Dynamic Batching
        • Train and Inference MNIST
        • Etc.
      • CLI References
      • ❓FAQ
    • Playground
      • FloatChat
      • FloatPrompt
      • Quantize by Float16
  • 📚Use Case
    • Q&A Bot (RAG)
    • Text-to-SQL
    • OpenAI with Rate Limit
    • OpenAI with Guardrail
    • Multiple Agents
    • Q&A Chatbots (RAG + Agents)
  • ✳️Journey
    • ✨The Beginner's LLM Development Journey
    • 📖Glossary
      • [English Version] LLM Glossary
      • [ภาษาไทย] LLM Glossary
    • 🧠How to install node
  • Prompting
    • 📚Variable
    • ⛓️Condition
    • 🔨Demonstration
    • ⌛Loop
    • 📙Formatting
    • 🐣Chat
    • 🔎Technical term (Retrieve)
  • Privacy Policy
  • Terms & Conditions
Powered by GitBook
On this page
  • Intro
  • How it work ?
  • Prompt example
  1. Prompting

Chat

PreviousFormattingNextTechnical term (Retrieve)

Last updated 9 months ago

Intro

ChatGPT is one of the well-known use cases that draws our attention to the capabilities of LLMs.

How it work ?

The chat capability of an LLM is not a magic trick. It's based on a prompt template that segments user turns and assistant (LLM) turns and combines them. That's why an LLM can recall information from previous chats.

The system prompt can be an important part because it allows us to control the chat style and chat objectives.

SYSTEM PROMPT
You are an assistant and need to collect data from the user, including their name, age, and gender. You need to guide and help the user input the correct data.
USER PROMPT
What about pricing ?

A software developer could consider this ability to be a new type of capability for programming languages.

Prompt example

🐣
LogoBasic Prompt Chat #1FloatPrompt
Change previous number into another language
LogoBasic Prompt Chat #2FloatPrompt
Chatbot to collect information
LogoBasic Prompt Chat #3FloatPrompt
Chatbot to collect information and refused non-relevant task