📚
Docs - Float16
homeapp
  • 🚀GETTING STARTED
    • Introduction
    • Account
      • Dashboard
      • Profile
      • Payment
      • Workspace
      • Service Quota
    • LLM as a service
      • Quick Start
        • Set the credentials
      • Supported Model
      • Limitation
      • API Reference
    • One Click Deploy
      • Quick Start
        • Instance Detail
        • Re-generate API Key
        • Terminate Instance
      • Features
        • OpenAI Compatible
        • Long context and Auto scheduler
        • Quantization
        • Context caching
      • Limitation
      • Validated model
      • Endpoint Specification
    • Serverless GPU
      • Quick Start
        • Mode
        • Task Status
        • App Features
          • Project Detail
      • Tutorials
        • Hello World
        • Install new library
        • Prepare model weight
        • S3 Copy output from remote
        • R2 Copy output from remote
        • Direct upload and download
        • Server mode
        • LLM Dynamic Batching
        • Train and Inference MNIST
        • Etc.
      • CLI References
      • ❓FAQ
    • Playground
      • FloatChat
      • FloatPrompt
      • Quantize by Float16
  • 📚Use Case
    • Q&A Bot (RAG)
    • Text-to-SQL
    • OpenAI with Rate Limit
    • OpenAI with Guardrail
    • Multiple Agents
    • Q&A Chatbots (RAG + Agents)
  • ✳️Journey
    • ✨The Beginner's LLM Development Journey
    • 📖Glossary
      • [English Version] LLM Glossary
      • [ภาษาไทย] LLM Glossary
    • 🧠How to install node
  • Prompting
    • 📚Variable
    • ⛓️Condition
    • 🔨Demonstration
    • ⌛Loop
    • 📙Formatting
    • 🐣Chat
    • 🔎Technical term (Retrieve)
  • Privacy Policy
  • Terms & Conditions
Powered by GitBook
On this page
  • Intro
  • How it work ?
  • Prompt example
  1. Prompting

Formatting

PreviousLoopNextChat

Last updated 9 months ago

Intro

Text formatting is a crucial ability of LLMs because LLMs can understand the text that we provide to them. LLMs can then format or arrange this text into another structure.

Text processing is not an easy task to handle if we rely solely on programming languages.

The emergence of LLMs can help us significantly with tasks involving text processing, including formatting.

How it work ?

By formatting the text, we can clarify the structure of the output as we desire.

If we do not specify the structure, the LLM will determine the most general output structure for all audiences, which may not be precise for domain-specific needs.

The best approach is to provide detailed information about the desired structure of the output.

"In a quiet village, a young girl named Mia discovered a hidden key in her grandmother's attic. Curious, she followed a map etched on the key, leading her to an ancient oak tree in the forest. As she turned the key in a concealed lock, a door opened, revealing a magical world filled with talking animals and shimmering rivers. Mia befriended a wise fox who guided her through enchanting adventures. When she returned home, she knew the magic was real, for the key glowed warmly in her hand, a reminder of the wonders just beyond the ordinary."

Format the text into 3 sectors.

A software developer could consider this ability to be a new type of capability for programming languages.

Prompt example

📙
LogoBasic Prompt Formatting #1FloatPrompt
CSV to JSON
LogoBasic Prompt Formatting #4FloatPrompt
Formatting text into 3 sectors
LogoBasic Prompt Formatting #2FloatPrompt
JSON to case report
LogoBasic Prompt Formatting #3FloatPrompt
User requirement to functional and non-functional requirement