๐ GETTING STARTEDLLM as a service Quick Start LLM as a service quick start
Setting up API Key
After accessing LLM as a service, you need to set up your API key. Learn how to set your API key here .
Quickly test API
To quickly try the API using cURL, use the following command:
Copy curl -X POST https://api.float16.cloud/v1/chat/completions -d
'{
"model": "seallm-7b-v3",
"messages": [
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "เธชเธงเธฑเธชเธเธต"
}
]
}'
-H "Content-Type: application/json"
-H "Authorization: Bearer <float16-api-key>"
Paste this in your terminal to see the response.
Using the Chat API
Our API is compatible with OpenAI, allowing integration with your chat UI using OpenAI or LangChain libraries.
OpenAI
Install the OpenAI package:
Use this Python code snippet (example using SeaLLM-7B-v2.5 model):
Copy import httpx
import openai
FLOAT16_BASE_URL = "https://api.float16.cloud/v1/"
FLOAT16_API_KEY = "<your API key>"
client = openai . OpenAI (
api_key = FLOAT16_API_KEY,
base_url = FLOAT16_BASE_URL,
)
client . _base_url = httpx . URL (FLOAT16_BASE_URL)
# Streaming chat:
messages = [ { "role" : "system" , "content" : "You are truly awesome." } ]
while True :
content = input ( f "User:" )
messages . append ({ "role" : "user" , "content" : content})
print ( f "Assistant:" , sep = "" , end = "" , flush = True )
content = ""
for chunk in client . chat . completions . create (
messages = messages,
model = "seallm-7b-v3" ,
stream = True ,
):
delta_content = chunk . choices [ 0 ]. delta . content
if delta_content :
print (delta_content, sep = "" , end = "" , flush = True )
content += delta_content
messages . append ({ "role" : "assistant" , "content" : content})
print ( "\n" )
For more information on the OpenAI library, visit the OpenAI docs .
LangChain
To use Float16.cloud with the LangChain, follow these steps:
Install the LangChain package:
Copy pip install langchain langchain_community
or
Copy conda install langchain langchain_community -c conda-forge
Use this Python code snippet (example using SeaLLM-7B-v2.5 model):
Copy from langchain_community . chat_models import ChatOpenAI
from langchain . schema import HumanMessage
FLOAT16_BASE_URL = "https://api.float16.cloud/v1/"
FLOAT16_API_KEY = "<your API key>"
chat = ChatOpenAI (
model = "seallm-7b-v3" ,
api_key = FLOAT16_API_KEY,
base_url = FLOAT16_BASE_URL,
streaming = True ,
)
# Simple invocation:
print (chat. invoke ([ HumanMessage (content = "Hello" )]))
# Streaming invocation:
for chunk in chat . stream ( "Write me a blog about how to start to raise cats" ):
print (chunk.content, end = "" , flush = True )
For more information on the LangChain library, visit the LangChain docs .
Last updated 4 months ago