Llm
LLMSDK
Bases: PerplexityConfig
, OpenAIConfig
Methods:
Name | Description |
---|---|
get_search_result |
|
get_oai_reply |
|
get_oai_reply_stream |
|
model
model: ChatModel = Field(
default="gpt-4.1",
title="LLM Model Selection",
description="This model should be OpenAI Model.",
frozen=False,
deprecated=False,
)
system_prompt
system_prompt: str = Field(
default="\n 角色定位:我被設定為一個知識豐富、語氣專業但親切的助手,目的是幫你解決問題、提供準確資訊,或一起創作內容。\n 行為準則:我會避免給出虛假、自相矛盾或無依據的答案,並且如果我不知道某件事,我會直接說明或幫你找答案。\n 互動風格:我應該簡潔、直接,有需要時會主動提出追問幫你釐清目標,特別是技術或寫作相關的任務。\n ",
title="System Prompt",
description="This is the system prompt for the LLM.",
frozen=False,
deprecated=False,
)
api_type
api_type: str = Field(
default="openai",
description="The api type from openai for calling models.",
examples=["openai", "azure"],
validation_alias=AliasChoices("OPENAI_API_TYPE"),
frozen=False,
deprecated=False,
)
base_url
base_url: str = Field(
...,
description="The base url from openai for calling models.",
examples=["https://api.openai.com/v1", "https://xxxx.openai.azure.com"],
validation_alias=AliasChoices("OPENAI_BASE_URL", "AZURE_OPENAI_ENDPOINT"),
frozen=False,
deprecated=False,
)
api_key
api_key: str = Field(
...,
description="The api key from openai for calling models.",
examples=["sk-proj-...", "141698ac..."],
validation_alias=AliasChoices("OPENAI_API_KEY", "AZURE_OPENAI_API_KEY"),
frozen=False,
deprecated=False,
)
api_version
api_version: str = Field(
default="2025-04-01-preview",
description="The api version from openai for calling models.",
examples=["2025-04-01-preview"],
validation_alias=AliasChoices("OPENAI_API_VERSION"),
frozen=False,
deprecated=False,
)
pplx_api_key
pplx_api_key: str = Field(
...,
description="The api key from perplexity for calling models.",
examples=["pplx-..."],
validation_alias=AliasChoices("PERPLEXITY_API_KEY"),
frozen=False,
deprecated=False,
)
get_search_result
Source code in src/sdk/llm.py
get_oai_reply
Source code in src/sdk/llm.py
get_oai_reply_stream
get_oai_reply_stream(
prompt: str, image_urls: Optional[list[str]] = None
) -> AsyncGenerator[ChatCompletionChunk, None]