This endpoint is the same as the openAI API, but it uses context to improve the results. See https://platform.openai.com/docs/api-reference/chat/create.

Context Control

The only difference in the request params is that you can manage matched context items with max_matches, max_total_matches_tokens, max_single_match_tokens and min_match_relevancy_score

Filter context using tags

You can use the tags and tags_exlude params to only find context item matches that have/don't have the specifiec tags.

Context Placement

By default the relevant context will be added as a system user message. If you need more fine grained control you can include the string [context] within any of the messages and the context will be add inline.

Context Items Prompt

By default Godly will use the content of the last message to retrieve relevant context items. However to improve accuracy you can pass the original request with the context_search_prompt param and Godly will use that for context item search.

Response

The only difference to OpenAI in the response is that it returns matches which is an array of context items that were applied. You can use this for debugging or improving the user experience by providing reference responses.

Language
Authorization
Bearer
Click Try It! to start a request and see the response here!