You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: cohere/client.py
+15-32Lines changed: 15 additions & 32 deletions
Original file line number
Diff line number
Diff line change
@@ -227,7 +227,6 @@ def generate(
227
227
defchat(
228
228
self,
229
229
message: Optional[str] =None,
230
-
query: Optional[str] =None,
231
230
conversation_id: Optional[str] ="",
232
231
model: Optional[str] =None,
233
232
return_chatlog: Optional[bool] =False,
@@ -246,30 +245,33 @@ def chat(
246
245
"""Returns a Chat object with the query reply.
247
246
248
247
Args:
249
-
query (str): Deprecated. Use message instead.
250
248
message (str): The message to send to the chatbot.
251
-
conversation_id (str): (Optional) The conversation id to continue the conversation.
252
-
model (str): (Optional) The model to use for generating the next reply.
253
-
return_chatlog (bool): (Optional) Whether to return the chatlog.
254
-
return_prompt (bool): (Optional) Whether to return the prompt.
255
-
return_preamble (bool): (Optional) Whether to return the preamble.
256
-
chat_history (List[Dict[str, str]]): (Optional) A list of entries used to construct the conversation. If provided, these messages will be used to build the prompt and the conversation_id will be ignored so no data will be stored to maintain state.
257
-
preamble_override (str): (Optional) A string to override the preamble.
258
-
user_name (str): (Optional) A string to override the username.
259
-
temperature (float): (Optional) The temperature to use for the next reply. The higher the temperature, the more random the reply.
260
-
max_tokens (int): (Optional) The max tokens generated for the next reply.
249
+
261
250
stream (bool): Return streaming tokens.
251
+
conversation_id (str): (Optional) To store a conversation then create a conversation id and use it for every related request.
252
+
253
+
preamble_override (str): (Optional) A string to override the preamble.
254
+
chat_history (List[Dict[str, str]]): (Optional) A list of entries used to construct the conversation. If provided, these messages will be used to build the prompt and the conversation_id will be ignored so no data will be stored to maintain state.
255
+
256
+
model (str): (Optional) The model to use for generating the response.
257
+
temperature (float): (Optional) The temperature to use for the response. The higher the temperature, the more random the response.
262
258
p (float): (Optional) The nucleus sampling probability.
263
259
k (float): (Optional) The top-k sampling probability.
264
260
logit_bias (Dict[int, float]): (Optional) A dictionary of logit bias values to use for the next reply.
261
+
max_tokens (int): (Optional) The max tokens generated for the next reply.
262
+
263
+
return_chatlog (bool): (Optional) Whether to return the chatlog.
264
+
return_prompt (bool): (Optional) Whether to return the prompt.
265
+
return_preamble (bool): (Optional) Whether to return the preamble.
266
+
267
+
user_name (str): (Optional) A string to override the username.
265
268
Returns:
266
269
a Chat object if stream=False, or a StreamingChat object if stream=True
267
270
268
271
Examples:
269
272
A simple chat message:
270
273
>>> res = co.chat(message="Hey! How are you doing today?")
271
274
>>> print(res.text)
272
-
>>> print(res.conversation_id)
273
275
Continuing a session using a specific model:
274
276
>>> res = co.chat(
275
277
>>> message="Hey! How are you doing today?",
@@ -295,25 +297,6 @@ def chat(
295
297
>>> print(res.text)
296
298
>>> print(res.prompt)
297
299
"""
298
-
ifchat_historyisnotNone:
299
-
should_warn=True
300
-
forentryinchat_history:
301
-
if"text"inentry:
302
-
entry["message"] =entry["text"]
303
-
304
-
if"text"inentryandshould_warn:
305
-
logger.warning(
306
-
"The 'text' parameter is deprecated and will be removed in a future version of this function. "
307
-
+"Use 'message' instead.",
308
-
)
309
-
should_warn=False
310
-
311
-
ifqueryisnotNone:
312
-
logger.warning(
313
-
"The chat_history 'text' key is deprecated and will be removed in a future version of this function. "
0 commit comments