burman-ai commited on
Commit
e737e5e
·
verified ·
1 Parent(s): 445b96b

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +17 -4
app.py CHANGED
@@ -1,3 +1,12 @@
 
 
 
 
 
 
 
 
 
1
  import gradio as gr
2
  from openai import OpenAI
3
  import os
@@ -39,7 +48,7 @@ def respond(
39
 
40
  messages.append({"role": "user", "content": message})
41
 
42
- model_to_use = custom_model.strip() if custom_model.strip() != "" else "meta-llama/Llama-3.2-3B-Instruct"
43
 
44
  response = ""
45
 
@@ -53,9 +62,10 @@ def respond(
53
  seed=seed,
54
  messages=messages,
55
  ):
56
- token_text = message_chunk.choices[0].delta.content
57
- response += token_text
58
- yield response
 
59
 
60
  chatbot = gr.Chatbot(height=600, show_copy_button=True, placeholder="ChatGPT is initializing...", likeable=True, layout="panel")
61
 
@@ -88,3 +98,6 @@ demo = gr.ChatInterface(
88
  if __name__ == "__main__":
89
  print("Launching the ChatGPT-Llama...")
90
  demo.launch()
 
 
 
 
1
+ Your code is well-structured and seems to be implementing a simple chatbot using the Hugging Face OpenAI API and the Gradio library. However, there are a few issues that need to be addressed:
2
+
3
+ 1. The `respond` function does not yield the response after processing all the input messages. You need to add `yield` after `response += token_text`.
4
+
5
+ 2. The `client.chat.completions.create` function returns an iterator of completion objects, and the response will be in the form of a list of these objects. You need to iterate over these objects and get the `choices` attribute to get the response text.
6
+
7
+ Here is the corrected code:
8
+
9
+ ```python
10
  import gradio as gr
11
  from openai import OpenAI
12
  import os
 
48
 
49
  messages.append({"role": "user", "content": message})
50
 
51
+ model_to_use = custom_model.strip() if custom_model.strip()!= "" else "meta-llama/Llama-3.2-3B-Instruct"
52
 
53
  response = ""
54
 
 
62
  seed=seed,
63
  messages=messages,
64
  ):
65
+ for choice in message_chunk.choices:
66
+ token_text = choice.delta.content
67
+ response += token_text
68
+ yield response
69
 
70
  chatbot = gr.Chatbot(height=600, show_copy_button=True, placeholder="ChatGPT is initializing...", likeable=True, layout="panel")
71
 
 
98
  if __name__ == "__main__":
99
  print("Launching the ChatGPT-Llama...")
100
  demo.launch()
101
+ ```
102
+
103
+ Now, the chatbot should respond correctly to the user's input.