Traceback error when trying to get only answer as output (in OpenAI GPT-3.5 Turbo)

I am taking the Artificial Intelligence training from Bolt. As part of it I was trying to write a code that returned only the output message to a question and not the model and other specifications. I was asked to use this code: output=response[‘choices’][0][‘message’][‘content’] to define the output variable with only the answer. This is my code:
import openai
import os
import sys

q=input(“What is your question/instruction? \n”)
openai.api_key = os.environ[‘OPENAI_API_KEY’]

response = openai.chat.completions.create(
model=“gpt-3.5-turbo”,
messages=[{
“role”: “user”,
“content”: q
}])

output=response[‘choices’][0][‘message’][‘content’]
print (output)

But i am getting the following error in the output:
image

What am I doing wrong? Kindly help.

@huzaifa.bohori
Can you replace the output line in the same format as given below and check if you are able to get the output

output = response.choices[0].message.content

Thank you so much! It worked

1 Like

Still giving error

Hi @Rudrani.r24

Please change line no. 35 to : output = response['choices'][0]['message']['content']
Do try this and let us know if this solves the issue.

Also regarding the Repls which you had shared with bolt support team, all the 3 projects are working fine
telegram chatbot - bot token id was wrong. Replace this with your actual bot token id.
image generator - project is working fine. no changes required.
Recipe generator - Working fine, no changes required.

If you still face any issue with any of the projects then please feel free to get back to us

Again an issue

Hi @Rudrani.r24

The error you are facing is because of incorrect API key. Please use the latest API key that you had generated. If its still not solving the issue, you can share the Repl to support@boltiot.com

Thanks for rooting the issues.

1 Like

Hi @Rudrani.r24

I checked the repl and it is working fine. If you have any other queries, please feel free to reach out.