admin管理员组

文章数量:1122832

I am using a LLM model through an API and it passes the response.

Here I need to read the response(generator obj) variable into a Dataframe as:

Sno, Name, parent name, Hno, age, and enrollment

It works well when I am using print but it doesn't write the full line. When I try to pass response.content to a list then it gives a few characters by characters.

Could you suggest how to read the 'response' generator variable to a Dataframe[Sno, Name, Parentname, Hno, age, gender, Enrollment]?

Something like:

Here is what I tried:

stream = client.chatpletions.create(
model="meta-llama/Llama-3.2-11B-Vision-Instruct-Turbo",
messages= msg,
temperature=0.4,
top_p=0.7,
top_k=50,
repetition_penalty=1,
stop=["<|eot_id|>","<|eom_id|>"], stream=True)

ls_text = []
for chunk in stream:
    ls_text.append(chunk.choices[0].delta.content)
    print(chunk.choices[0].delta.content or "", end="", flush=True)

I am using a LLM model through an API and it passes the response.

Here I need to read the response(generator obj) variable into a Dataframe as:

Sno, Name, parent name, Hno, age, and enrollment

It works well when I am using print but it doesn't write the full line. When I try to pass response.content to a list then it gives a few characters by characters.

Could you suggest how to read the 'response' generator variable to a Dataframe[Sno, Name, Parentname, Hno, age, gender, Enrollment]?

Something like:

Here is what I tried:

stream = client.chat.completions.create(
model="meta-llama/Llama-3.2-11B-Vision-Instruct-Turbo",
messages= msg,
temperature=0.4,
top_p=0.7,
top_k=50,
repetition_penalty=1,
stop=["<|eot_id|>","<|eom_id|>"], stream=True)

ls_text = []
for chunk in stream:
    ls_text.append(chunk.choices[0].delta.content)
    print(chunk.choices[0].delta.content or "", end="", flush=True)
Share Improve this question edited yesterday JonSG 12.9k2 gold badges30 silver badges44 bronze badges asked yesterday SekarSekar 378 bronze badges 2
  • 2 Why don't you ask the LLM in the prompt to gives you the result in an easier format for Pandas like csv or json? – rehaqds Commented yesterday
  • 1 Thanks @rehaqds . I converted to json using LLM itself. – Sekar Commented yesterday
Add a comment  | 

1 Answer 1

Reset to default 0
`enter code here`# Process streaming response
response_text = ""
for chunk in stream:
    if hasattr(chunk, 'choices') and chunk.choices:
        if hasattr(chunk.choices[0], 'delta'):
            if hasattr(chunk.choices[0].delta, 'content'):
                content = chunk.choices[0].delta.content
                if content is not None:
                        print(content, end="", flush=True)
                        response_text += content
print(response_text)

本文标签: PythonConvert generator object to a dataframeStack Overflow