why streaming is not supported for titan and cohere model on bedrock

0

Here is my code that enable streaming tokens, it works for LLaMa-2 but not for titan, cohere. Are these model not support streaming yet?


accept = '*/*'
contentType = 'application/json'
response = brt.invoke_model_with_response_stream(
    modelId='amazon.titan-text-express-v1', 
    body=body,
    accept=accept, 
    contentType=contentType
)
 

stream = response.get('body')
if stream:
    for event in stream:
        chunk = event.get('chunk')
        if chunk:
            print(json.loads(chunk.get('bytes').decode()))
JCJJ
已提问 6 个月前537 查看次数
1 回答
0

Cohere models support streaming, but you need to pass the "stream" : true parameter in the JSON body to enable streaming, in addition to using the invoke_model_with_response_stream API.

You can try the following code:

body = json.dumps({
                "prompt": prompt,
                "max_tokens": max_token_count,
                "temperature": temperature,
                "p": top_p,
                "stop_sequences": stop_sequences,
                "stream": True,
            })

response = brt.invoke_model_with_response_stream(
    modelId='cohere.command-text-v14', 
    body=body,
    accept=accept, 
    contentType=contentType
)

stream = response.get('body')
if stream:
    for event in stream:
        chunk = event.get('chunk')
        if chunk:
            print(json.loads(chunk.get('bytes').decode()))
AWS
专家
已回答 6 个月前
profile picture
专家
已审核 2 个月前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则