why streaming is not supported for titan and cohere model on bedrock

0

Here is my code that enable streaming tokens, it works for LLaMa-2 but not for titan, cohere. Are these model not support streaming yet?


accept = '*/*'
contentType = 'application/json'
response = brt.invoke_model_with_response_stream(
    modelId='amazon.titan-text-express-v1', 
    body=body,
    accept=accept, 
    contentType=contentType
)
 

stream = response.get('body')
if stream:
    for event in stream:
        chunk = event.get('chunk')
        if chunk:
            print(json.loads(chunk.get('bytes').decode()))
JCJJ
asked 6 months ago493 views
1 Answer
0

Cohere models support streaming, but you need to pass the "stream" : true parameter in the JSON body to enable streaming, in addition to using the invoke_model_with_response_stream API.

You can try the following code:

body = json.dumps({
                "prompt": prompt,
                "max_tokens": max_token_count,
                "temperature": temperature,
                "p": top_p,
                "stop_sequences": stop_sequences,
                "stream": True,
            })

response = brt.invoke_model_with_response_stream(
    modelId='cohere.command-text-v14', 
    body=body,
    accept=accept, 
    contentType=contentType
)

stream = response.get('body')
if stream:
    for event in stream:
        chunk = event.get('chunk')
        if chunk:
            print(json.loads(chunk.get('bytes').decode()))
AWS
EXPERT
answered 6 months ago
profile picture
EXPERT
reviewed 2 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions