why streaming is not supported for titan and cohere model on bedrock

0

Here is my code that enable streaming tokens, it works for LLaMa-2 but not for titan, cohere. Are these model not support streaming yet?


accept = '*/*'
contentType = 'application/json'
response = brt.invoke_model_with_response_stream(
    modelId='amazon.titan-text-express-v1', 
    body=body,
    accept=accept, 
    contentType=contentType
)
 

stream = response.get('body')
if stream:
    for event in stream:
        chunk = event.get('chunk')
        if chunk:
            print(json.loads(chunk.get('bytes').decode()))
JCJJ
posta 6 mesi fa537 visualizzazioni
1 Risposta
0

Cohere models support streaming, but you need to pass the "stream" : true parameter in the JSON body to enable streaming, in addition to using the invoke_model_with_response_stream API.

You can try the following code:

body = json.dumps({
                "prompt": prompt,
                "max_tokens": max_token_count,
                "temperature": temperature,
                "p": top_p,
                "stop_sequences": stop_sequences,
                "stream": True,
            })

response = brt.invoke_model_with_response_stream(
    modelId='cohere.command-text-v14', 
    body=body,
    accept=accept, 
    contentType=contentType
)

stream = response.get('body')
if stream:
    for event in stream:
        chunk = event.get('chunk')
        if chunk:
            print(json.loads(chunk.get('bytes').decode()))
AWS
ESPERTO
con risposta 6 mesi fa
profile picture
ESPERTO
verificato 2 mesi fa

Accesso non effettuato. Accedi per postare una risposta.

Una buona risposta soddisfa chiaramente la domanda, fornisce un feedback costruttivo e incoraggia la crescita professionale del richiedente.

Linee guida per rispondere alle domande