All Content tagged with TensorFlow on AWS

Quickly and easily get started with deep learning in the cloud

Content language: English

Select tags to filter
Sort by most recent
20 results
I have been working on building a QML model for the classification of an image dataset, which is approximately 194 MB in size. While executing the code on the AWS Braket Jupyter instance, I encountere...
2
answers
0
votes
819
views
asked 6 months ago
Hi AWS team, I am currently running the Greengrass qualification test. But the TensorFlowLiteImageClassification test item has consistently failed to pass. My OS is Debian on an aarch64 architecture...
3
answers
0
votes
115
views
asked 6 months ago
How can you deploy a TensorFlow model to an async endpoint in Amazon SageMaker while including an inference.py script?
2
answers
0
votes
176
views
AWS
asked 6 months ago
When I attempt to deploy a model to an endpoint with initial_instance_count=1, it works fine. Attempting to deploy the same model to an endpoint with initial_instance_count=2, however, always results ...
1
answers
0
votes
105
views
asked 8 months ago
Hi, Is there more documentation/examples for *TensorFlow* on Trn1/Trn1n instances? Documentation at: [https://awsdocs-neuron.readthedocs-hosted.com/en/latest/frameworks/tensorflow/index.html]() ha...
3
answers
0
votes
638
views
asked a year ago
I'm considering launching an instance to work on one of my TensorFlow models since my current PC doesn't perform efficiently. My PC has 32GB of RAM, a 20CPU i7 processor, and an RTX 3050Ti 20GB GPU. I...
1
answers
0
votes
718
views
asked 2 years ago
Hello everyone! I have this problem, where I'm trying to deploy an emotion recognition model (format: `model.h5`) # keras model But I have tried a couple of ways but it isn't working out for me. I tri...
0
answers
0
votes
150
views
asked 2 years ago
I have start the docker API for loading of model. Container is running successfully but in browser it is showing --> This site can’t be reached (http://13.189.100.89:8051/v1/models/my_model) refused ...
0
answers
0
votes
202
views
asked 2 years ago
I trained a deep neural network using sagemaker and deployed the endpoint successfully. But, when trying to get the predictions on test data, i am getting "SSLError: SSL validation failed for https://...
1
answers
0
votes
4.6K
views
asked 2 years ago
Can a customized DLAMI be used for both GPU and CPU backed instances? When creating GPU compatible AMI using https://aws.amazon.com/releasenotes/aws-deep-learning-ami-ubuntu-18-04/ they don't work on ...
1
answers
0
votes
604
views
AWS
asked 2 years ago
Hi Team, We have trained our Ml model using keras framework (version =2.11.0) which is used for prediction, and we want to run this model on an edge device (linux) so we made use of aws sagemaker edg...
0
answers
0
votes
87
views
asked 2 years ago
Im using sagemaker for train the data It has pre-trained model “tensorflow-od1-ssd-resnet50-v1-fpn-640x640-coco17-tpu-8” **Create the SageMaker model instance. Note that we need to pass Predictor cl...
0
answers
0
votes
151
views
asked 2 years ago
  • 1
  • 2
  • 12 / page