![]() ![]() Use this model to predict the class of each image (for example, boat, car, bird). ![]() The model you use is an image-classification model based on the ResNet-50 architecture that has been trained on the ImageNet dataset and exported as a TensorFlow SavedModel. This subset consists of 100,000 images in JPG format for a total of 10 GB. ![]() The dataset in this example is the “ Challenge 2018/2019” subset of the Open Images V5 Dataset. This feature enables your TensorFlow model to make inferences directly on data in S3 and also save post-processed inferences to S3. You also see how to use the new pre– and post-processing feature of the Amazon SageMaker TFS container. The example in this post uses a TensorFlow Serving (TFS) container to do batch inference on a large dataset of images. In this post, you learn how to use Amazon SageMaker batch transform to perform inferences on large datasets. In the case of batch transform, it’s becoming increasingly necessary to perform fast, optimized batch inference on large datasets. Use batch transform to obtain inferences on an entire dataset stored in Amazon S3. ![]() Deploy your model to an endpoint to obtain real-time inferences from your model.After you’ve trained and exported a TensorFlow model, you can use Amazon SageMaker to perform inferences using your model. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |