Communities

Writing
Writing
Codidact Meta
Codidact Meta
The Great Outdoors
The Great Outdoors
Photography & Video
Photography & Video
Scientific Speculation
Scientific Speculation
Cooking
Cooking
Electrical Engineering
Electrical Engineering
Judaism
Judaism
Languages & Linguistics
Languages & Linguistics
Software Development
Software Development
Mathematics
Mathematics
Christianity
Christianity
Code Golf
Code Golf
Music
Music
Physics
Physics
Linux Systems
Linux Systems
Power Users
Power Users
Tabletop RPGs
Tabletop RPGs
Community Proposals
Community Proposals
tag:snake search within a tag
answers:0 unanswered questions
user:xxxx search by author id
score:0.5 posts with 0.5+ score
"snake oil" exact phrase
votes:4 posts with 4+ votes
created:<1w created < 1 week ago
post_type:xxxx type of post
Search help
Notifications
Mark all as read See all your notifications »
Challenges

HOW TO HANDLE THIS ERROR: "Failed to get input map for signature: serving_default" [closed]

+0
−5

Closed as off topic by Mithical‭ on Aug 4, 2023 at 11:52

This question is not within the scope of Code Golf.

This question was closed; new answers can no longer be added. Users with the reopen privilege may vote to reopen this question if it has been improved or closed incorrectly.

import requests import json import numpy as np import base64 import cv2

Replace this with the actual image path you want to test

image_path = 'H_L_.jpg'

Read and preprocess the image

image = cv2.imread(image_path) image = cv2.resize(image, (256, 256)) image = image.astype(np.float32) / 255.0 image = np.expand_dims(image, axis=0)

Convert the NumPy array to bytes before encoding

encoded_image = base64.b64encode(image.tobytes()).decode('utf-8')

Prepare the JSON request with the signature name

data = { "signature_name": "serving_default", "instances": [{"input_1": encoded_image}] # Adjust the input key based on your model's signature }

Replace these labels with your actual labels

labels = ['Potato___Early_blight', 'Potato___Late_blight', 'Potato___healthy']

Send the inference request to TensorFlow Serving

url = 'http://localhost:8501/v1/models/model:predict' # Replace 'model' with the actual model name and version headers = {"content-type": "application/json"} response = requests.post(url, data=json.dumps(data), headers=headers)

Process the response

if response.status_code == 200: predictions = response.json()['predictions'][0] predicted_class_idx = np.argmax(predictions) predicted_label = labels[predicted_class_idx] print("Predicted Label:", predicted_label) print("Class Probabilities:", predictions) else: print("Error: Unable to get predictions. Status code:", response.status_code) print("Response content:", response.content)

History
Why does this post require attention from curators or moderators?
You might want to add some details to your flag.
Why should this post be closed?

1 comment thread

Advice for posting (1 comment)

0 answers