【1.23-1.24 | MindSpore第五期两日集训营】实践作业6之幻想成真
作业内容:实现使用MindSpore Serving部署Resnet50推理服务并访问
参考网页:https://www.mindspore.cn/tutorial/inference/zh-CN/master/serving_example.html#
鉴于 Serving的实践作业 5是失败的。。。https://bbs.huaweicloud.com/forum/thread-104719-1-1.html
所以以下过程纯粹出于张小白的幻想。等到什么时候张小白装成功了ascend版本的serving,或者华为大大们开发出来了cpu版本的serving。这些任务也许就能顺利完成了。
可惜不能跟惊喜礼物相见,但是大家可以看到我的惊喜模拟作业。。。也算是过节前的一次惊喜了。
(一)环境准备 安装MindSpore和MindSpore Serving(for Ascend)
1.先完成MindSpore for Ascend安装(!!!)
2.再完成 MindSpore Serving for Ascend安装。
3.验证serving的安装结果
4.配置Serving环境变量
以上的步骤请参见 作业5 https://bbs.huaweicloud.com/forum/thread-104719-1-1.html 45楼的内容,因为这都没有经过认证,要错一起错。要对一起对。
(二)模型生成
这个其实可以参见作业2,好像那边已经做好了。生成的MINDIR文件为:resnet-90_18810.mindir
当然,这是张小白的毒蘑菇模型,并不是猫狗的那种。。。那简单,我们拿毒蘑菇的照片来推理就行了吧。。。或者下载个猫狗模型的ckpt文件重新convert成MINDIR格式也行。
(三)定制模型方法
先仿照add和subtract建立如下目录:
test_dir
├── resnet50/
│ └── servable_config.py
│ └── 1/
│ └── resnet-90_18810.mindir(根据实际名称调整)
├── test_image/
└── master_with_worker.py
│── client.py
servable_config.py
# Copyright 2020 Huawei Technologies Co., Ltd # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # ============================================================================ """Resnet50 ImageNet config python file""" import os import ast import numpy as np import mindspore.dataset as ds import mindspore.dataset.transforms.c_transforms as TC import mindspore.dataset.vision.c_transforms as VC from mindspore_serving.worker import register cur_dir = os.path.abspath(os.path.dirname(__file__)) print("current dir:", cur_dir) with open(os.path.join(cur_dir, "imagenet1000_clsidx_to_labels.txt"), "r") as fp: idx_2_label = ast.literal_eval(fp.read()) idx_2_label[1000] = "empty" def preprocess_eager(image): """ Define preprocess, input is image numpy, return preprocess result. Return type can be numpy, str, bytes, int, float, or bool. Use MindData Eager, this image processing can also use other image processing library, likes numpy, PIL or cv2 etc. """ image_size = 224 mean = [0.485 * 255, 0.456 * 255, 0.406 * 255] std = [0.229 * 255, 0.224 * 255, 0.225 * 255] decode = VC.Decode() resize = VC.Resize([image_size, image_size]) normalize = VC.Normalize(mean=mean, std=std) hwc2chw = VC.HWC2CHW() image = decode(image) image = resize(image) image = normalize(image) image = hwc2chw(image) return image def preprocess_pipeline(instances): """ Define preprocess pipeline, the function arg is multi instances, every instance is tuple of inputs. This example has one input and one output. Use MindData Pipeline. """ def generator_func(): for instance in instances: image = instance[0] yield (image,) resnet_ds = ds.GeneratorDataset(generator_func, ["image"], shuffle=False) image_size = 224 mean = [0.485 * 255, 0.456 * 255, 0.406 * 255] std = [0.229 * 255, 0.224 * 255, 0.225 * 255] resnet_ds = resnet_ds.map(operations=VC.Decode(), input_columns="image", num_parallel_workers=8) trans = [ VC.Resize([image_size, image_size]), VC.Normalize(mean=mean, std=std), VC.HWC2CHW() ] resnet_ds = resnet_ds.map(operations=TC.Compose(trans), input_columns="image", num_parallel_workers=2) for data in resnet_ds.create_dict_iterator(): image_result = data["image"] yield (image_result,) def postprocess_top1(score): """ Define postprocess. This example has one input and one output. The input is the numpy tensor of the score, and the output is the label str of top one. """ max_idx = np.argmax(score) return idx_2_label[max_idx] def postprocess_top5(score): """ Define postprocess. This example has one input and two outputs. The input is the numpy tensor of the score. The first output is the str joined by labels of top five, and the second output is the score tensor of the top five. """ idx = np.argsort(score)[::-1][:5] # top 5 ret_label = [idx_2_label for i in idx] ret_score = score[idx] return ";".join(ret_label), ret_score register.declare_servable(servable_file="resnet50_1b_imagenet.mindir", model_format="MindIR") @register.register_method(output_names=["label"]) def classify_top1(image): """Define method `classify_top1` for servable `resnet50`. The input is `image` and the output is `lable`.""" x = register.call_preprocess_pipeline(preprocess_pipeline, image) x = register.call_servable(x) x = register.call_postprocess(postprocess_top1, x) return x @register.register_method(output_names=["label"]) def classify_top1_v1(image): """Define method `classify_top1_v1` for servable `resnet50`. The input is `image` and the output is `label`. """ x = register.call_preprocess(preprocess_eager, image) x = register.call_servable(x) x = register.call_postprocess(postprocess_top1, x) return x @register.register_method(output_names=["label", "score"]) def classify_top5(image): """Define method `classify_top5` for servable `resnet50`. The input is `image` and the output is `label` and `score`. """ x = register.call_preprocess_pipeline(preprocess_pipeline, image) x = register.call_servable(x) label, score = register.call_postprocess(postprocess_top5, x) return label, score
例子代码里面用到了一个 imagenet1000_clsidx_to_labels.txt的文件,打开来看了一下,好像是一个种类和具体名称的对照表。
{0: 'tench, Tinca tinca', 1: 'goldfish, Carassius auratus', 2: 'great white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias', 3: 'tiger shark, Galeocerdo cuvieri', 4: 'hammerhead, hammerhead shark', 。。。 996: 'hen-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa', 997: 'bolete', 998: 'ear, spike, capitulum', 999: 'toilet tissue, toilet paper, bathroom tissue'}
(四)部署服务
执行 master_with_worker.py
# Copyright 2020 Huawei Technologies Co., Ltd # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # ============================================================================ """Start Servable resnet50""" import os from mindspore_serving import master from mindspore_serving import worker def start(): servable_dir = os.path.abspath(".") worker.start_servable_in_master(servable_dir, "resnet50", device_id=0) master.start_grpc_server("127.0.0.1", 5500) master.start_restful_server("127.0.0.1", 1500) if __name__ == "__main__": start()
(五)高效执行推理
执行 client.py
# Copyright 2020 Huawei Technologies Co., Ltd # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # ============================================================================ """Client for resnet50""" import os from mindspore_serving.client import Client def read_images(): """Read images for directory test_image""" images_buffer = [] for path, _, file_list in os.walk("./test_image/"): for file_name in file_list: image_file = os.path.join(path, file_name) print(image_file) with open(image_file, "rb") as fp: images_buffer.append(fp.read()) return images_buffer def run_classify_top1(): """Client for servable resnet50 and method classify_top1""" print("run_classify_top1-----------") client = Client("localhost", 5500, "resnet50", "classify_top1") instances = [] for image in read_images(): instances.append({"image": image}) result = client.infer(instances) print(result) def run_classify_top1_v1(): """Client for servable resnet50 and method classify_top1_v1""" print("run_classify_top1_v1-----------") client = Client("localhost", 5500, "resnet50", "classify_top1_v1") instances = [] for image in read_images(): instances.append({"image": image}) result = client.infer(instances) print(result) def run_classify_top5(): """Client for servable resnet50 and method classify_top5""" print("run_classify_top5-----------") client = Client("localhost", 5500, "resnet50", "classify_top5") instances = [] for image in read_images(): # read multi image instances.append({"image": image}) # input `image` result = client.infer(instances) print(result) for result_item in result: # result for every image label = result_item["label"] # result `label` score = result_item["score"] # result `score` print("label result:", label) print("score result:", score) def run_classify_top5_async(): """Client for servable resnet50 and method classify_top5""" print("run_classify_top5_async-----------") client = Client("localhost", 5500, "resnet50", "classify_top5") instances = [] for image in read_images(): # read multi image instances.append({"image": image}) # input `image` result_future = client.infer_async(instances) result = result_future.result() print(result) for result_item in result: # result for every image label = result_item["label"] # result `label` score = result_item["score"] # result `score` print("label result:", label) print("score result:", score) def run_restful_classify_top1(): """RESTful Client for servable resnet50 and method classify_top1""" print("run_restful_classify_top1-----------") import base64 import requests import json instances = [] for image in read_images(): base64_data = base64.b64encode(image).decode() instances.append({"image": {"b64": base64_data}}) instances_map = {"instances": instances} post_payload = json.dumps(instances_map) ip = "localhost" restful_port = 1500 servable_name = "resnet50" method_name = "classify_top1" result = requests.post(f"http://{ip}:{restful_port}/model/{servable_name}:{method_name}", data=post_payload) print(result.text) if __name__ == '__main__': run_classify_top1() run_classify_top1_v1() run_classify_top5() run_restful_classify_top1() run_classify_top5_async()
当然了。张小白估计模型的使用可能也会有问题。这一切都得等MindSpore for Ascend OK才能说了。
所以说一切都是幻想,只有春节是真的。