请问一下如何存储比赛需求的infer模型
收藏
快速回复
请问一下如何存储比赛需求的infer模型
收藏
快速回复

请教一下各位大佬,我用的是paddlepaddle官方模型库里面的SSD模型,训练好了以后将model读出来再存成一个infer模型,但是模型一直无法被score.py正确读取,个人觉得应该是luid.io.save_inference_model函数中target_vars参数,feeded_var_names参数未设置正确,运行score脚本一直报输出size不匹配的错 ,想请教各位应该如何正确设置参数

0
收藏
回复
全部评论(5)
时间顺序
d
dreamaker_2010
#2 回复于2019-08

I went through the same issue. You need to do save_inference_model instead of save_persistables in the official model library of paddlepaddle SSD.

The target_vars and feeded_var_names can be obtained from build_program function.

I discussed this in length here ->

https://github.com/PaddlePaddle/Paddle/issues/19343

 

Hope this helps

0
回复
张呓语
#3 回复于2019-08

great appreciation for your reply. your suggestion works so well and i sucessfully build a model whose output fits score.py perfectly  . it works well when i trained a new model and save it , but i still meets some problem when i want to load the model which saved by fluid.io.save_persistables . i make sure that i use the same feeded_var_names and target_vars. 

0
回复
d
dreamaker_2010
#4 回复于2019-09
great appreciation for your reply. your suggestion works so well and i sucessfully build a model whose output fits score.py perfectly  . it works well when i trained a new model and save it , but i still meets some problem when i want to load the model which saved by fluid.io.save_persistables . i make sure that i use the same feeded_var_names and target_vars.  [图片]
展开

I am not sure I understood your exact concern. Anyway, here is a reply to what I think you are asking.

Whenever a model is saved using the function save_inference_model, it created a __model__ file, which dictates how the inference should run. score.py uses load_inference_model, which looks for this file. But, save_persistables does not create any such file. Thus, any model saved using save_persistables cannot be directly used for inference. That's why the model saved using save_persistables will not work with score.py

So you have 2 options now,

1. Train a new model and this time use save_inference_model instead of save_persistables

2. If you already have a trained model which you want to use, but it was saved using save_persistables, write a different code where you can load that model using load_persistables and then save it again using save_inference_model. Special case -> If this model is MobileNet-SSD, you can simply use the pretrained_model flag in paddlepaddle official code and then instead of save_persistables in the save_model function, use save_inference_model

Hope this helps.

0
回复
r
rose20135188
#5 回复于2019-09

看看,顺便学习有一下。

0
回复
l
luckyToMe2
#6 回复于2020-01

看看

0
回复
在@后输入用户全名并按空格结束,可艾特全站任一用户