首页 Paddle Serving 帖子详情
求指点,运行的示例报错。
收藏
快速回复
Paddle Serving 问答Serving服务化部署 1100 2
求指点,运行的示例报错。
收藏
快速回复
Paddle Serving 问答Serving服务化部署 1100 2

运行的github上的paddleServing项目中的预测房价的示例,结果这样报错,请指点。

http服务请求是这样的:

结果docker中报这样的错:

python3 -m paddle_serving_server.serve --model uci_housing_model --thread 10 --port 9494 --name uci
/usr/local/lib/python3.6/runpy.py:125: RuntimeWarning: 'paddle_serving_server.serve' found in sys.modules after import of package 'paddle_serving_server', but prior to execution of 'paddle_serving_server.serve'; this may result in unpredictable behaviour
warn(RuntimeWarning(msg))
This API will be deprecated later. Please do not use it
This API will be deprecated later. Please do not use it
web service address:
http://172.17.0.4:9494/uci/prediction
Going to Run Comand
/usr/local/lib/python3.6/site-packages/paddle_serving_server/serving-cpu-avx-openblas-0.6.0/serving -enable_model_toolkit -inferservice_path workdir -inferservice_file infer_service.prototxt -max_concurrency 0 -num_threads 10 -port 12000 -precision fp32 -use_calib False -reload_interval_s 10 -resource_path workdir -resource_file resource.prototxt -workflow_path workdir -workflow_file workflow.prototxt -bthread_concurrency 10 -max_body_size 67108864
* Serving Flask app "serve" (lazy loading)
* Environment: production
WARNING: This is a development server. Do not use it in a production deployment.
Use a production WSGI server instead.
* Debug mode: off
I0100 00:00:00.000000 343 op_repository.h:68] RAW: Succ regist op: GeneralCopyOp
I0100 00:00:00.000000 343 op_repository.h:68] RAW: Succ regist op: GeneralDistKVInferOp
I0100 00:00:00.000000 343 op_repository.h:68] RAW: Succ regist op: GeneralDistKVQuantInferOp
I0100 00:00:00.000000 343 op_repository.h:68] RAW: Succ regist op: GeneralInferOp
I0100 00:00:00.000000 343 op_repository.h:68] RAW: Succ regist op: GeneralReaderOp
I0100 00:00:00.000000 343 op_repository.h:68] RAW: Succ regist op: GeneralResponseOp
I0100 00:00:00.000000 343 op_repository.h:68] RAW: Succ regist op: GeneralTextReaderOp
I0100 00:00:00.000000 343 op_repository.h:68] RAW: Succ regist op: GeneralTextResponseOp
I0100 00:00:00.000000 343 service_manager.h:79] RAW: Service[LoadGeneralModelService] insert successfully!
I0100 00:00:00.000000 343 load_general_model_service.pb.h:333] RAW: Success regist service[LoadGeneralModelService][PN5baidu14paddle_serving9predictor26load_general_model_service27LoadGeneralModelServiceImplE]
I0100 00:00:00.000000 343 service_manager.h:79] RAW: Service[GeneralModelService] insert successfully!
I0100 00:00:00.000000 343 general_model_service.pb.h:1507] RAW: Success regist service[GeneralModelService][PN5baidu14paddle_serving9predictor13general_model23GeneralModelServiceImplE]
I0100 00:00:00.000000 343 factory.h:155] RAW: Succ insert one factory, tag: PADDLE_INFER, base type N5baidu14paddle_serving9predictor11InferEngineE
W0100 00:00:00.000000 343 paddle_engine.cpp:29] RAW: Succ regist factory: ::baidu::paddle_serving::predictor::FluidInferEngine->::baidu::paddle_serving::predictor::InferEngine, tag: PADDLE_INFER in macro!
--- Running analysis [ir_graph_build_pass]
--- Running analysis [ir_graph_clean_pass]
--- Running analysis [ir_analysis_pass]
--- Running analysis [ir_params_sync_among_devices_pass]
--- Running analysis [adjust_cudnn_workspace_size_pass]
--- Running analysis [inference_op_replace_pass]
--- Running analysis [memory_optimize_pass]
--- Running analysis [ir_graph_to_program_pass]
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0823 01:29:21.256803 372 naming_service_thread.cpp:209] brpc::policy::ListNamingService("127.0.0.1:12000"): added 1
This API will be deprecated later. Please do not use it
W0823 01:29:21.263702 372 predictor.hpp:129] inference call failed, message: [E112]1/1 channels failed, fail_limit=1 [C0][E1014]Got EOF of fd=9 SocketId=1@127.0.0.1:12000@58270 [R1][E111]Fail to connect SocketId=8589934594@127.0.0.1:12000: Connection refused [R2][E112]Fail to select server from list://127.0.0.1:12000 lb=la
E0823 01:29:21.263754 372 general_model.cpp:423] failed call predictor with req: insts { tensor_array { float_data: 0.0137 float_data: -0.1136 float_data: 0.2553 float_data: -0.0692 float_data: 0.0582 float_data: -0.0727 float_data: -0.1583 float_data: -0.0584 float_data: 0.6283 float_data: 0.4919 float_data: 0.1856 float_data: 0.0795 float_data: -0.0332 elem_type: 1 shape: 1 shape: 13 } } fetch_var_names: "price" log_id: 0
This API will be deprecated later. Please do not use it
Illegal instruction
[2021-08-23 01:29:21,264] ERROR in app: Exception on /uci/prediction [POST]
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.6/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/usr/local/lib/python3.6/site-packages/paddle_serving_server/serve.py", line 428, in run
return web_service.get_prediction(request)
File "/usr/local/lib/python3.6/site-packages/paddle_serving_server/web_service.py", line 251, in get_prediction
feed=request.json["feed"], fetch=fetch, fetch_map=fetch_map)
File "/usr/local/lib/python3.6/site-packages/paddle_serving_server/web_service.py", line 352, in postprocess
for key in fetch_map:
TypeError: 'NoneType' object is not iterable

0
收藏
回复
全部评论(2)
时间顺序
FutureSI
#2 回复于2021-09

看起来是352行那里的返回值为空的,导致后面的迭代操作无法进行。

0
回复
FutureSI
#3 回复于2021-09

可能是前面数据处理的有问题,服务没有返回正确结果

0
回复
需求/bug反馈?一键提issue告诉我们
发现bug?如果您知道修复办法,欢迎提pr直接参与建设飞桨~
在@后输入用户全名并按空格结束,可艾特全站任一用户