在将以前的一个项目升级到2.1版本的时候,碰到了运行时间过长的问题(也碰到了升级后运行报错的问题,这个问题以后再说),同样的代码,同样的aistudio高级版gpu16G环境,同样的2.02版本,一个notebook项目用时25ms/step,另一个用时142ms/step,
项目地址为:https://aistudio.baidu.com/aistudio/projectdetail/1222066
测试代码为:
import paddle
from paddle.vision.transforms import ToTensor
train_dataset = paddle.vision.datasets.MNIST(mode='train', transform=ToTensor())
test_dataset = paddle.vision.datasets.MNIST(mode='test', transform=ToTensor())
lenet = paddle.vision.models.LeNet()
# Mnist继承paddle.nn.Layer属于Net,model包含了训练功能
model = paddle.Model(lenet)
# 设置训练模型所需的optimizer, loss, metric
model.prepare(
paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters()),
paddle.nn.CrossEntropyLoss(),
paddle.metric.Accuracy(topk=(1, 2))
)
# 启动训练
callback = paddle.callbacks.VisualDL(log_dir='vdl/hapi')
model.fit(train_dataset, epochs=1, batch_size=64, log_freq=400, callbacks=callback)
# 启动评估
model.evaluate(test_dataset, log_freq=100, batch_size=64)
运行输出为:
正常:
step 400/938 - loss: 0.0781 - acc_top1: 0.9106 - acc_top2: 0.9590 - 25ms/step
step 800/938 - loss: 0.0352 - acc_top1: 0.9392 - acc_top2: 0.9748 - 25ms/step
step 938/938 - loss: 0.0474 - acc_top1: 0.9443 - acc_top2: 0.9776 - 25ms/step
Eval begin...
The loss value printed in the log is the current batch, and the metric is the average value of previous step.
step 100/157 - loss: 0.0043 - acc_top1: 0.9680 - acc_top2: 0.9905 - 22ms/step
step 157/157 - loss: 0.0056 - acc_top1: 0.9733 - acc_top2: 0.9921 - 22ms/step
Eval samples: 10000
非正常:
step 400/938 - loss: 0.0647 - acc_top1: 0.9061 - acc_top2: 0.9570 - 142ms/step
step 800/938 - loss: 0.0470 - acc_top1: 0.9361 - acc_top2: 0.9732 - 144ms/step
step 938/938 - loss: 0.0718 - acc_top1: 0.9416 - acc_top2: 0.9759 - 143ms/step
Eval begin...
The loss value printed in the log is the current batch, and the metric is the average value of previous step.
step 100/157 - loss: 0.0229 - acc_top1: 0.9677 - acc_top2: 0.9919 - 144ms/step
step 157/157 - loss: 0.0032 - acc_top1: 0.9739 - acc_top2: 0.9939 - 144ms/step
Eval samples: 10000
又试了一下,重新开项目后恢复正常了,是否前面分到的卡有问题 ?