首页 PaddleGAN 帖子详情
有关dc_gan的疑问
收藏
快速回复
PaddleGAN 问答生成 1192 1
有关dc_gan的疑问
收藏
快速回复
PaddleGAN 问答生成 1192 1

在学习 paddle 的 dcgan 模型:https://github.com/PaddlePaddle/models/blob/develop/fluid/PaddleCV/gan/c_gan/dc_gan.py

不理解为什么随着训练的进行,D-Loss 会越来越小,DG-Loss 会越来越大?

D-Loss 由两个 loss 相加得来:d_loss_1 生成图片判别,d_loss_2 真图片判别,打印发现两者都在减小。我理解的是,随着 G 的提升,D越来越“难”分辨生成图片,d_loss_1 不应该是越来越大吗?

DG-Loss 是 G 生成后使用 D 判别,并将结果和 1.0(真)取交叉熵,随着 G 的提升,D 的判断结果应该越来越靠近 1.0,那么 DG-Loss 不应该越来越小吗?

附上训练结果(对代码进行了修改,D-Loss 分别打印了 d_loss_1 和 2):

Epoch ID=0 Batch ID=0 D-Loss=[0.9413396, 0.5080029] DG-Loss=0.509056150913
 gen=[-0.00035174983, -0.077959724, 0.078784086, -0.012807296, 0.0117818555]
Epoch ID=0 Batch ID=10 D-Loss=[0.8772299, 0.48127288] DG-Loss=0.54744720459
 gen=[-0.0009413836, -0.66255087, 0.6291047, -0.14234722, 0.14739646]
Epoch ID=0 Batch ID=20 D-Loss=[0.8324908, 0.45882177] DG-Loss=0.583523392677
 gen=[-0.008304366, -0.8018948, 0.87974715, -0.19100502, 0.15497963]
Epoch ID=0 Batch ID=30 D-Loss=[0.8116391, 0.44748503] DG-Loss=0.601478457451
 gen=[-0.041910317, -0.91254705, 0.8673099, -0.30829585, 0.2164322]
Epoch ID=0 Batch ID=40 D-Loss=[0.76693845, 0.44238] DG-Loss=0.630416870117
 gen=[-0.047455773, -0.9264406, 0.9431238, -0.35612628, 0.26445058]
Epoch ID=0 Batch ID=50 D-Loss=[0.75674987, 0.42781287] DG-Loss=0.648641347885
 gen=[-0.02838057, -0.9710748, 0.96821845, -0.3725119, 0.33226904]
Epoch ID=0 Batch ID=60 D-Loss=[0.76286024, 0.43666524] DG-Loss=0.652921080589
 gen=[0.000898495, -0.9864888, 0.98864514, -0.48036155, 0.4667]
Epoch ID=0 Batch ID=70 D-Loss=[0.7524568, 0.42278215] DG-Loss=0.654020428658
 gen=[-0.020926915, -0.9979449, 0.99059135, -0.4506929, 0.3986177]
Epoch ID=0 Batch ID=80 D-Loss=[0.724985, 0.4244343] DG-Loss=0.668385267258
 gen=[-0.023699235, -0.99804264, 0.9946177, -0.49786776, 0.41184533]
Epoch ID=0 Batch ID=90 D-Loss=[0.76010084, 0.42260265] DG-Loss=0.648693323135
 gen=[-0.0075442, -0.99438566, 0.9917485, -0.43339494, 0.41605687]
Epoch ID=0 Batch ID=100 D-Loss=[0.72195876, 0.43098372] DG-Loss=0.671732723713
 gen=[0.04261195, -0.9865943, 0.99113953, -0.42338407, 0.48123085]
Epoch ID=0 Batch ID=110 D-Loss=[0.71280944, 0.4158618] DG-Loss=0.676549553871
 gen=[0.05457441, -0.9935098, 0.9940411, -0.3648114, 0.48018718]
Epoch ID=0 Batch ID=120 D-Loss=[0.7139101, 0.40882674] DG-Loss=0.676476120949
 gen=[0.045805052, -0.99484634, 0.9944248, -0.33952788, 0.4486676]
Epoch ID=0 Batch ID=130 D-Loss=[0.71287894, 0.41997856] DG-Loss=0.676730930805
 gen=[-0.0003026225, -0.99673676, 0.98210657, -0.30898783, 0.33982575]
Epoch ID=0 Batch ID=140 D-Loss=[0.7167694, 0.41415712] DG-Loss=0.666431367397
 gen=[-0.35601985, -0.9985628, 0.92145324, -0.6360779, -0.12504809]
Epoch ID=0 Batch ID=150 D-Loss=[0.71382934, 0.4273099] DG-Loss=0.682226538658
 gen=[-0.7400998, -0.99973583, 0.86852026, -0.9349872, -0.5909076]
Epoch ID=0 Batch ID=160 D-Loss=[0.7125248, 0.42491335] DG-Loss=0.674280107021
 gen=[-0.48998603, -0.9991625, 0.9959037, -0.7804571, -0.28527778]
Epoch ID=0 Batch ID=170 D-Loss=[0.7040516, 0.4312653] DG-Loss=0.680618286133
 gen=[-0.03240129, -0.9993214, 0.9945996, -0.3453997, 0.2913468]
Epoch ID=0 Batch ID=180 D-Loss=[0.7196784, 0.42506886] DG-Loss=0.680206894875
 gen=[-0.2716768, -0.9996766, 0.99067736, -0.58626896, -0.011236521]
Epoch ID=0 Batch ID=190 D-Loss=[0.71472067, 0.42008954] DG-Loss=0.676885068417
 gen=[-0.5257619, -0.9998761, 0.992847, -0.80208606, -0.34479865]
Epoch ID=0 Batch ID=200 D-Loss=[0.72426105, 0.43688476] DG-Loss=0.671522378922
 gen=[-0.43495327, -0.9997729, 0.9985096, -0.7647353, -0.21348156]
Epoch ID=0 Batch ID=210 D-Loss=[0.71536505, 0.41209638] DG-Loss=0.673253953457
 gen=[-0.19274533, -0.99963796, 0.99923164, -0.5829041, 0.16430806]
Epoch ID=0 Batch ID=220 D-Loss=[0.71826077, 0.4478671] DG-Loss=0.674174904823
 gen=[-0.33436066, -0.99987197, 0.9992421, -0.7071303, -0.036703788]
Epoch ID=0 Batch ID=230 D-Loss=[0.71523637, 0.42580894] DG-Loss=0.672444820404
 gen=[-0.45866695, -0.99989897, 0.9974155, -0.80335426, -0.2277862]
Epoch ID=0 Batch ID=240 D-Loss=[0.71179247, 0.4282166] DG-Loss=0.67279946804
 gen=[-0.53995585, -0.999988, 0.998615, -0.86624557, -0.34926727]
Epoch ID=0 Batch ID=250 D-Loss=[0.70760953, 0.449031] DG-Loss=0.681282043457
 gen=[-0.49773955, -0.9999845, 0.9991327, -0.8559687, -0.2878714]
Epoch ID=0 Batch ID=260 D-Loss=[0.71524, 0.4216944] DG-Loss=0.681333124638
……

烦请解答,多谢!

0
收藏
回复
全部评论(1)
时间顺序
joodo
#2 回复于2019-02

~

0
回复
需求/bug反馈?一键提issue告诉我们
发现bug?如果您知道修复办法,欢迎提pr直接参与建设飞桨~
在@后输入用户全名并按空格结束,可艾特全站任一用户