You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
在G_pretrain.m文件中代码摘要如下
`prob_k=zeros(9,1);
for k=1:9
row=floor((k-1)/3)+1;
col=mod((k-1),3)+1;
for i=1:nBatches
batch = pos_data(:,:,:,opts.batchSize*(i-1)+1:min(end,opts.batchSize*i));
batch(col,row,:,:)=0;
if(opts.useGpu)
batch = gpuArray(batch);
end
res = vl_simplenn(net_fc, batch, [], [], ...
'disableDropout', true, ...
'conserveMemory', true, ...
'sync', true) ;
f = gather(res(end).x) ;
if ~exist('feat','var')
feat = zeros(size(f,1),size(f,2),size(f,3),n,'single');
end
feat(:,:,:,opts.batchSize*(i-1)+1:min(end,opts.batchSize*i)) = f;
end
X=feat;
E = exp(bsxfun(@minus, X, max(X,[],3))) ;
L = sum(E,3) ;
Y = bsxfun(@rdivide, E, L) ;
prob_k(k)=sum(Y(1,1,1,:));
在G_pretrain.m文件中代码摘要如下
`prob_k=zeros(9,1);
for k=1:9
row=floor((k-1)/3)+1;
col=mod((k-1),3)+1;
for i=1:nBatches
batch = pos_data(:,:,:,opts.batchSize*(i-1)+1:min(end,opts.batchSize*i));
batch(col,row,:,:)=0;
if(opts.useGpu)
batch = gpuArray(batch);
end
res = vl_simplenn(net_fc, batch, [], [], ...
'disableDropout', true, ...
'conserveMemory', true, ...
'sync', true) ;
end
end
[~,idx]=min(prob_k);`
其中**[~,idx]=min(prob_k)**此处选择出来的idx对应的mask,所产生的D网络的loss不是最大的,而是最小的。这样选出的mask与论文中所阐明的选择方法不同。
@22wei22
The text was updated successfully, but these errors were encountered: