Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve the pruning module of paddle #2284

Closed
NHZlX opened this issue May 26, 2017 · 1 comment · Fixed by #2354
Closed

Improve the pruning module of paddle #2284

NHZlX opened this issue May 26, 2017 · 1 comment · Fixed by #2354
Assignees

Comments

@NHZlX
Copy link
Contributor

NHZlX commented May 26, 2017

Pruning is a way of model compression, which is important for embedded deployments. Current paddle has an implementation of the pruning
https://github.com/PaddlePaddle/Paddle/blob/develop/paddle/parameter/ParameterUpdaterHook.cpp#L36
However, there are servel problems:

  • before training ,the user need to generate mask file for each layer, and need to specify the mask file dir when build the network. It is hard to specify the mask path for each layer when the network is large.
  • Paddle's Python V2 API does not have a clear pruning interface.
  • There is a need for a document to illustrate the pruning effect and usage.

so, It will have the following improvements:

  • We only need to provide a sparse degree for each layer with parameters of the network, and The network will automatically calculate the mask.
  • Improve the interface of python v2 api to pruning.
  • offer a pruning demo or doucument on cifar10 dataset.
@NHZlX
Copy link
Contributor Author

NHZlX commented Jun 2, 2017

python v2 api 训练还存在bug, 所以demo还在过程中
每一层指定稀疏度,网络根据稀疏度来保留与舍弃parameters
#2354
python v2 pruning api example(对resnet最后的全链接进行pruning) 如下:

import paddle.v2 as paddle
from resnet import resnet_cifar10
from paddle.v2.attr import Hook
from paddle.v2.attr import ParamAttr

hk = Hook('pruning', sparsity_ratio=0.9)

def main():

    datadim = 3 * 32 * 32
    classdim = 10
    paddle.init(use_gpu=False, trainer_count=1)

    image = paddle.layer.data(
        name="image", type=paddle.data_type.dense_vector(datadim))

    net = resnet_cifar10(image)

    out = paddle.layer.fc(
        input=net, size=classdim, act=paddle.activation.Softmax(), param_attr = ParamAttr(update_hooks=hk))

    lbl = paddle.layer.data(
        name="label", type=paddle.data_type.integer_value(classdim))
    cost = paddle.layer.classification_cost(input=out, label=lbl)
    
    parameters = paddle.parameters.create(cost)
    
    momentum_optimizer = paddle.optimizer.Momentum(
        momentum=0.9,
        regularization=paddle.optimizer.L2Regularization(rate=0.0002 * 128),
        learning_rate=0.001,
        learning_rate_schedule='constant')


    # Create trainer
    trainer = paddle.trainer.SGD(
        cost=cost, parameters=parameters, update_equation=momentum_optimizer)

    #......

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants