Skip to content

Roxbili/TorchPruner

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TorchPruner

TorchPrune is designed to prune PyTorch model.

Install

pip install -e .

Usage

Demo can be found in examples

config_list

config_list = [
    {
        'sparsity': 0.5,
        'op_types': ['Linear', 'Conv2d'],
        'block_size': 4 # for block prune
    }, {
        'sparsity': 0.25,
        'op_types': ['Conv2d'],
        'op_names': ['block.0'],
        'block_size': 2
    }, {
        'exclude': True,
        'op_names': ['fc']
    }
]
  • sparsity : This is to specify the sparsity for each layer in this config to be compressed.
  • op_types : Operation types to be pruned.
  • op_names : Operation names to be pruned.
  • exclude : Set True then the layers setting by op_types and op_names will be excluded from pruning.
  • block_size: For block prune, set block_size along in_ch dimension

(Note that if a layer is satisfied with multi-config in config_list, later one will cover before.)

one-shot

pruner = {
    'random': RandomPruner(model, config_list),
    'level': LevelPruner(model, config_list),
    'block': BlockPruner(model, config_list)
}['block']

pruner.compress()
pruner.show_sparsity()
pruner.parameters_size()

agp iteration

pruner = {
    'random': RandomPruner(model, config_list),
    'level': LevelPruner(model, config_list),
    'block': BlockPruner(model, config_list)
}['block']

scheduler = AGPScheduler(pruner, config_list, finetuner, evaluator, 
                         total_iteration=args.agp_iteration, finetune_epoch=args.agp_finetune_epoch, lr_scheduler=None)
scheduler.compress()

pruner.show_sparsity()
pruner.parameters_size()

fintuner gets epoch and model as inputs, evaluator gets model as inputs.

Feature

  1. Support Pruner:
  • RandomPruner: random pruning
  • LevelPruner: magnitede pruning
  • BlockPruner: block pruning, along in_channel axis
  1. Support one-shot pruning and AGP iteration pruning
  2. Support evaluating parameters size of sparse model, including quant size, sparse encoding size and parameters size.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages