Skip to content

Commit

Permalink
add
Browse files Browse the repository at this point in the history
  • Loading branch information
WAMAWAMA committed Nov 10, 2022
1 parent b7fce25 commit c7b777a
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -114,8 +114,8 @@ An overview of this repo (let's call `wama_modules` as `wm`)
|`wm.Transformer` |Some self-attention or cross-attention modules, which can be used to build ViT, DETR or TransUnet | `TransformerEncoderLayer` `TransformerDecoderLayer` |


- How to build your networks modularly and freely? 👉 See 'Guideline 1: Build networks modularly' below ~
- How to use pretrained model with `wm.thirdparty_lib`? 👉 See 'Guideline 2: Use pretrained weights' below ~
- How to build your networks modularly and freely? 👉 See ['Guideline 1: Build networks modularly'](https://github.com/WAMAWAMA/wama_modules#4-guideline-1-build-networks-modularly) below ~
- How to use pretrained model with `wm.thirdparty_lib`? 👉 See ['Guideline 2: Use pretrained weights'](https://github.com/WAMAWAMA/wama_modules#5-guideline-2-use-pretrained-weights) below ~



Expand Down

0 comments on commit c7b777a

Please sign in to comment.