-
Notifications
You must be signed in to change notification settings - Fork 45
Description
I read the paper, it was providing straight ideas about how does GFNT tackled the limitation of using Self-Attention with a Vision-Transformer in computations cost and complexity also, i would like to take step further to use it in Segmentation task but i will use Global Filter as Attention mechanism in Unet model similar to Attention Gate
Algorithm pseudocode
Inputs:
Outputs: filtered output, gate frequencies, weights frequencies
Function: AttentionFilter(
-
Input:
$g$ - input global feature map,$x$ - input feature map -
Output:
$out$ - filtered output,$G1_{feq}$ - gate frequencies,$X1_{feq}$ - weights frequencies
Pseudocode:
Return: