Skip to content
代码片段 群组 项目
提交 39a79f40 编辑于 作者: Phil Wang's avatar Phil Wang
浏览文件

give locality to feedforwards even in cross attention block

上级 fa600e10
分支
标签 0.0.78
无相关合并请求
......@@ -851,7 +851,7 @@ class Alphafold2(nn.Module):
layers.append(nn.ModuleList([
intercept_fn(context = False, attn = prenorm_cross(cross_attn_fn())),
prenorm(FeedForward(dim = dim, dropout = ff_dropout)),
prenorm(LocalFeedForward(dim = dim, hidden_dim = dim * 4, dropout = ff_dropout)),
intercept_fn(context = True, attn = prenorm_cross(cross_attn_fn())),
prenorm(FeedForward(dim = dim, dropout = ff_dropout)),
]))
......
......@@ -3,7 +3,7 @@ from setuptools import setup, find_packages
setup(
name = 'alphafold2-pytorch',
packages = find_packages(),
version = '0.0.89',
version = '0.0.90',
license='MIT',
description = 'AlphaFold2 - Pytorch',
author = 'Phil Wang, Eric Alcaide',
......
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册