Cerebras PyTorch Layer API
On This Page
Cerebras PyTorch Layer API#
Cerebras PyTorch Layer API implements a subset of PyTorch APIs with our custom implementation that takes advantage of our high-performance kernels and provides extra functionalities as compared to the native PyTorch version. The extra functionalities are optional and opt-in; if you don’t use the extra functionalities, then the Layer API is equivalent to the native PyTorch version.
modelzoo.common.pytorch.layers.MultiheadAttention is the replacement for
torch.nn.MultiheadAttentionmodelzoo.common.pytorch.layers.TransformerDecoderLayer is the replacement for
torch.nn.TransformerDecoderLayermodelzoo.common.pytorch.layers.TransformerDecoder is the replacement for
torch.nn.TransformerDecodermodelzoo.common.pytorch.layers.TransformerEncoderLayer is the replacement for
torch.nn.TransformerEncoderLayermodelzoo.common.pytorch.layers.TransformerEncoder is the replacement for
torch.nn.TransformerEncoder
Supported PyTorch Ops#
If your model implementation requires additional PyTorch Ops beyond the layer APIs above, Cerebras also supports the following PyTorch operations.
Attention
The following list of supported PyTorch ops is very preliminary. We cannot guarantee that mixing and matching them in your models will work. Support is only provided for the way they are used in the Cerebras Model Zoo.
nn#
- torch.nn.CrossEntropyLoss
Note: Known limitation:
ignore_indexcan only be -100
- torch.nn.Embedding
Note: Known limitation:
num_embeddings < 65536
- torch.nn.functional.gelu
Note: Known limitation: May have precision issue when approximation
!=tanh
- torch.nn.NLLLoss
Note: Known limitation:
ignore_indexcan only be -100