cerebras.modelzoo.common.utils.model.lora.LoRA_Linear#
- class cerebras.modelzoo.common.utils.model.lora.LoRA_Linear[source]#
Bases:
torch.nn.Linear
,cerebras.modelzoo.common.utils.model.lora.LoRALayer
LoRA linear layer From https://github.com/microsoft/LoRA/blob/main/loralib/layers.py.
Methods
forward
reset_parameters
train
- __init__(in_features: int, out_features: int, r: int = 0, lora_alpha: int = 1, lora_dropout: float = 0.0, fan_in_fan_out: bool = False, merge_weights: bool = True, **kwargs)[source]#
- __call__(*args: Any, **kwargs: Any) Any #
Call self as a function.
- static __new__(cls, *args: Any, **kwargs: Any) Any #