cerebras.modelzoo.common.utils.model.lora#
Functions
Sets merge_weights=False in LoRA parameters. |
|
Gets lora parameters for a particular module. |
|
Create a Low Rank Adaptation (LoRA) model from a non-LoRA model. |
|
Classes
Base LoRA layer From https://github.com/microsoft/LoRA/blob/main/loralib/layers.py. |
|
LoRA embedding layer From https://github.com/microsoft/LoRA/blob/main/loralib/layers.py. |
|
LoRA linear layer From https://github.com/microsoft/LoRA/blob/main/loralib/layers.py. |
|
r: Rank of LoRA matrix projections alpha: Scaling factor (see paper for additional details) dropout: Dropout to apply to LoRA updates fan_in_fan_out: merge_weights: Determines whether lora weights should be merged/folded into underlying layers target_modules: A list of module names that must all exist in layers that will be converted to LoRA. For example, setting target_modules to ["TransformerDecoderLayer", "Linear"] would mean that all linear layers that were children of a TransformerDecoderLayer would be converted to LoRA. |