March 28, 2024, 9:21 p.m. | /u/kessa231

Machine Learning www.reddit.com

Just skimmed their official implementation code and curious about this.
For example, in their Embedding module they declared and used lora parameters like this:

self.lora_A = nn.Parameter(self.weight.new_zeros((r, num_embeddings)))
self.lora_B = nn.Parameter(self.weight.new_zeros((embedding_dim, r)))
...
self.weight.data -= (self.lora_B @ self.lora_A).transpose(0, 1) * self.scaling
...
after_A = F.embedding(
x, self.lora_A.transpose(0, 1), self.padding_idx, self.max_norm,
self.norm_type, self.scale_grad_by_freq, self.sparse
)
result += (after_A @ self.lora_B.transpose(0, 1)) * self.scaling
...

So, why don't they just declare like this and use without transpose?

self.lora_A = nn.Parameter(self.weight.new_zeros((r, embedding_dim)))
self.lora_B …

code data embedding example implementation lora machinelearning parameters scaling

ML/AI Engineer / NLP Expert - Custom LLM Development (x/f/m)

@ HelloBetter | Remote

Doctoral Researcher (m/f/div) in Automated Processing of Bioimages

@ Leibniz Institute for Natural Product Research and Infection Biology (Leibniz-HKI) | Jena

Seeking Developers and Engineers for AI T-Shirt Generator Project

@ Chevon Hicks | Remote

Security Data Engineer

@ ASML | Veldhoven, Building 08, Netherlands

Data Engineer

@ Parsons Corporation | Pune - Business Bay

Data Engineer

@ Parsons Corporation | Bengaluru, Velankani Tech Park