-
DoRA: Weight-Decomposed Low-Rank Adaptation
Paper • 2402.09353 • Published • 32 -
MobileLLM: Optimizing Sub-billion Parameter Language Models for On-Device Use Cases
Paper • 2402.14905 • Published • 134 -
Resonance RoPE: Improving Context Length Generalization of Large Language Models
Paper • 2403.00071 • Published • 24 -
AtP*: An efficient and scalable method for localizing LLM behaviour to components
Paper • 2403.00745 • Published • 14
Aharneish Abburu
Aharneish
AI & ML interests
None yet
Organizations
imp papers
-
DoRA: Weight-Decomposed Low-Rank Adaptation
Paper • 2402.09353 • Published • 32 -
MobileLLM: Optimizing Sub-billion Parameter Language Models for On-Device Use Cases
Paper • 2402.14905 • Published • 134 -
Resonance RoPE: Improving Context Length Generalization of Large Language Models
Paper • 2403.00071 • Published • 24 -
AtP*: An efficient and scalable method for localizing LLM behaviour to components
Paper • 2403.00745 • Published • 14
models 59
Aharneish/dpo_out
Updated
Aharneish/strategy-head
Updated
Aharneish/emotion-head-goemotions
Updated
Aharneish/finetuned_model-1
Updated
Aharneish/anime-trail
Updated • 1
Aharneish/finetuning-trail
Updated
Aharneish/email_response_reader-lm-1.5b
Updated
Aharneish/hindi-llama
Updated • 1
Aharneish/Llama-2-demo
Updated
Aharneish/llama-demo
Updated
datasets 9
Aharneish/toclaude
Preview • Updated • 9
Aharneish/sprit-qa-t
Viewer • Updated • 8.39k • 2
Aharneish/cleaned_qa
Viewer • Updated • 8.39k • 7
Aharneish/dataset_spiritual
Preview • Updated • 6
Aharneish/sharded_data
Updated • 4
Aharneish/training
Viewer • Updated • 438k • 7
Aharneish/spirit-qa
Viewer • Updated • 8.39k • 14
Aharneish/spirit
Viewer • Updated • 778k • 50
Aharneish/dataset-qa-indian-classical-text
Preview • Updated • 14