🤖AI & LLM
22,267
144

mamba-architecture

State-space model with O(n) complexity vs Transformers' O(n²). 5× faster inference, million-token sequences, no KV cache. Selective SSM with hardware-aware design. Mamba-1 (d_state=16) and Mamba-2 (d_state=128, multi-head). Models 130M-2.8B on HuggingFace.

#architecture#llm-optimization#machine-learning#transformers#mamba
Share
Quick Install
>_npx skills add davila7/claude-code-templates
Documentation
Loading documentation...
Repository
Repositorydavila7/claude-code-templates
Stars22,267
Last UpdatedMar 7, 2026
Related Skills
271,400
6,331

find-skills

Helps users discover and install agent skills based on their queries.

vercel-labs
vercel-labs/skills
46,800
19,561

agent-browser

A CLI tool for AI agents to automate browser tasks like navigation, form filling, and data scraping.

vercel-labs
vercel-labs/agent-browser
34,600
79,803

browser-use

Automates browser interactions for web testing, form filling, screenshots, and data extraction.

browser-use
browser-use/browser-use
32,600
86,065

skill-creator

A guide for creating effective AI skills that extend Claude's capabilities with specialized knowledge, workflows, or tool integrations.

anthropics
anthropics/skills
24,400
55,506

brainstorming

A skill for brainstorming and exploring user intent before implementing creative work.

obra
obra/superpowers