Self-hosted AI coding assistant
Find a file
Wei Zhang aa5d0e4cb7
chore(demo): forbit changing password in demo station (#4399)
* chore(demo): forbit changing password in demo station

* [autofix.ci] apply automated fixes

* [autofix.ci] apply automated fixes (attempt 2/3)

* chore: fix tests

---------

Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
2025-11-26 11:10:02 +08:00
.changes docs(changelog): add v0.31.2 release notes (#4373) 2025-09-25 08:52:37 -07:00
.github chore(ci): fix download llama from upstream and drop cuda11.7 release (#4331) 2025-08-01 21:17:47 +08:00
.tmuxinator refactor(dev): replace complex bash script with tmuxinator for dev en… (#3137) 2024-09-23 15:49:56 +08:00
ci chore(ci): fix download llama from upstream and drop cuda11.7 release (#4331) 2025-08-01 21:17:47 +08:00
clients fix(intellij): undo inline edit in one undo & add code vision setting (#4257) 2025-05-28 17:59:05 -07:00
crates chore(demo): forbit changing password in demo station (#4399) 2025-11-26 11:10:02 +08:00
docker Revert "feat: Update CUDA version to 12.8 (#4312)" (#4313) 2025-07-10 16:21:40 +08:00
ee chore(demo): forbit changing password in demo station (#4399) 2025-11-26 11:10:02 +08:00
experimental chore: update llama.cpp repo url to ggml-org (#3857) 2025-02-17 17:55:23 +08:00
python feat(loadtest): add loadtest tools (#906) 2023-11-28 14:34:00 +08:00
rules feat(webserver): implement access policies backend check (#3076) 2024-09-06 14:27:04 -07:00
website docs(wsl): repair linux guide link path (#4391) 2025-11-10 09:08:09 +08:00
.changie.yaml chore: adopt changie for automatic changelog management (#2092) 2024-05-10 15:11:25 -07:00
.dockerignore feat: Add rocm builds and documentation (#1012) 2023-12-13 15:59:04 +08:00
.env refactor(db): Rewrite database code to use sqlx instead of rusqlite (#1340) 2024-02-01 09:38:37 -08:00
.gitattributes chore: exclude ee/tabby-webserver/ui to language stats 2023-11-17 15:50:27 -08:00
.gitignore fix(scheduler): properly set parallelism for code indexer (#2504) 2024-06-26 12:44:22 +08:00
.gitmodules chore: update llama.cpp repo url to ggml-org (#3857) 2025-02-17 17:55:23 +08:00
.rustfmt.toml refactor: rust nightly format (#197) 2023-06-05 14:17:07 -07:00
Cargo.lock fix(crawler): append URL-encoded section titles as fragments to llms.txt URLs (#4338) 2025-08-14 09:15:12 +08:00
Cargo.toml fix(sqlx): use 0.7.3 sqlx to avoid pool timeout (#4328) 2025-07-29 02:11:43 +00:00
CHANGELOG.md docs(changelog): add v0.31.2 release notes (#4373) 2025-09-25 08:52:37 -07:00
CODE_OF_CONDUCT.md docs: Create CODE_OF_CONDUCT.md (#1228) 2024-01-24 00:18:32 -08:00
codecov.yml chore: update codecov.yml 2024-02-08 13:15:46 -08:00
CONTRIBUTING.md docs(CONTRIBUTING): fix broken link and changie typo (#3348) 2024-10-31 08:53:38 -07:00
LICENSE docs: update 2025 in License (#3716) 2025-01-16 08:53:45 -08:00
Makefile refactor(dev): replace complex bash script with tmuxinator for dev en… (#3137) 2024-09-23 15:49:56 +08:00
MODEL_SPEC.md chore: update llama.cpp repo url to ggml-org (#3857) 2025-02-17 17:55:23 +08:00
package.json chore(vscode): use codelens for inline edit actions / decorations (#3000) 2024-08-28 21:52:58 -07:00
pnpm-lock.yaml chore(gql): remove codegen introspection plugin (#4119) 2025-04-09 10:49:40 +08:00
pnpm-workspace.yaml feat: migrate to pnpm (#2214) 2024-05-21 21:43:15 +00:00
README-ja.md docs(readme): fix readme wrapped within html blocks (#4180) 2025-04-22 17:06:37 +08:00
README-zh.md docs(readme): fix readme wrapped within html blocks (#4180) 2025-04-22 17:06:37 +08:00
README.md docs: add v0.30.0 entry to README (#4315) 2025-07-11 13:49:20 +08:00
sgconfig.yml refactor(webserver): ensure only services could rely on tabby_db dire… (#1407) 2024-02-07 19:15:53 +00:00
turbo.json chore(vscode): use codelens for inline edit actions / decorations (#3000) 2024-08-28 21:52:58 -07:00

Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features:

  • Self-contained, with no need for a DBMS or cloud service.
  • OpenAPI interface, easy to integrate with existing infrastructure (e.g Cloud IDE).
  • Supports consumer-grade GPUs.

Open Live Demo

Demo

🔥 What's New

  • 07/02/2025 v0.30 supports indexing GitLab Merge Request as Context!
  • 05/25/2025 💡Interested in joining Agent private preview? DM in X for early waitlist approval!🎫
  • 05/20/2025 Enhance Tabby with your own documentation📃 through REST APIs in v0.29! 🎉
  • 05/01/2025 v0.28 transforming Answer Engine messages into persistent, shareable Pages
  • 03/31/2025 v0.27 released with a richer @ menu in the chat side panel.
  • 02/05/2025 LDAP Authentication and better notification for background jobs coming in Tabby v0.24.0!
  • 02/04/2025 VSCode 1.20.0 upgrade! @-mention files to add them as chat context, and edit inline with a new right-click option are available!
Archived
  • 01/10/2025 Tabby v0.23.0 featuring enhanced code browser experience and chat side panel improvements!
  • 12/24/2024 Introduce Notification Box in Tabby v0.22.0!
  • 12/06/2024 Llamafile deployment integration and enhanced Answer Engine user experience are coming in Tabby v0.21.0!🚀
  • 11/10/2024 Switching between different backend chat models is supported in Answer Engine with Tabby v0.20.0!
  • 10/30/2024 Tabby v0.19.0 featuring recent shared threads on the main page to improve their discoverability.
  • 07/09/2024 🎉Announce Codestral integration in Tabby!
  • 07/05/2024 Tabby v0.13.0 introduces Answer Engine, a central knowledge engine for internal engineering teams. It seamlessly integrates with dev team's internal data, delivering reliable and precise answers to empower developers.
  • 06/13/2024 VSCode 1.7 marks a significant milestone with a versatile Chat experience throughout your coding experience. Come and they the latest chat in side-panel and editing via chat command!
  • 06/10/2024 Latest 📃blogpost drop on an enhanced code context understanding in Tabby!
  • 06/06/2024 Tabby v0.12.0 release brings 🔗seamless integrations (Gitlab SSO, Self-hosted GitHub/GitLab, etc.), to ⚙️flexible configurations (HTTP API integration) and 🌐expanded capabilities (repo-context in Code Browser)!
  • 05/22/2024 Tabby VSCode 1.6 comes with multiple choices in inline completion, and the auto-generated commit messages🐱💻!
  • 05/11/2024 v0.11.0 brings significant enterprise upgrades, including 📊storage usage stats, 🔗GitHub & GitLab integration, 📋Activities page, and the long-awaited 🤖Ask Tabby feature!
  • 04/22/2024 v0.10.0 released, featuring the latest Reports tab with team-wise analytics for Tabby usage.
  • 04/19/2024 📣 Tabby now incorporates locally relevant snippets(declarations from local LSP, and recently modified code) for code completion!
  • 04/17/2024 CodeGemma and CodeQwen model series have now been added to the official registry!
  • 03/20/2024 v0.9 released, highlighting a full feature admin UI.
  • 12/23/2023 Seamlessly deploy Tabby on any cloud with SkyServe 🛫 from SkyPilot.
  • 12/15/2023 v0.7.0 released with team management and secured access!
  • 10/15/2023 RAG-based code completion is enabled by detail in v0.3.0🎉! Check out the blogpost explaining how Tabby utilizes repo-level context to get even smarter!
  • 11/27/2023 v0.6.0 released!
  • 11/09/2023 v0.5.5 released! With a redesign of UI + performance improvement.
  • 10/24/2023 Major updates for Tabby IDE plugins across VSCode/Vim/IntelliJ!
  • 10/04/2023 Check out the model directory for the latest models supported by Tabby.
  • 09/18/2023 Apple's M1/M2 Metal inference support has landed in v0.1.1!
  • 08/31/2023 Tabby's first stable release v0.0.1 🥳.
  • 08/28/2023 Experimental support for the CodeLlama 7B.
  • 08/24/2023 Tabby is now on JetBrains Marketplace!

👋 Getting Started

You can find our documentation here.

Run Tabby in 1 Minute

The easiest way to start a Tabby server is by using the following Docker command:

docker run -it \
  --gpus all -p 8080:8080 -v $HOME/.tabby:/data \
  tabbyml/tabby \
  serve --model StarCoder-1B --device cuda --chat-model Qwen2-1.5B-Instruct

For additional options (e.g inference type, parallelism), please refer to the documentation page.

🤝 Contributing

Full guide at CONTRIBUTING.md;

Get the Code

git clone --recurse-submodules https://github.com/TabbyML/tabby
cd tabby

If you have already cloned the repository, you could run the git submodule update --recursive --init command to fetch all submodules.

Build

  1. Set up the Rust environment by following this tutorial.

  2. Install the required dependencies:

# For MacOS
brew install protobuf

# For Ubuntu / Debian
apt install protobuf-compiler libopenblas-dev
  1. Install useful tools:
# For Ubuntu
apt install make sqlite3 graphviz
  1. Now, you can build Tabby by running the command cargo build.

Start Hacking!

... and don't forget to submit a Pull Request

🌍 Community

  • 🎤 Twitter / X - engage with TabbyML for all things possible
  • 📚 LinkedIn - follow for the latest from the community
  • 💌 Newsletter - subscribe to unlock Tabby insights and secrets

🔆 Activity

Git Repository Activity

🌟 Star History

Star History Chart