VS Code extension for intelligent coding assistance using CodeShell
Top 56.9% on sourcepulse
CodeShell VSCode Extension provides an intelligent coding assistant for Visual Studio Code, supporting multiple programming languages like Python, Java, C++, JavaScript, and Go. It aims to boost developer productivity by offering features such as code completion, explanation, optimization, comment generation, and conversational Q&A, all powered by the CodeShell large language model.
How It Works
The extension integrates with a self-hosted CodeShell model service. It supports two primary deployment methods for the model: llama_cpp_for_codeshell
for quantized models (e.g., codeshell-chat-q4_0.gguf
) which can leverage CPU or Metal (on Apple Silicon) for inference, and text-generation-inference
(TGI) for larger models (e.g., CodeShell-7B, CodeShell-7B-Chat) requiring NVIDIA GPUs. This modular approach allows users to choose the backend that best suits their hardware capabilities.
Quick Start & Requirements
npm exec vsce package
to generate a .vsix
file, then install via VS Code's "Install from VSIX..." command.llama_cpp_for_codeshell
or deploying with text-generation-inference
. Model weights must be downloaded separately from Hugging Face.Highlighted Details
-int4
) can run on CPU with Metal support for Apple Silicon.Maintenance & Community
Licensing & Compatibility
Limitations & Caveats
The README implies that specific model configurations (e.g., codeshell-chat-q4_0.gguf
with llama.cpp
or larger models with TGI) must be correctly selected in the VS Code extension settings for proper operation. Deployment of the model service itself requires technical expertise and potentially significant hardware resources.
1 year ago
1 day