mcp-server-workshop
Language:
TypeScript
Stars:
3
Forks:
0
Local AI Code Reviewer š
A lightweight, local-first code review tool that uses AI to provide quick feedback on your code changes. Built with TypeScript and powered by Ollama, it analyzes git diffs and offers actionable suggestions to improve your code.
Features āØ
- š Local-First: Runs entirely on your machine - no API keys or internet required
- š Git Integration: Automatically analyzes your uncommitted changes
- š¤ AI-Powered: Uses the deepseek-coder model for intelligent code review
- šØ Rich Output: Clear, colorized terminal output with actionable suggestions
- ā” Fast: Quick setup and instant feedback
- š Private: Your code never leaves your machine
Prerequisites š
Before you begin, ensure you have:
- Windows 11
- Node.js v22.0.0 or higher
- npm v11.0.0 or higher
- Ollama installed
Installation š ļø
-
Clone the repository:
git clone cd code-reviewer
-
Install dependencies:
npm install
-
Pull the required model:
ollama pull deepseek-coder:latest
-
Verify the installation:
ollama list # Should show deepseek-coder
Usage š»
-
Make some changes in your project
-
Run the review tool:
npx tsx src/index.ts /path/to/your/project
-
Review the suggestions in your terminal
Example output:
š Code Review Assistant
āāāāāāāāāāāāāāāāāāāāā
Reviewing project: /path/to/your/project
š Summary
Found 3 issues in your changes...
šÆ Detailed Analysis
[app.ts]
ā ERROR (line 15)
Unused variable 'config'
ā Impact: Increases code size and reduces maintainability
ā Fix: Remove unused variable or implement intended usage
...
Project Structure šļø
src/
āāā index.ts # CLI entry point
āāā reviewer.ts # Core review logic
āāā git.ts # Git operations
āāā ollama.ts # Model interface
āāā types.ts # TypeScript types
Configuration š§
You can customize the review process by adding a .reviewconfig
folder in your project:
your-project/
āāā .reviewconfig/
ā āāā standards.md # Project coding standards
ā āāā patterns.md # Preferred patterns
āāā ...
Troubleshooting š
Common issues and solutions:
-
Ollama not responding
- Check if Ollama is running in your system tray
- Verify with
ollama list
- Restart Ollama if needed
-
Model issues
- Ensure deepseek-coder is installed:
ollama pull deepseek-coder:latest
- Check available memory (8GB minimum recommended)
- Close unnecessary applications
- Ensure deepseek-coder is installed:
-
Node version errors
- Update Node.js to v22+:
node -v
- Update npm to v11+:
npm -v
- Update Node.js to v22+:
Limitations š§
While powerful for quick code reviews, be aware that:
- Local models have limited capabilities compared to cloud alternatives
- Resource usage depends on your hardware
- Complex analyses might require cloud-based solutions
- Model responses can vary in quality
Best Practices š
-
Use for:
- Quick feedback on code changes
- Personal projects
- Learning and experimentation
-
Consider cloud alternatives for:
- Team-wide code review
- Production deployment
- Complex analysis needs
License š
Publisher info
Vinicius Skonicezny
Rust and Web enthusiast, front-end developer, and entrepreneur on the weekends.