The Ultimate Comparison of Claude Code Alternatives: A Complete Analysis of the 10 Strongest CLI AI Programming Tools | The ideal shore
The Ultimate Comparison of Claude Code Alternatives: A Complete Analysis of the 10 Strongest CLI AI Programming Tools Reading time: 16.11minutes
A compilation of installation, model support, advantages, and disadvantages for OpenCode, Crush, Qwen Code, Gemini CLI, Kimi CLI, Aider, Codex CLI, Grok CLI, Claude Code, and Copilot CLI, enabling you to easily select the most suitable AI coding partner.
1. Gemini CLI ⭐ 81,300+ Stars
Official Repository : https://github.com/google-gemini/gemini-cli
Core Advantages
Completely Free : 60 requests per minute, 1,000 requests per day for personal Google accounts
Massive Context : Native support for 1 million token context window with Gemini 2.5 Pro
Multimodal Capabilities : Supports various inputs including text, images, PDFs, and videos
Google Search Integration : Real-time access to web pages and latest information
MCP Support : Built-in Model Context Protocol support
Fully Open Source : Apache 2.0 license, transparent code
Supported Models
Gemini 2.5 Pro (Default, Free)
Gemini 2.5 Flash (Fast)
Other models available via Vertex AI
Installation Method
npm install -g @google/gemini-cli
gemini --version
Authentication Configuration
Method 1: Personal Google Account (Free, Recommended)
Method 2: Google AI Studio API Key
export GOOGLE_API_KEY="your_api_key"
gemini
Method 3: Vertex AI (Enterprise Users)
export GOOGLE_GENAI_USE_VERTEXAI=true
export GOOGLE_API_KEY="your_vertex_api_key"
gemini
Basic Usage
gemini
gemini -p "Create a REST API"
gemini -p "Analyze code architecture" --output-format json
Advanced Configuration Project Context (Create GEMINI.md or use /init to automatically summarize and generate)
# Project Guidelines
## Tech Stack
- React 18 with TypeScript
- Tailwind CSS for styling
- Vitest for testing
## Code Style
- Use functional components
- Follow Airbnb style guide
- Add JSDoc comments
MCP Server Configuration (~/.gemini/settings.json)
{
"mcpServers" : {
"github" : {
"transport" : "stdio" ,
"command" : "npx" ,
"args" : [ "@modelcontextprotocol/server-github" ]
}
}
}
Free Quota
Personal Account : 60 RPM / 1,000 RPD
Completely Free : No credit card required
2. Aider ⭐ 35,200+ Stars
Core Advantages
Most Mature Open Source Solution : Has the most active community
Top Model Support : Works with Claude, DeepSeek, OpenAI, Gemini, etc.
Intelligent Codebase Mapping : Creates a map of the entire codebase
Deep Git Integration : Automatically commits changes with meaningful messages
Architect/Editor Mode : Supports dual-mode collaboration between reasoning and editing models
Supported Models
Anthropic Claude Series (Recommended)
OpenAI GPT Series (including o1, o3-mini)
DeepSeek R1 & Chat V3
Google Gemini
Local models (Ollama, LM Studio, etc.)
Installation Method python -m pip install aider-install
aider-install
Basic Usage
cd /to/your/project
aider --model deepseek --api-key deepseek=<key>
aider --model sonnet --api-key anthropic=<key>
aider --model o3-mini --api-key openai=<key>
3. OpenCode ⭐ 31,500+ Stars
Core Advantages
Ultimate Terminal Experience : Built by Neovim users
Model-Agnostic Architecture : Supports Anthropic, OpenAI, Google, and even local models
Client/Server Architecture : Can be driven remotely, supports mobile app control
Modern TUI Design : Built with Zig + SolidJS
Supported Models
Anthropic Claude Series (Recommended)
OpenAI GPT Series
Google Gemini
Local LLM Models
Installation Method
curl -fsSL https://opencode.ai/install | bash
npm i -g opencode-ai@latest
scoop bucket add extras; scoop install extras/opencode
choco install opencode
brew install opencode
paru -S opencode-bin
Basic Usage
opencode auth login
cd /path/to/project
opencode
/init
4. Qwen Code ⭐ 14,900+ Stars
Core Advantages
Optimized for Qwen3-Coder : Open-sourced by Alibaba, optimized for Chinese scenarios
Large Context Window : Native support for 256K tokens, expandable up to 1M tokens
Free Usage Options : Available for free via OpenRouter or ModelScope
Codebase Understanding : Goes beyond traditional context window limits
Workflow Automation : Automates tasks like PR processing, complex rebasing, formatting, etc.
Supported Models
Qwen3-Coder-480B-A35B-Instruct (Flagship)
Qwen3-Coder-Flash (Fast)
Qwen3-Coder-Plus
Other models supported via OpenAI compatible API
Installation Method
curl -qL https://www.npmjs.com/install.sh | sh
npm install -g @qwen-code/qwen-code@latest
git clone https://github.com/QwenLM/qwen-code.git
cd qwen-code
npm install
npm install -g .
brew install qwen-code
qwen --version
Configuration and Usage
qwen
cd your-project/
qwen
Token Usage Limit** (~/.qwen/settings.json) {
"sessionTokenLimit" : 32000
}
5. Crush ⭐ 14,400+ Stars
Core Advantages
Elegant User Experience : Developed by the Charm team
LSP Enhancement : Provides additional context using LSP
MCP Extensibility : Adds functionality via MCP (http, stdio, and sse)
Multi-platform Support : Supports macOS, Linux, Windows, FreeBSD, OpenBSD, and NetBSD
Session Management : Supports multiple working sessions and contexts per project
Supported Models
Wide range of LLM support
OpenAI or Anthropic compatible APIs
Can switch LLMs within a session while retaining context
Installation Method
brew install charmbracelet/tap/crush
go install github.com/charmbracelet/crush@latest
npm install -g @charmland/crush
yay -S crush-bin
nix run github:numtide/nix-ai-tools#crush
winget install charmbracelet.crush
scoop bucket add charm https://github.com/charmbracelet/scoop-bucket.git
scoop install crush
sudo mkdir -p /etc/apt/keyrings
curl -fsSL https://repo.charm.sh/apt/gpg.key | sudo gpg --dearmor -o /etc/apt/keyrings/charm.gpg
echo "deb [signed-by=/etc/apt/keyrings/charm.gpg] https://repo.charm.sh/apt/ * *" | sudo tee /etc/apt/sources.list.d/charm.list
sudo apt update && sudo apt install crush
Usage Method The fastest way to get started is to get an API key from your preferred provider (e.g., Anthropic, OpenAI, Groq, or OpenRouter) and start. The system will prompt you for the API key.
Environment Variable Provider ANTHROPIC_API_KEYAnthropic OPENAI_API_KEYOpenAI OPENROUTER_API_KEYOpenRouter GEMINI_API_KEYGoogle Gemini CEREBRAS_API_KEYCerebras HF_TOKENHuggingface Inference VERTEXAI_PROJECTGoogle Cloud VertexAI (Gemini) VERTEXAI_LOCATIONGoogle Cloud VertexAI (Gemini) GROQ_API_KEYGroq AWS_ACCESS_KEY_IDAWS Bedrock (Claude) AWS_SECRET_ACCESS_KEYAWS Bedrock (Claude) AWS_REGIONAWS Bedrock (Claude) AWS_PROFILEAWS Bedrock (Custom Profile) AWS_BEARER_TOKEN_BEDROCKAWS Bedrock AZURE_OPENAI_API_ENDPOINTAzure OpenAI models AZURE_OPENAI_API_KEY
Configuration Example** (~/.crush/config.json) {
"$schema" : "https://charm.land/crush.json" ,
"providers" : {
"anthropic" : {
"type" : "anthropic" ,
"base_url" : "https://api.anthropic.com/v1" ,
"api_key" : "$ANTHROPIC_API_KEY" ,
"models" : [
{
"id" : "claude-sonnet-4-20250514" ,
"name" : "Claude Sonnet 4"
}
]
}
}
}
6. Kimi CLI ⭐ 2,000+ Stars
Core Advantages
Powerful Kimi K2 Model : 1T parameter MoE model with 32B active parameters
Shell Mode : Ctrl-K toggle, allows direct execution of shell commands
Deep Zsh Integration : Seamlessly integrates into the Zsh workflow
ACP Support : Agent Client Protocol, can integrate with Zed editor
MCP Tool Access : Connects to external services like Linear, GitHub, etc.
Chinese Priority : Optimized for Chinese scenarios
Supported Models
Kimi K2 Instruct (Recommended)
Kimi K1.5
Kimi For Coding (Member Exclusive)
Custom models (via configuration)
Installation Method
curl -LsSf https://astral.sh/uv/install.sh | sh
uv tool install --python 3.13 kimi-cli
kimi --version
uv tool upgrade kimi-cli --no-cache
Basic Usage
Token Usage Limits** (~/.qwen/settings.json) {
"sessionTokenLimit" : 32000
}
7. Codex CLI (OpenAI) ⭐ 49700+ Stars Note : Although Codex CLI is open source, it belongs to the OpenAI product ecosystem and requires an OpenAI API subscription or ChatGPT subscription.
Core Advantages
Powered by GPT-5-Codex : GPT-5 version optimized for agentic coding
Extremely Fast Inference : Optimized for coding tasks
Multimodal Input : Supports text, screenshots, and diagrams
Three Approval Modes : Suggest, Auto Edit, Full Auto
Cloud Collaboration : Seamless switching between IDE, terminal, cloud, GitHub, and mobile
Built with Rust : Optimized for speed and efficiency
Supported Models
GPT-5-Codex (Recommended, Default)
GPT-5
o1-mini
o3-mini
Other OpenAI Responses API models
Installation Method
npm install -g @openai/codex
brew install --cask codex
codex --upgrade
codex --version
Authentication Configuration Requires one of the following :
ChatGPT Plus Subscription ($20/month)
ChatGPT Pro Subscription ($200/month)
ChatGPT Business / Edu / Enterprise
OpenAI API Key (Pay-as-you-go)
Basic Usage
codex
> Add error handling to all API endpoints
> Refactor this component to use hooks
> Fix the TypeScript errors in this file
codex --suggest
codex --auto-edit
codex --full-auto
> /mode auto-edit
Advanced Features Configuration File (~/.codex/config.toml)
[default]
model = "gpt-5-codex"
auto_approve = false
max_tokens = 4096
[sandbox]
enable = true
container = "docker"
AGENTS.md File
Create AGENTS.md in the project root:
Core Advantages
Most Intelligent Model : Grok 4 ranks first in several benchmarks
Grok Code Fast 1 : Optimized for agentic coding, fast response (92 tokens/sec)
Native Tool Usage : Code interpreter, terminal, file editing, etc.
X Platform Integration : Access to real-time information and trends on X
Open Source : MIT license, community-driven
High-Speed Code Editing : Modifies multiple files in a single operation
Supported Models
Grok 4 (Latest Flagship)
Grok 4 Code Fast 1 (Optimized for coding)
Grok 3.5 (Fast version)
Grok 2 (Classic version)
Installation Method
npm install -g @vibe-kit/grok-cli
bun add -g @vibe-kit/grok-cli
grok --version
Configuration
export GROK_API_KEY=your_api_key_here
echo "GROK_API_KEY=your_api_key_here" > .env
Basic Usage
grok
grok --api-key your_api_key_here
Configuration File Create ~/.grok/user-settings.json:
{
"apiKey" : "your_api_key_here" ,
"baseURL" : "https://api.x.ai/v1" ,
"defaultModel" : "grok-code-fast-1" ,
"models" : [
"grok-code-fast-1" ,
"grok-4-latest" ,
"grok-3-latest" ,
"grok-3-fast" ,
"grok-3-mini-fast"
]
}
1. GitHub Copilot CLI ⭐ Produced by Microsoft
Core Advantages
Deep GitHub Integration : Out-of-the-box access to repositories, issues, and pull requests
Agentic Capabilities : AI collaborator can plan and execute complex tasks
MCP Extensibility : Built-in GitHub MCP server supports custom extensions
Full Control : Previews every action, requires explicit approval
Enterprise Grade : Supports team management and usage monitoring
Supported Models
Claude Sonnet 4.5 (New, Recommended)
GPT-4o
GPT-4 Turbo
Claude Haiku 4.5 (Fast version)
Gemini 2.5 Pro (Experimental)
Installation Method
npm install -g @github/copilot
copilot --version
Basic Usage
Subscription Plans Plan Price Features Copilot Pro $10/month CLI + IDE plugins, personal use Copilot Pro+ $39/month Higher quota, Claude Sonnet 4.5 Copilot Business $19/user/month Team management, usage monitoring Copilot Enterprise $39/user/month Enterprise customization, private knowledge base
2. Claude Code ⭐ Official from Anthropic
Core Advantages
Official Product : Developed by Anthropic, deeply integrated with Claude models
Built-in Memory : Retains information on conversations, commands, and style guides
Most Powerful Model : Claude Sonnet 4.5 is one of the most intelligent coding models available
Seamless Experience : Fully integrated with Claude API, no extra configuration needed
Enterprise Support : Professional technical support and SLA guarantees
Supported Models
Claude Sonnet 4.5 (Latest and Best)
Claude Opus 4.1/4 (Flagship)
Claude Sonnet 4 (Standard)
Claude Haiku (Fast response)
Installation Method
npm install -g @anthropic-ai/claude-code
curl -fsSL https://claude.ai/install.sh | bash
brew install --cask claude-code
irm https://claude.ai/install.ps1 | iex
claude --version
claude
Authentication Configuration
claude
/login
export ANTHROPIC_API_KEY="your_api_key"
claude
Basic Usage
claude
claude --model claude-opus-4
claude --spending-limit 100
claude -p "Refactor this component using TypeScript"
Pricing
Claude Sonnet 4.5:
Input: $3 / 1M tokens
Output: $15 / 1M tokens
Cached Input: $0.30 / 1M tokens
Claude Opus 4:
Input: $15 / 1M tokens
Output: $75 / 1M tokens
Disadvantages
Higher Cost : Can be expensive for large teams
Closed Source : Cannot view or modify source code
Anthropic Dependency : Only supports Claude models
Comprehensive Feature Comparison Table Feature Gemini CLI Aider OpenCode Qwen Code Crush Kimi CLI Copilot CLI Claude Code GitHub Stars 81.3K 35.2K 29.7K 14.9K 14.4K 2.0K Closed Closed Open/Closed Source Open Open Open Open Open Open Closed Closed License Apache 2.0 Apache 2.0 MIT Apache 2.0 MIT MIT Proprietary Proprietary Installation Ease Easy Medium Easy Easy Easy Easy Easy Easy Free Tier ✅ Generous ❌ ❌ ❌ ❌ ❌ ❌ ❌ Multi-Model Support ❌ Only Gemini ✅ Full ✅ Full ✅ Limited ✅ Full ❌ Only Kimi ✅ Multiple ❌ Only Claude Local Models ❌ ✅ ✅ ✅ ✅ ❌ ❌ ❌
Selection Guide
Selection by Use Case
Personal Learning / Small Projects
Gemini CLI - Completely free, powerful features
Qwen Code - Free tier + Chinese friendly
Aider - Open source + flexible configuration
Reasoning : Zero or low cost, moderate learning curve.
Open Source Project Maintenance
Aider - Most active community, strongest Git integration
OpenCode - Excellent terminal experience
Crush - Elegant and easy to use
Reasoning : Open source tools support open source projects, community-driven.
Startups / Small Teams (5-10 people)
Aider + Claude Sonnet 4 - $45-90/month (for the whole team)
GitHub Copilot CLI Business - $95-190/month
Qwen Code - Lowest cost option
Reasoning : Balance cost and performance, easy to manage.
Mid-to-Large Enterprises (50+ people)
GitHub Copilot CLI Enterprise - Complete ecosystem
Claude Code - Best performance
Aider - Self-hosted option
Reasoning : Enterprise-grade support, compliance assurance.
Selection by Tech Stack
Python Developers Recommended : Aider, Qwen Code, Claude Code
Reason: Best support for Python, best test coverage.
JavaScript/TypeScript Developers Recommended : GitHub Copilot CLI, OpenCode, Gemini CLI
Reason: Best integration with the frontend ecosystem.
Multi-language Teams Recommended : Aider, Crush, Claude Code
Reason: Comprehensive multi-language support.
AI/ML Developers Recommended : Qwen Code, Gemini CLI, Claude Code
Reason: Best support for data science libraries.
Selection by Workflow Style
Heavy Terminal Users
OpenCode - Ultimate terminal experience
Kimi CLI - Deep Zsh integration
Crush - Elegant TUI design
IDE Dependent Users
GitHub Copilot CLI - Seamless VS Code integration
Claude Code - Rich editor plugins
Gemini CLI - VS Code extension
Command Line Newcomers
Gemini CLI - Easiest to get started
Crush - User-friendly interface
GitHub Copilot CLI - Comprehensive documentation
Selection by Region
Users in Mainland China
Qwen Code - Best localization
Kimi CLI - Domestic vendor
Aider + Domestic Models - Flexible configuration
Some tools may require a proxy
Consider API latency issues
Prioritize domestic vendors or services with domestic nodes
International Users
Gemini CLI - Globally available + Free
GitHub Copilot CLI - Global coverage
Claude Code - Best performance
Production Readiness Fully Mature (Can be used directly in production) :
✅ Gemini CLI - Google official, 81K+ stars
✅ Aider - 35K+ stars, time-tested
✅ GitHub Copilot CLI - Microsoft official, enterprise-grade
✅ Claude Code - Anthropic official, stable and reliable
✅ OpenCode - 29K+ stars, active development
Basically Mature (Suitable for most scenarios) :
✅ Crush - 14K+ stars, developed by Charm team
✅ Qwen Code - 14.9K+ stars, developed by Alibaba
Rapidly Developing (Use cautiously for critical projects) :
Kimi CLI - 2K+ stars, developed by Moonshot AI, continuous improvement
Best Practice Recommendations
Beginner Onboarding Route Phase 1: Free Trial (1-2 Weeks)
Install Gemini CLI (Completely Free)
Learn basic commands and interactive methods
Try simple code generation tasks
Experience Google Search integration
npm install -g @google/gemini-cli
gemini
Phase 2: Exploring Open Source (2-4 Weeks)
Install Aider
Configure the Claude Sonnet 4 model
Use it in a real project
Learn Git integration and version control
pip install aider-chat
export ANTHROPIC_API_KEY=your-key
aider
Phase 3: Advanced Choices (After 1 Month)
If pursuing performance → Claude Code
If cost is a concern → Continue with Gemini CLI or Aider
If using GitHub → GitHub Copilot CLI
If Chinese is primary → Qwen Code or Kimi CLI
Team Adoption Strategy Evaluation Phase (Month 1)
Pilot 2-3 tools
Test on non-critical projects
Gather team feedback
Evaluate cost vs. performance
Select the main tool
Establish team standards and best practices
Train team members
Set up cost monitoring
Rollout Phase (Month 4 onwards)
Full team adoption
Continuously optimize workflows
Share success stories
Regularly assess ROI
Security and Privacy Considerations
Data Security Comparison Tool Code Storage Data Encryption Local Execution Enterprise Compliance Gemini CLI Cloud ✅ ❌ ⭐⭐⭐⭐ Aider Local N/A ✅ Optional ⭐⭐⭐⭐⭐ OpenCode Local N/A ✅ Optional ⭐⭐⭐⭐⭐ Qwen Code Cloud ✅ ❌ ⭐⭐⭐⭐ Crush Local N/A ✅ Optional ⭐⭐⭐⭐⭐ Kimi CLI Cloud ✅ ❌ ⭐⭐⭐⭐ Copilot CLI Cloud ✅ ❌ ⭐⭐⭐⭐⭐ Claude Code Cloud ✅ ❌ ⭐⭐⭐⭐⭐
Enterprise Usage Advice Highly Sensitive Projects :
✅ Recommended: Aider + Local Models (Ollama)
✅ Recommended: OpenCode + Self-hosting
Caution: Any cloud-based service
General Business Projects :
✅ Recommended: GitHub Copilot CLI Enterprise (with DPA)
✅ Recommended: Claude Code (Anthropic has strong privacy commitments)
✅ Recommended: Gemini CLI (Google Enterprise version)
✅ Any tool can be used
✅ Prioritize open source tools that support the open source community
Community Resources and Support
Official Documentation
Frequently Asked Questions
A : It is recommended to start with free tools:
Complete Novice → Gemini CLI (Free + Simple)
Experienced → Aider (Powerful + Open Source)
Enterprise User → Try GitHub Copilot CLI trial
Performance Difference : Top models (Claude, GPT) have a slight edge.
Feature Difference : Open source tools can achieve similar results through flexible configuration.
Cost Difference : Open source tools allow choosing cheaper or free options, offering better cost control.
Conclusion : For most scenarios, open source tools are entirely sufficient.
A : Depends on the team situation:
Small Teams (<10 people) → Aider (Most economical, $45-90/month)
Medium Teams (10-50 people) → GitHub Copilot Business ($190-950/month)
Large Teams (50+ people) → Copilot Enterprise or Claude Code (Enterprise-grade)
Q4: How to control costs?
Use free tools (Gemini CLI, Qwen Code free tier)
Enable prompt caching (can save 90% of input costs)
Use Aider's --map-tokens to limit context
Only use expensive models when necessary
Set a monthly spending cap
Q5: Will my code be used for model training?
Aider/OpenCode/Crush : Use your own API; depends on the provider.
GitHub Copilot : Enterprise version guarantees it won't be used for training.
Claude Code : Anthropic commits not to train on user data.
Gemini CLI : Google commits not to train (check privacy settings).
Suggestion : Check each provider's privacy policy; enterprise users should choose services with a DPA.
Summary
Overall Recommendations
Completely Free (60 RPM / 1,000 RPD)
1 Million token context
Google ecosystem integration
81.3K stars
Best Value for Money
Gemini: Completely Free
Aider: $1.80-9/month, most powerful features
Fully Open Source
GitHub Copilot CLI Enterprise
Enterprise-grade support
Deep GitHub integration
Reasonable pricing ($39/user/month)
Claude Sonnet 4.5
Leading on SWE-Bench
Official optimization
Native Chinese support (14.9K stars)
Large context window (256K-1M)
Free usage options
Best Terminal Experience
Modern TUI design (29.7K stars)
Client/Server architecture
Ultimate performance
Easy to get started
Clear documentation
Friendly interface
2025 Recommendation Matrix Performance
↑
|
Claude Code ● | ● Copilot CLI
|
Aider ● |
|
Qwen Code ● | ● Kimi CLI
|
OpenCode ● | ● Crush
|
Gemini CLI ● |
|
←────────────────┼────────────────→
Free/Low Cost High Cost
Final Advice If you can only choose one tool :
Limited Budget → Gemini CLI (Completely Free)
Seeking Flexibility → Aider (Most mature open source)
Pursuing Performance → Claude Code (Most powerful model)
Enterprise Use → GitHub Copilot CLI Enterprise
Chinese Priority → Qwen Code
The best tool is the one you actually use . Recommendation:
Start by trying free tools (Gemini CLI, Qwen Code)
Experience the flexibility of open source tools (Aider, OpenCode)
Decide whether to upgrade to a paid option based on actual needs
Don't hesitate to use multiple tools simultaneously
Reference Resources
Official Links
In-Depth Reading Azure OpenAI models (optional when using Entra ID)
AZURE_OPENAI_API_VERSIONAzure OpenAI models
LSP Integration ⭐⭐⭐ ⭐⭐⭐ ⭐⭐⭐ ⭐⭐⭐ ⭐⭐⭐⭐⭐ ⭐⭐⭐ ⭐⭐⭐⭐ ⭐⭐⭐⭐
Session Management ⭐⭐⭐ ⭐⭐⭐ ⭐⭐⭐⭐ ⭐⭐⭐⭐ ⭐⭐⭐⭐⭐ ⭐⭐⭐⭐ ⭐⭐⭐⭐ ⭐⭐⭐⭐⭐
Context Window 1M 200K+ 128K+ 256K-1M 200K+ 200K+ 128K+ 200K+
Windows Support ⭐⭐⭐⭐ ⭐⭐⭐ ⭐⭐⭐ ⭐⭐⭐⭐ ⭐⭐⭐⭐ Developing ⭐⭐⭐⭐⭐ ⭐⭐⭐⭐
Enterprise Support Limited Community Community Community Community Business Official Official
Documentation Quality ⭐⭐⭐⭐ ⭐⭐⭐⭐⭐ ⭐⭐⭐⭐ ⭐⭐⭐ ⭐⭐⭐⭐ ⭐⭐⭐⭐ ⭐⭐⭐⭐⭐ ⭐⭐⭐⭐⭐
Community Activity ⭐⭐⭐⭐⭐ ⭐⭐⭐⭐⭐ ⭐⭐⭐⭐⭐ ⭐⭐⭐ ⭐⭐⭐⭐ ⭐⭐⭐ ⭐⭐⭐⭐⭐ ⭐⭐⭐⭐⭐