Skip to content
/
OpenRouter
© 2026 OpenRouter, Inc

Product

  • Chat
  • Rankings
  • Apps
  • Models
  • Providers
  • Pricing
  • Enterprise
  • Labs

Company

  • About
  • Announcements
  • CareersHiring
  • Privacy
  • Terms of Service
  • Support
  • State of AI
  • Works With OR
  • Data

Developer

  • Documentation
  • API Reference
  • SDK
  • Status

Connect

  • Discord
  • GitHub
  • LinkedIn
  • X
  • YouTube
Favicon for cognitivecomputations

Dolphin 2.6 Mixtral 8x7B 🐬

cognitivecomputations/dolphin-mixtral-8x7b

This is a 16k context fine-tune of Mixtral-8x7b. It excels in coding tasks due to extensive training with coding data and is known for its obedience, although it lacks DPO tuning.

The model is uncensored and is stripped of alignment and bias. It requires an external alignment layer for ethical use. Users are cautioned to use this highly compliant model responsibly, as detailed in a blog post about uncensored models at erichartford.com/uncensored-models(opens in new tab).

#moe #uncensored

Modalities

Context

33K

Knowledge Cutoff

Dec 31, 2023

Activity

Recent activity on Dolphin 2.6 Mixtral 8x7B 🐬

Total usage per day on OpenRouter

Not enough data to display yet.