Qwen 2.5 开源模型资源 - 7B/14B/72B全系列版本
Qwen 2.5 Open Source Model Resources - Full Series 7B/14B/72B Versions
Qwen 2.5开源模型资源,提供7B/14B/72B全系列版本。每个版本均经过优化,支持长文本输入和多语言处理,适用于不同的应用场景和硬件配置。
Qwen 2.5 open source model resources, providing full series 7B/14B/72B versions. Each version has been optimized to support long text input and multilingual processing, suitable for different application scenarios and hardware configurations.
文件大小
52.8 GB
Upload Size
52.8 GB
上传日期
2024-01-12
Upload Date
2024-01-12
下载次数
22,300
Downloads
22,300
评分
4.8/5.0
Rating
4.8/5.0
下载资源 Download Resources
下载资源表示您同意我们的使用条款和隐私政策
By downloading this resource, you agree to our Terms of Service and Privacy Policy
相关资源推荐
Grok-1超大规模AI语言模型,330B参数的稀疏专家系统。采用MoE架构,具备卓越的语言理解和生成能力,支持复杂推理和长文本处理,代表当前AI语言模型的前沿水平。
Grok-1 ultra-large scale AI language model, a 330B parameter sparse expert system. Using MoE architecture, it possesses excellent language understanding and generation capabilities, supporting complex reasoning and long text processing, representing the current frontier level of AI language models.
GPT-4模型完整版权重文件,用于高精度自然语言处理任务。包含175B参数,支持多语言理解和生成,适用于复杂推理和创作任务。
Complete GPT-4 model weights file for high-precision natural language processing tasks. Contains 175B parameters, supports multilingual understanding and generation, suitable for complex reasoning and creative tasks.
Chinchilla大语言AI模型,训练效率优化的语言模型。证明了缩放定律,展示了参数数量与训练数据的关系,实现了更高效的训练。
Chinchilla large language AI model, a training efficiency optimized language model. Proves the scaling law, demonstrating the relationship between parameter count and training data, achieving more efficient training.