福模

免费开源AI模型下载_本地AI工具资源平台

大语言模型Large Language Models

Qwen 2.5 开源模型资源 - 7B/14B/72B全系列版本

Qwen 2.5 Open Source Model Resources - Full Series 7B/14B/72B Versions

Qwen 2.5开源模型资源,提供7B/14B/72B全系列版本。每个版本均经过优化,支持长文本输入和多语言处理,适用于不同的应用场景和硬件配置。

Qwen 2.5 open source model resources, providing full series 7B/14B/72B versions. Each version has been optimized to support long text input and multilingual processing, suitable for different application scenarios and hardware configurations.

Qwen开源模型多语言长文本QwenOpen Source ModelMultilingualLong Text

文件大小

52.8 GB

Upload Size

52.8 GB

上传日期

2024-01-12

Upload Date

2024-01-12

下载次数

22,300

Downloads

22,300

评分

4.8/5.0

Rating

4.8/5.0

下载资源 Download Resources

下载资源表示您同意我们的使用条款和隐私政策

By downloading this resource, you agree to our Terms of Service and Privacy Policy

相关资源推荐

Grok-1超大规模AI语言模型 - 330B参数稀疏专家系统Grok-1 Ultra-Large Scale AI Language Model - 330B Parameter Sparse Expert System

Grok-1超大规模AI语言模型,330B参数的稀疏专家系统。采用MoE架构,具备卓越的语言理解和生成能力,支持复杂推理和长文本处理,代表当前AI语言模型的前沿水平。

Grok-1 ultra-large scale AI language model, a 330B parameter sparse expert system. Using MoE architecture, it possesses excellent language understanding and generation capabilities, supporting complex reasoning and long text processing, representing the current frontier level of AI language models.

Grok超大模型MoEGrokUltra-Large ModelsMoE
220 GB2025-02-27
GPT-4模型完整版权重文件 - 高精度NLP模型GPT-4 Model Full Weights File - High-Precision NLP Model

GPT-4模型完整版权重文件,用于高精度自然语言处理任务。包含175B参数,支持多语言理解和生成,适用于复杂推理和创作任务。

Complete GPT-4 model weights file for high-precision natural language processing tasks. Contains 175B parameters, supports multilingual understanding and generation, suitable for complex reasoning and creative tasks.

GPTNLP权重文件GPTNLPWeights File
340 GB2025-01-15
Chinchilla大语言AI模型 - 训练效率优化Chinchilla Large Language AI Model - Training Efficiency Optimization

Chinchilla大语言AI模型,训练效率优化的语言模型。证明了缩放定律,展示了参数数量与训练数据的关系,实现了更高效的训练。

Chinchilla large language AI model, a training efficiency optimized language model. Proves the scaling law, demonstrating the relationship between parameter count and training data, achieving more efficient training.

Chinchilla大语言模型训练效率ChinchillaLarge Language ModelTraining Efficiency
25.6 GB2025-02-11