DeepSeek Major Update: Support for Million-Token Context

Recently, users reported that AI company DeepSeek has started a phased test of its latest version, supporting up to one million tokens in context length. This update marks another leap in its capability to handle long-form text.

As early as last August, the context length of the DeepSeek V3.1 version had been increased to 128K tokens. Now, this new test version further enhances the limit, demonstrating DeepSeek’s strong capabilities in model optimization and computational management.

Knowledge Base Updated to May 2025

In addition to the increased context length, DeepSeek has also significantly updated its internal knowledge base, with the latest data now current to May 2025. This means that even offline, the model can accurately generate news content from April 2025, showcasing its notable advantages in timeliness and accuracy.

  • Support for up to one million tokens in context
  • Knowledge base updated to May 2025
  • Ability to generate April 2025 news content offline

This update not only enhances the model’s practicality but also opens up new possibilities for developers and enterprise users, signaling further expansion of large models in real-world applications.