Disclosure: Some of the links you’ll encounter are affiliate links. If you click and buy something, I’ll get a commission. If you’re reading a review of some precious metals company, please understand that some of the links are affiliate links that help me pay my bills and write about what I love with no extra cost to you. Thank you!
DeepSeek R1 by DeepSeek AI cut deep into the AI world, left many an AI company bleeding stock value and many more regular people perplexed at what this tool can do.
I’ve gathered the most interesting statistics and facts about this AI chatbot and these are right below. Enjoy!
Key DeepSeek R1 and DeepSeek AI Statistics, Facts and Trends for 2025
- DeepSeek R1 is priced at 1/30th of similar OpenAI models. It costs $2.19 per million output tokens versus OpenAI’s 01 model which costs $60.00 per million output tokens. (Source)
- DeepSeek R1 is currently number #1 in the Apple App Store. It took Deep Seek R1 5 days to take the top spot in Apple App Store and get ahead of ChatGPT app (currently #2). (Source)
- Currently 6 million+ people use DeepSeek R1, after only a week of it going live. (Source)
- There are currently 3 million+ DeepSeek R1 app downloads. (Source)
- Deep Seek AI is only 20 month old (founded May 2023 in China). DeepSeek launched as a spin-off from High-Flyer hedge fund, prioritizing fundamental AI research over quick profit—similar to early OpenAI which once was a true nonprofit company. (Source)
- High-Flyer’s peak at ~$15 billion in assets gave DeepSeek robust funding. This enabled the team behind DeepSeek AI to do high-level experimentation without immediate pressure to generate revenue. (Source)
- There are ~200 employees at DeepSeek compared to OpenAI’s 3,500. (Source)
- DeepSeek R1 offers models that go up to 671B parameters easily rivaling the best models from OpenAI and Anthropic AI. (Source)
- The cost to train DeepSeek R1 was ~1/10 of comparable Western models. (Source)
- The estimated cost to train DeepSeek R1 is only $5.5 million. This is dozens of times lower than the money need to train comparable Western AI models. For example, it took ~100 million to train GPT-4. (Source)
- Most DeepSeek R1 models are MIT-licensed. (Source)
- DeepSeek’s latest AI model triggered a global tech selloff, jeopardizing over $1 trillion in market capitalization. (Source)
- January 2025 launch of DeepSeek R1 drew international attention and caused a 13% pre-market drop to Nvidia stock). (Source)
- There are currently 68 research papers on arXiv. (Source)
- Chatbot Arena Rank for DeepSeek R1 is #4 with a score of 1357, just after ChatGPT-4o on the #3 place. (Source)
- DeepSeek relies on RL over extensive supervised fine-tuning, producing advanced reasoning skills (especially in math and coding). (Source)
- Multi-Head Latent Attention (MLA): This subdivides attention mechanisms to speed training and enhance output quality, compensating for fewer GPUs. (Source)
- DeepSeek preemptively gathered 10,000 Nvidia H100 GPUs , then focused on software-based efficiency to compete with larger Western labs when export controls from the US tightened. (Source)
- Predominantly Recent Graduates: Most DeepSeek researchers finished their degrees in the past two years, fostering rapid innovation through fresh perspectives and minimal corporate baggage. (Source)
- DeepSeek R1’s largest model boasts 671 Billion Parameters (DeepSeek-V3). It easily rivals top-tier Western LLMs, with far fewer costs to operate it. However, only a targeted set of parameters is activated per task, drastically cutting compute costs while maintaining high performance. (Source)
- The “R1-Distill” model compresses large models.. This makes advanced AI accessible to those with limited hardware. (Source)
- Countering US Export Controls: Despite chip embargoes, DeepSeek innovated with custom GPU communication and memory optimizations, challenging the policy’s effectiveness and US export controls and chip bans. (Source)
- DeepSeek’s continued push in RL, scaling, and cost-effective architectures has the potential to reshape the global LLM and AI market if the model proves a viable alternative to its wester counterparts. (Source)
Table of Contents
When Was DeepSeek AI Founded?
DeepSeek was founded in December 2023 by Liang Wenfeng, and released its first AI large language model the following year.
DeepSeek currently have ~200 full-time employees compared to 3500+ from OpenAI.
(Source)

What Were the Training Costs for DeepSeek R1?
The estimated training costs to train DeepSeek R1 are only $5.5 million. (Source)
That’s ~5% of the money it took to train GPT-4 (~$100 million) (Source)

Is DeepSeek R1 AI free?
Yes, DeepSeek R1 AI is free.
This model is open-sourced under MIT licensing. This means you can download and modify it at no cost. However, the DeepSeek AI charge for its model’s API usage, with rates starting at $0.55 per million input tokens.
Is DeepSeek a Chinese Company?
Yes, DeepSeek is a Chinese company.
They were founded by High-Flyer hedge fund in May 2023 in China. DeepSeek maintains its headquarters in the country and employs about 200 staff members.
Is DeepSeek AI Good?
DeepSeek’s latest model, DeepSeek-R1, reportedly beats leading competitors in math and reasoning benchmarks. With up to 671 billion parameters in its flagship releases, it stands on par with some of the most advanced LLMs worldwide.
Is DeepSeek safe?
DeepSeek hasn’t faced major security controversies, but concerns about censorship may arise given it’s Chinese-owned.
How Did DeepSeek AI achieve Competitive Performance With Fewer GPUs Compared to its Western Counterparts?
They adopted innovations like Multi-Head Latent Attention (MLA) and Mixture-of-Experts (MoE), which optimize how data is processed and limit the parameters used per query. As a result, DeepSeek gets more mileage out of its ~10,000 H100 chips.
Millennials Financial Situation Statistics, Facts and Trends for 2025 (Conclusion)
My updated guide for 2025 lists the best and latest statistics, facts and trends about DeepSeek R1 and DeepSeek AI habits and how this AI model and company are reshaping the AI world.
I hope you enjoyed it because the guide is now over.
During my research, I consulted these resources below:
References:
- What is DeepSeek? The AI chatbot is topping app store charts- https://abcnews.go.com/Business/deepseek-ai-chatbot-topping-app-store-charts/story?id=118138606
- Martin Vechev of INSAIT: “DeepSeek $6M Cost Of Training Is Misleading”- https://therecursive.com/martin-vechev-of-insait-deepseek-6m-cost-of-training-is-misleading/
- Release of DeepSeek R1 shatters long-held assumptions about AI- https://cointelegraph.com/news/release-deep-seek-shatters-long-held-assumptions-ai
- Notes on Deepseek r1: Just how good it is compared to OpenAI o1- https://www.reddit.com/r/LocalLLaMA/comments/1i8rujw/notes_on_deepseek_r1_just_how_good_it_is_compared/
- DeepSeek R1: Features, o1 Comparison, Distilled Models & More- https://www.datacamp.com/blog/deepseek-r1
- DeepSeek AI Statistics and Facts (2025)- https://seo.ai/blog/deepseek-ai-statistics-and-facts
- Breaking down the DeepSeek-R1 training process—no PhD required- https://www.vellum.ai/blog/the-training-of-deepseek-r1-and-ways-to-use-it
- This week in AI research: Latest Insilico Medicine drug enters the clinic, a $0.55/M token model R1 rivals OpenAI’s $60 flagship, and more- https://www.rdworldonline.com/this-week-in-ai-research-a-0-55-m-token-model-rivals-openais-60-flagship/
- Calm down: DeepSeek-R1 is great, but ChatGPT’s product advantage is far from over- https://venturebeat.com/ai/calm-down-deepseek-r1-is-great-but-chatgpts-product-advantage-is-far-from-over/
- DeepSeek-R1 Paper Explained – A New RL LLMs Era in AI?- https://aipapersacademy.com/deepseek-r1/
- DeepSeek-R1 – The Chinese AI Powerhouse Outperforming OpenAI’s o1 — at 95% Less Cost- https://arbisoft.com/blogs/deep-seek-r1-the-chinese-ai-powerhouse-outperforming-open-ai-s-o1-at-95-less-cost
- DeepSeek Revenue and Usage Statistics (2025)- https://www.businessofapps.com/data/deepseek-statistics/

Nikola Roza
Nikola Roza is a blogger behind Nikola Roza- SEO for the Poor and Determined. He writes for bloggers who don't have huge marketing budget but still want to succeed. Nikola is passionate about precious metals IRAs and how to invest in gold and silver for a safer financial future. Learn about Nikola here.