Print this page
Published in AI

Chinese firms get their Nvidia fix in Southeast Asia

by on01 December 2025


AI chip is full of holes

Chinese tech behemoths are hauling their prized AI models overseas to gorge on Nvidia’s finest chips while pretending everything is above board.

Alibaba and ByteDance have been parking their latest large language models in Southeast Asian data centres, according to two people who know exactly what is going on.

They said offshore training has steadily climbed since the US in April tried to choke sales of Nvidia’s H20 chips, which were meant to be China-only semiconductors.

One Singapore-based data centre operator said: “It’s an obvious choice to come here. You need the best chips to train the most cutting-edge models, and it’s all legally compliant.”

Alibaba’s Qwen and ByteDance’s Doubao models have muscled their way into the top ranks of global LLMs during the past year. Qwen is popping up outside China like mushrooms in spring because it is free and “open”, which is catnip for developers.

Singapore and Malaysia have seen data centre clusters explode amid surging Chinese demand. Many of these facilities sport high-end Nvidia gear similar to what US Big Tech uses to train its own monsters.

People familiar with the practice said Chinese companies usually lease space from foreign-owned data centres, which keeps everything compliant with US export rules since the Joe Biden-era “diffusion rule” meant to close this loophole was binned by President Donald Trump earlier this year.

DeepSeek is the anomaly. The company’s tidy range of low-cost, high-quality AI models is still being trained on home soil, people with knowledge of the matter said.

These people said DeepSeek stockpiled a decent stash of Nvidia chips before the US ban kicked in. They added that the outfit is now working closely with domestic chipmakers led by Huawei to tune the next batch of Chinese AI silicon.

Huawei engineers are stationed at DeepSeek’s Hangzhou base. The company sees its cosy partnership with DeepSeek as a strategic move to advance its semiconductor and software systems so they can be used for AI training nationwide.

Training LLMs eats colossal computing power as they chew through vast datasets, which is why most Chinese groups still fancy Nvidia kit for the heavy lifting.

The companies are shifting more of the “inference” work to homegrown chips, since that part involves responding to user queries and now accounts for a growing share of AI workloads.

Chinese groups tap south-east Asian data centres to serve their overseas customers as Alibaba and ByteDance angle for a bigger slice of the global cloud market. They are dipping their toes into the Middle East.

One hitch is that Chinese firms are barred from moving private data out of the country. People in the industry said this forces any training that relies on sensitive client data to stay inside China.

Last modified on 01 December 2025
Rate this item
(0 votes)