Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Futures Kickoff
Get prepared for your futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to experience risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Alibaba open-sources three medium-sized Qwen 3.5 models that can be directly deployed on consumer-grade graphics cards
The Beijing News Shell Finance News (Reporter Luo Yidan) — On February 25th, following the open-source release of Qwen3.5-397B-A17B on New Year’s Eve, Alibaba continued to open-source the Qianwen 3.5 series models. This release includes three medium-sized new models: Qwen3.5-35B-A3B, Qwen3.5-122B-A10B, and Qwen3.5-27B. Based on architectural innovations and training breakthroughs, all three Qianwen 3.5 models set new performance records for models of their size, surpassing the larger previous flagship models Qwen3-235B-A22B and Qwen3-VL, and outperforming GPT-5 mini on multiple leaderboards.
It is noteworthy that the new Qianwen 3.5 models can even be deployed directly on consumer-grade graphics cards, making them highly developer-friendly. Currently, the hosted model Qwen3.5-Flash based on Qwen3.5-35B-A3B is available on Alibaba Cloud Baolian, with input costs as low as 0.2 yuan per million tokens. The Qianwen 3.5 models utilize a hybrid attention mechanism, combined with innovative high-sparsity MoE architecture, and are trained on larger-scale text and visual mixed tokens. The new models achieve performance improvements with fewer total parameters and active parameters.