Deepseek says new method can train AI more efficiently and cheaply
Chinese AI company Deepseek has unveiled a new training method, Manifold-Constrained Hyper-Connections (mHC), which will make it possible to train large language models more efficiently and at lower cost, reports the South China Morning Post.
Space X plans to lower Starlink satellites’ orbital altitude
Chinese AI company Deepseek has unveiled a new training method, Manifold-Constrained Hyper-Connections (mHC), which will make it possible to train large language models more efficiently and at lower cost, reports the South China Morning Post.
The method is a further development of so-called Hyper-Connections, which was originally developed by Bytedance in 2024. That technology, in turn, builds on the classic ResNet architecture from Microsoft Research Asia.
Deepseek says mHC provides more stable and scalable training without increasing computational costs, thanks to specific optimizations at the infrastructure level. The researchers have tested the technology on models with up to 27 billion parameters with positive results.
