DeepSeek

Follow

DeepSeek tests sparse attention to reduce AI costs

Reported by AI

Chinese AI firm DeepSeek is experimenting with sparse attention mechanisms to significantly lower the processing costs of large language models. The approach focuses computations on key parts of input data, potentially halving resource demands. This development could make advanced AI more accessible amid rising energy concerns.

This website uses cookies

We use cookies for analytics to improve our site. Read our privacy policy for more information.
Decline