The Python Software Foundation has secured $1.5 million from Anthropic, the company behind Claude AI, for a two-year partnership focused on enhancing Python ecosystem security. This follows the foundation's rejection of similar funding from the US government last year over concerns about diversity, equity, and inclusion policies. The investment aims to protect the Python Package Index from supply chain attacks and support ongoing operations.
Python has become essential to modern AI development, powering frameworks like TensorFlow and PyTorch due to its accessibility and rich libraries. On January 15, 2026, the Python Software Foundation (PSF) announced a $1.5 million investment from Anthropic over the next two years.
Last year, the PSF rejected a comparable $1.5 million grant from the National Science Foundation (NSF). The decision stemmed from a clause allowing the NSF to reclaim funds if the PSF violated the US government's anti-DEI policies. Loren Crary of the PSF addressed this in a statement, highlighting the foundation's concerns.
Anthropic's funding targets security improvements for the Python ecosystem, particularly the Python Package Index (PyPI). PyPI hosts hundreds of thousands of packages and serves millions of developers worldwide but remains vulnerable to malicious open-source uploads. The partnership will develop automated review tools for uploaded packages, shifting from reactive measures to proactive detection.
Key initiatives include creating a dataset of known malware to train detection tools that spot suspicious patterns. This approach could extend to other open-source repositories. Beyond security, the funds will sustain PyPI operations, the Developers in Residence program for CPython contributions, and community grants.
Anthropic's contribution underscores its reliance on Python for operations, blending self-interest with community support. As AI firms increasingly depend on open-source infrastructure, such investments highlight the need for sustainable funding models amid corporate freeloading concerns.