Faux ChatGPT, Claude API Packages Deliver JarkaStealer

Summary:
Two malicious Python packages, "gptplus" and "claudeai-eng," posed as tools for API integration with popular AI chatbots like OpenAI’s GPT-4 Turbo and Anthropic’s Claude, but instead delivered a newly documented infostealer called "JarkaStealer." Uploaded to the Python Package Index by a user with the alias "Xeroline," these packages exploited the popularity of generative AI platforms to target developers and organizations.

The attack capitalized on the growing demand for free or alternative access to paid AI services, a tactic often exploited in malicious circles to lure unsuspecting users. Despite offering only limited demo functionality, the packages appeared somewhat legitimate, thanks to efforts by the attacker to simulate partial functionality. This facade likely tricked users into believing the tools were authentic. Once installed, the packages deployed a Java Archive (JAR) file containing JarkaStealer, a lightweight infostealer available for as little as $20 on the Russian-language Dark Web, with optional add-ons priced between $3 and $10. Its source code is also accessible on GitHub. JarkaStealer performs various data theft operations, including stealing system and browser data, capturing screenshots, and extracting session tokens from popular applications. However, its effectiveness in carrying out these tasks remains debatable.

Security Officer Comments:
The packages survived on PyPI for nearly a year before being identified by Kaspersky researchers. By the time they were reported and taken down, they had been downloaded more than 1,700 times across over 30 countries, with the highest number of downloads in the United States. However, analytics from ClickPy indicate the download numbers may have been artificially inflated shortly after their release, likely to create the illusion of popularity and credibility.

Suggested Corrections:
Security professionals emphasize the importance of vetting software packages before downloading, especially by checking user reviews and download statistics. George Apostopoulos, a founding engineer at Endor Labs, pointed out that attackers deliberately boosted download figures to exploit common trust signals. "Many users don’t scrutinize the authenticity of packages, making them easy targets for such schemes," he noted.

For those who downloaded and used one of the malicious packages, the main recommendation is to immediately delete it. The malware doesn’t have persistence functionality, so it’s launched only when the package is used. However, all passwords and session tokens that were used on a victim’s machine could have been stolen by JarkaStealer, and so should be immediately changed or reissued.

It’s also recommend that developers be especially vigilant when working with open source software packages, and inspect them thoroughly before integrating them into their projects. This includes a detailed analysis of the dependencies and the respective supply chain of software products – especially when it comes to such a hyped topic as the integration of AI technologies.

Link(s):
https://www.darkreading.com/application-security/faux-chatgpt-claude-api-packages-jarkastealer

https://www.kaspersky.com/blog/jarkastealer-in-pypi-packages/52640/