Recently, the author of a popular npm package injected code that wipes the files of users from Russia and Belarus. Even though the change was rolled back in a few hours, newer versions of the package still create a text file on the user's desktop when called. This sparked lively debates around the utility of so-called 'protest-ware', and whether such action is in line with the spirit of open source.
Here, I want to address two things: whether such an action is allowed, and if we should audit all third-party dependencies before using them.
On March 7, GitHub user RIAEvangelist released
email@example.com, which contains code that wipes files with a heart emoji if the machine's public IP address is from Russia or Belarus. This malware was obfuscated and not documented anywhere. It got removed in
firstname.lastname@example.org on the same day.
A few hours later,
email@example.com was published with a less radical behavior: the package included the
peacenotwar module, which, when called, prints a message to stdout and creates a text file on the user's desktop. This dependency is explicitly stated in the README.
(For more details on the whole incident, see this post by Snyk.)
Caught in a landslide
Presumably, the goal of these actions is to deter Russian citizens from contributing to the war, or simply to raise awareness on the fact that there is a war going on.
I will not argue whether such approach is necessary or effective for those goals. Instead, I want to address a common argument whenever this kind of issues come up: the license explicitly disclaims all warranties, so the author is free to do whatever they like.
Specifically, the MIT License, under which
node-ipc is licensed, states that
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED [...]
Hence, by using the software, the user accepts that the software may not work well and exhibit unwanted behaviors, such as crashing, taking up resources, or, well, deleting files.
No escape from reality
However, a license is not some magic cheat code that absolve the author of all responsibilities imposed by the law. Distributing malware is prohibited under the Computer Fraud and Abuse Act (CFAA) in the U.S. as well as other countries with legal frameworks in cybersecurity, and in almost every case, deleting files without user consent is malicious behavior. Imagine if I receive an email that contains a link saying "Download your new software," and I knowingly click the link only to get my drive wiped; does the fact that I click it voluntarily clear the sender of all liability for harm?2
Even without such a legal framework, the problem is not that the author wrote a malware, but that they distributed it through the npm Public Registry, whose terms forbid the deployment or delivery of such programs. People are free to write code that deletes their hard-drive, just not spreading it to other machines.
I’m just a poor boy
But wait, given this definition of malicious behavior, does that mean should my code has a weird bug where it could potentially wipe the user's files, then I might go to jail? This brings us to a key point of how the law works which many programmers ignore: intention matters.
In the CFAA, every section specifying prohibited behaviors contains one of the phrases "knowingly", "intentionally", and "with intent to". Almost all laws distinguish between involuntary and deliberate violation of law, and in this case, the intent is quite evident. In fact, unintentional bugs are one of main things covered by the "no warranty" clause in most open source licenses. It means that if
emacs overheats the CPU when I press spacebar, all I can do is making it interpret a temparature anomaly as "Control".
I need no sympathy
Most of the above talks about the original malware which wipes the files of users. What about the
peacenotwar module? Is it okay because it is explicitly stated in the project's README?
Here, the line is less clear-cut, and the behavior, though unexpected, is not malicious in intent. I do think that software authors should be able to express their political view in their own projects if they wish, since that is a basic freedom of expression. Software is not some Platonic ideal that resides out of the politics of our lives, and forcing it to be free of all politics is, well, political in and of itself.
(Whether these actions are effective or not is another question, which I won't discuss here.)
Nothing really matters
You can't trust code that you did not totally create yourself. [...] No amount of source-level verification or scrutiny will protect you from using untrusted code.
What then? If we cannot trust any code to be non-malicious, how do we make usable programs, let alone secure ones? Has all of open source been a futile endeavor?
Any way the wind blows
The solution is to approach the problem from a human perspective, not a technical one: we trust the author of the software to have acted in good faith.
There is nothing special about computer that allows us to put complete trust into any piece of code, and that is okay. Instead, we strive to build trust among developers so that we don't have to read every line of code in every level of abstraction to make sure our files won't be deleted. Being able to trust someone else without having to painstakingly going over their work gives us a foundation we can build upon, enbaling us to collaborate with strangers and foster a healthy community of developers. In the words of Ken Thompson:
Perhaps it is more important to trust the people who wrote the software.
Of course, this is no panacea. Trust can be broken, sometimes in irrepairable way. However, throwing everything out of the window and declaring nobody should be trusted is not a viable option. That is why we have laws for computer crimes: to hold the malicious actors responsible, thus restoring trust among members of the community.
Someone will be malicious, and quite often they are not easy to find out. Someone will make a mistake, which might break your code no matter whether it is intentional or not. Someone will have to inspect the code with utmost care, for it is the only way to build trust in the first place. Even then, someone will slip through the cracks, eroding trust in the whole community.
But that someone should not be everyone.
As of now, the two affected versions,
10.1.2, have been removed from the npm registry, and the problem was assigned CVE-2022-23812.
This kind of incident is not new. In January, the maintainer of popular npm libraries
faker.js sabotaged his own projects, making the former print gibberish text indefinitely and deleting the latter. In stark constrast with the current incident, npm quickly took over his packages while GitHub swiftly suspended his account. (For more details, see this article by BleepingComputer.)
In the end, RIAEvangelist reverted the change shortly afterwards, so the impact is not as serious, and they probably saw the shortsightedness in publishing the obfuscated malware. Some may no longer depend on their projects, some may continue to put faith in them, and some may not care at all.
One thing that hasn't change is we can still choose whom we trust.
The only way to make a man trustworthy is to trust him.
(Henry L. Stimson)
npm, by default, will update your dependencies to all future minor/patch versions. For example,
^9.1.0 will use
9.2.2 when it is released, but not