Louigi Verona's Workshop

‹‹‹back

On the justification of the four freedoms

Ethics in the digital world


Louigi Verona
July 2016 - October 2017

The free software movement is a social movement with the goal of guaranteeing certain freedoms for software users. It was formally founded by Richard Stallman in 1983 and has since campaigned for the importance of these freedoms and against any actions, taken by software companies large and small that attack these freedoms. The Free Software Foundation also maintains the pivotal free software license called GNU Public License, colloquially known as GPL.[0]

Over the years there has been some criticism leveled against both the campaigns run by the Foundation and its GPL license. Very few people, however, seem to question the philosophical basis of the free software movement, namely, what it calls "essential user freedoms". Research reveals virtually no published analysis of Stallman's writing, who has laid out the theoretical framework of the movement. Its ideas remain unchallenged and the validity of "essential user freedoms" is usually taken for granted.

The goal of this treatise is to demonstrate the following:

It is, perhaps, also helpful to note what this treatise does not say:

Importantly, the purpose of this work is not to deliver final judgments, but to instead open a door to a conversation about the ethical and political challenges that the world of new technology has bestowed upon us; a conversation free of ideology and preconceptions, based on facts and reason.

The author is a proponent of free software and strongly believes that it has an important and necessary role in society, without being touted as the only ethical choice.




1. Theoretical framework of the Free Software movement

The basis of the theoretical framework is the Free Software Definition, which defines the terms "free" software and "proprietary" software, with "free" software being programs that respect users freedoms.

The freedoms in question are statements, formulated over the years by Stallman and are these:

The freedom to run the program as you wish, for any purpose (freedom 0).
The freedom to study how the program works, and change it so it does your computing as you wish (freedom 1). Access to the source code is a precondition for this.
The freedom to redistribute copies so you can help your neighbor (freedom 2).
The freedom to distribute copies of your modified versions to others (freedom 3). By doing this you can give the whole community a chance to benefit from your changes. Access to the source code is a precondition for this.[3]

If the software program denies these freedoms to the user, then, according to Stallman, it "makes the program an instrument of unjust power" and is then called "proprietary". [3]

"We campaign for these freedoms because everyone deserves them." [3]


The importance of Stallman's pronouncements should not be underrated. This is one of the first formulations of ethics in the realm of software, and a claim to a very bold way of dividing people and their actions into "good" and "evil". Such a claim requires scrutiny.

A careful reader will recognize that Stallman is actually making two separate claims:

  1. Proprietary software is unjust
  2. The solution to this is to adopt certain freedoms as essential

The adoption of proposed user freedoms then leads to making sure that people avoid using proprietary software and resources are channeled into creating and adopting "free" software, as the only ethical option.

What case does Stallman make to support these claims?

The whole theoretical framework of the Free Software movement is laid out in 12 essays at gnu.org in the "Philosophy" section, under the title "Introduction".[1] An additional link, "History of GNU", leads to a page with several more essays talking about the history of the project and an article in which Stallman recalls how the rise of proprietary (closed source) software has changed his hacker community and inspired him to take action against what he perceives as control of corporations over "what kind of society we are allowed to have".[2]

Among the 12 essays, 7 deal with the philosophy itself, 1 lists alleged problems with non-"free" software, 1 lists motivations to writing "free" software, 1 is a dystopian fiction story and 2 deal with government adoption of "free" software.

There are an additional 28 essays, as of the moment of writing, mostly commenting on recent events and largely repeating the message, laid out in the "Introduction".

Stallman dedicates considerable time arguing for claim 1. In fact, the bulk of his writing is dedicated to outlining the unfair character of proprietary software. However, there is hardly anything in his essays dedicated to demonstrating why "free" software is a solution to the problem, and if it is, why should this solution be preferred to other possible solutions.

Stallman seems to think that the problem at hand and his proposed solution are obvious and require no special justification, but of course this is not the case, and, as will be demonstrated in this treatise, his perception of the world of software lacks nuance.

In general, Stallman's arguments are typically not done in any organized manner. His handling of arguments is fairly unsystematic.

This is not a stab at the content of Stallman's writings, as his manner of organizing an argument has no relevance to whether it is correct or not. However, it remains relevant to anyone investigating his claims.

The reason why this is mentioned at all is because it requires considerable effort to locate Stallman's arguments and then organize them in a systematic fashion. The result of this is that arguments in this treatise are rarely presented to the reader in the same form as they appear in Stallman's essays. I always link to Stallman's original writings when referencing his position and devote significant time explaining why I believe a certain essay or line of reasoning can be summarized in the way I present it.

This state of affairs exposes investigators to possible reproach that some argument, buried in the text, had been missed, or worse - misinterpreted. I have attempted to make my analysis of Stallman's writings as careful, thorough and charitable as possible. In a number of instances I believe my interpretation makes Stallman's arguments stronger.


In the following chapters we will examine both Stallman's claims. In chapter 2 we will examine the case he makes for the injustice of proprietary software and also look at the evidence he provides. In chapter 3 we will examine his second claim. Chapter 4 will outline the conclusions and talk about proposed solutions.



2. Is proprietary software unjust?

Stallman's case for claim 1 is not done in any organized manner. He often simply postulates that proprietary software is unjust, without giving much explanation as to why that is so. In one of the articles he writes:

Proprietary software, also called nonfree software, means software that doesn't respect users' freedom and community. This means that its developer or owner has power over its users. This power is itself an injustice.[4]

In yet another article he expands the point:

If the users don't control the program, the program controls the users. With proprietary software, there is always some entity, the developer or “owner” of the program, that controls the program—and through it, exercises power over its users. A nonfree program is a yoke, an instrument of unjust power.[5]

But all Stallman does is repeat his previous statement. There is no explanation as to why he considers this power to be unjust and, in fact, whether the whole framing of developer/user relationship in terms of power of one over the other is even legitimate.

Most of Stallman's justification seems to be found in his essay "Free Software Is Even More Important Now", under the subtitle "The Injustice of Proprietariness" and under the subtitle "Free Software: More Than “Advantages”".[5] His arguments seem to be these:

  1. if the user cannot change the source code, then he is in theory open to abuse from the author of the program
  2. abuse from proprietary software developers is standard practice
  3. proprietary software keeps people divided and reduces cooperation

In effect, all three points are but one argument and can be summarized as "dependence on a software vendor". Stallman will typically use exactly this kind of language, by referring to "dependence on megacorporations".[134] The three statements listed are the components that appear to constitute Stallman's contention.

It is unclear whether Stallman's position is consequentialist in nature, i.e. what matters to him are possible negative consequences, or if he despises the whole idea of someone not giving another person an ability to change the program, and believes that it is wrong on principle, even if no negative consequences will follow and/or these negative consequences have a solution other than "free" software.

If it is the latter, then the position runs contrary to the secular approach to ethics, with its emphasis on consequences of actions, and should be considered unreasonable until rational reasons are put forward to justify the principle. I will, however, be assuming that Stallman's philosophy is pragmatic in nature and that what matters to him are consequences that he believes are real.

This assumption carries with it an obligation that if consequences are demonstrably much less grave or even absent in certain cases, this change of circumstances must influence the pragmatic philosophy.

It is also important to note that although Stallman posits other arguments, all of them depend on the idea that proprietary software is unjust. To give an example, in the same essay Stallman writes:

Schools (and this includes all educational activities) influence the future of society through what they teach. They should teach exclusively free software, so as to use their influence for the good. To teach a proprietary program is to implant dependence, which goes against the mission of education. By training in use of free software, schools will direct society's future towards freedom, and help talented programmers master the craft.[5]

However, this argument only works if we already believe that proprietary software is "bad", that it "implants dependence" and that it does not allow for cooperation and education. Therefore, we are ignoring such arguments as secondary and not important to Stallman's primary case.



2.1 Abuse from the author of software is possible

If the users don't control the program, the program controls the users. With proprietary software, there is always some entity, the developer or “owner” of the program, that controls the program—and through it, exercises power over its users.[5]

It is entirely true that a closed source package provides little transparency of its inner workings to the user. Our goal is to understand the implications of this circumstance and whether it shows proprietary software to be unjust.

We should start by noticing that such situation is true for many trades. For example, when we go to a restaurant, we hope that the cook has not abused our trust and will not poison us. When we board an airplane, we have no access to its engines and would not be able to check for ourselves whether its design is adequate and poses no health risks. We effectively put our trust in professionals whose job it is to make sure that food and airplane engines are safe.

This trust is pragmatic and rational. If we try to be experts at everything, we would effectively be denying ourselves all the fruits of the division of labor which enables people to specialize and advance their craft.

At the same time, the dependence of consumers on producers is not an issue that is ignored. There is a feedback loop that allows us to cease support of businesses and establishments that repeatedly betray our expectations. In some cases these businesses can be severely punished, in some cases making large corporations responsible is objectively difficult to achieve.

Free market, as effective as it is, is therefore not the only mechanism to manage the relationship between producers and consumers. Businesses are regulated, and are often required to satisfy strict safety and quality requirements. Consumer rights protection is a constantly developing body of law, dramatically improving the quality and reliability of services in the private marketplace.

Even very reliable businesses will occasionally be liable for misconduct. A cook might use spoiled milk or unwashed vegetables, either deliberately or by mistake. This problem is real. 351 000 people die of food poisoning globally every year.[6] Studies also indicate that in the US almost half of food money is spent on restaurant-prepared food.[7] Thus, restaurant food safety is an important concern.

This concern is widely recognized, and the food industry is heavily regulated. As a consequence of this concern we do not, however, consider restaurants to be an unethical business practice in and of itself. The theoretical possibility of misconduct is simply not enough to consider a practice unethical. All it does, is call for a certain level of control, typically provided by a government institution and/or an independent third party.

It should be recognized that analogies with other trades are totally applicable in this context. Stallman's argument is not about software, it is about control. He argues that outsourcing an activity to a professional is problematic, in fact, so much so that it should be completely rejected - because such a position of "power" can in theory be abused.

Let us reiterate again: whenever we outsource a certain activity to professionals, we are giving up a portion of our control and have the professionals exercise "power" over us. As professionals, we ourselves exercise such "power" over our customers every day. This is the basis of modern economy - the division of labor.

Because Stallman does not explain how software development is different in this regard to other areas of human expertise, his position, consistently applied, becomes a position against the division of labor.

Now, I am not saying that Stallman is against the division of labor. That would be quite an uncharitable reading of what he is saying. But I do believe that if we follow his line of thought, that's where we end up. Stallman seems to limit his reasoning to software alone and never addresses the wider implications of what he is saying. He does not explain why software should be taken out of this wider context and, honestly, I haven't been able to come up with any ideas either. In other words, I currently don't see how Stallman's argument does not become an argument against the division of labor as a concept.

And even if we throw out all of the comparisons with other industries, his argument definitely implies that the division of labor in the realm of software is an imbalance of power - a claim that requires more than just an assertion. And in later sections and chapters we'll see that a lot of reasoning and evidence goes against this claim.

One might, perhaps, argue that professionalization does not mean making software proprietary. This will be addressed in great detail in Chapter 3. To give a brief response here, turning software development of a product which users are free to modify and redistribute into a profitable business is objectively difficult. In today's world and for most software products, profitable software development is guaranteed only when developers protect the fruits of their labor by keeping the source code to themselves. At the same time, taking advantage of access to source code is, in most cases, not very realistic. This latter point is going to be addressed in multiple sections of this book, the earliest of these instances coming up in just several paragraphs.


When evaluating Stallman's argument, we must also address the language he uses. As we will see in the many following sections, Stallman's writing style fails to contribute to a balanced discussion. He repeatedly employs a number of rhetorical devices that are best left to propaganda, not a serious discourse of ethics.

For instance, he often uses what is known as "loaded language", when words chosen do not aim for an impartial representation, but are utilized to influence emotional state of the reader and guide him toward a particular conclusion based on visceral reactions.

It is obvious that using the phrase "exercise of power" is likely to invoke a negative response, and thus make things appear more sinister than they really are. As mentioned above, we are subject to such "exercise of power" daily and from multiple sources - and don't consider it to be problematic to the point of rejecting professional services.

He also uses something known as "persuasive definition", when a word is given an uncommon definition in order to advance one's agenda. Stallman calls "non-free" programs a "yoke, an instrument of unjust power" and calls proprietary software "malware".

And finally, in that same paragraph he uses equivocation to grossly misrepresent the actual state of affairs. He writes: "If the users don't control the program, the program controls the users".

By employing his own definition of "controlling a program" in the first part of the sentence, which in this case would mean "having access to a program's source code", and then using the word "control" in its ordinary meaning of "exercising command or power over someone", he is able to make it appear as if proprietary software developers have nefarious plans towards the user.

But, of course, not having access to source code does not automatically mean that the program "controls the user". It's not even clear what that actually means. We can, perhaps, intend it as a metaphor, that a program can perform actions that the user does not want it to, or that it cannot perform actions that the user wants it to.

But anyone who cannot program will find themselves in any of these situations daily, whether the source code is available or not. Inability to write code denies the user to be "in control" of any software, regardless of its standing in relation to Stallman's freedoms.

The only segment of the population, to which the problem of controlling software makes sense at all, are programmers. But even among them there is a great deal of specialization, and someone who has done web development all their life would require substantial efforts to make sense of a desktop environment, and someone who has done graphics development would be lost in programs that deal with audio processing. Differences between programming languages, environments and the amount of special knowledge required is enough that any meaningful change is likely to be impossible unless substantial resources are spent on acquiring skills necessary for the job.

In other words, the availability of source code in most cases is irrelevant and does not solve the problem of our metaphorical "program controlling the user". In fact, Stallman himself acknowledges this when he speaks about the non-"free" nature of JavaScript.[27] Although JavaScript code has to be passed on to the client-side in its entirety, Stallman still claims it is non-"free", since it is often distributed in minimized form and is difficult to edit. In other words, source code should not simply be available, but also immediately and easily editable.

This is an interesting detail. Here we see Stallman trying to explain that his interest is not a formal alignment with his proposed freedoms, but their actual availability to the user, which is a commendable approach. What's not commendable, however, is the lack of recognition that in case of a user who cannot program, the source code is definitely not editable, as the user has no skills to do that. Arguing that a user is able to learn how to code would be invoking a double standard. A minimized JavaScript library can definitely be edited in such a way so as to make it readable. And, arguably, this will take less effort then learning how to code (in fact, there are tools that do a good job of deobfuscating JavaScript code). And thus, while formally a user has the freedom to edit the code, in all actuality she cannot exercise that freedom.

It can, perhaps, be argued that the availability of source code will allow one to find help more easily, but we will discuss this in greater detail when analyzing Stallman's second claim, that "free" software is the solution. Suffice to say, the advantages are not very obvious.

And this still does not save Stallman from a double standard: being dependent on a developer of a proprietary program is bad, but being dependent on a developer of a "free" program is somehow acceptable:

A person or company has the right to cease to work on a particular program; the wrong here is Microsoft does this after having made the users dependent on Microsoft, because they are not free to ask anyone else to work on the program for them.[129]

It can easily be argued that what Stallman does is simply advocate for users to stop being dependent on proprietary software developers, and begin to be dependent on the "free" software community, which they now have to ask to develop programs for them, as if the former dependency is "unjust" and the latter is "just".


So, let us reiterate both the initial argument and my response.

According to Stallman, proprietary software is unjust because users do not have complete control over the software, by which Stallman understands owning the source code. Under the assumption that Stallman's philosophy is consequential, the lack of complete control on the part of the users is problematic because they are then open to possible abuse from the software developer.

I, however, have argued that mere theoretical possibility of misconduct is not enough to consider a practice unethical. All it does, is call for a certain level of control, typically provided by a government institution and/or an independent third party.

A more rational approach would be to analyze not only whether abuse is possible, but whether it is also probable.

I have also argued that the cost of complete control of software by the user is very high, as most consumers of software are not software engineers, but have deferred expanding on this point in detail until Chapter 3.

And this is what leads us to Stallman's second argument.



2.2 Abuse from proprietary software developers is standard practice

Full phrase used by Stallman is "...proprietary programs are designed to spy on the users, restrict them, censor them, and abuse them." He then links to a separate page that is called "Proprietary Software Is Often Malware". Apart from text, which is partially quoted below, the page contains a table, linking to examples of proprietary software abusing its users.

Here's what Stallman writes:

Power corrupts, so the proprietary program's developer is tempted to design the program to mistreat its users—that is, to make it malware. (Malware means software whose functioning mistreats the user.) Of course, the developer usually does not do this out of malice, but rather to put the users at a disadvantage. That does not make it any less nasty or more legitimate.

Yielding to that temptation has become ever more frequent; nowadays it is standard practice. Modern proprietary software is software for suckers!

Users of proprietary software are defenseless against these forms of mistreatment. The way to avoid them is by insisting on free (freedom-respecting) software. Since free software is controlled by its users, they have a pretty good defense against malicious software functionality. [4]

We again see signs of loaded language and persuasive definition, but let us return to this paragraph later, after we've looked through the examples of proprietary software abuses that Stallman offers. The only important thing to note here is that Stallman claims that users are being routinely mistreated, that this mistreatment is intentional, and that this is "standard practice".

The examples designed to prove this claim are divided into a number of sections, namely: backdoors, censorship, insecurity, sabotage, interference, surveillance, Digital Restrictions Management, jails, tyrants. And then specifically by company products: "Apple Malware", "Microsoft Malware", "Malware in mobile devices", "Malware in the Amazon Swindle", etc.

Note: section headings listed above contain inaccuracies, inserted by Stallman as a reflection of his political position. "Digital Restrictions Management" is actually "Digital Rights Management". Amazon Swindle refers to Amazon Kindle. The term malware, as we will see in the sections below, is name-calling on Stallman's part and does not refer to actual malware. For example, he considers Microsoft Windows as a whole to be malware. [130]

We will now go through each section and study the examples provided in order to see whether they indeed demonstrate the validity of Stallman's claim.


However, before we commence with our analysis, we should answer one very important question, namely - what is the use of bringing up these examples? Or, asked in another way - what do these examples intend to prove?

The purpose of examples might be this: to show that abuse by proprietary software developers is not simply possible, but also probable. But that the purpose of these examples is to show abuse being probable is my own interpretation, and a very charitable one at that. Stallman does not say that this is what he is doing and seems to be committing a fallacy of circular reasoning, which is trivial to show.

The important word in the quote above is "mistreat". Stallman never defines this word directly, but it is actually a key term. He calls whole operating systems "malware"[130] and then loosely defines malware as software, which "mistreats" the user.

In his examples he often seems to define "mistreatment" as software not respecting the principles of "free" software. But if his goal is to show that proprietary software is unjust and THUS we need "free" software, saying that proprietary software is unjust, because it is not "free", does not prove any injustice. Instead, this is classical circular reasoning - "free" software is just, because proprietary software is unjust, and it is unjust, because it is not "free" software.

Therefore, any attempt to understand Stallman's examples of "mistreatment of users" in terms of his proposed freedoms is invalid reasoning and should be rejected. Unless Stallman's argument is simpler and is this: proprietary software allows for misconduct, therefore, it is unjust. End of argument. And examples are, literally, examples of injustice of proprietary software, without claiming to be proof of any kind, in which case they are useless to Stallman's case.

These examples can also be looked at as a sort of additional evidence that what Stallman theorized about is actually happening - and happening routinely, on an everyday basis, as standard practice of the proprietary software industry.

Whether this is what Stallman intended or not is difficult to infer from his writing. But as I believe that this is a better case for "free" software, I will look at his data from the point of view of this more sophisticated framework. One can indeed attempt to demonstrate that the software industry is especially prone to customer rights violations, where these violations are defined in terms other than Stallman's software freedoms to avoid invalid reasoning, and make the case for proprietary software being unjust.

Let us now proceed to evaluate evidence for this refined claim.

Important note: the reader is encouraged to go through these sections, as they are much more than just a list of examples. Section 2.2 will supply very relevant background to the kind of problems that worry "free" software supporters, and the kind of problems that should worry everyone. This section is also extremely important in evaluating the veracity of Stallman's general narrative that proprietary software is a source of routine abuse of its users.


2.2.1 Backdoors

A backdoor is a method, often secret, of bypassing normal authentication in a product, computer system, cryptosystem or algorithm etc. Backdoors are often used for securing unauthorized remote access to a computer. Stallman lists examples of demonstrated backdoors in proprietary software. As there are quite a few of them, we will take a look only at several, and then give an overview of the whole section.


• The universal back door in portable phones is employed to listen through their microphones.

Here Stallman links to an article by Bruce Schneier about the possibility to eavesdrop, by either mobile providers remotely installing a piece of software or FBI actually setting a hardware bug into the phone.[8] Eavesdropping mentioned in the article is in the context of crime prosecution. Basically, eavesdropping on a mafia boss. The article is speculative and Schneier clearly says that he does not have any evidence and that it is him merely hypothesizing. One of his sources, a BBC news article, says:

Mobiles communicate with their base station on a frequency separate from the one used for talking. If you have details of the frequencies and encryption codes being used you can listen in to what is being said in the immediate vicinity of any phone in the network.

According to some reports, intelligence services do not even need to obtain permission from the networks to get their hands on the codes.

So provided it is switched on, a mobile sitting on the desk of a politician or businessman can act as a powerful, undetectable bug.[9]

This is a scary prospect, but the evidence is sketchy at best, with "some reports" being the only source of these stories. Even if true, it is not entirely clear if "free" software can help avoid such eavesdropping.

Another article that Stallman links to is an article by Thom Holwerda on how every mobile phone has an additional operating system that runs on baseband processor.[10] Holwerda explains that the software is buggy and can be exploited by a third party and then argues that "insecurity of baseband software is not by error; it's by design". He goes on to explain that this proprietary operating system is old and was written in the 90s when approach to security was different. In other words, he does not claim that Qualcomm wrote this operating system to deliberately abuse its users. He does argue that if the operating system was free, it is more likely that it would've been updated to comply with modern security standards.

Conclusion: Stallman's claim is misleading. Existing evidence tells us about distinct cases when software of hardware bugs had been installed for investigative work. The sources Stallman links to specifically explain that this state of affairs is not deliberate. Finally, he does not mention that many modern processors are already using an open implementation of such an operating system called OKL4. Using open source implementations (not necessarily "free" per Stallman's definition) could be prudent for a lot, if not most of standardized software, and especially firmware.


• Dell computers, shipped with Windows, had a bogus root certificate that allowed anyone (not just Dell) to remotely authorize any software to run on the computer.

It was a bug in Dell's setup that was promptly fixed.[36] Such errors routinely happen (and get quickly fixed) everywhere, including "free" software distributions. There is no reason to believe that this has anything to do with proprietary software, nor is there any evidence that Dell did this intentionally to mistreat its users or "put them at a disadvantage". In fact, even suggesting this is blatantly dishonest. Yet, Stallman lists a security bug as an example of intentional exercise of power over the user.

Conclusion: This is a clear case of misrepresenting a quickly fixed security bug as an intentional backdoor.


• Baidu's proprietary Android library, Moplus, has a back door that can “upload files” as well as forcibly install apps. It is used by 14,000 Android applications.

This is a case of a serious security vulnerability in an SDK, but Stallman's wording gives an impression of malice on the part of Baidu, as if this is a backdoor, intentionally introduced by Baidu as part of "exercising power over the user". Also, the sentence is written in present clause, as if the vulnerability is still out there, although it was fixed within days of being reported.[37]

The functionality in Baidu's SDK allowed a third party to silently install apps, make calls and send fake SMS messages. Someone had discovered this vulnerability and wrote a worm that was installing unwanted apps on rooted devices. It is through discovering this worm that compromising SDK code was found out. A security software company "Trend Micro" has worked with Baidu to help resolve the problem.

There is, however, zero evidence that this was done by Baidu in order to mistreat users, silently install apps, send fake SMS messages or in any other way control users' devices, and there is zero evidence that such activity was ever performed by the company.

While it is not clear why this functionality was there - was it originally intended for test purposes or was it an architecture error or for whatever other reason, - it seems improbable even on principle that a large company would intentionally build malicious code into their SDK, in order to send fake SMS messages. Not only such vulnerabilities are scouted for and discovered by security companies relatively quickly, both the risk of being discovered and the questionable effectiveness of such practices for a company with an established business make it a wild suggestion. But most importantly, it is not supported by the existing evidence.

Finally, free software is not a guarantee against such situations. It can be argued that making SDKs open-source will allow vulnerabilities in the code to be discovered much earlier. Whether there is any evidence for this claim will be discussed in 2.2.3 Proprietary Insecurity. But this is a separate discussion and not relevant to the argument Stallman tries to support, that "developer is tempted to design the program to mistreat its users" and that mistreating the user is "standard practice".

Conclusion: This is another clear case of misrepresenting a quickly fixed security vulnerability as an intentional backdoor on the part of the developer.


• Microsoft has already backdoored its disk encryption

We are linked to an article by Micah Lee on how Microsoft has added a feature of automatically creating a backup of the user's encryption key to the cloud.[11] The choice of the word "backdoor" here is questionable, as is the implication that Microsoft did this on purpose to get access to users' data. What the article really says is that it is a decision that can be considered poor from a security standpoint, but the author does not claim that Microsoft has created a backdoor to abuse users. In fact, the purpose of the feature is quite the opposite, to help users in case the recovery key is lost:

“When a device goes into recovery mode, and the user doesn’t have access to the recovery key, the data on the drive will become permanently inaccessible. Based on the possibility of this outcome and a broad survey of customer feedback we chose to automatically backup the user recovery key,” a Microsoft spokesperson told me. “The recovery key requires physical access to the user device and is not useful without it.”[11]

It also explains, that users are able to log in using a domain other than Microsoft, in which case a key will not be sent to Microsoft at all:

If you login to Windows using your company’s or university’s Windows domain, then your recovery key will get sent to a server controlled by your company or university instead of Microsoft — but still, you can’t prevent device encryption from sending your recovery key.[11]

While all of this can be frustrating to more advanced and security conscious users, and we can argue that an option to store the key locally is required, this is definitely not "user abuse by proprietary software", nor is this even a backdoor in any common sense. It is difficult to see any conscious plan to give users a disadvantage, as many Microsoft products do not have such a feature and this is only true for Home edition of Windows. For example, BitLocker, available for Pro and Enterprise versions does not have this problem and allows the user to store keys locally.

Additionally, if advanced users want to make sure that Microsoft does not have the key, same article explains how to re-encrypt your disk and not have encryption keys be sent to Microsoft servers, which is to go to One Drive, remove the encryption key and create a new recovery key which can be made to be stored locally.

It is trivial to find other sources that support the same conclusion. Here is an example:

It may be true that Microsoft has the decryption keys to your encrypted hard disk if you bought a PC with Windows 10 or Windows 8.1 preinstalled, if it supports device encryption, and if you use a Microsoft account to log into Windows. But it isn't a security disaster that they do, and if you aren't happy that they do, it takes no more than a couple of minutes to delete the copy of the key they hold and then update your system to render their key useless. This can be done on any Windows version, even Home.[12]

Conclusion: Stallman misrepresents the real situation, by using the term "backdoor" incorrectly and implying malice on the part of Microsoft, although the article he links to does not support such a view. One has to stretch the truth to call the feature in question a backdoor.


• Modern gratis game apps collect a wide range of data about their users and their users' friends and associates.

In this example Stallman supplies more text:

Modern gratis game cr…apps collect a wide range of data about their users and their users' friends and associates.

Even nastier, they do it through ad networks that merge the data collected by various cr…apps and sites made by different companies.

They use this data to manipulate people to buy things, and hunt for “whales” who can be led to spend a lot of money. They also use a back door to manipulate the game play for specific players.

While the article describes gratis games, games that cost money can use the same tactics.

Yet another example how Stallman consistently uses rhetorical devices (name-calling, "cr...apps") that are more characteristic of propaganda and not careful analysis.

The article he links to is a piece written by "an Anonymous Free to Play Producer" and titled "'We Own You' - Confessions of an Anonymous Free to Play Producer". [13]

Unfortunately, the article is not an impartial narrative, but a sensationalist essay that purports to "confess secrets about free to play apps", and uses loaded language that is so typical of Stallman's own essays.

The article tells a story of a game producer who had built a system to collect user data and used it to push advertisements that would allow him to increase sales. He makes sure to point out that he personally had been against this, but that the CEO of the company insisted on collecting even more data. He then tells how advertising companies collect user data to display matching ads.

Your IP address says you are in America, but you buy virtual items featuring the flag of another country, we can start to figure out if you are on vacation, or immigrated. Perhaps English is not your first language. We use all of this to send you personalized Push Notifications, and show you store specials and items we think you will want.[13]

He then claims that such companies will go for individual users, track them on Facebook, friend them using fake accounts and then create custom items for them to buy. He also claims that game producers will adjust a game to make sure that the levels are more difficult and force players to make in-app purchases.

And if you are a whale, we take Facebook stalking to a whole new level. You spend enough money, we will friend you. Not officially, but with a fake account. Maybe it’s a hot girl who shows too much cleavage? That’s us. We learned as much before friending you, but once you let us in, we have the keys to the kingdom. We will use everything to figure out how to sell to you. I remember we had a whale in one game that loved American Football despite living in Saudi Arabia. We built several custom virtual items in both his favorite team colors and their opponents, just to sell to this one guy. You better believe he bought them. And these are just vanity items. We will flat out adjust a game to make it behave just like it did last time the person bought IAP. Was a level too hard? Well now they are all that same difficulty.[13]

He then finishes his article with advice to purchase paid versions, so that data is not collected anymore:

Every time you play a free to play game, you just build this giant online database of who you are, who your friends are and what you like and don’t like. This data is sold, bought and traded between large companies I have worked for. You want to put a stop to this? Stop playing free games. Buy a game for 4.99 or 9.99. We don’t want to be making games like this, and we don’t want another meeting about retention, cohorts or churn.[13]

All of this material is written in an emotional and bullying tone, to make readers feel humiliated and personally targeted.

However, a lot of what is written are half-truths - at best. It is true that modern freemium apps have advertising SDKs built into them. In turn, SDKs collect user data. This user data is very strictly regulated in many countries, not allowing companies like Facebook and Google to sell information that can zero in on a particular person.[14][19][20][21][22] US information privacy law is actually one of the weakest, and currently several initiatives are undertaken to regulate data usage.[18]

Additionally, there have been many works written on the topic of how anonymized data can still be used to narrow down options to a particular individual by going through big data analysis.[23] No scalable method is currently known as of this writing. It will be possible in some cases to point to a particular person with some probability, but no method is known to routinely figure out who you are based on anonymized data. In principle, however, this is a very real concern that has to be addressed. Security and privacy conscious organizations can help a lot in moving forward.

The claim about fake Facebook accounts requires significant amount of supporting evidence to be believed. The narrative makes little sense and is unlikely to be a description of actual practice. In many countries this would actually be illegal. So even if there is evidence for such cases (but which Stallman does not provide), these cases are doubtless singular cases for which companies should be severely punished.

The claim that app developers change level difficulties on the fly, depending on how the user is doing, has no evidence. It is a typical claim that people make about slot machines in casinos, where they perceive patterns in sequences of randomly generated outcomes. If such tweaking of difficulty was real, getting evidence of it would have been almost trivial, and yet - we are presented with no such evidence, just assertions. Such tweaking is not applicable to a whole variety of games anyway, not to mention its questionable effectiveness in getting more in-app purchases.

This, though, has little to do with "free" software that Stallman argues for. A lot of data collection happens on the web, that most people view through "free" or at least open-source software, such as Firefox. There are many "free" software mining applications as well. It is not clear how this example is relevant to Stallman's argument.

Conclusion: This example has almost nothing to do with backdoors or with "free" software, and reliability of its source is suspect. Data collection in the modern digital era is a concern, should be addressed and is already being addressed, but it is unlikely that progress will be achieved by writing sensationalist stories in loaded language that, in the end, misrepresent the actual state of affairs.

That Stallman is ready to quote such a biased article with suspect reliability is not a sign of careful research.


• ARRIS cable modem has a backdoor in the backdoor.

Such backdoors are written for modems so that ISPs can access your modem remotely, see technical logs and also upgrade firmware. This access is very limited and does not expose any sensitive information, as is stated by the security expert who has uncovered the original backdoor.[15] Several Arris modems, however, seemed to have an additional backdoor implemented that, instead of giving only a limited technician shell, would give full shell access. The company was slow to react and considered this a low security threat, arguing that it had heard of no cases, exploiting the vulnerability.

While this is poor response from a company, it seems unlikely that this was built to exercise power over users and there is no evidence that Arris or ISPs had been mistreating users. It is more probable that clever hackers had been able to gain access to someone's Internet connection.

It is understandable that Arris customers should be very unhappy about such a modem. And while it is true that inability to update the modem firmware is problematic, it is not necessarily a "free" software issue. One can have open-source and even "free" firmware, but have no ability to update it on a piece of particular hardware.

Either way, it is difficult to see this as a deliberate attempt to mistreat users, and the article Stallman links to also emphasizes that this is a security flaw, not a backdoor put in by Arris for nefarious purposes. Therefore, I don't see how this example boosts Stallman's position.

Conclusion: This is yet another case of misrepresenting a security vulnerability as an intentional backdoor on the part of a developer.


• Mac OS X had an intentional local back door for 4 years.

The backdoor was "intentional" in that it was a mechanism to update the “System Preferences” app, but a security flaw allowed other processes to use the same functionality. As soon as this bug was reported, Apple fixed it.

Conclusion: As already usual with Stallman's examples, this shows absolutely zero evidence of the company being "tempted to put its users at a disadvantage". Yet another case of misrepresenting a security vulnerability as an intentional backdoor on the part of a developer.


• The iPhone has a back door that allows Apple to remotely delete apps which Apple considers “inappropriate”. Jobs said it's ok for Apple to have this power because of course we can trust Apple.

If we follow Stallman's link, we will swiftly see that Steve Jobs said no such thing. The quote from the linked article is below:

Steve Jobs, Apple's chief executive, has confirmed there is a 'kill switch' built into the iPhone that allows Apple to remotely delete malicious or inappropriate applications stored on the device.
...
However, Mr Jobs insisted that the so-called 'kill switch' was there as a precaution, rather than a function that was routinely used.

"Hopefully we never have to pull that lever, but we would be irresponsible not to have a lever like that to pull," said Mr Jobs. [16]

In another article, that Stallman links to at a later example, it is also claimed that so far Apple has not used such a kill switch even once, and even if they would delete apps from the App Store, they would not remove apps from users' devices.[17]

As we will see below with Android and Windows Phone having the same "kill switch", in practice these backdoors are used to remove malicious software that managed to infect a large number of devices.

Conclusion: There is no evidence that a backdoor that allows the app store vendor to remove malicious applications is an attempt to mistreat the user.


• In Android, Google has a back door to remotely delete apps. (It is in a program called GTalkService).

What Stallman does not mention is that this backdoor is used to remove malicious software when a massive infection has occured, similar to a case described above with Baidu SDK vulnerability. It is very difficult to see such a measure as mistreating users.

One can argue that such a backdoor can in theory be abused. But there is a difference between claiming a possible problem with a feature and outright saying that developers are corrupted by their "power over users" and are using that power to "mistreat users". Such wording is misleading. It just doesn't reflect the situation in the real world.

Stallman then writes:

Although Google's exercise of this power has not been malicious so far, the point is that nobody should have such power, which could also be used maliciously. You might well decide to let a security service remotely deactivate programs that it considers malicious. But there is no excuse for allowing it to delete the programs, and you should have the right to decide who (if anyone) to trust in this way.

This is a very curious passage, as Stallman seems to admit that in this example Google does not, in fact, mistreat its users. He again underlines that a mere possibility of malicious use is problematic, but he does not show how probable such malicious use actually is. And it is then unclear why this or the previous example of Apple having a "kill switch" prove that "mistreating a user is standard practice", or how that shows proprietary software to be "malware". None of these examples support Stallman's bold claims.

Conclusion: There is no evidence that a backdoor that allows the app store vendor to remove malicious applications is an attempt to mistreat the user. What Stallman does is plant flags on possible abuse opportunities. However, this in itself cannot be used to argue that mistreating the user is standard practice.


• Windows 8 also has a back door for remotely deleting apps.

This is the exact same situation, as with iPhone and Android devices. Here Stallman again concedes to a seemingly more nuanced position and adds:

You might well decide to let a security service that you trust remotely deactivate programs that it considers malicious. But there is no excuse for deleting the programs, and you should have the right to decide who (if anyone) to trust in this way.

As these pages show, if you do want to clean your computer of malware, the first software to delete is Windows or iOS.

This is a position that is uncustomary for Stallman's obstinate views and a position which can and, perhaps, should be argued. We can see that Apple, for example, does not remove apps at all and allows even removed apps to stay on the users' devices. Even a better idea would be to just deactivate apps, as Stallman suggests, and allow the user to retain control of the apps he deletes.

If such a position would be argued by Stallman and the FSF, there is definitely a higher probability that it would be heard. A discussion could ensue on how to achieve this technologically. It is possible that technological cost of this is higher than simply removing apps, and organized action from users is required to force the change.

Nonetheless, this small compromise is promptly nullified by Stallman, who immediately claims that Windows and iOS themselves are "malware" and should be removed, although, as we will see in the general conclusion to backdoor examples, his evidence, justifying his definition of Windows and iOS as "malware", is lacking and is yet another example of loaded language.

Conclusion: There is no evidence that a backdoor that allows the app store vendor to remove malicious applications is an attempt to mistreat the user. And although Stallman does make a good case for the solution to this problem to be less intrusive, the example itself does not demonstrate malicious intent towards the user on the part of a proprietary developer.


• Conclusion for backdoors

We looked at 10 out of 26 backdoor examples, listed by Stallman, which is over a third. Among these 10 examples not a single one is evidence that a developer of proprietary software in question was "exercising power over the user" in order to "mistreat the user".

If we take the whole set of the 26 examples and break them down, here is what it looks like:

Out of these 26 only 1 is definitely "mistreating the user", and it is a controversial case of a Chinese smartphone company Coolpad, which had released phones with a modified Android build to allow a backdoor that would push both ads and unwanted app on users.[24] Notice, however, that this has nothing to do with "free" software. Android is based on Linux and, according to GPL v2, has to be "free" as well. However, when a vendor uses a device with the operating system, they may add any software to it, regardless of whether an operating system was "free" or not. This new addition could be released as "free" also. In compiled form on a particular device the "freeness" of the source code says nothing about what the compiled piece really does. Thus, it is more a question of regulation or, at the very least, "free" hardware, and should not be an example that has anything to do with "free" software. Even if we concede that this might be somehow made to be relevant to the question of "free" software, this is just one example out of the 26 that Stallman provides. It is not clear how based on this one example one is justified in calling user mistreatment "standard practice".

Government involvement, as pointed out above, is something that needs to be addressed in a democratic society. This is a very serious issue, but hardly just a software problem. A more authoritarian regime is more likely to abuse anything and anyone, while a more liberal and transparent one is unlikely to engage much in either. "Free" software can indeed help here. An approach to solving these problems is outlined in chapter 4.

Recent events have shown once more that access to devices in context of fighting crime and terrorism is not a black-and-white issue as well, and requires much more discussion to be worked out.[25] Therefore, it is not clear if we can categorize such government involvement as clear examples of "mistreating the user", unless Stallman considers any government intervention as "mistreatment", but then this is a totally separate discussion and something that he does not address in his essays. What can be said is that some of his examples of government intervention mention fighting crime, and, surely, a case of agents bugging a mafia boss phone would probably not be considered as "mistreatment" by the public.

In other words, we can certainly say that government involvement, while a serious issue, is not that clear cut a case and is definitely not always done to mistreat users in a democratic society.

A lot of writing and examples point to Windows, MacOS and mobile device operating system forcing upgrades on the user or making upgrades silently. While some of these are questionable tactics, this is very difficult to see as "mistreating the user". Windows 10, being the most outstanding and attention-grabbing example, has eventually released information on how to make the nagging upgrade screen disappear for good.[26] I agree that Windows 10 can probably be considered "user mistreatment" and discuss it in section Sabotage.

But other examples are mostly innocent and point to technical upgrades that happen on free software systems as well. The freedom to be able to inspect an upgrade is illusionary for most users: they have no idea of what is being installed and what it does, and they put their trust into developers who have worked on the code. In commercial products it became customary to hide such technical processes from the user entirely. There is no evidence that, say, Windows "backdoor" to update several files regularly was done to "abuse users", unless Stallman can somehow show that these upgrades were done to intentionally delete user data or read their private text files, send the data to Microsoft and later use to blackmail users, etc. Unless such evidence exists, it is difficult to see this as "mistreating users", but rather as providing smooth upgrade experience.

Additionally, any examples of Microsoft or Mac OS upgrades breaking computer systems are again relevant for most users of "free" software. Even if the user is a developer, it is unlikely one can realistically check every upgrade. A user will tend to install upgrades without spending hours trying to understand what these upgrades are doing exactly. And it is not unheard of for upgrades to seriously break systems. A good example is a recent, as of the moment of writing, upgrade to Xubuntu 14.04 that broke Network Manager and left users without wifi for several days. Unless users had access to a wired connection, there was nothing they could do. Less technically-minded users would have had their work stalled for many days.

"Free" distros also have nagging upgrade windows. Ubuntu has a button that can and has been pressed by mistake, forcing users to upgrade the distribution to the next version. Questions from users on how to remove the button can be found online. Granted, it is much easier to remove these screens than it is for Windows 10, but then Windows 10 is an outstanding example.

Ubuntu/Xubuntu/Kubuntu family of distributions have what is known as LTS Enablement Stack. This update typically installs a new kernel and newer graphic drivers, so that old distributions can be supported with driver updates. The update is introduced to a user with these words:

New important security and hardware support update.

WARNING: Security updates for your current Hardware Enablement Stack ended on 2016-08-04:
http://wiki.ubuntu.com/1404_HWE_EOL

Ubuntu Wiki briefly explains what it is and how to install the Stack manually through the terminal. There is no information on possible repercussions. Neither is there information on how to remove it.[84.] Obviously, user is manipulated into going along with this update, which is presented as being "important". Unfortunately, these kernel and driver updates are capable of rendering previously functioning systems either inoperable or prone to crashes,[110] [111] [112] [113] [114] [115] [116] in which case the user might typically want to remove the update. This, however, is a serious challenge even for a technically-minded user, and will require a lot of reading on the Internet forums.

Moreover, once the user has removed the Enablement Stack, nagging screens return. Even turning off all the updates in the Synaptic UI does not help, and the user has to edit a specific config file.[117]

So, nagging screens and a push towards upgrades that users might find unnecessary and even harmful is hardly a proprietary software phenomenon. Rather, it seems to emerge in systems that have an established maintenance process. Maintainers are then motivated to move users along the maintenance process, so as to have less bugs to deal with and focus on the newer OS versions.

Another weakness of Stallman's argument is his general focus on Microsoft and Apple. One, however, must remember that there are countless proprietary packages apart from Windows, the vast majority of which forces no upgrades on users. Focus on Windows does not demonstrate that such practice is characteristic of all proprietary software, but rather that it is characteristic of Microsoft, and more specifically characteristic of their Windows 10 release.

That the situation even with Microsoft is more complicated can be gathered from an analysis by Galen Gruman, titled "Double standard: Why Apple can force upgrades but Microsoft can't".[28] The article discusses how Microsoft has actually been slow to force upgrades and how it was always consistent in providing backward compatibility and options to stick with older systems.

Examples that mention contract obligations, such as employee phone, or software controlling engines of nonpaying customers, can hardly be examples of developers of proprietary software putting in a backdoor to "mistreat a user". Company phone's user is a company, not an employee, and the situation with a woman loosing her personal data was not intentional, but an unfortunate side effect.[38] It is difficult to agree that it is a clear cut example of some nefarious functionality.

And, finally, Stallman's insistence on counting mere security vulnerabilities as intentional mistreatment of users or "exercise of power" is an incredible example of manipulation and intellectual dishonesty.

To conclude, backdoor examples provide no evidence that mistreating the user is anywhere near "standard practice", and only 1 example out of 26 is a clear example of such mistreatment, that is without question contingent on software being proprietary.


2.2.2 Censorship

According to Stallman, these are "examples of proprietary systems that impose censorship on what their users can access". He then adds: "Selling products designed as platforms for a company to impose censorship ought to be forbidden by law, but it isn't". There are only 6 examples, 4 of which are dedicated to Apple's app store cases, one to Google App Store, and one to Nintendo 3DS.

The immediate thing to notice here is that citing examples of "proprietary systems that impose censorship on what their users can access" says nothing about such "censorship" being a feature of proprietary software alone. For instance, gnu.org has a list of "completely free" operating systems that also impose "censorship" on what their users can access.[31]

"Censorship", yet another loaded term from Stallman, is not by default a "mistreatment" of users. If we use a more neutral term "content filtering", the tone of the claim changes considerably, as well as its veracity. Private actors choose to or are forced to impose content filtering for a variety of reasons, such as legal considerations, or to maintain their own quality standards, or even to get a competitive edge. There is little evidence that this is a feature of proprietary software alone, or that this is done as a deliberate act to mistreat the user.

Additionally, it is questionable that running a private app store and imposing private rules on it should be considered an offense punishable by law, let alone be framed as "censorship". If an entrepreneur opens a book store, dedicated to books in English, she will definitely "censor" books in German, Italian and French. It does not mean that this is a "mistreatment of users" or "imposing power on the users". If she wants to run a book store that explicitly forbids any political content, then she will "censor" books based on these criteria, even if those books are considered to be groundbreaking. This is not censorship that anyone should be worried about.

Similarly, neither Apple, nor any other business is obliged or should be obliged to publish someone's app. They make their guidelines clear, and that they do not want to publish porn or political statements or even certain political statements could be distasteful to some, but certainly the suggestion that it should be illegal stands opposite to many private freedoms. This would only be worrisome if competition to Apple App Store would be forbidden by the government and Apple Guidelines were to be imposed on the whole population. However, this is not the case. Apple offers a certain ecosystem of applications and it is their right. Android app stores typically have different offerings. FSF can create their own app store, if they wish. This is normal market competition of different products, just like you might have magazines that allow use of foul language and magazines that don't and won't. Describing this state of affairs as "censorship" would be a rhetorical ploy.

Examples, which Stallman cites here, expose the sketchy nature of many of his views. By trying to label all these complicated cases as simply "censorship" of an abusive "proprietary developer", and by claiming that the solution is to make code available for everyone to run and modify, he falls into a trap of seeing the world in black-and-white terms of his software philosophy, being blind to other issues involved. It is difficult to see how "free" software is related to many of these situations.


• Apple censors games, banning some games from the app store because of which political points they suggest. Some political points are apparently considered acceptable.

Stallman links us to a story about Apple declining to categorize an app as "Game" and instead suggesting it be categorized under "News" or "Reference", apparently because it contains a strong political message.[29] The story also says that Apple has then reversed its decision and reinstated the app as a "Game".

It can be immediately seen that Stallman's claim is false, as this is not an example of Apple censoring content, but instead a dispute over how to categorize the app. Additionally, Stallman fails to add that the app was reinstated.

The article then mentions other examples, linking to cases when Apple had rejected apps based on their political content. Some of these apps have actually been reinstated, some were not. The apps in question were based on the ongoing Syrian war and seemed to go against Apple's policies.

Stallman insinuates that by doing this kind of content filtering, Apple expresses a political position. But this is unconvincing. A more plausible reading of the situation is that companies such as Apple actually try very hard to be as apolitical as they can, and shy away from any controversy. Their app guidelines are directed at providing as neutral content as possible, and any political content that they do approve is only approved inasmuch as it is non-controversial. Apps are usually reinstated after public outcry, which signals that it is safer for the company to publish the app. Same goes for other large companies, be it Nintendo, Microsoft or Amazon. These corporations are subject to state regulations and open to lawsuits from all directions, and are thus exercising extreme caution with the content they publish. It can be argued that they are sometimes overdoing it or outright making mistakes, but we do see evidence of them correcting these mistakes, as Stallman's source does show.

We already attested to Stallman's manner of framing any situation in mere terms of software freedom, losing all other aspects of the story, arguably more important. But another problem is the double standard. Even if we assume that Apple is not apolitical, but instead advances its own political agenda, for some reason Stallman thinks that a game company that publishes games with political content can have a political stance, but a publisher company, like Apple, cannot, and should even be punishable by law if it does. Unless he demonstrates how such a disparity in rights to expression of political opinion can be justified, this position should be rejected.

Additionally, the sketchy nature of Stallman's views and simplistic solutions he promotes can be shown by applying them to his own Free Software Foundation. If, according to him, a publisher of content should be prohibited from expressing any political views, this is wholly applicable to the FSF, which promotes and publishes software.[30] Stallman himself restlessly points out that FSF and GNU Project are not mere engineering projects, but are instead parts of a political movement. Thus, publishing only "free" software and censoring proprietary software should be punishable by law, as this is a political statement.

Finally, it is very difficult to see how Apple's guidelines and their implementations have anything to do with Stallman's philosophy. The best demonstration of this is a simple fact that no user freedoms, proposed by Stallman, are violated by Apple's decline to publish the apps, and cannot be in principle. If the developer approaches a software store and asks them to sell her software, and they decline for whatever reason - user freedoms, proposed by Stallman, do not and cannot apply to a platform that has not published the software and thus has not become part of the publishing chain.

One can argue that a publisher can indirectly violate these freedoms by specifically declining to publish apps that would grant such freedoms. But in cases brought up in this story apps were not rejected due to them respecting user freedoms, as defined by Stallman, but rather due to their controversial content. Thus, his claim that this somehow shows any mistreatment is misplaced. But most importantly, if this example is used to support the case for user freedoms, one cannot cite violation of these proposed freedoms to be evidence, as this is circular reasoning, as noted in the beginning of 2.2.


• Apple banned a program from the App Store because its developers committed the enormity of disassembling some iThings.

This is case of developers disassembling Apple hardware and breaking Apple's product Terms and Conditions. Their account was then suspended and their app became unavailable due to that.[32]

This has nothing to do with software at all, and it is also clear that Apple did not aim to "censor" their app. One more case of Stallman grossly distorting the situation and using loaded language to imply that breaking Apple's contract is not important, but Apple's reaction to the breach of contract is somehow unfair.

And even if there is a case to be made that an individual customer's rights should be more strongly defended due to the disparity in power between an individual and a corporation, this case has to be put forward, justified and argued for. Stallman does no such thing.


• Apple rejected an app that displayed the locations of US drone assassinations, giving various excuses. Each time the developers fixed one "problem", Apple complained about another. After the fifth rejection, Apple admitted it was censoring the app based on the subject matter.

This claim is generally true, although the app is now in the App Store, albeit under a different name. As pointed out above, this has nothing to do with "free" or proprietary software, and Stallman's argument that an app publisher should not have the freedom to choose what to publish in their private app store is unfounded and should be rejected.


• As of 2015, Apple systematically bans apps that endorse abortion rights or would help women find abortions.

That is mostly false. The story Stallman links to talks of one app called "Hinder", not of many apps.[33] Additionally, thanks to social media, Apple has approved the app 9 hours after it has rejected it. It also exists in the Google Play store. It is intellectually dishonest to not give a full overview of the situation.

What's interesting is that the article about Apple approving the app is from September 22nd, whereas the article Stallman links to was written on September 23rd. Which shows that he either chose not to mention that the app was quickly reinstated, or did not care to do additional research and actually verify the accuracy of the story he linked to. Either way, this underlines yet again that Stallman's style of presenting data is not inclined towards an impartial and balanced analysis.

Finally, just like written above, this has nothing to do with "free" or proprietary software, but are rather complex questions about freedom of speech, its applicability to public and private spaces, and how we should handle situations when private businesses become a significant element of the culture. Even if Apple is wrong at committing these acts of content filtering, this says nothing about the injustice of proprietary software.


• Google censored installation of Samsung's ad-blocker, saying that blocking ads is "interference" with the sites that advertise (and surveil users through ads). The ad-blocker is proprietary software, just like the program (Google Play) that Google used to deny access to install it. Using a nonfree program gives the owner power over you, and Google has exercised that power. Google's censorship, unlike that of Apple and Microsoft, is not total: Android allows users to install apps in other ways. You can install free programs from f-droid.org.

Stallman's loaded language and his use of phrases like "exercising power" has been already dealt with in 2.1. His statement that "proprietary software gives the owner power over you" is misleading, and should either be rejected, or recognized as being true in the vast majority of cases for "free" software as well.

As stated in 2.2.1, in most western countries user "surveillance" is strictly regulated and, while the situation must be carefully monitored by the public at all times, should not be reason for too grave a concern. Many websites specifically ask its visitors to not use ad blockers, as this is their only source of income.

But, as is unfortunately typical of Stallman's one-sided approach, he fails to mention that the story he links to says that Apple is allowing to block ads in Safari. But he still calls Apple's "censorship" total.


• The Nintendo 3DS censors web browsing; it is possible to turn off the censorship, but that requires identifying oneself to pay, which is a form of surveillance.

This is the most vivid example of how Stallman tries to misrepresent the situation to be about proprietary developer set to "put users at a disadvantage" or "mistreat the user", when the case has nothing to do with the issue.

The story Stallman links to is a story about Nintendo 3DS having a parental lock on unrestricted Internet browsing.[35] To open the lock, one has to pay 30 cents through a credit card. The reasoning, as explained by the author, is this:

This is not necessarily intended as a money-making scheme; it's instead intended to be a parental control feature. By requiring a credit card purchase, it reduces the likelihood that a child will find their way onto unsavory websites on their 3DS (though it doesn't change the fact that they may live in a home with countless other Internet-connected devices).[35]

The author then proceeds to calculate the potential profit that the company can make and asks readers for their opinion.

This story is clearly not about "censorship" the way most people understand it, and the reasoning behind it is sound. Additionally, no vendor should be forced to give users unrestricted browsing or even any browsing at all. This is a non-issue.

Stallman's statement that identifying oneself is a form of surveillance is absurd and paranoid. While one is free to feel that way, Stallman provides no arguments as to why everyone should feel this way and what negative consequences exactly will people face by using a credit card. See further discussion of this in 2.2.3 Insecurity.

Finally, this has nothing to do with Nintendo "mistreating" the user and there is no evidence this was done to "put the user at a disadvantage" or to put users under surveillance. This is positioned as a safety feature, and, possibly, many parents are quite happy with the lock.


• Conclusion for "Censorship".

Among the 6 examples, only one was accurately summarized by Stallman, and none seem relevant in any direct way to questions of user freedoms. He also suggested extreme form of regulation of private businesses, without providing any justification. The use of the term "censorship" in most of these cases is questionable.


2.2.3 Insecurity

Insecurity examples are prefaced with these words from Stallman:

This page lists clearly established cases of insecurity in proprietary software that has grave consequences or is otherwise noteworthy.

It would be incorrect to compare proprietary software with a fictitious idea of free software as perfect. Every nontrivial program has bugs, and any system, free or proprietary, may have security holes. That in itself is not culpable. But proprietary software developers frequently disregard gaping holes, or even introduce them deliberately, and the users are helpless to fix them.

As noted by Stallman himself, any software would have bugs. The difference is, he says, that 1. proprietary software developers often do not care for security holes in their programs, 2. introduce them deliberately, and 3. the users are helpless to fix them.

These are statements about the real world that should be backed up by evidence.

Stallman provides no evidence for the first argument. He lists 26 examples of bugs, some of which are the same he lists in the "Backdoors" section, discussed at length in 2.2.1. There is no statistical data that he provides to the reader to demonstrate that disregard for security is indeed a frequent phenomenon among proprietary developers, nor does he define "frequently". In fact, we have shown that Stallman consistently declines to mention that most security bugs in his list have been fixed within days or hours of being reported. Only in one of his backdoor examples did a company comment that the security flaw found in its modems was insignificant, clearly a rare case of negligence.[15]

Hence, the project of compiling such a list is a futile exercise. Same list can be assembled for "free" software. The "Heartbleed" bug, often referred to as the worst vulnerability found, happened in OpenSSL, a "free" software project, and was discovered by two proprietary software companies, Google and Codenomicon.[45]

What one should do is demonstrate the rate of vulnerabilities discovered, number of vulnerabilities and speed of fixing in proprietary and "free" software, and show them to be vastly different. Only in this case Stallman's pronouncements can be justified. But Stallman fails to provide any comparison to "free" software or any analysis or metrics or data whatsoever. His claim that "free" software developers show high regard for security is a mere declaration, not fact.

If we look at the actual data, there is enough evidence to say that "free" software projects are not any better in terms of security than their proprietary counterparts. All advantages of code availability and free modification have enough disadvantages to nullify any significant leverage that "free" software may have had over proprietary. There is also indication that "free" software developers demonstrate enough of security negligence for this to be a visible issue. Reader is encouraged to study provided links to realize the complexity of the problem. The popular notion that "free" software is more secure than proprietary is not supported by evidence. [40] [41] [42] [43] [44]

His second argument probably refers directly to his "Backdoors" section, which was thoroughly analyzed in 2.2.1.

Finally, his third argument refers to the theoretical possibility of users fixing their software themselves. We have already briefly touched upon this maxim when discussing the illusory nature of user freedoms for most users, and we will discuss this issue in greater detail in Chapter 3. Here it is, perhaps, useful to note that Stallman presents a false dichotomy - that problems with software can either be fixed by granting everyone full access to its source code - or they cannot be fixed at all. As will be demonstrated throughout these sections, Stallman's own examples thoroughly refute this notion.

Because it is clear that none of these "proprietary insecurity" examples help support Stallman's case, I will not give them an exhaustive treatment. But to comment briefly, just like in other cases, Stallman gives each case the most uncharitable interpretation possible, fails to give an honest overview of the situation, and defines informational security in terms that are very difficult to agree with.

For example, he writes that "Many proprietary payment apps transmit personal data in an insecure way. However, the worse aspect of these apps is that payment is not anonymous."

But nobody in the field of software security defines informational security as total anonymity.[46] [47] What informational security strives for, among other things, is confidentiality, which is different from complete anonymity. Complete anonymity is frequently undesirable, especially in most cases of online payment, where lack of authorization makes it more difficult to fight fraud.

He also writes about the insecurity of WhatsApp, despite the fact that in April 2016 the company has rolled out complete end-to-end encryption using a "free" software protocol.[48] It was also given a 6 out of 7 score card by EFF.[49] None of this, however, has prompted Stallman to retract his condemnation of WhatsApp as insecure.

And, finally, he remarks that "GNU/Linux does not need antivirus software", hinting at the fact that desktop Linux has very few viruses. While it is true, one should remember that the adoption of Desktop Linux is very low, usually estimated at 1-2% (some question these numbers and consider them to be higher, but I argue that there is no evidence for higher numbers), and we should, thus, expect the amount of viruses to be very low.[58] [61] Additionally, proprietary OS X is also known to not need antivirus software. At the same time, the amount of malware has risen both for OS X and for Linux in recent years, as both systems see higher rates of adoption by the general public.[59] Apple and Red Hat have reacted to that by removing "virus free" claims from their homepages.[62] [60]

So, security or lack thereof is not contingent on software being "free". And there is no plausible reason why it should be. Stallman treats all "free" software as a single entity. But in reality "free" software is a very diverse body of software, which is only related to each other by the type of license developers choose. There is no mechanism by which releasing a program under a "free" license automatically makes it more secure.

In conclusion, this set of examples does not demonstrate intentional "mistreatment of users", does not help advance Stallman's case and lacks all features of careful analysis, such as even basic research and follow up of the claims. All we see are baseless, often demonstrably wrong proclamations and fearmongering. Evidence that does exist and which was referenced throughout this section actually points to his claims being largely wrong.


2.2.4 Sabotage

Stallman describes this section as "examples of proprietary software that has something worse than a back door." This is a very vague definition. Analyzing his examples, it seems that he means situations when significant changes are introduced into a product after it has been bought, usually removing part of its functionality.

We have to agree with Stallman that such practices are generally wrong. Among the 24 examples of "sabotage", he lists genuine problematic and very difficult cases, but also cases which do not have markings of intentional harm or even any harm at all, such as promptly fixed accidental bugs, a user misunderstanding how to use Apple's cloud service, users breaking a product's Terms and Conditions, companies limiting the type of content they want to run on their systems and companies forced by courts to give up their users' data. I will mention several examples that should not be included in the list of intentional mistreatment, along with my reasoning, and then will examine incidents that do pose serious problems.


• The Apple Music client program scans the user's file system for music files, copies them to an Apple server, and deletes them.

A case that should not be considered "sabotage" is an article by jamespinkstone "Apple Stole My Music. No, Seriously.", that quickly went viral.[50] Stallman describes the situation the same way the article does: "The Apple Music client program scans the user's file system for music files, copies them to an Apple server, and deletes them."

Even quick research, however, would reveal that this was an error on jamespinkstone's part and his description of what the service is supposed to do was incorrect.[51] Refutations of his claims came in quickly, and just five days later jamespinkstone himself published an update to the story, revealing that Apple had contacted him, worked with him and eventually it was believed to be a potential bug, affecting a small amount of users. It was never reproduced, thus user error cannot be excluded completely. Apple had nonetheless released additional security features to prevent this theoretical situation from happening.[53] [54] [55]

Although this information is publicly available and had been available already 5 days after the initial publication from the same author (refutations from others had been available as early as the next day), Stallman had not updated his description of how Apple's service really works, and continues to list this accidental bug not only as an example of intentional "mistreatment of users", but as proof that a dystopian society he had predicted is actually coming.


• Adobe applications have time bombs: they stop working after a certain time, after which the user must pay to extend the time. Once there was a problem with the servers that these programs use to check who has paid, and the applications refused to work for anyone.

This is a manipulation on the part of Stallman. The article he links to is a news item on the Creative Cloud service from Adobe, which is a subscription-based service.[65] Describing a subscription-based service as a "time bomb" is neither a common, nor a correct way to put it. This is loaded language again, implying that users of the software do not know that if they do not renew the subscription, Adobe applications will stop working. This, of course, is not the case, and Stallman gives no justification as to why we should consider subscription-based services wrong or unjust. According to this logic, renting an apartment is then also a "time bomb" and should be considered unjust.

He then goes on to mention an accidental bug, as though this somehow proves his case.

None of this shows any evidence of "intentional user mistreatment" and should not be included in examples of any "sabotage" or other forms of injustice. Stallman has a separate section called "Subscriptions" where he links to Adobe Creative Cloud again and specifically asserts that subscription services are unjust because if you do not pay - then the developer would revoke the software license. No case is made to justify this unyielding position, and until such time comes - this assertion should be firmly rejected.


• Microsoft informs the NSA of bugs in Windows before fixing them.

The implication here is that by informing NSA about bugs Microsoft therefore gives NSA a chance to use this knowledge to spy on the population.

First of all, we have to say that any illegal surveillance, like the disclosed PRISM program, is a serious issue that should be actively discussed and eventually resolved to the public's satisfaction in a democratic society.

However, integral part of a rational dialog is an accurate representation of the problem. Any propaganda, over-exaggeration or misconception could be a severe stumbling block in understanding the real scope of the issue.

As is usual for Stallman's narrative, key facts are left out.

1. Microsoft gives early access to information on zero-day vulnerabilities not only to NSA, but also to its larger private customers.[80] The fact that a company provides such early access to its major customers is normal and often necessary practice. It is especially valid for larger organizations where updating could take quite some time.

2. Early access for major customers is practiced by "free" software projects as well. Recent notable examples include Samba, in a case of Badlock vulnerability, as well as Xen, used by Amazon.[81] [82]

One more objection to the implication that Microsoft does this to allow NSA to "mistreat users" is that earlier in the "Backdoors" section Stallman proposes all sorts of elaborate backdoors that the secret service allegedly has. If it is true, the necessity to exploit zero-day vulnerabilities becomes redundant. In other words, if you already have secret privileged access to Windows, why would you need to set up an additional process of exploiting vulnerabilities?

There is some ground for speculation that such vulnerabilities are being widely used for cyber espionage. One such case is Stuxnet, although no incontrovertible evidence of US government involvement exists.[83] But it is not clear if reports from Microsoft are required to obtain information of these vulnerabilities.

In conclusion, this point should be taken with significant reservations. The mere fact of providing early access to vulnerabilities presents no evidence of wrongdoing or injustice. An actor with significant and uncontrolled power might subvert this process and use it to one's advantage. But then this is a question of politics. In a democratic society the public has to make sure that its government's power is held in check. These socio-political issues are clearly beyond the scope of this treatise.

For our analysis it is important to note that this question is not necessarily tied to software being proprietary. Whether having an open (and even "free") operating system will make exploitation of zero-day vulnerabilities by governments less probable is a question worthy of extensive scientific research. In my view, the answer to that question is far from obvious, as can be seen from the section dealing with claims of "free" software having superior security.


• Apple stops users from fixing the security bugs in Quicktime for Windows, while refusing to fix them itself.

This wording is deceptive. Saying that Apple "stops users from fixing security bugs" implies that they could do it before, and now they can't. But users could never fix bugs in Quicktime, and at no point in time did Apple promise that users would be able to do it. Additionally, Quicktime is not the only video player on Windows, so it would be very difficult to argue vendor lock-in in this situation.

Of course, one can argue that if code was available, then users would be able to fix bugs even when official maintenance is over. But a software developer is not obliged to provide code to the users. If such an obligation exists or should exist, Stallman is required to prove this crucial point, not just assert it. As far as I can see, users are neither promised nor entitled to be able to fix someone else's code. Therefore, no basis is provided to consider this situation unethical or unfair in any way.

One could point out that this actually is a good pragmatic argument for "free" software adoption. And I agree. But we have to remind ourselves that Stallman does not argue a practical approach to building and maintaining software, he argues ethics. gnu.org dedicates at least two essays, "Why Open Source misses the point of Free Software" by Stallman and "When Free Software Isn't (Practically) Superior" by Hill, to explain that the Free Software movement is not about getting better software, but about being more ethical, even at the expense of having worse software.[87] [88] At one point in his essay Stallman goes so far as to argue that better proprietary software is undesirable, as it lures users away from "free" software.


• Vendor lock-in and changing the product after it was bought

The section of "sabotage" is, nonetheless, one of the better cases that Stallman makes. The situations he looks at can be divided into two main types: vendor lock-in and changing the product after it was bought. Both of these are usually bad practices that should be dealt with and can be condemned as unjust.

Vendor lock-in is a serious and also complicated problem, not at all limited to the software industry. Any innovation has a tendency to create vendor lock-in, regardless of industry and even regardless of whether the technology is a "free" or proprietary.[70] Chapter 4 will talk about solutions to vendor lock-in (which includes using open and perhaps even "free" standards), but suffice to say that apart from solutions that we would argue should be put into place, there already exists a whole set of competition laws ("antitrust law" in US, "competition law" in Europe), which regulate market abuse.[63] In some of the cases below we will see that this regulation does work.

Cases when the product is being changed to remove or degrade a feature of the product is definitely unjust, especially if the feature can be argued to be a "selling point" of the product. This effectively creates a situation when a person has paid for a product, but then after a while the product stops functioning, without this ever being communicated to the customer as part of the deal, which in turn might have influenced her decision to (not) invest resources - time and money - into the product. In other words, this is a case of cheating, albeit often unintentional.

But even these cases, which I believe to be seriously problematic, are described by Stallman very inaccurately. As with many other of his examples, he almost never gives a full overview of the situation and seems to paint these cases in a way so as to give an impression that nothing is being done to fix it, and that the only solution, thus, is to make all software "free".

Additionally, he does not differentiate among the cases mentioned. All cases are presented by him as equally wrong, and everywhere the only problem he sees is that software is proprietary. But the cases are actually varied, complicated, can be considered wrong to very different degrees, as well as have different degrees of having anything to do with being proprietary.


• Phillips “smart” light bulbs had initially been designed to interact with other companies' smart light bulbs, but later the company updated the firmware to disallow interoperability.

Any company changing the product after it has been bought can be argued to break the contract with its customer. Such behavior is definitely wrong. Not only this betrays customer expectations, such bad practice is quite enough to take the company to court, as is the case with many other situations of this kind. One can safely consider this an example of "mistreatment of users".

However, Stallman does not give a full overview of the situation. After receiving outcry from the public, Philips swiftly reversed its decision, and now third-party vendors are again compatible.[64]

This case shows why discussions about ethics benefit from an unbiased overview of the situation. The only way we can arrive at the best solution is to make sure that we possess the most accurate representation of the problem and its scope. If the situation is portrayed as dire and hopeless, we might opt for desperate measures. And desperate measures tend to incur greater costs on both individuals and society.

Here it is important to see that companies can be influenced to disengage from such behavior and that, at least in some cases, it is possible to fix the problem without making software "free".

It also shows that examples of such behavior are not "standard practice" as Stallman claims in the preface to this section, but are instead best described as controversial, exceptional cases that elicit outcry from the public.


• iOS version 9 for iThings sabotages them irreparably if they were repaired by someone other than Apple. Apple eventually backed off from this policy under criticism from the users. However, it has not acknowledged that this was wrong.

If true, there are two problems with this practice: possible vendor lock-in and changing the product after it was bought, without this ever being mentioned to the customer as part of the package. Vendor lock-in in this case is hardware-based and concerns locking out competing repair firms:

Could Apple's move, which appears to be designed to squeeze out independent repairers, contravene competition rules? Car manufacturers, for example, are not allowed to insist that buyers only get their car serviced by them.[66]

But further research reveals that the situation is more complicated.

The Guardian article that Stallman links to and which is quoted above is markedly anti-Apple and paints Apple as a vendor desiring to lock out competition. When Apple gives a technical explanation of the feature, The Guardian reporter tells the reader to "get ready for a jargon overload", implying that these are just excuses. The explanation, cited in the article, is only partial and does not allow the reader to understand what the terms mean and why "error 53" (lockout error) occurs. Whether this was really the way the Apple employee explained it or whether some portions of her explanations were removed is difficult to tell.

Several articles at Tech Crunch, on the other hand, give a good and balanced overview of the Touch ID feature that Apple had introduced back in 2014 and what security issues come with it.[71] [72] The reader is encouraged to read both these articles and Apple materials that the articles link to, in order to understand how Touch ID and "secure enclave" work, how they store user fingerprint information and why allowing 3rd party vendors to fix the start button is a huge security breach. It turns out that bricking devices was indeed a security measure, only poorly implemented. Tech Crunch explains:

And this is where Apple has made some mistakes. The company treats security very seriously but should also take advantage of its own design. The secure enclave works independently from the main processor. If iOS 9 can't verify the authenticity of the Touch ID sensor, the OS should brick the secure enclave, or disable all Touch ID-related features, such as Apple Pay.

The company shouldn’t prevent you from accessing your precious photos, contacts and apps. Today’s implementation of the error 53 is a bad one, and I hope that Apple is going to fix it in the next iOS release.[71]

In another article they write:

Allowing a third-party Touch ID sensor to function properly without an official Apple repair center both verifying that it is legitimate and recalibrating the cable to work with your iPhone’s Secure Enclave is a huge security risk. A malicious repair shop or corrupted part could allow unauthorized access to your phone or its data. Apple is absolutely right to disable TouchID — it was also wrong for it to disable your entire iPhone for getting your home button replaced on the cheap.[67]

Apple has claimed that bricking devices was meant as a factory test and was not expected to affect customers. This, though, is a questionable explanation and is not in line with their previous documentation and statements.

Two weeks after "error 53" made headlines Apple fixed the situation by issuing an update that unbricks devices (but not the Touch ID) and by providing reimbursement to anyone, who had paid for an out-of-warranty replacement of their device based on this issue.[67] This was exactly the right thing to do and is in accordance with what Tech Crunch reporters have written about earlier.

Before Apple had issued a fix, the company was faced with class action lawsuit.[68] Later the lawsuit was dismissed on the grounds of Apple providing the fix and the reimbursement program, and also plaintiffs having no evidence that bricking the device was intentional on the part of Apple.[73]

However, it can still be argued that Apple was wrong in not being perfectly clear on its security features, on its phone software updates, and being inconsistent in its statements to the press. In other cases Apple does warn about impending bricking of devices.[74] That they did not do this here might hint that it could have been a bug after all, but it is difficult to tell with the information currently available.

We now have to notice how inaccurate and one-sided Stallman's account of this case is. Not only did he fail to conduct even basic research into the feature in question (or chose not to mention it), but also said nothing about the lawsuit, did not consider opinions other than the ones that fit his agenda, and was silent on the reimbursement program.

But all of these details are very important, as they show that a lot can be done to fix the situation, both by the public and by the company that erred; that the case is far from clear and cannot be dubbed as "injustice" or "intentional mistreatment of the user" without additional evidence.

Furthermore, when full context is available, it is clear that, contrary to Stallman, Apple did apologize and tried to correct the situation as best as possible. Saying that they have to admit that the situation is simply "wrong" does not provide any real insights, nor does it help the affected users in any way.

It also feels like a misdirection - yes, the company perhaps did not literally use the words "we were wrong". But they fixed the operating system behavior, unlocked devices and reimbursed money to affected customers. That seems like a very clear way of admitting the company was wrong.


• Windows 10 update push.

This is a recurring theme in Stallman's examples. Out of 24 examples of proprietary "sabotage", 4 examples are rehashing the same problem - that Microsoft is pushing Windows 7 and Windows 8 users to upgrade to Windows 10. I have to agree that such practice can be argued to be a breach of contract with users of Windows 7 and 8. They paid for an operating system, believing they would be free to decide when to upgrade themselves. Tricking users into upgrading is malpractice.

To be absolutely clear, let me stress this once more - Windows 10 upgrade tactics employed by Microsoft are wrong. There is already a court case, in which the user was able to get $10,000 in compensation from Microsoft. The disturbing part of the story is that, according to some news sources, Microsoft denied any wrongdoing and decided not to appeal only to avoid cranking up litigation costs.[85]

But while I strongly agree with Stallman that this is user mistreatment, I cannot agree that re-using the same example several times is honest. Yes, in every case mentioned by Stallman there is a slightly different spin to the story, but it is all about one company and one product upgrade. This, like many other rhetorical tactics employed by Stallman, only shifts discussion away from a balanced overview, creating a false sense of numerous cases of user mistreatment when we are really talking about the same case.


• Examples of clear user mistreatment.

Finally, there are several examples which can be considered clear user mistreatment, with almost no additional commentary necessary.

A case when Oracle's nonfree Java plug-in for browsers installs adware can be considered questionable practice, given the deceptive manner in which adware is being installed, requiring users to de-select the installation.[75] Bundling products is an issue that requires careful consideration, since this is also closely tied to competitiveness problematics.

A case of LG changing the Terms and Conditions for their Smart TVs and locking out customers who do not agree to provide their data to third parties.[76] More or less clear breach of contract. Customer complaint has been registered in the UK. There is actually a law in the UK against such seller behavior, but it seems that in this case nobody went to court with this situation.[77] LG, though, has received some serious attention from the UK government and had to tone down its practices.[78]

There are other cases when smart TVs would raise privacy concerns. One known case is Samsung Smart TV's voice recognition technology. The device is reacting to voice commands, but in order to understand them it uses a third-party voice recognition service. Thus, the device actually records speech in the room, sends this data to third-party servers, where the software analyzes data and attempts to react to the commands.[79] Samsung Product Policy said: "If your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party." EFF activist called this clause of the policy "Orwelian", although such characterization is unfair and is based on ideology, not on facts. There is a difference between sending data to oppressive government with the intent of spying on people and putting them in jail for their opinions, and creating a service that analyzes speech in order to operate a Smart TV.

To be clear, such situations are clearly problematic, but they should not be elevated to the status of a catastrophe. It is important to understand the implications of such situations and develop measures to make sure that customers have control of their data. Proposed solutions will be discussed in Chapter 4.

The important thing to notice about all of these cases is that they are not very numerous and that each one of them is a controversy, not "standard practice". Even if we would grant each and every case that Stallman lists the status of "user mistreatment" (which we can't), this would still be a minuscule amount of cases in an otherwise vast field of normal, uncontroversial situations.

A case of Nintendo changing its End User License Agreement (EULA) and forcing already existing customers either agree to it or loose part of the functionality is clear malpractice and a case of a possible contract breach. EFF has reported on the issue and I can firmly agree that this is a serious question of consumer rights. Organizations like EFF can help us move forward in fighting for consumer rights and making sure that such cases are unacceptable.[86]


• Conclusion for "Sabotage"

In general, this is the better argument Stallman makes, and among the problems he lists several are cases of genuine injustice, involving breach of contract on the part of the software developer. He, however, routinely misrepresents these problematic situations as hopeless, although most of them are quickly resolved. Additionally, this section better than others illustrates that as soon as a case can clearly be understood as problematic, the public notices this and moves to act, and companies are forced to patch things quickly.

The breakdown of the cases is this:

While in this section we are able to finally see clear-cut cases of users being mistreated, Stallman does a poor job explaining how this is contingent on software being proprietary. Most of these situations are impossible in the world of "free" software simply because there is no contract and there are no customers.

It might be true that in some cases "free" software might theoretically make such problems less likely to occur; in a number of cases "free" software seems unlikely to help, and in some cases usefulness of "free" software is at least debatable. A large portion of examples has actually been resolved to the satisfaction of customers, without reverting to a "free" software solution.

In general, proprietary "sabotage" outlines some of the problems that sometimes occur in the software world, but cannot be used as evidence that proprietary software as a concept is unfair. Neither did this section provide evidence that mistreating the user is standard practice.



2.2.5 Interference

Stallman writes: "This page describes how various proprietary programs mess up the user's system. They are like sabotage, but they are not grave enough to qualify for the word “sabotage”. Nonetheless, they are nasty and wrong."

Examples seem to describe situations when an upgrade or a download initiated by proprietary software has the potential to interfere with users' work on computer.

This is one of the weaker sections. It contains 9 examples. 6 of these examples are dealing with Microsoft forcing Windows 10 upgrades, which we have repeatedly seen in other sections. Linking to the same problem over and over again creates a false impression of problems being more numerous than they really are. Even if Stallman believes that Windows 10 upgrade tactics should be mentioned in this section, an honest approach would be to do that in one bullet point, and not list 6, when they are essentially addressing the same issue.

One of the other examples talks about the case of Apple downloading an upgrade in the background and bottling up the Internet connection in some cases. This is an annoying issue, but if one simply reads the whole discussion Stallman links to, it becomes clear that the user complaining did not turn off "automatic update download" option.[92] However, the issue is not completely solved by this, since according to other sources, unchecking the option will stop the phone from using your mobile data plan, but will still go ahead with the download when you are charging the phone and are on wi-fi.[93] That makes it unlikely to interfere with users' actions, as in this case the device is less likely to be used actively. But, in my view, such behavior may still be a case of breaching consumer rights. Consumer rights organizations such as EFF should definitely step in.

Another example deals with Adobe Photoshop Cloud freezing the system to perform a license check. Stallman writes: "Adobe nonfree software may halt all other work and freeze a computer to perform a license check, at a random time every 30 days." He then links to a blog post which speaks about a situation of someone giving a live presentation, and Photoshop freezing the system to perform a license check. The author of the blog then urges everyone to use GIMP, which does not perform licensing checks.[89]

This blog post is an example of incredible intellectual dishonesty and "free" software propaganda. In reality, Adobe Cloud products work very differently. As can be seen from Adobe documentation, online check happens every time you go online. Going online resets the check timer and gives the user 99 days to work offline.[90] [91] When the time to check the license is nearby, the user is reminded by the software to do that at first convenience. To quote from the Adobe FAQ on offline work:

After one calendar month, a dialog box requests online connection.
You can dismiss the dialog box; the next reminder shows up after 30 days.
After one calendar month + 69 days, the dialog box appears daily.
After one calendar month + 99 days, one final offline product launch is available.
[91]

The documentation makes it clear that the software gives user enough reminders to alert her that a problem might occur. Therefore, claims that it happens "randomly" or that it is impossible to go ahead with the check to prevent it from happening during an engagement are false.

This information has been verified by reaching out to Adobe tech support at adobe.com and by communicating with Photoshop Cloud users who have confirmed that the story told in Adobe documents is correct. Unfortunately, I found no convenient way to reference these firsthand accounts, but I believe that anyone can obtain those through finding users of Adobe Cloud products.

Stallman and the author of the blog have both demonstrated unbelievable level of intellectual laziness, Stallman by linking to a blog post that provides little more than anecdotal evidence from a clearly biased source, and the author of the blog by making hasty conclusions about how Photoshop Cloud works, without doing any research and clearly knowing next to nothing about Photoshop Cloud. I cannot know if his account of what happened is even accurate. Photoshop license check is not supposed to freeze the system, in the worst case scenario it will refuse to start Photoshop. I could find no accounts of Photoshop license checks freezing the system. Bug reports on related license check issues which one can find through Google don't seem to describe what the blog author thinks he saw.

Taking into account the information provided above, what seems to have happened was that the presenter failed to pay attention to all the reminders and did not go online for more than 3 months to re-new the license check. This is not Adobe's fault and definitely not an example of any mistreatment. The only user mistreatment is on the part of Stallman for providing his readers with unchecked and ultimately false data.

And the final example on the "Proprietary Interference" page is this: "Oracle made a deal with Yahoo; Oracle's nonfree Java plug-in will change the user's initial web page, and default search engine, to Yahoo unless the user intervenes to stop it."

The irony of this example is that if one carefully reads the article Stallman links to, same tactic was employed a year earlier by Firefox.[94] In fact, reading Firefox blog post on the agreement with Yahoo, it did not even give users a choice, Firefox just came bundled with Yahoo being the default search option.[95]

Firefox is "free" software, according to gnu.org.[96] That clearly shows that such behavior is not contingent on software being proprietary, but can occur in all kinds of software.

But the allegation is void anyway. Unlike the previous example of the Java installer installing adware,[75] asking if the user wants to change default search and requiring that option to be unchecked (equivalent of having to say "No") sounds like a fine way to handle such installations. It can be argued that having that option de-selected would be better and that having the option selected is manipulation on Oracle's part. I would not disagree. But I would disagree that this should be classified as an interference worth discussing, let alone an injustice or user mistreatment.

In conclusion, among the 9 examples we see just one problematic case of Apple iOS download and rehashed examples of Windows 10 upgrade. This page hardly advances Stallman's argument and instead shows once again lack of even basic research behind his allegations, as well as readiness to use any news item that seems to confirm his view of the proprietary software world, regardless of how reliable or even relevant the source material is.


2.2.6 Deception

Stallman: "This document reports instances where proprietary software is dishonest or conceals deception or trickery."


• Many proprietary programs secretly install other proprietary programs that the users don't want.

Stallman links to a Register article that refers to another article on the PPI (pay-per-install) blackmarket and Google's battle with such PPI installs.[97] [98] [99]

Interestingly enough, this often includes re-packaging "free" software, usually of known packages such as the VLC Player, with wrapper installers. Therefore, Stallman's claim is inaccurate, since this is not only about proprietary programs, this is, in fact, also about non-proprietary software being used as a vehicle for crapware propagation.

The research paper that Stallman links to does not conclude that the problem is software being proprietary, neither does the Register article, nor does the Google blog. It is also not clear on which grounds Stallman concludes so.


• The proprietor of the Pokémon Go game invites restaurants and other businesses to pay to have the game lure people there.

The article Stallman links to says just that.[100] I see no signs of any injustice towards anyone here. This has nothing to do with any "user mistreatment".


• “Dark patterns” are user interfaces designed to mislead users, or make option settings hard to find.

Stallman then adds: "This allows a company such as Apple to say, “We allow users to turn this off” while ensuring that few will understand how to actually turn it off."

I agree that such deception should be unacceptable. I definitely disagree that designing difficult to use interfaces has anything to do with software being proprietary, although I would probably agree that "free" software products are less likely to have the need to hide certain settings.

But another important piece here is to be careful with implying deception or intent when there might be none. While some alleged examples of "dark patterns" are more believable, many allegations tend to be pretty unconvincing.


• A top-ranking proprietary Instagram client promising to tell users who's been watching their pictures was in reality stealing their credentials, advertising itself on their feed, and posting images without their consent.

This was malware. It was removed from the store immediately after its malicious activity was discovered. Malware exists on "free" software systems as well. This has nothing to do with questions of intentional user mistreatment, which Stallman obviously means in context of non-malware proprietary software.

I'd like to stress this again - framing an example of actual malware as "proprietary software" is an unbelievably dishonest tactic.


• Volkswagen programmed its car engine computers to detect the Environmental Protection Agency's emission tests, and run dirty the rest of the time.

Stallman then adds: "Using free software would not have stopped Volkswagen from programming it this way, but would have made it harder to conceal."

He does not deliberate on why he believes the problem is software being proprietary. I consider Stallman's assessment of the situation to be superficial. Yes, if firmware was open source, it could have been technically easier to discover the deception - provided code was cleanly written and well documented. But a "free" software approach would, in fact, create an even worse problem: if one can install any type of software on their vehicle, that means that anyone can now override emission limitations, and enforcing the law would become a logistical nightmare.

As will be discussed in Chapter 4, I do not disagree that in cases such as these government regulation bodies should actually opt-in for at least some versions of "free" and/or open-source software that they can check in a centralized manner, along with a change in processes. But these solutions cannot be and should not be a simplistic binary approach that Stallman advances.


• Conclusion.

This section lists 5 examples, only one of which is at least somewhat relevant to Stallman's case. There is no evidence provided that deception in proprietary programs is "standard practice". Examples are about malware propagated through both proprietary and "free" packages, and a Volkswagen scandal, where the problem is hardly proprietary software.


2.2.7 DRM

This section is described as: "examples of proprietary programs and systems that implement digital restrictions management (DRM): functionalities designed intentionally to restrict what users can do. These functionalities are also called digital handcuffs."

These functionalities are called "digital handcuffs" only by Stallman. This is in no way an accepted term. Additionally, DRM is an abbreviation of "Digital Rights Management", not "Digital Restrictions Management". He also refers to copyright as "censorship law".

Interestingly enough, Stallman is actually a supporter of copyright, and on many occasions had said that copyright law is necessary, because "copyleft" would be impossible without it.

10 examples he lists are examples of DRM in various products. Stallman remarks that DRM should be illegal. Unfortunately, he stops there and does nothing to substantiate his point. There are no arguments, no evidence, just an assertion.

While DRM and copyright are complex questions, debated and grappled with in technology, law and discussions of consumer rights, their resolution would benefit from evidence-based approach, not from ideological pronouncements. Such a discussion is taking place and should be encouraged even further. There are studies that look at the effectiveness and consequences of DRM, alternative business models, on whether this can be considered anti-competitive practice, etc. Additionally, the situation with DRM is always changing, and even large companies such as Apple are making moves to remove it from at least some areas, like iTunes going DRM-free in 2009. Many game companies also decide not to use DRM in their products.

Stallman, however, is silent about all these advancements. He is extremely slow to give credit to companies when they are doing the right thing, which creates an impression that the situation is hopeless, when in reality it is not. Indeed, Stallman has also launched an initiative called "Defective by Design", which calls for the abolition of DRM.[104] This is interesting, as it shows that his discourse tends to be void of gradation. Instead of opting for reform, he prefers to advocate for complete abolitions of concepts he disagrees with, and for a complete embrace of his own philosophy.

DRM is definitely a very controversial technology and I hope that evidence-based discussion will eventually lead to a correct decision. Readers are encouraged to read the provided science article and also an overview of DRM on Wikipedia.[102] [103]

Just to be perfectly clear, I share the concerns of groups opposed to DRM, just like many people and organizations do. I just believe that this cannot be labeled in a black-and-white fashion as "user mistreatment" and be used as an argument that all proprietary software is unjust.

But, most importantly, DRM is not a proprietary software problem. DRM is typically easily bypassed, sometimes mere hours after a DRM-ed product release. It is the regulations that in many countries make bypassing DRM illegal.


2.2.8 Surveillance

This is a very large section that is prefaced by an introduction.

"This document attempts to track clearly established cases of proprietary software that spies on or tracks users."

For decades, the Free Software movement has been denouncing the abusive surveillance machine of proprietary software companies such as Microsoft and Apple. In the recent years, this tendency to watch people has spread across industries, not only in the software business, but also in the hardware. Moreover, it also spread dramatically away from the keyboard, in the mobile computing industry, in the office, at home, in transportation systems, and in the classroom.[101]

Psychologically this could be a very important list of examples, as being told that you are being spied upon is a strong message. This might be a good selection of examples, or it could be an excellent opportunity for ideological manipulation.

As I have repeatedly said elsewhere in this text, questions of privacy and user control of data are extremely important, and any tracking capabilities companies have should always be kept in check. And they are being kept in check more and more.

For example, although Stallman considers Apple to be one of the most evil companies, iOS 10 makes even one step further in empowering users to be removed from tracking, by allowing them to nullify unique device id, which will make all 3rd parties blind as to who the user is.[118] What are the real life implications remains to be seen. Taking into account the general trend, there is good reason to be quite optimistic.

The situation with cookies, user analytics and advertising has changed considerably over the years. Legislation has been developed and continues to be developed to protects consumer rights, making sure tracking is transparent and user data is anonymous.[14][19][20][21][22]

And it is precisely because I believe that questions of user privacy are important, I contend that fearmongering and propaganda of any sort should be unacceptable. Fringe groups that try to advance hysterical views and unrealistic solutions on current privacy issues are more likely to halt the momentum of organizations that advance reasonable agendas and pragmatic policies.

A group that argues for complete anonymity as a standard of all of online conduct, that considers using cellphones and credit cards an unacceptable invasion of a person's freedom, such a group can very easily alienate the public, as opposed to organizations that instead of anonymity care about confidentiality (as discussed in section Insecurity), that want to make sure that data collection is clearly announced to the user, that users are not discriminated against if they choose not to be tracked, and that data collection is transparent and performed in such a way that it cannot be used to identify a user.

As is clear from previous sections, these nuanced views and policies are not the ones Stallman is interested in. His position is definitely on the maximalist side of the argument.


I am again forced to start by noting Stallman's uncompromising decision to use loaded language throughout his writings. The definition of "spying", for example, presupposes hostile intent. Which is exactly why people would react negatively to such a term and why using this word is an excellent propaganda tactic - unless, of course, one provides evidence of hostile intent.

However, most of the materials Stallman links to provide little evidence of hostile intent. There is an incontestable difference between spying on users, getting their personal data and then using personal secrets to blackmail them or put them in jail for political opinions - and tracking anonymized user behavior in an application in order to make the UI more efficient, or even with the intent to show more relevant advertising.

The latter is a serious privacy problem, but calling all data collection "spying" is not only unhelpful, it makes it difficult to consider realistic solutions and instead shifts focus to slogans.

By opting for terms that favor his worldview and promote fearmongering and conspiracy theories, Stallman denies his readers an impartial overview of the situation. Instead, he manipulates them into thinking that the situation is much worse than it really is.

The approach Stallman chooses in this section is as stringent and unworkable as in other sections.

For example, he condemns all analytics tools, by commenting on one of the tech pieces he links to: "The article takes for granted that the usual analytics tools are legitimate, but is that valid? Software developers have no right to analyze what users are doing or how. “Analytics” tools that snoop are just as wrong as any other snooping."

Such unorthodox pronouncements are left largely unsubstantiated. He almost universally fails to provide evidence or even any sort of reasoning for his assertions. His statement that "developers have no right to analyze what users are doing or how" requires at least some explanation. But Stallman provides none. Unless there is a good argument on why developers should be denied the opportunity to receive analytics from software, Stallman's statement should be rejected.

He also paints all user tracking with the same brush. But it is clear that recording someone's private conversations and anonymously tracking how users are using the UI are two very different kinds of situations, with data involved being of varying levels of importance and privacy.

We have to make it absolutely clear - the discussion on these very important topics should be based on reason and evidence. If Stallman believes it is enough to assert that someone should have certain freedoms or someone should not have certain rights, he is gravely mistaken. His opponents might just as well respond that developers absolutely have the right to analyze what users are doing. Unless there is a coherent argument, this is not an intellectual discussion, but ideological warfare.

Regrettably, the "Surveillance" section is especially littered with misrepresentation and dishonest tactics. Intentionally or not, Stallman uses all sorts of rhetorical devices to make his case seem more plausible.

For example, he describes Snapchat as a program the principal purpose of which "is to restrict the use of data on the user's computer". Readers who are not aware of how Snapchat works might think that Snapchat is an evil piece of software the purpose of which is to mistreat the user, although this is not true: the selling point of the app is that users can set a timer on their image messages; as timer runs out, images are deleted. So, formally one can say that the purpose of the app is to restrict use of data. But this is a very misleading way to describe Snapchat.

Stallman then routinely misrepresents an article on a security vulnerability in Snapchat, saying that Snapchat "tries to get the user's list of other people's phone numbers", as if Snapchat is doing this to somehow harm the users, although the article he links to is clear that malicious intent is feared from hackers, not the company. Such casual misrepresentation is especially upsetting, since a much better argument can be made, as Snapchat is known to be very careless with its users' security and can be argued to mistreat users. Eventually, the company was even investigated by the Federal Trade Commission and forced to retract some claims about security and data handling of its product.[107]

Another example of simply unbelievable dishonesty is Stallman's concern with "spyware in E-books". He writes: "E-books can contain Javascript code, and sometimes this code snoops on readers."

He then links to an article about volunteers using a special JavaScript library to take part in a scientific experiment that tracked their reading habits. The article makes it very clear that everyone participating knew that their E-book reader was sending data to the researchers.[108] There is no way a reasonable person can call voluntary participation in a science experiment "snooping". Such deception on Stallman's part is simply mind-boggling.

When talking about Windows, Stallman rehashes the same example he uses in the Backdoors section, of Windows disc encryption. We have looked at this example and concluded that there is no sign of user mistreatment. But while it is not necessarily unfair to list the same example in various categories, in this case Stallman lists the same example twice! Several bullet points down he is again concerned with Windows 10 full disc decryption sending a key to Microsoft! He does link to a different article that further looks at Microsoft's Privacy Policy, but also again at how its disc encryption will send a key to OneDrive.[11] [109]

There are other cases which Stallman will rehash several times. For example, that Yosemite Spotlight sends search terms not only to the search engines, but also to Apple itself is mentioned in at least two separate places.


Low quality of research on Stallman's part is especially highlighted by this section.

In one instance Stallman links to resources that seem to be giving clearly conflicting statistics. He links to a scientific study that mentions that on average free mobile apps connect to 3 third party URLs.[119] Several bullet points down he links to an article that claims that on average free Android apps are connecting to 100 domains on average.[120] This discrepancy apparently does not worry Stallman and he references both links, thus creating an impression that the situation is dire.

While the first link actually leads to the research itself, the second link leads to a Guardian article that does not list its sources. The article does mention the name of the researcher, Luigi Vigneri. A quick Google search reveals his profile at Eurecom,[121] which in its turn links to his publication. The paper in question is called "Taming the Android AppStore: Lightweight Characterization of Android Applications" and is publicly available at arxiv.org[122]

Careful read of the paper reveals that the statement about an average of a 100 domains is false. Section 5 titled "Application Destination Characterization" meticulously lists all the research statistical numbers. The data is in agreement with the previous scientific study and concludes that the average number of URLs is 4. It also shows that most apps do not communicate with trackers and the vast majority of third party URLs are not connected to malware. The relevant quotes, all taken from section 5 of the paper are listed below:

Across the applications in our dataset, the median number of domains connected to is 4, while some apps connect to more than 100.

We observe that while the vast majority (73.2%) of apps do not involve any communication with trackers, a small number of apps do indeed communicate with them.

URLs that host executable content that is deemed malware-like, are deemed suspicious. ... By suspicion score for a URL, we denote the fraction of antivirus engines (VirusTotal uses 52 in all) that deem the URL suspicious (or malicious). Our result show 94.4% of the URLs have a (suspicion) score of 0. In the worst case, a URL was deemed suspicious by 3 (of 52) engines.[122]

This fact checking process took me a little over 10 minutes. It is incredible that Stallman did not feel that he owes his readers at least some level of scrutiny.

This situation also demonstrates the importance of keeping sources updated. This same paper says:

While the Do Not Track policy has been proposed by consumer advocates and has gained some acceptance, the mechanism is restricted to web browsers, and does not extend to mobile apps in general.[122]

As mentioned in the beginning of this section, that policy has already been rolled out to mobile devices as well. The Do Not Track flag is now an official part of OpenRTB, a protocol that is used by advertisers and publishers to exchange information when serving ads (section 3.2.18 Object: Device).[123]

In general, articles that Stallman links to, pertaining to app permissions, are becoming quickly outdated as iOS and Android roll out more privacy features. Today even Facebook apps will alert users to the permissions that apps ask the user for.[124] [125] Many of the features that in Stallman's opinion are invading user privacy can be turned off and the articles he links to mostly demonstrate how to do that.

Another note of interest is that the first article on mobile apps URL connections notes that strict filtering by the vendor is effective in removing the danger of too many connections and connection to malware sites. The paper also notes all the regulations that are now being put in place to fix the situation. This is important because it shows that the problem is being solved and is nowhere near "standard practice".

The section also lists some cases that are of significance for the modern privacy discussion. One case is about a pregnancy test application. Stallman writes: "A pregnancy test controller application not only can spy on many sorts of data in the phone, and in server accounts, it can alter them too." This, however, is manipulation. Such phrasing implies that all of this is kept secret from the user. However, this is not true and during installation the app clearly lists all the permissions it requires from the user.[126] Calling this "spying" is simply incorrect and an example of loaded language.

However, there are discussions to be had on what kind of regulation is required for a company that provides a service in exchange of data. As noted earlier in this treatise, and in fact in the beginning of this section, EU laws are already quite strict in regard to data collection and user targeting. The US will hopefully follow.

In general, it is important to understand that when Stallman writes "personal data", typically it means "email". Loaded language of the GNU philosophy text is doing a good job of invoking associations with corporations secretly getting people's personal secrets and snooping inside their files. However, this is not what is generally happening.

There are cases of security breaches where more personal data was leaked. An example is a case of CloudPets.[127] In this specific case a security breach made it possible for hackers to gain access to users' emails, passwords, but also to profile pictures and 2 million audio recordings that children and their parents made as messages for CloudPets. These messages had been stored in the cloud.

These cases are fundamental in understanding whether connected toys should be trusted and whether in order to really trust such devices we should have security standards. Company security standards were very low.

But one can notice that this is not how Stallman presents the case. He writes:

"“CloudPets” toys with microphones leak childrens' conversations to the manufacturer. Guess what? Crackers found a way to access the data collected by the manufacturer's snooping. That the manufacturer and the FBI could listen to these conversations was unacceptable by itself."

However, calling audio messages that had been stored on the manufacturer's server snooping is deceptive, even if you add "FBI" to the sentence. When a user registers on a forum she, too, must enter her email. Then she will leave her messages in the forum's database. And this can now be spun to say that owners of the forum "snoop" on users' personal data and conversations. After all, FBI can potentially read them, can't they? So, does it mean that anything that connects data to a person and that can be potentially read by FBI is snooping?

The reader must remember that these numerous examples are listed by Stallman to demonstrate that proprietary developers systematically mistreat the user, secretly steal everyone's personal information and do this with malicious intent. But instead we are again and again presented with the vendor keeping some very specific user data, typically very openly and obviously, and with a clear purpose of providing a service, and then being hacked or having a security bug discovered - which is usually quickly patched. These examples might lead us to continue to push for better security standards and regulation, but this simply does not make Stallman's case that proprietary software is inherently evil or is more likely to collect user data or that this collection is even an issue in most cases.


2.2.9 Jails

Stallman explains the content of this section: "Here are examples of proprietary systems that are jails: they do not allow the user to freely install applications. These systems are platforms for censorship imposed by the company that owns the system. Selling products designed as platforms for a company to impose censorship ought to be forbidden by law, but it isn't."

As I discussed in the section on Censorship, there is no reason why an app publishing platform should allow installation of any app, let alone on what grounds should this be illegal. I have laid out the problems with Stallman's reasoning and the reader is referred to that section.

Unless my arguments against considering content filtering as "user mistreatment" are refuted, none of the examples in this section should be considered a demonstration of injustice towards the user.


2.2.10 Tyrants

Stallman: "A tyrant device is one that refuses to allow users to install a different operating system or a modified operating system. These devices have measures to block execution of anything other than the “approved” system versions."

This is not a black-and-white issue. In case of some products such behavior can perhaps be argued to be deceptive and anti-competitive, in most it is totally acceptable. Stallman, however, just posits this to be bad, without going into detail and not giving any justification of his opinion. Which is unfortunate, because a more nuanced argument can be made.

For example, a product that is positioned as a general purpose computer should not favor certain operating systems and force other operating systems out. The injustice here would be the deception of the customer, who believes to be buying a computer that any operating system could run on. If you are creating a custom piece of hardware that is not positioned as a general purpose computer, then the customer is not deceived and there is no discernible injustice.

Perhaps in some situations a case could be made that artificially excluding certain operating systems is anti-competitive practice in certain market situations, but this has to be investigated on a case by case basis.

Stallman brings up one interesting example, of Intel possibly locking out certain Operating Systems.[105] This is definitely a worrying development and such strategies of hardware corporations should be carefully looked at. Conspicuously, this has nothing to do with questions of "free" or proprietary software, so this example seems to be irrelevant to Stallman's case.

Another example he brings up is of Playstation 3. He simply says that "The Playstation 3 is a tyrant", meaning it allows only a selection of operating systems to be installed on its hardware. I, however, don't agree this is "user mistreatment". Stallman needs to demonstrate exactly what is unjust here and in what way a gaming console blocking certain operating systems is mistreating its users.

It should be noted that in the section of Sabotage Stallman brings up a case of Sony removing the option to install Linux from Playstation 3, with the case having been brought to court.[106] But this is different. Here a clear problem can be spotted - a breach of contract with the customers who have bought the product, expecting it to have the "Other OS" option. The case was won by the customers and Sony had to agree to a settlement.

But, notably, the settlement did not say that excluding operating systems in general is unjust, it instead said that removing the feature after the product was purchased is unjust. Below is the relevant quote from the settlement:

To get the $55, a gamer "must attest under oath to their purchase of the product and installation of Linux, provide proof of their purchase or serial number and PlayStation Network Sign-in ID, and submit some proof of their use of the Other OS functionality." To get the $9, PS3 owners must submit a claim that, at the time they bought their console, they "knew about the Other OS, relied upon the Other OS functionality, and intended to use the Other OS functionality."[106]

So, we should not be mixing the two issues and should firmly reject the notion that a gaming console is in principle prohibited to choose which operating systems to support. If this is somehow unjust, Stallman needs to make a coherent argument as to why this is unjust - which he doesn't do.

Other 4 examples are describing practices that I currently find legitimate. I might change my opinion if I would be presented with an argument explaining why I should consider them to be cases of user mistreatment.

In general, examples in this section don't demonstrate that proprietary software developers are "intentionally mistreating the user".


2.2.11 Subscriptions

Stallman writes:

It sounds simple to say that a certain program “requires a subscription.” What that means concretely is that it contains a time bomb, so that it will refuse to operate after that date. Or else it is tethered to a server, and that server checks the date. Either one is a malicious functionality.

This is an extension of what Stallman wrote about "time bombs", see 2.2.4 of this treatise. It references Adobe cloud services as an example.

Stallman gives only one reason why subscription-based services are wrong - "nonfree software is controlled by its developers, which puts them in a position of power over the users; that is the basic injustice."

But as I discussed in 2.1, the phrase "developers exercise power over the users" is misleading, unhelpful and is not contingent on software being proprietary. "Free" software developers exercise "power" over the users in much the same way.

This "basic injustice", thus, cannot explain what is wrong with subscription-based services. It would've definitely been unjust if the developer would pull the plug suddenly, without any prior agreement. I have discussed this at length in 2.2.4. But subscription-based model openly states that the software will be available to the user as long as she pays for the service.

It is not clear why Stallman believes that this is unjust. In the same way, any subscription-based service should then be unjust in his eyes. A monthly bus ticket, a subscription to a fitness club, a contract to rent an apartment. In fact, many "free" software vendors make their money by offering tech support subscriptions. Someone could then say that Red Hat's tech support subscription model is unjust because it is a time bomb, and after a certain date Red Hat specialists will refuse to help people in need.

Which is why I repeatedly stress the importance of a nuanced approach. Stallman's black-and-white tactics are limiting his own opportunities at starting a dialog and promoting meaningful change. For example, Adobe has been known to make the termination of its subscription difficult for its users.[128] This is definitely something that one can fight against, and something that can be argued to be rather malicious.

I must, however, decisively reject Stallman's assertion that subscription-based models in themselves are somehow unjust, unless he makes a coherent argument as to why this is so.


2.2.12 By company or type of product

This section seems to be just an alternative way to organize the material and mostly rehashes examples from other sections.


2.2.13 Conclusion

I started this section by quoting Richard Stallman's own introduction to the examples of proprietary abuse:

Power corrupts, so the proprietary program's developer is tempted to design the program to mistreat its users—that is, to make it malware. (Malware means software whose functioning mistreats the user.) Of course, the developer usually does not do this out of malice, but rather to put the users at a disadvantage. That does not make it any less nasty or more legitimate.

Yielding to that temptation has become ever more frequent; nowadays it is standard practice. Modern proprietary software is software for suckers!

Users of proprietary software are defenseless against these forms of mistreatment. The way to avoid them is by insisting on free (freedom-respecting) software. Since free software is controlled by its users, they have a pretty good defense against malicious software functionality. [4]

To begin with, it is clear that Stallman's approach to proving his claim of consumer abuse being standard practice is fundamentally flawed. The only way to demonstrate that something is standard practice is to provide statistical data. Preparing an arbitrary set of isolated incidents is not and cannot be evidence to support the claim. Even if we take all cases Stallman presents at face value, any area of human activity will have numerous cases of misconduct, that are easy to overwhelm with if presented in one place as a large list. In such a manner, one can review a list of all aerial incidents in history and be tempted to conclude that flight is dangerous. In reality, all one needs to do is place this list against all flights and recognize how small the incidents list really is.

Software development is an enormous and extremely broad business that touches upon countless products, from personal computers to toys, appliances, house lighting and car software. Putting together a list of malpractice cases by scanning news items across all these products and decades of software development is not informative. Doing comparative analysis, providing proper statistics and establishing the actual scale of consumer right violations is. Stallman does not even attempt to perform such research.

Additionally, most of the news items Stallman submits typically describe a court case or a scandal, which demonstrates that these situations are nowhere near "standard practice", and are instead outliers treated as problems to be dealt with.

But what about the cases themselves? Are they indeed accurately represented examples of consumer rights violations?

Mostly, not.

Stallman routinely misrepresents data, gives incomplete accounts, links to unreliable, biased sources, many of which on close inspection turn out to be wrong, and denies the reader a balanced, well-researched discussion of ethics, instead committing to the use of loaded language and other rhetorical tactics.

He constantly talks about developers of proprietary software abusing the user, but the bulk of the cases he presents are examples of a third-party taking advantage of security vulnerabilities and abusing the user. This is a bait-and-switch argument. It does not show any intent to harm the user on the part of proprietary software developers.

Stallman also claims, without providing any evidence, that proprietary software developers show no concern for security. Available research, however, reveals that "free" software projects are not any better in terms of security than their proprietary counterparts.

Little care is shown to correct or update allegations. Old news stories about security bugs in proprietary products continue to be listed, and the commentary is worded so as to give the impression that the problem exists today, although most problems had been fixed within days and sometimes hours of being reported.

Some examples talk about a problem with a particular program in general, like in the case of Whatsapp not being secure. But although this allegation is no longer true, it is still listed. The amount of such "dead issues" in Stallman's compilation is very significant.

His commentary on the provided examples is littered with mistakes, misconceptions and attempts to manipulate the reader. Even basic research shows that Stallman is extremely careless with his sources, and typically does not do any fact checking. Sensationalist articles, thoroughly debunked and sometimes even corrected by their original authors, continue to provide "evidence" of "proprietary abuse". Stallman readily links to personal blog posts, taking any claims about problems with proprietary software at face value, even when the claims are ridiculous and quick research reveals them to be blatantly false.

Such lack of rigor leaves an impression of Stallman trying to influence the reader with sheer numbers. When corrected for accuracy and relevance to Stallman's claims, the amount of links that do indeed demonstrate customer rights violations becomes quite small. And even those are not without caveats.

Stallman does bring up issues that are serious, and do indeed pose an ethical and political challenge, which include such exceptionally complex topics as freedom of political expression, privacy in the wake of new technologies, the extent of state power, fair competition in the free market.

However, his stance on these issues is radical, ideologically charged, borders on a simplistic binary worldview, and doesn't appear to be backed by much argument or evidence. Instead of addressing the complicated nature of the problems that arise in our society as a consequence of new technology, the only problem Stallman seems to care about is that software is proprietary, even in cases when a problem has little or nothing to do with the availability of source code.

In some cases the only way to see the current state of affairs as constant user mistreatment is to adopt radical definitions of what is non-abuse. For example, Stallman's position on security is that only total anonymity is acceptable, something that is clearly not widely believed and is asserted by Stallman without any justification.

Most importantly, Stallman's misleading language veers the reader in the direction of despair, as if nothing is being done to fix the situation. In the above quote he says: "Users of proprietary software are defenseless against these forms of mistreatment." Throughout his writings there is a tacit assumption that the software world exists in a vacuum, and there are no ways to solve the problem other than to adopt Stallman's philosophy. He thus frequently offers a false dichotomy: that problems with software can either be fixed by granting everyone full access to its source code - or they cannot be fixed at all.

Yet, almost every source he quotes explains how the issue in question was solved by a regulation or a court case or proactively by the industry itself. Scientific research that he links to also demonstrates that regulation works.

And finally, the overarching narrative seems to be that all of these "abuses" were exactly what Stallman had been predicting from the start, and now it is finally happening. When talking about Windows 10 updates, he writes: "This demonstrates what we've said for years: using proprietary software means letting someone have power over you, and you're going to get screwed sooner or later."[129]

Not only is this tantamount to saying that letting someone cook for you means that sooner or later you are going to get a badly prepared meal, which is not saying anything substantial, but also that the dystopia that Stallman anticipates does not seem to be coming. His poorly researched examples of people purportedly being "screwed" by proprietary software do nothing to demonstrate a dystopian trend that Stallman so passionately tries to detect in a random stream of news items, blog posts and political scandals, many of which have to be painstakingly spun in order to appear relevant to Stallman's cause.

Thus, based on my analysis and the evidence provided, I must conclude that these examples fail to uphold Stallman's argument that abuse from proprietary software developers is standard practice. Software development is just another area of human expertise, with its occasional misfortunes and mistakes, and with a clear trend of becoming safer and more reliable, as experience and regulation shape it to comply with law and human rights - a path that any new trade has gone through.



2.3 Proprietary software keeps people divided and reduces cooperation

This is the final argument that Stallman puts forward as proof that proprietary software is unjust. It may also be the most psychologically decisive one. A person with a cursory familiarity of the subject will read that "free" software is about sharing, cooperation and education, and will associate it with moral superiority.

In fact, this is what frequently happens. Even people who are actively engaged with "free" software projects will tend to be largely unaware of the details of Stallman's philosophy, but be convinced that "free" software is ethical, while proprietary is not.

So what is this argument? Stallman writes:

When you use proprietary programs or SaaSS, first of all you do wrong to yourself, because it gives some entity unjust power over you. For your own sake, you should escape. It also wrongs others if you make a promise not to share. It is evil to keep such a promise, and a lesser evil to break it; to be truly upright, you should not make the promise at all.
...
Freedom includes the freedom to cooperate with others. Denying people that freedom means keeping them divided, which is the start of a scheme to oppress them. In the free software community, we are very much aware of the importance of the freedom to cooperate because our work consists of organized cooperation. If your friend comes to visit and sees you use a program, she might ask for a copy. A program which stops you from redistributing it, or says you're “not supposed to”, is antisocial.
...
Schools (and this includes all educational activities) influence the future of society through what they teach. They should teach exclusively free software, so as to use their influence for the good. To teach a proprietary program is to implant dependence, which goes against the mission of education. By training in use of free software, schools will direct society's future towards freedom, and help talented programmers master the craft.

They will also teach students the habit of cooperating, helping other people. Each class should have this rule: “Students, this class is a place where we share our knowledge. If you bring software to class, you may not keep it for yourself. Rather, you must share copies with the rest of the class—including the program's source code, in case someone else wants to learn. Therefore, bringing proprietary software to class is not permitted except to reverse engineer it.”
... [5]

Let us break this text down into individual points.

Point a. I have dealt with in section 2.1 and it is mentioned here only for completeness. Point c is secondary to point b, but is of some importance as an application of Stallman's reasoning to education, to which he dedicates noticeable space in his writings.

What must be noted is that point b focuses not on the availability of the source code, but instead on the terms of the software license. Point c mostly focuses on the availability of source code.


2.3.1 The non-scarcity dilemma

Stallman argues that proprietary software is unjust because it denies people the opportunity to cooperate. Although this is worded overly generic, Stallman's concern is clear: because proprietary software licenses often prohibit distributing a program to anyone, including friends and family, he believes that this will somehow undermine the fabric of society and create a dystopian world where everyone is dependent on corporations, and there is no will left to help people in need, and that those who cannot afford to buy software would thus be left out.[134] [145]

It is of paramount importance to stress that this argument is not specifically about software. At it's root it is a solution to a moral dilemma about non-scarce products. Basically, Stallman argues that agreeing to obtain something that can be easily duplicated under the obligation to not duplicate it for others is morally wrong.

Such a dilemma seems to be at the heart of modern debate about information technology ethics that surrounds not only software development, but also distribution of media and questions of copyright in general. Let us formulate this dilemma by generalizing Stallman's phrasing.

You have obtained a product that is very useful and is sold at a substantial cost, but which can be duplicated at negligible cost. Your friend comes over and asks for a copy. What is the right thing to do?

Stallman's solution is outlined in the quote above: you should give a copy to your friend. He then provides further opinions on the situation by saying that there were two wrongs committed: first by a software developer who imposed a prohibition of copying on a non-scarce product, second by you for having agreed to such a prohibition over "cooperation".

Before I analyze Stallman's response to the dilemma, let us look at the dilemma itself and appreciate it's non-triviality.

The first difficulty is the definition of a non-scarce product. Typically, the term "non-scarce" is used for a product that can be copied at negligible cost. Thus, it is non-scarce in a sense that once it is produced - the amount of copies that can be created at no cost is virtually unlimited. However, design, production and maintenance (where applicable) of such a "non-scarce" product involves quite scarce resources - skilled labor, time, financial investment. Therefore, this non-scarcity is not universal and applies only to one aspect of the product - its distribution. In all other aspects the product is quite scarce.

The second difficulty is the one-sided aspect of the dilemma, at least in the way Stallman formulates it: "if your friend comes to visit and sees you use a program, she might ask for a copy." It puts the user of a program under scrutiny, perhaps even the developer of a program, but it leaves out actions of a friend who asks for a copy, as if asking for a copy carries no consequences, and is not something worthy of analysis.

So, how does Stallman approach this dilemma?

Stallman's response rests on several key statements:

First of all, let us take note of the language Stallman is using. He routinely equivocates "distributing programs" with "cooperation" and "sharing", whereas words "cooperation" and "sharing" typically define much broader notions. Cooperation is not limited to distributing software.

And even if we do limit ourselves to software and imagine that it is the sole arena of human cooperation, we are still offered a false dichotomy, as if there are only two options - distribute a program (and cooperate) or refuse to distribute (and not cooperate) - whereas there are many other options to help our friend without undermining our friendship. For instance, we can buy him a license.

But nor is distributing programs always cooperation. It is certainly not cooperation with people who have developed the program. And once we recognize that software development is an area of quite scarce resources, only one aspect of which - copying - has so far enjoyed the blessings of relative non-scarcity, our help in the short-term can very well become hindrance in the long-term, for everyone.

Therefore, another way to solve the supposed conflict is to explain to one's friend that people who write software need resources in order to continue doing that, and as adults we can and should realize this basic fact, and do our best to uphold the current order of things, up until the time developing software becomes as non-scarce as copying its end product.


Which leads us to a very important feature of Stallman's argument that is not obvious, but plays a large role in the underlying logic. And the reason for its obscurity lies in the "free" software narrative itself.

The "free" software movement focuses on ethics. In this specific argument Stallman also consistently aims his attention at ethical categories of "sharing" and "cooperation". The danger of such focus is failure to analyze differing perspectives. Thinking that narrows its throughput to only a handful of principles frequently carries the assumption that the rest of the world sees the situation in similar terms.

This explains why Stallman frequently equivocates words like "cooperation" and "sharing" with "distributing software". If one sees the world through the lens of a certain cause, everything becomes embroidered in it. It is evident in Stallman's examples of what he calls "proprietary abuse", and especially apparent in cases which to a neutral investigator might seem to have very little if anything to do with the topic at hand. (see many examples in section 2.2)

Stallman's argumentation in this case suggests that proprietary software licenses prohibit cooperation and sharing in Stallman's general ethical terms. He must assume so, otherwise his concerns make little sense.

But this is not what such licenses actually prohibit. What they prohibit is sharing in a very narrow sense - the distribution of said software. And they do so for one reason and one reason alone - because a license costs money. When money is not an issue, not only individual developers, but also large corporations ordinarily bypass prohibitive licenses and opt for some form of freeware or even open source distribution. There are no known software licenses that try to prohibit people from cooperating or sharing in general.

Similarly, the prevailing reason why someone would require the service of distribution from a friend is because they do not have the money to obtain a software license, or because they are unwilling to spend the money they do have on a software license.

Stallman's ultimate solution to what he sees as money interests destroying cooperation in society is a world where the "essential freedoms" ensure that it is nearly impossible for software development to be a directly monetized activity. Of course, various forms of indirect economical models are often suggested, but none of these models seem to work reliably. And Stallman's freedoms definitely remove pure software development from the area of commercial enterprise.

There is, of course, a case to be made that failing to cooperate with a friend even in this one narrow sense somewhat hurts the spirit of cooperation in other areas. But this is a difficult case to make, and one that Stallman provides neither evidence, nor argument for. And neither does he specifically formulate such a case.

And the problem with this argument, were he to make it, would be similar to his insistence that software developers exercise "power" over users: it is applicable to countless situations that are unlikely to be considered problematic by "free" software advocates. Any situation where a good costs money will by definition exclude those who cannot afford it. Unless "free" software advocates want to push for a utopian socialist society where everyone should be entitled to receive scarce goods at no cost, it would be difficult to argue for the problem of proprietary software allegedly "reducing the habit of cooperation". From the point of view of such a radical standpoint buying groceries basically has the same "anti-social" effect.

The only reason why prohibition on copying software would at all seem like an issue worth discussing is because the process of copying costs nothing, which makes one superficially conclude that it is a non-scarce good to which a license has been somewhat artificially attached. One has to make an additional and probably not always obvious mental step to realize that all other factors that make software possible are non-scarce, and thus the product overall is not as non-scarce as it seems. With all the automation and seamless services, a non-developer can be excused from not appreciating the colossal amount of work that goes into writing software, but that does not change the eventual conclusion: software is generally a scarce good.


If I were to add some additional comments, I would need to point out that not all proprietary software licenses forbid copying. A whole area of proprietary software has non-prohibitive licenses and should be left out from Stallman's argument. It is thus incorrect to refer in his argument to proprietary software in general. He fails to make such a distinction.

While a lot of the software might depend on the user being prohibited from sharing, a lot of notable modern software no longer works that way either. People need to have accounts with the developer which would be tied to a payment for a license. In many cases, this is actually true for completely freeware software too.
This is important, because it removes at least some of the ethical concern on the part of the user - there is no longer a reason for a friend to ask for a copy. One might have concerns that this scarcity is artificial and, therefore, still unethical, but I have dealt with these concerns in the paragraphs above: software is generally a scarce good.

Frequently, it is much more beneficial to actually obtain your own license rather than ask someone to share it due to the advantages it gives, including free updates, access to official forums and tech support. A number of proprietary software allows paying customers to secure access to the developer team and lobby the features they most require.


Stallman also says: "Freedom includes the freedom to cooperate with others. Denying people that freedom means keeping them divided, which is the start of a scheme to oppress them".

Another frequent sound byte from Stallman is that software is "free as in freedom", with the implication that his proposed software "freedoms" are akin to basic rights like freedom of speech.

I should clarify that there is a difference between public and private spaces that Stallman is not highlighting. All freedoms, granted to citizens of western democratic countries by their Constitutions are applicable only in context of public spaces.

For example, freedom of speech means that anyone can articulate their opinion without fear of government retaliation or censorship, or societal sanction.[132] However, such freedom is not applicable to private spaces. The law defending the right to express one's opinion does not force gnu.org to allow other authors to publish articles praising proprietary software. In fact, gnu.org has every right to completely exclude authors who hold alternative opinions from being represented on their website. But if someone were to take Stallman's approach, then one would be able to say that the Free Software Foundation denies people the freedom of expression, and that denying the freedom of expression is the first step to oppression. Obviously, this is twisting the meaning of the freedom of speech by applying it to an inappropriate situation.

Therefore, whenever Stallman insinuates that a proprietary software vendor is "denying the freedom to cooperate", an allegation of this sort can only be applicable to the overall public space. Unless the user would somehow be coerced by the state to agree to particular licenses, this cannot be argued to be the case. It can probably also be said that forcing people to use any type of software, be it proprietary or "free", is inherently unjust. Instead, one should defend an option to let people have a choice.

But the option to have a choice is exactly the state of affairs today. The claim that proprietary software developers deny others the freedom to cooperate is demonstrably false. Computer developers and companies do not have the power to deny basic freedoms to anyone. Nobody is prevented from using, writing and promoting open source and/or "free" software, or even public domain software.

To say that there is currently no alternative to a given piece of proprietary software, and to say that the developer of said software denies people freedom to cooperate is fallacious reasoning.

And, as mentioned several times above, cooperation is not limited to distributing or not distributing copies of software, therefore it is implausible that the consequences of prohibitive software licenses are as vast as Stallman implies.

Additionally, freedom to cooperate is not an obligation to cooperate, just as freedom of speech is not an obligation to speak one's mind.

Thus, Stallman's claim that proprietary software "denies people the freedom to cooperate" is largely immaterial. It does raise concerns in spaces where government sets the rules, like in public schools. This will be discussed in greater detail below and in Chapter 3. But for the majority of cases the allegation that proprietary software developers deny people the freedom to cooperate is simply wrong.


Another curious counter-argument to our contention is a belief that useful software has to be written only once, similar to an establishment of a mathematical theorem. Once the software is "mostly complete", there is no need to incur additional costs on anybody, the argument goes. The morally right thing to do is to make it available to everyone at this point. And the community will be able to add the finishing touches.

A concept of a "complete program", however, is a mirage that promptly disappears once closely inspected.

We should recognize that what software is, is a model of certain use cases. It is always a tool that serves a purpose. Frequently - several purposes. And any model is by definition incomplete. Therefore, software will always fail to address a given use case completely. There will always be a gap between requirements and their implementation.

Not only that, but this gap is destined to always expand. Even if a perfect tool is built, once we have a set of requirements satisfied, we will naturally proceed to outline a set of further ones. Because a program makes a given job easier for us, we are now able to shift our attention to other goals, and build additional capabilities on top of what we have. This creates an endless technological cat-and-mouse game, with the gap between implementations and new requirements having to be continuously sealed by further development.

A real life example is a text editor. Seemingly, this is a very basic type of software and, 30 years into personal computing, we should have written a text editor that is "complete", with perhaps only small fixes and few features required. However, this does not seem to be the case. Text editors are constantly evolving, becoming more complex, adopting more and more use cases. To this day there are many commercial text editors which are actually being sold, i.e. people are ready to pay for someone's implementation of a text editor. The world of open source software is no different, with dozens upon dozens of packages in active development.

Even in principle it is difficult to define at which point the program should be considered complete. This difficulty stems from the plethora of use cases that any general purpose software will have. If we take our text editor example, one person might consider a given text editor complete, while another would require it to have line numbering as part of basic functionality. Someone else might need it to be full screen, while others would want it to also be able to highlight and compile code.

Professional, non-general purpose software is even worse in this regard. Professional areas are developing at a higher pace, and widely used modern software tools are more complex than anything humanity has ever designed. Whole professions implode into a one-man act, because of constantly evolving software.

At times a concern is voiced that new software versions frequently do not significantly differ from their predecessors, and that upgrades are artificial and are done only to boost sales. And while there are probably cases when this does happen, one would be hard-pressed to actually support the claim that such occasions are frequent with statistics. A bird's eye view of any professional software will reveal clear progress. A modern video editor, for instance, can do markedly more things than one a decade ago. A music workstation is capable of performing tasks that were unheard of just several years prior. And more often than not professional software development is very directly driven by the needs of its customers.

Incidentally, we have good empirical evidence of formerly proprietary tools becoming open source and even "free" in Stallman's sense, with their development continued by the community. None of these tools seem to be able to compete with modern software unless very significant and well organized development is continued. And software that was abandoned quickly falls out of favor with end users, as even maintaining it and guaranteeing operability in modern systems is significant work.

So, unless all software human progress is frozen, we will never have complete programs or complete tools of any sort.


Finally, one can question the whole premise of non-scarcity in Stallman's dilemma and maintain that this is not what he is saying.

Indeed, it would be correct to say that nowhere does Stallman directly mention non-scarcity. However, I believe that it is strongly implied in his argument. The whole problem that he poses is predicated on software being copyable with no great effort. Otherwise, there is no moral dilemma. He also obviously singles out software (and digital media in general), and does not seem to be making a similar case for objects that have less efficient methods of duplication.[1]


2.3.2 Education

Stallman maintains that proprietary software is detrimental to the mission of education and "implants dependence". He then gives a number of arguments in support of his contention.

The importance of this claim is that while the above section deals with cooperation in relation to distributing software, this section deals with the availability of source code. The idea that the lack of source code is analogous to a refusal of cooperation is an idea that Stallman highlights frequently, for instance in his essay on the history of the GNU project.[149]

I will proceed to analyze his arguments, but interestingly enough he does not make an argument that I have outlined in 2.3.1 - that in public places, where government does set the rules on which software to use, citizens might be denied the choice of licenses and forced into a contract that they are unwilling to abide by.

The problem is broader than education, and applies to a whole set of government establishments, tied in with the spending of taxpayer money, but it definitely does apply to public schools. In some western countries the state might define, usually on a municipal level, the exact software to be used in class, and a lot of that software is distributed under proprietary licenses.

While the case for "free" software as a solution is not as simple as it looks on the surface, at the very least it is a real problem, and a problem that requires a solution. More on this in Chapter 3.

Stallman's own reasons to insist on "free" software in education are listed in an essay "Why Educational Institutions Should Use and Teach Free Software",[135] and are further supported by three more essays on the subject.[5][133][134] The reasons are these:

In this case we have the same situation as with the general "free" software contention. There are essentially two things being said: that proprietary software is detrimental to the mission of education, and that "free" software is a solution. While Chapter 3 will specifically focus on the latter point, I will have to touch upon this point in this section.


i. "Free" software supports education by allowing the sharing of knowledge

Stallman expands on this point in the essays, mentioned above:

Free software permits students to learn how software works. Some students, natural-born programmers, on reaching their teens yearn to learn everything there is to know about their computer and its software. They are intensely curious to read the source code of the programs that they use every day.

Proprietary software rejects their thirst for knowledge: it says, “The knowledge you want is a secret — learning is forbidden!” Proprietary software is the enemy of the spirit of education, so it should not be tolerated in a school, except as an object for reverse engineering.

Free software encourages everyone to learn. The free software community rejects the “priesthood of technology”, which keeps the general public in ignorance of how technology works; we encourage students of any age and situation to read the source code and learn as much as they want to know.

Schools that use free software will enable gifted programming students to advance. How do natural-born programmers learn to be good programmers? They need to read and understand real programs that people really use. You learn to write good, clear code by reading lots of code and writing lots of code. Only free software permits this.

How do you learn to write code for large programs? You do that by writing lots of changes in existing large programs. Free Software lets you do this; proprietary software forbids this. Any school can offer its students the chance to master the craft of programming, but only if it is a free software school.

...

When deciding where they will study, more and more students are considering whether a university teaches computer science and software development using Free Software. Free software means that students are free to study how the programs work and to learn how to adapt them for their own needs. Learning about Free Software also helps in studying software development ethics and professional practice.
[134],[135]

We can infer several arguments from these paragraphs, the first being that "free" software allows people to see the code and, therefore, encourages them to learn programming, whereas proprietary software does not.

To start, we must recognize that most people interfacing with software tools in class will be learning how to use the tools, not how they are built. Therefore, this argument can only apply to a narrow selection of programming classes and to people who are curious about programming outside of class. In the majority of cases it is simply wrong to assert that "proprietary software is the enemy of the spirit of education".

Second, Stallman seems to insist that the only way to learn programming is to read source code of complete programs. In fact, he highlights this in several places, by first saying that kids are "curious to read the source code of the programs that they use every day", and then stressing that in order to become good programmers people "need to read and understand real programs that people really use", and that learning how to write big programs necessitates to "write lots of changes in existing large programs".

It is clear why he does that - it helps him set up a confrontation between proprietary software and education. The problem with this narrative is that it rests on an unproven premise that proprietary software has any relation to education, or that it should have a relation to education.

Proprietary software is not a set of educational materials, and generally does not claim to be. It is not clear why anyone should feel that any given tool students are exposed to must also provide means to learn how it is built. Saying that proprietary software does not allow people to learn how to program is tantamount to saying that a microscope in a science class does not allow people to learn how to build microscopes. Proprietary software is not created to teach people how to program.

But not only proprietary software - most software is created for purposes different to that of educating people how to code. And Stallman's claim that reading source code of complete programs is the only way to become a good programmer is a puzzling one. It definitely does not match the reality of how people actually learn to code. It is clear that there are vast numbers of highly skilled programmers who have never been exposed to "free" and/or open source software, and yet have managed to master the craft.

But even in and of itself, the idea of learning how to program by looking at source code of complete programs is generally very poor advice. It is definitely an outdated one. Many of our everyday programs are extremely complex, and even if each and every one of them would provide its source code to the public, a novice programmer would find it very confusing, unhelpful and incredibly discouraging.

Typically, people learn how to code by inspecting code snippets, by reading textbooks and going through programming courses, by working on small projects and reading documentation - in other words, by using educational material, not by employing source code of actual functioning software, which guarantees neither novice-level, nor well-organized or even well-written code.[136] [137] [138]

And, finally, one of the proven ways to learn is on the job. Many people have picked up programming by starting with smaller projects, and then honing their skills to expert level by doing programming as their profession.

I cannot accept Stallman's statement that "free software encourages everyone to learn" either.
That source code is formally available does not mean that this alone encourages students to learn. Putting together materials that would actually facilitate learning is a very difficult task, and authors and teachers all over the world continuously work on text books, courses and exercises to help people learn programming. If a given "free" software project would like to actually encourage learning, a lot of additional work has to be done - code formatting and commenting, ensuring best practices, writing detailed documentation, etc. This kind of work is very rarely done for any software, let alone "free" software, which in many cases is written by hobbyists in their spare time.

Stallman then says that proprietary software developers "keep the general public in ignorance of how technology works". This argument again assumes that the only way to teach technology is to give people access to complete source code of software.

More importantly, he confuses technology with source code. Stallman commits this error frequently, for example saying "the easy choice was to join the proprietary software world, signing nondisclosure agreements and promising not to help my fellow hacker."[149] In other words, to Stallman the lack of source code is equivalent to the lack of help.

There is, however, an inherent difference between principles, techniques, algorithms - and source code. Source code is no longer the only or even the most common source of technological knowledge sharing. In many cases what's relevant is a description of an algorithm, a code snippet or a function. It is very rare that someone requires to inspect the code of a whole program. Even then, what's really helpful is an overview of the program's architecture, not the code itself.

Developers can and do share their findings by publishing white papers, by disseminating their knowledge in books, courses, on the Internet forums, in blog posts, at Q&A resources like Stack Overflow, etc. Simply supplying complete source code is not likely to encourage people to learn, because they might not even be sure that there is anything interesting to learn there, unless they spend valuable time and effort studying it first.

The public is not kept in ignorance of how technology works, either. Such information is generally open and accessible. For instance, one might not have access to the source code of Skype, but it is fairly trivial to get access to information about technologies and techniques that make Skype possible, and anyone can write a program with identical functionality based on publicly available knowledge - provided they gain the expertise and spend a significant amount of time and effort.

Source code alone does not teach programming in general, nor does it teach how programs should be written. The only thing Stallman can claim is that "free" software gives people the opportunity to learn how a given program was, in fact, written.

Finally, Stallman claims that "more and more students are considering whether a university teaches computer science and software development using Free Software", because only "free" software will allow people to learn how programs work and how they can adopt them to their needs.

However, teaching computer science is not equivalent to teaching how particular programs work, nor is computer science about teaching how to adopt existing software to one's needs. It is, instead, a vast body of knowledge about how computers work, how software is built, and the fundamental principles of computing and algorithm design.[139] Stallman tries to redefine computer science and narrow it down to looking at source code of existing programs, because only then can he erect a conflict between proprietary software and education, a conflict that seems to be largely non-existent.


ii. "free" software supports education by allowing the sharing of tools

Teachers can hand out to students copies of the programs they use in the classroom so that they can use them at home. With Free Software, copying is not only authorized, it is encouraged.[135]

This is a potent argument that makes sense in certain scenarios. Not being able to work with a program at home can be a serious impediment.

It is a separate question whether "free" software should be the solution here. There are many cases when a "free" software alternative does not exist, or the quality of existing alternatives is too low. If a viable "free" alternative exists, schools in many cases should opt for this alternative.

So, I definitely agree with Stallman here, but only to an extent. In most cases, we do not need a program that would provide source code and permission to change it and redistribute further. All we need to solve this issue, is a license that allows users to obtain copies free of charge. In other words, any freeware program will do, not only "free" program per Stallman's definition.


iii. teaching "free" software will prepare students to live in a free digital society

This argument rests on two premises: that proprietary software is unjust and that "free" software is just. As I consider both premises to be unproven - and even questionable - I cannot accept that not teaching proprietary software and teaching "free" software is a moral duty of anyone.


iv. using "free" software in schools provides independence from software vendors

Stallman expands on the point:

Schools have an ethical responsibility to teach strength, not dependency on a single product or a specific powerful company. Furthermore, by choosing to use Free Software, the school itself gains independence from any commercial interests and it avoids vendor lock-in.

- Proprietary software companies use schools and universities as a springboard to reach users and thus impose their software on society as a whole. They offer discounts, or even gratis copies of their proprietary programs to educational institutions, so that students will learn to use them and become dependent on them. After these students graduate, neither they nor their future employers will be offered discounted copies. Essentially, what these companies are doing is they are recruiting schools and universities into agents to lead people to permanent lifelong dependency.

- Free software licenses do not expire, which means that once Free Software is adopted, institutions remain independent from the vendor. Moreover, Free Software licenses grant users the rights not only to use the software as they wish, to copy it and distribute it, but also to modify it in order to meet their own needs. Therefore, if institutions eventually wish to implement a particular function in a piece of software, they can engage the services of any developer to accomplish the task, independently from the original vendor. [135]

This is a very complicated argument. I believe Stallman's take on it to be mostly incorrect. As a lot of it is about "free" software as a solution, I will discuss these arguments in greater detail in Chapter 3. Here I will only briefly outline my analysis.

Stallman's reasoning rests on the following premises:

I do not disagree with the first premise entirely. Vendor lock-in can indeed be a problem. I do disagree that this is the state of affairs today. With the exception of Microsoft, whose grip on the operating system market can be described as a near monopoly, most other software cannot be described in this way. Even titles such as Photoshop, although currently dominating the market, are rivaled by other products.[141] [142] [143]

The point about dependence in general has been shown (see section 2.1) to be immaterial for non-developers, and even for most developers most of the time, as we cannot be experts in everything, and are all dependent on professionals to varying, and often significant degrees. Software is not even the most critical area in this regard, but rather fields like medicine and food.

It is true that dependence on a "free" software vendor, whoever it might be, carries different consequences. Stallman believes these consequences to be more favorable to the user, and completely disregards any other solutions. This is discussed in great detail in Chapter 3 in the section ""Free" software removes dependency on a software vendor". Suffice to say, Stallman does not provide any evidence or reasoning as to why dependence on a "free" software vendor is better, he simply asserts it.


The second premise which Stallman voices frequently is that people can become dependent on software by mere introduction to it. He even employs an analogy with cigarettes:

Why, after all, do some proprietary software developers offer gratis copies of their nonfree programs to schools? Because they want to use the schools to implant dependence on their products, like tobacco companies distributing gratis cigarettes to school children. They will not give gratis copies to these students once they've graduated, nor to the companies that they go to work for. Once you're dependent, you're expected to pay, and future upgrades may be expensive.[134]

The analogy is invalid. Nicotine is a drug that is capable of producing actual chemical dependency in the brain, and that by itself is a harmful substance that leads to increased risk of disease. Software is nothing like that, it is a tool. And most of the time mere introduction to a tool is not enough to implant any dependency.

Software (or any other technological product) typically becomes widely used thanks to its usefulness and competitiveness, not because someone taught it to children at school. It is absurd to imply that because a teacher decides to show students image editing using Microsoft Paint, once they decide to do image editing, they will go for Microsoft Paint simply because this is what they were taught in school. Microsoft Paint objectively lacks features that are required to do involved image editing. Therefore, Microsoft Paint is unlikely to be chosen, regardless of its familiarity or sentimental value.

In other words, software has a tendency to be chosen mostly due to its objective capabilities, not because of memories of previous experiences in school. This is especially true for business situations, where people are looking for results.

This is not to say that early exposure does not make a difference - it might. It is likely to be a decisive factor when there are several very capable alternatives, but not in a situation when most alternatives are simply not up for the task. If the tool is dominating the market, for whatever reason, schools do not have a choice. Teaching students tools that are irrelevant to the professional market is not doing a proper job as educators.

Stallman objects that schools must opt for ethics first and treat functionality as a secondary concern, but this again rests on a premise that proprietary software is somehow evil, a premise we believe he does not make a good case for.

Even if teaching an inferior, but capable enough product will result in a spike in its usage, there is ample evidence that this effect is short-term. A good example of this is the situation with Internet Explorer 6. It was documented to be used mostly at workplaces where default Windows tools were being enforced, but at home people were more likely to switch to an alternative.[144] Apparently, exposure to software during long work hours was not capable of making people dependent on Internet Explorer in other settings.

Another problem with Stallman's argument is his use of equivocation. The dependency on a software product is fundamentally different to that of a drug dependency. By putting together "drug dependency" and "dependency on proprietary software", he creates an impression that the latter situation is analogous to the former. But being dependent on a tool in a sense that this is the only tool capable of producing the desired result is a normal part of reality, and is usually neither problematic, nor a permanent state of things. Stallman alludes to such dependencies being potentially lifelong, but in the quickly moving software world this is actually unlikely. Internet Explorer is once more a good example. Despite aggressive push by Microsoft, the browser was not able to secure a dominant market position after more capable browsers have entered the scene.


As for Stallman's suggestion that with "free" software schools will be able to employ "any developer to accomplish the task, independently from the original vendor", it looks good only on paper. In practice, this puts a school in a position of a software maintainer, a job which is incredibly time consuming, logistically complicated and requires significant expertise. It is precisely for that reason that software development exists as a business. It can perhaps be argued that "free" software adoption will create a more competitive environment, where companies might compete for providing services to schools, but this argument is difficult to pose without backing it up with laborious research, as there are very many variables to consider, not the least of which are price, quality and scalability of such services. Such an argument requires a fair amount of evidence, not just assertion. I look at this argument in greater detail in Chapter 3.


v. "free" software costs no money

It is true that "free" software costs no money to obtain. It is not necessarily true that maintenance costs no money. As seen from the previous argument, Stallman suggests schools become software maintainers and hire people to develop software for them. The cost for setting up such an enterprise is not nil, and can in the end be higher than what schools will ever pay to professional vendors.


vi. "free" software is of high quality

Stallman writes:

Stable, secure and easily installed Free Software solutions are available for education already. In any case, excellence of performance is a secondary benefit; the ultimate goal is freedom for computer users.[135]

Stallman elects on talking about proprietary and "free" software in very general terms. He frequently makes claims about all of proprietary software, or about all of "free" software, routinely failing to choose a more nuanced wording or provide any evidence for his claims.

In this case he claims that "free" software in general is stable, secure and easy to install. However, as noted elsewhere, "free" software is actually a very diverse body of software, which is only related to each other by the type of license developers choose. Unless Stallman is able to provide data to back up his claim that "free" software tends to comply with the characteristics he ascribes to it, his claim must be labeled as unproven. Furthermore, it can be argued to be implausible too, as there is no mechanism by which releasing a program under a "free" license automatically makes it more stable, secure and easy to install.

"Excellence of performance" is also not limited to stability, security and ease of installation. Software programs are complex products that require a much more detailed characterization. A program can be stable, secure and easy to install, but if it can barely cover basic functionality, it's quality as a tool can be described as low. Therefore, I find Stallman's list of quality parameters to be insufficient. Even if his claim would be demonstrably true, which it isn't, it would be an utterly incomplete set of parameters upon which one can judge the quality of software.

Therefore, I cannot accept this as a proven benefit. More on similar assertions about the quality of "free" software will be discussed in Chapter 3.


2.3.3 Conclusion

In several of his essays Stallman passionately tries to make the case for proprietary software being a detriment to a cooperative society. I have carefully inspected his arguments and have found them to be unconvincing, with little attention to detail, and frequently employing rhetorical devices, such as equivocation and redefinition of words, to advance "free" software agenda. Most of his arguments work only if the statement that proprietary software is unjust is taken as a premise. Quite a few of his contentions are mere assertions. I have mostly disagreed with his better arguments and have presented my objections.



2.4 Chapter conclusion

In the beginning of this chapter I have summarized Stallman's arguments to prove that proprietary software is unjust in this manner:

  1. if the user cannot change the source code, then he is in theory open to abuse from the author of the program
  2. abuse from proprietary software developers is standard practice
  3. proprietary software keeps people divided and reduces cooperation

I have meticulously reviewed each of those points and have even enhanced a number of Stallman's arguments. Nevertheless, I have concluded that Stallman generally failed to make his case.

His contention that proprietary software exhibits inherent injustice has not been demonstrated to my satisfaction, and I have found Stallman's arguments in this regard to be superficial and unconvincing.

The examples of "proprietary abuse" are poorly researched, frequently presented with demonstrable and often consequential inaccuracies, and do not lend support to the position that abuse from proprietary software developers is standard practice. Stallman also falls into a trap of circular reasoning on more than one occasion, when what he tries to prove is taken as a premise of the argument.

The real issues that are brought up by him are described solely in terms of Stallman's proprietary vs. "free" framework, which at times includes problems that are exceedingly complex and cannot be contained or even accurately described by such a framework, and often can be argued to have very little pertinence to such a framework at all.

Arguments, designed to show that proprietary software keeps people divided and reduces cooperation, mostly rest on unproven and at time implausible premises, commit logical fallacies, lack nuance and detail, and in their present form raise very significant objections.

I, therefore, must conclude that Stallman's writings do not demonstrate proprietary software to be unjust.



3. "Free" software as a solution

When arguing for a solution, one must essentially do two things: first, propose a solution, by which we mean demonstrate that certain measures reliably fix the problem; second, establish that this solution is an effective way to fix the problem.

I have been largely unconvinced by Stallman's contention that proprietary software is inherently unjust and is, thus, a problem that needs to be fixed. Nevertheless, it is possible to analyze his arguments in defense of "free" software as a solution by granting him the assumption that proprietary software is a problem.

Let us now analyze Stallman's claim that "free" software is the only solution to the problems he outlines in greater detail.



3.1 "Free" software as panacea

Stallman brings up many problems with proprietary software, problems that span across vast topics such as political expression, consumer rights, privacy, security and a range of others. Even if we concede that all of these problems are real, the overall difficulty with Stallman's philosophy is that it basically tries to offer a single solution to all of these complex issues.

There is good reason to doubt the efficiency of any umbrella proposal. The topics at hand are so intricate and complicated that it is unrealistic to expect a single solution to exist for all of them. The only way to argue for a panacea is to treat the initial problems as simplistic, black-and-white issues. Which is exactly the approach employed by Stallman throughout his writings. He focuses solely on the alleged conflict between proprietary and "free" software, refusing to take into account any other issues involved.

When, however, we elect for a nuanced approach, it quickly becomes clear that even in relatively simple cases no single solution is possible, as situations in the real world tend to be complicated and multifaceted.

One can regard "free" software to not be a proposal per se, but rather a choice of direction. This is certainly not a strategy Stallman employs. He clearly advocates a very particular solution, with very little latitude for compromise.

As will be demonstrated in the following sections, his stringent position is easily weakened by the introduction of even a very elementary degree of rigor, precisely because panacea is an intellectual fantasy.



3.2 "Free" software removes dependency on a software vendor

In Chapter 2 I have argued that in a practical realistic sense the problem of non-professionals being dependent on professionals is not a situation that is generally considered problematic. To put it another way, the division of labor is largely considered a trade-off. We economize on a limited resource of time by outsourcing an activity to someone else, thereby surrendering part of our control over the end product. Additionally, division of labor allows one to specialize in a craft and thus become more performant at it.

That said, the dependence of consumers on producers is not an issue that is ignored. A large section of law and activism deals with consumer protection.[146] Existing solutions to these issues are not a complete removal of any possibility of misconduct on the part of producers, but rather minimization of harm and possibility of harm.

The power of Stallman's narrative is that a "free" software approach seems to remove consumer rights violations almost completely. Definitely to a degree that is unparalleled in most other cases.

What Stallman fails to mention or understand, is that this approach comes at a very high cost. Let us outline this cost.


It is safe to assume that the vast majority of people in the world are not software developers. Instead, they are software operators, software users. Even if a person is a software developer, she typically develops only a few products, while being an operator of the majority of software she uses everyday. If one is developing browsers, she typically does not develop operating systems, video editors, databases, games, etc. Indeed, the increasing complexity of software guarantees that today's engineers are highly specialized.

Control over a software product requires a significant degree of expertise. While the idea to have complete control over one's computing might have seemed quite realistic in the 1970s, when whole operating systems had mere thousands lines of code, in today's world a web browser we use everyday has millions lines of code, spanning across several programing languages, protocols and database technologies, with complicated multilayered functionality. To add more data points, Microsoft Office 2013 has over 40 million lines of code. Facebook has over 62 million lines of code. Average modern high-end car software is at around 100 million.[147]

Having control over these everyday products effectively means owning a well-functioning software development company that would be capable of maintenance and development across dozens of products and technologies. Any suggestion that a person can open source code of a "free" software package, say, LibreOffice Writer, and quickly fix something or add a feature - is the manifestation of extreme naivete and ignorance on the subject.

But then what does Stallman mean?

As noted in Chapter 1, Stallman focuses his writings on the alleged problems with proprietary software and the assertion that "free" software is the only solution to all of these problems. He says very little of substance about why "free" software is the solution and in what way it is the solution.

Even when he discusses the history and goals of the GNU Project,[149] he says nothing about how to reach those goals. The only thing he mentions is work done by the Free Software Foundation, which recruits several people to write "free" software. He also talks about "High Priority Projects", a list of programs that need to be developed.

The tacit assumption, underlying Stallman's writings, is that "free" software is to be developed and maintained by a community of developers. This community is left undefined. Purportedly, these are all the people who for one reason or another would choose to dedicate their time and efforts to writing and maintaining "free" software.

For a user the dependence on the software vendor is thus converted into a dependence on this loosely defined community.

Stallman believes that the latter state of affairs is just, and the former is unjust. The fact that he does not substantiate this claim severely wounds his philosophical stance. By failing to recognize the complexity of modern day software and that most users are not developers, he ends up with a very sketchy claim. All we can do at this point is to try to guess his general line of reasoning.


It is possible that Stallman believes a dependence on the community to be more ethical because it gives more possibilities to the users - to either maintain the software themselves, fork it or ask others for help. This can be inferred from passages, analyzed in section 2.2.4, where Stallman speaks about users fixing bugs. In another place he says: "The wrong here is Microsoft does this after having made the users dependent on Microsoft, because they are not free to ask anyone else to work on the program for them." [129]

It needs to be noted that software can be divided into two main segments: software written for private purposes, like a company internal tool, and software written for the general public. It is the proprietary nature of the latter that Stallman is concerned with, as software for private use is typically owned by its users, who have a complete set of rights to it. We will thus focus on software written for the general public.

Ideas on how a community of developers can help users with software should also be divided into two main scenarios: lone developers being hired to develop additional features and fix bugs, or reliance on a centralized development community running the project.


3.2.1 Lone developer model

The first scenario refers to those frequently voiced ideas that with "free" software all one has to do is call up a "developer friend" and ask her to help, even if for a fee. Unfortunately, this is mostly a fantasy. The practicality of this approach in real life is negligible, and it is not difficult to show.

The easiest thing to point out is that a "developer friend" would have only very limited expertise. As noted above, modern software is very specialized and complex. Therefore, mere probability suggests that in most cases this "developer friend" would not be qualified to help. Finding skilled enough volunteers is difficult even among people who are experts in a given area. The job could be uninteresting to them or they may have other priorities, or they might ask for way too high a price.

And as a lot of the software one tends to use everyday is quite complex, even if a developer agrees to help, a lot of time and effort will be spent understanding the architecture of the program, intricacies of its implementations, etc. Adding a small feature, unless the person has first hand experience with the code of the program in question, is going to be a time-consuming project, likely costly for the customer.

But this is not even the most consequential problem. Since larger and more useful software is typically developed in a centralized manner, when there exists a main development branch, asking someone to write an additional feature would effectively fork the program and make it incompatible with the main package. Patches, written by contractors, would now have to be applied to each new update from the main developer team, making it a very costly and unreliable procedure.

A solution to this is to submit all patches to the main branch, but there is no guarantee that these patches would be accepted quickly, or even be accepted at all. Additionally, if everyone would begin submitting countless patches because of the practice of hiring lone developers, this would quickly turn into a logistical nightmare, forcing the core development team to introduce a process that would put a filter on the stream of submissions. It is easy to imagine that going through such a process would not be trivial. The Linux kernel has a very involved operation of adding patches to the system. Think of a local school, a company or even one user hiring a friend to add something to the kernel - and then trying to have kernel developers add this code to the main branch of the kernel. The amount of effort is difficult to underestimate.[150]

It is also clear that quite a number of patches will never be accepted simply because they transform a program into something that is too specific to a particular use case and is irrelevant for the product as it stands now. Open source projects that accept almost anything that goes their way quickly turn into concoctions, difficult to both maintain and use. Even in today's "free" software world there are enough examples when a solution to a product design conflict among groups of developers results in having both features added, making software more complex. Whenever several options are added to the package just to indulge users who want it, developers now have to maintain each one of those options and are likely to have more bugs to fix and more things to test.

Another version of a lone developer solution is to indeed just fork the project and maintain it oneself. A team of developers can be regarded as a "lone developer" in relation to the main branch. The obvious difficulty here is that this is just way too costly for someone to routinely afford. Taking into account the complexity of modern software, this casts such self-organized users in the role of software maintainers, who now have to hire developers, provide a development roadmap, etc.

This can be done as a business, and it is not inconceivable that companies would emerge, providing exactly such services. Since anything they do falls into the commons, it is probable that these companies would be local and not very big. It might also make sense to make the local version of the software incompatible with all other versions, so as to defend the business and make sure the users cannot shop around for other versions. It can also happen naturally, without anyone specifically aiming to introduce such an incompatibility. A local version of the software that diverges too much will become mostly incompatible with the main branch.

Excessive forking will create a fragmented and chaotic market landscape, where there are dozens upon dozens of versions of similar, but incompatible packages, all developed in different directions, at different speeds and with different quality. The advantages of having one centralized package are so numerous, that a "lone developer" scenario is very unlikely to be a method of choice in most cases.


3.2.2 Communal development model

Reliance on a community of developers, on the other hand, is the situation that exists today. Most "free" software projects, especially those that are bigger, will have a core development team that works on its functionality.

This is definitely a more viable approach that can result in stable, more cleanly designed products. However, the fact that these are not businesses, but groups of hobbyists, the behavior of teams developing open source software at no cost has its own very specific problems. It is not easy to support all of these pronouncements with concrete references, as a lot of it comes from the general experience of being part of these communities, but most of these statements are easily verifiable.

First of all, big open source products are often effective monopolies. Because a product is so big and has been developed for such a long time, typically no team will opt to create a replacement. It can even be considered a poor move that undermines the work of the community, wastes time and effort. Also, once a project takes off, it will create a snowball effect and attract many developers and users, thus stealing potential resources from other projects. The need to market the project to the community requires very serious dedication from the initial team, and it make take many years until the project takes off.

As a result, "free" operating systems usually have only one main program in a given area. For instance, there is but a single serious raster graphics package - GIMP. If GIMP does not satisfy the user - there are virtually no alternatives that are able to boast comparable stability and initial feature set. If one sees several programs being developed to achieve a similar use case, then it is a good bet that none of them are reliably good. Such is the situation with video editors on Linux as of the moment of writing.

At the same time developers of the main package might be under very little pressure to make their product competitive, by virtue of there being no competition. This allows them to work at their own pace, prioritize new features over stability and over polishing existing functionality, spend time on experiments that are incomplete for years, and often hold bizarre views about software development in general.

The whole relationship between "free" software developers and users is completely different. Users are not customers, therefore developers owe nothing to them. Although in theory developers can be penalized for their unproductive behavior by the program being forked, this rarely happens. The cost of finding a functional team is very high and software being "free" does nothing to help this. In fact, since a "free" software license can only enforce development being mostly a non-commercial activity, finding capable developers for a big project that went wrong is not a trivial task. There are many examples when a very sophisticated, useful and powerful program is abandoned because no developer wants to pick it up, for one reason or another. There are also examples of projects being picked up by less skilled developers, who are able to perform small changes and minor bug fixes, but are not capable of really advancing the software.

There is no long-term guarantee that a "free" software program will continue to be developed even when a very active team is present. While less likely for very large projects, middle-sized projects are at a higher risk of the main developer(s) leaving and essentially bringing the project to a halt. Since typically developers are not being paid for their work, there are no obligations and commitment outside of personal interest in the product. Sometimes a hobbyist developer will maintain interest in their product for years, sometimes they can leave at a seemingly unexpected moment, never to return. And then the project enters a phase of being mostly abandoned. Less skilled developers that stay with the project now have the need to promote the project and try to lure in skilled developers.

One of the ways to grow the development team is to employ beginners for simple and possibly interesting tasks with the goal of training them to write more serious features. This approach, while creates commotion around the program, rarely succeeds. Designing a GUI skin might pique a person's interest in software development, but there is no sure path from working on simple features to becoming an expert, especially when the required expertise might go beyond programming. Someone working on an audio program might need years of digital sound processing experience to be able to contribute. Most of the time skilled developers neither join, nor grow out of the community around the project. During this time the project might fall into obscurity, even if it is powerful and the only one of its kind. Sometimes distribution maintainers will put in additional effort to patch older programs so that they compile for newer systems.

So, the fact that "free" software projects are not dependent on a software vendor could be both a blessing and a curse. Yes, there is no vendor that owns and fully controls a "free" program, but then there is no one who is truly committed to it either.

Commercial software developers have a strong incentive to grow their user base. "Free" software developers normally have no such incentive. It is not uncommon for them to be moderately opposed to growing the number of active users, since more users only means more requests and more bug reports. It also forces the development team to re-organize and become involved with boring tasks: planning, building processes, fixing bugs and writing documentation.

Another very notable feature of "free" software is that it is rarely systematically tested. Testing is crowdsourced to the community of its users. This creates a curious state of affairs when stability is rarely a focus of development. A lot of programs are released as they are, the project website will promote the program as providing a long list of features, but nobody will be overly concerned that it might be very far from production quality. Since the idea is that users need to test and file bugs, it is thought that stability is not the primary concern of developers. This leads to many packages giving an impeccable first impression, but being absolutely unusable in practice.

Since testing is outsourced to users, frequently new versions of operating systems and programs become stable only as they age. Linux distributions can be very stable and functional on older hardware, for which the community has already figured out most of the issues, but be unpredictable when installed on newer hardware. Interestingly enough, this itself does not depend solely on vendors putting out only proprietary drivers. There are enough situations when vendors provide specs for their hardware, but if this hardware is not widely used in a particular community, the drivers might never be written. If a given piece of hardware is rarely used by a given community, its problems might never be addressed.[152]

In general, communal testing is of a different qualitative nature. Instead of systematic, well designed testing processes often employed by commercial enterprises, it is more along the lines of a stream of random bug reports. Such methodology does not make it easier to debug programs.

It is possible that in a world which has only "free" software hardware vendors might do a better job focusing on the available operating systems, as opposed to them now focusing on Windows, but this in itself is not a proprietary vs. "free" problem in the first place. Windows just happens to have a dominating user share, so hardware vendors spend resources developing drivers for the core audience. But even if the drivers are there, the challenges are rooted in the "free" nature of the operating system, which is developed based on communal principles. Even with existing drivers systems are often broken, since a rollout of new functionality is rarely tested on a wide range of hardware, something that commercial companies do routinely for their products.

Another recurring feature of "free" software is that it tends to look very outdated. This is not a coincidence. The availability of source files creates a temptation to take the existing codebase and fork it, rather than write a product from scratch that follows a different paradigm. This explains why a lot of older "free" software programs look more like a result of evolution rather than planned product design. Linus Torvalds is known to have said exactly that about the Linux kernel.[151]

In fact, quite a number of "free" programs packaged today can be traced back to Windows programs of the 90s that obviously served as inspiration. A developer would attempt to create a "free" version of the package, perhaps with some variations, but with the same basic concept. As the proprietary software world marches on, "free" and open source programs tend to evolve rather than create newer specimens. Therefore, there is this tendency to put in new features into older, already existing software.

One of the things that the "free" ecosystem does very well is provide a world of shared libraries. That allows a developer to very quickly create software, by basing it on existing libraries. However, "free" libraries are not free from the peculiarities of the "free" software development described above, therefore they are not guaranteed to be stable or well tested, functional, or new. Instead, many of them are old, had been written a long time ago and simply reused again and again.

To use an example of video editors again, a fair number of video editors on Linux use a library called MLT. This library is notoriously unstable, but for many years it has also been the better available library for video editing. A user, installing seemingly different video editors, will in reality be choosing among different UIs, built on top of the same backend engine. And as the library is unstable, many video editors on Linux might look very nice and even be relatively modern in their appearance, but be equally prone to crashes. This situation is the direct result of access to "free" libraries.



3.3 "Free" software as a public good

Let us remind ourselves and list once more the freedoms Stallman proposes:

The freedom to run the program as you wish, for any purpose (freedom 0).
The freedom to study how the program works, and change it so it does your computing as you wish (freedom 1). Access to the source code is a precondition for this.
The freedom to redistribute copies so you can help your neighbor (freedom 2).
The freedom to distribute copies of your modified versions to others (freedom 3). By doing this you can give the whole community a chance to benefit from your changes. Access to the source code is a precondition for this.[3]

What do these freedoms accomplish?

What can immediately be said is that these proposals effectively transform all software into a "public good".

In economics, a public good is a good that is both non-excludable and non-rivalrous in that individuals cannot be effectively excluded from use and where use by one individual does not reduce availability to others.[140]

Proprietary software can be considered a "club good", which means a public good with an exclusion mechanism added. In the realm of informational goods, typical exclusion mechanisms are restricted access to the source, copyright and patent laws.[153]

Although a public good is defined as a good that is not only non-excludable, but also non-rivalrous, a quick conclusion that such a good, therefore, should be made available to everyone at no cost is in most cases wrong. It is widely recognized that such public goods can be "subject to excessive use resulting in negative externalities affecting all users" and "are often closely related to the "free-rider" problem, in which people not paying for the good may continue to access it. Thus, the good may be under-produced, overused or degraded."[140]

Stallman tries to underline that commercial software is conceptually different from "free". In the real world, however, "free" software is very rarely a commercial enterprise. There are small businesses which use a combination of crowdsourcing and selling binaries, thus saving users the effort of compilation, but this works only for niche software, where people mostly pay to consciously support the product, not because there is no other way to obtain it. If the business becomes exposed to a larger audience, someone would just compile from source and provide binaries at no cost.

In other words, software being "free" implies it being without charge. The only meaningful difference between "free software" and freeware is that "free" software guarantees access to source code at no cost as well. Any argument that "free" software is not about money is deceptive. It is precisely about money or, to be more accurate, about removing all mechanisms of exclusion. Money is just a common example of such mechanism. Access to source code through a copyleft license like GPL guarantees that a given piece of software stays free of monetary cost forever.

By transforming all club goods into public goods, Stallman effectively excludes pure software development from the realm of business enterprises and forces bolder solutions to the "free-rider" problem, which typically take the form of government intervention.[154][155]


In fact, the free-rider problem becomes dominant in this context, and may shape the hypothetical world of "free" software quite fundamentally. By labeling any form of dependence on software vendors to be unjust, and categorically rejecting any such dependence, we create a world where another form of injustice becomes dominant: a situation when a limited group of people invests resources into an activity, the product of which is then available to everyone at no cost.

Therefore, people will find ways to prevent this injustice and restrict access to software even in the world of "free" software, while formally honoring the "free" license. And it can be argued that this will considerably stifle innovation and degrade public access to capable software.

How can that happen?

First, it is very possible that as soon as software becomes a public good, in many countries governments might become responsible for software development. In fact, it is unlikely this will not happen.[154][155] Taxpayers money would then be used to finance basic software needs. This would create a situation when taxpayers of one country or even region are effectively using their limited resources to produce a good that ends up in the commons. At least some countries might choose not to invest any resources in software development and instead wait for products, the development of which was paid by citizens of another country.

In order to prevent that, but honor a "free" license, countries might choose to fragment standards and create procedures that are tied to a specific country, so that software made in the US is mostly irrelevant to people living in France or India, and vice versa. A software arms race might ensue that will increase the burden on the taxpayers. At the same time this situation would stifle innovation, limiting government funded development to a bare minimum, and leaving additional non-basic programs to the private sector.

The private sector would be quite far from the rosy picture painted by "free" software advocates. Unlike the narrative about a community of hobbyists building most of the software, available to anyone at no cost, reality is likely to be quite different. By the end of the 1970s we have seen the decoupling of hardware and software. With software becoming a public good, it is only logical that software would again be coupled with hardware.

In fact, software as a separate product might disappear completely. Instead, we would have software that is tightly integrated into very particular hardware, effectively excluding the possibility of programs falling into the commons. Specialized professional products might boast prohibitive prices. The situation would degrade into computing of the 80s, where such apps had been available only to professionals. In other words, any attempt to police commercial software into the realm of the commons would undoubtedly create a backfire effect, when professional-grade software simply becomes unavailable to the general public. Obviously, this might also create another programming arms race, with the community of amateurs trying to modify professional software to become compatible with cheaper devices (or have no device dependence at all), so as to receive software at no cost. This arms race will affect prices, as businesses will have to take into account the cost of defending access to their products. As a result, computers in general can become more expensive.

Of course, as we saw in the example of GPLv3, it is possible to come up with some newer license, say, GPLv4 that will somehow prohibit tying software to particular hardware. But in the end, there will always be a way around limitations and one cannot keep creating endless versions of licenses to close every single loophole. Not to mention convincing people that yet another practice is "unethical" would be difficult. Even GPLv3 is not universally accepted, and the Linux kernel is firmly released under GPLv2.

It is difficult to estimate how this state of things will affect the hobbyist software community. Will it then become better organized? Or will it actually become even less effective at delivering replacements for commercial software, as expertise and technological focus shift to hardware-based products?

When governments adopt "free" software in today's realities, the outcomes are mixed. In 2016 a number of governments in Europe and government in Brazil have reverted back to proprietary software, citing quality issues.[162][164]

Note: the claim about government in Brazil opting for proprietary solutions is less clear, as a Brazilian paper seemed to imply in 2016 that it is not the case, and yet the note from the editor supplied in the footnote assets that it is actually, indeed the case.[163]

The analysis above is just one way of how things might develop. Predicting the direction of such an enormous area of human activity is not trivial. However, my analysis perhaps allows to state one thing with confidence: turning all software into a public good is unlikely to create a personal computing paradise.



3.4 The cost and types of sharing source code

Stallman's writings devote little time, if any, to the discussion of specifics of sharing source code. Meanwhile, this is an important topic to keep in mind.

First, sharing source code does not take zero effort. Releasing source code is logistically harder than releasing a precompiled binary. Unless we are speaking about dumping code as it is, preparing source code in such a way that others may find it useful is time-consuming. It might involve writing an altogether cleaner code and preparing documentation, such as providing comments, development notes and a list of necessary dependencies. Badly prepared source code, real life examples of which are not unheard of, might boast a notable level of ineffectiveness because almost nobody is able to extend it without completely reworking it, or even have difficulties compiling it into a working program.

Second, the usefulness of source code varies. A library that is specifically designed to be used in other projects is generally more useful than a complete program, only parts of which might be of interest for other developers to reuse.



3.5 Extent of the benefits

Given all the costs and unique challenges that "free" software presents, to what extent does it really solve the issues Stallman finds problematic?

A number of notable cases are solved by this approach. If a developer abandons a project, unlike with a proprietary program, it is theoretically available for anyone else to pick up. As noted earlier, it is generally unlikely to happen, but at least critical bug fixes can be applied and basic support for newer systems can be added.

A problem outlined in 2.3.1 and 2.3.2 (but not mentioned by Stallman) that in government establishments people might be forced into a proprietary license, as well as students might have to use software in class that they cannot afford to work with at home, is effectively solved by the introduction of "free" software. This solution is not necessary, as licenses that allow for distribution at no cost may also be utilized, but it is preferred, since guarantees that the program will stay available in the future. It is also important to note that a "free" license is a necessary, but not a sufficient condition. The program must also be of high enough quality to be useful. Choosing a program which is "free", but is not capable of getting work done, cannot be considered a solution. In some cases a sub-standard program can serve an educational purpose.

But in most cases "free" software removes the alleged problems not by directly solving them, but by making the initial situation impossible.

For example, DRM, which is a mechanism of exclusion of non-paying users, becomes unnecessary if software development is no longer a business venture, and all programs become available to the public at no cost. The problem with DRM per se is not solved by this, rather DRM becomes irrelevant.

One can claim that "free" software does solve the issue of DRM directly, as open source nature of the product will allow anyone to re-compile the program without DRM. While it is certainly true, DRM is a very broad set of technologies which includes centralized servers and hardware lockouts which are not bypassed by "free" software. In all fairness, Stallman does not claim "free" software to be a solution for DRM, instead he demands for DRM to be considered illegal.[157]

What Stallman calls "sabotage" is now conceptually meaningless because there are no customers and, thus, no customer expectations: "free" software developers owe nothing to their users. The phenomenon itself will exist, it is quite thinkable in the "free" software world. It just stops being an actionable issue, because users formally own the software to the same degree as the developers.

Quite a number of issues are not really solved. Logic dictates that this concerns all problems marked in Chapter 2 as problems not contingent on software being proprietary. Interference (situations when an upgrade or a download initiated by proprietary software has the potential to interfere with users' work on their computers) stays largely intact, since such situations have little to do with software being proprietary. Removing the word "proprietary" from the definition of "interference" does not invalidate it. I have also provided real life examples of such interference in "free" software.

Staying secure will likely become a bigger challenge, as part of the security that many "free" desktop systems enjoy today is mostly due to the low number of their users. This was outlined in greater detail in 2.2.3.

"Deception" is hardly solved by "free" software. In many cases it merely transfers the power to deceive from a software vendor to every single user, which admittedly makes it virtually impossible to track. For instance, Stallman speaks of a Volkswagen scandal as an example where proprietary software is a problem. But "free" software in people's car would make the situation potentially much worse, as every driver would then be capable of overriding limitations. Unless draconian systems of constant checks are in place, "free" software would ensure that trickery is rampant and is an everyday worry. One would be hard-pressed to not foresee that eventually "free" software will begin to be regulated and become centralized, evolving into a much less "free" version of itself in the process.

When it comes to what Stallman calls "Surveillance", what changes with "free" software is actually not that clear. There are "free" software packages that collect data anonymously, just like proprietary packages do. Having access to source code might not change the situation with data collection on the Internet either, since even today millions of people use open source and "free" browsers, and yet websites collect data normally. At least in Europe, the laws ensure that users are able to opt out of data collection, and user data is heavily regulated. As Stallman's views are quite radical on this, it is quite likely that most people would not have a problem with at least some of the data collection. The need to provide one's verified identity for certain online purchases is rarely considered a problem, and often is a necessary security precaution. In many cases this has little to do with software at all, but rather with the requirements of law.

Stallman's bizarre claim that subscription-based services are somehow wrong is unlikely to be solved by "free" software. If anything, "free" software developers are more likely to come up with more subscription-based services, as direct monetization becomes impossible. Stallman's main problem with subscription seems to be the fact that software might become unavailable if one does not pay for it, while he believes that everyone is entitled to software available at no cost - forever.

He also believes that online applications that substitute computing tasks normally done on the desktop are ethically wrong.[156] This, however, is a gray area in terms of software freedoms, and it is not clear how this would play out in the hypothetical world of "free" software. Suffice to say, Stallman's contention that subscription-based and online services are wrong is just way too questionable and unjustified, and it is entirely possible that it would not be universally ratified even in our hypothetical scenario.



3.6 Efficiency

In the beginning of this chapter I have stated that a necessary ingredient in the argument for a particular solution is its effectiveness.

This might include not only comparisons to competing approaches, but also an analysis of the solution itself - it's applicability, side effects and relation to the larger picture. To supply an example, if someone visits a doctor and complains of knee pain, one way to resolve the issue is to amputate the leg. In fact, if we narrow our focus to knee pain alone and ignore all other issues, such a solution seems preferable, as unlike a painkiller or a surgical operation, it is capable of removing knee pain - and even the possibility of knee pain - permanently. And yet, no sane person would opt for a leg amputation as a solution to knee pain. The consequences of losing a limb are just way too negative and typically don't outweigh the benefits of never having knee pain.

Stallman's solution definitely seems to remove a lot of the problems he outlines. But the costs are high: eradication of software development as a business, a potential increase of the burden on taxpayers, a significant slowdown of technological innovation, making state of the art computing less accessible to the general public.

This happens because in his analysis Stallman focuses only on one issue and one issue alone - the proprietary nature of software. By choosing to focus only on this feature and consider it to be the primary problem in every situation, Stallman's solution is blind to nuances and neighboring issues. Such a methodology is rarely useful, especially in multifaceted areas of society and politics.

Stallman also routinely fails to compare his solution to other solutions. In fact, he never even acknowledges that other solutions exist, although news items he cites in his examples of "proprietary abuse" again and again speak about measures taken to fix these issues.



3.7 Chapter conclusion

The point of this chapter is not to show that the "free" software solution is completely wrong. It is to show that the consequences of this approach are very complicated, and require much thought and research.

Stallman does not offer a solution that obviously and unequivocally neutralizes all of the problems he sees with the proprietary world. He offers a solution that neutralizes the alleged problems only very superficially, while making dramatic changes to the current state of affairs, with very complicated consequences, and with the introduction of new non-trivial problems. That Stallman's writings do virtually nothing to explore these issues demonstrates the superficial nature of his philosophy.

It is vital to understand that Stallman's solution is not accomplishing the world of software that we have today, but with full access to source code at no cost - a utopia that probably many people envision when talking about "free" software. Instead, Stallman's solution transforms the world of personal computing as we know it with it's benefits and challenges into an entirely different world with a set of its own benefits and challenges. Understanding which world is better is not trivial.

His essays are narrowly focused on proprietary software, ignoring all other issues, values and goals. Stallman does not assess the cost and consequences of his proposed "free" software approach, and never acknowledges less radical, more nuanced solutions. At the very least, this makes his argument incomplete and, as a result, unconvincing.



4. Overall conclusion and proposals for a stronger case for "free" software

Fundamentally, Stallman is making two claims:

  1. Proprietary software is unjust
  2. The solution to this is to adopt certain freedoms as essential

The logic of the case is sequential. Adopting a solution is only reasonable if the problem being discussed is real. I have, nevertheless, analyzed Stallman's second claim independently, as if the first claim has been made successfully.

I believe that my analysis has shown both claims to be unconvincing and lacking in depth, as well as in provided evidence. The case for proprietary software being unjust is weaker, but then the statement itself is more consequential and requires stronger evidence than the proposal of the solution. The latter statement is poorly presented, insufficiently thought-out and takes the form of a panacea that somehow miraculously cures all injustices.

Stallman's writings take the form of ideological storytelling, with loaded language, biased assessment of evidence and philosophical shallowness. Complex, multi-faceted issues are given simplistic black-and-white treatment, mixed in between calls for bush-league political action.

I will dedicate this chapter to suggesting how to make the case for "free" software stronger, and point out areas which might benefit from additional scientific research.



4.1 "Free" software as a moderate philosophy

The primary weakness of Stallman's argument is it's radical nature. His unyielding idealism makes his position difficult to argue for and yet stay true to the facts, nearly impossible to implement in real life, dangerously border on conspiratorial thinking, and essentially construct an "us and them" narrative, which is rarely a basis for constructive political action.

"Free" philosophy as a choice, however, is a case that is much stronger, more consistent and ultimately easier to defend.

There is an inherent difference between saying "I personally want complete control of my computing", and "everyone should have complete control of their computing". An inherent difference between saying "we want to build an ecosystem for those who don't want to use proprietary software" and "proprietary software is unethical and it should not be used by anyone".

Would Stallman argue the former, this treatise would not exist. But this is not what he argues. He instead tries to build a universal case for proprietary software being unethical, with "free" software as the only ethical alternative. And this is a claim which is incredibly hard to defend, primarily because existing evidence, including evidence Stallman himself presents, does not seem to support his idea. In other words, this radical claim appears to be false.

This is not to say that a more moderate philosophy is automatically consistent and correct. A lot of what has been said of Stallman's original writings might apply to the more moderate version as well, like the impossibility of truly controlling one's computing in our age of technological diversity. But it is definitely a much more workable line of reasoning, and may result in actionable political proposals, as opposed to Stallman's quixotic projects to completely abolish proprietary software, DRM, data collection of any sort, electronic identification, content filtering, etc.

A moderate philosophy might be based on an idea that it is prudent to build an ecosystem of publicly available software, that offers no restrictions on modification and redistribution, and that is defended by a copyleft license. Such a cause would not at all be antithetic to the existence of proprietary software, and would be focused on the positive case for "free" software as an asset to society, rather than on the negative case of proprietary software being a problem.

The benefit of this approach is that it removes many of the difficulties of relying solely on communal-based software development, and instead allows to reap the benefits of freely available binaries and source code whenever possible and reasonable.

It is also clear that a moderate philosophy of this kind would not in any way undermine "free" software efforts that exist today. It is quite safe to assume that most people, passionate about "free" software, are either unfamiliar with Stallman's writings, or their familiarity is cursory and not that consequential. Nothing that is done today in the name of "free" software depends on it being as radical as Stallman presents it.

The author of this treatise is a strong supporter of "free" software, believes it to be an exceptionally good and even necessary idea, and yet, as is clear from this treatise, does not share Stallman's stringent views.



4.2 Proposals for realistic political action

Equipped with a positive case for "free" software, devoid of revolutionary demands, a group of activists might be better able to advance their cause and actually solve some of the genuine problems that exist in the world of software. The list of suggestions is by no means exhaustive.


4.2.1 Open standards and open formats

While it would be nearly impossible to argue for all companies to have to release source code, it is much more realistic and perhaps even more beneficial to argue for open standards and open formats.

The idea of open standards is already entrenched in the modern conversation about technology. Although attempts to insert proprietary standards into common protocols are still frequent, generally openness and net neutrality seem to be winning in the long run.

Once confronted with the need to prepare a concrete recommendation, the idea for open formats, however, is more complicated, and there is definitely much work to be done in understanding what a push for open formats should look like. The difficulty here is that new formats are frequently an integral part of original software development. The cost of providing an open format is not nil, and, as software tends to constantly change and grow, the process of constantly keeping the publicly available specification in sync is bound to be laborious. Sometimes a proprietary format is used as an exclusion mechanism and allows the company to make money. At what point should the public enjoy access to a proprietary format, under which conditions, which cases must be included, which cases should be excluded - all of these questions are not trivial.

Nevertheless, such an approach could prove much more realistic than arguing for the availability of complete source code of programs, and much easier to justify.


4.2.2 "Free" software in the public sector

A lot of modern "free" software activism quite successfully works towards saving taxpayers' money by introducing "free" software in publicly funded institutions.[158][159][160][161] These successes are not guaranteed to be permanent, as software quality proves to be extremely important when required to deliver results, and not just the theoretical opportunity to modify and share source code. In recent years a number of critical "free" software supporting governments have reverted to using proprietary solutions, citing poor quality and lack of interest from local developers to advance "free" programs, chosen by the public sector.[162]

However, it is clear that this direction is promising, realistic and is capable of adding real value.


4.2.3 Security and privacy standards

Industries such as food and medical industries are heavily regulated, because the consequences of mishaps can be disastrous. With our society highly dependent on software, there is good reason to demand a serious approach to security in software. Many options might be explored here, from common software security standards to responsibility of public and private institutions which are dependent on software in critical areas, to be responsible and accountable for providing IT security of their systems.

Same concerns privacy. Organizations like EFF (The Electronic Frontier Foundation) are fighting for civil liberties in the digital world. Quite a lot of their agenda addresses Stallman's concerns about surveillance. However, EFF's work is a good example of a rational approach, based on upholding civil liberties and Constitutional rights.[165] Compare that to Stallman's demonstrative rejection of credit cards, because one needs to identify oneself in order to use them.

Current state of privacy laws is not at all unsatisfactory. Every year less and less leeway is given to companies, and their user data usage possibilities have shrunk significantly since the dawn of the Internet. This space does have to be continuously monitored and kept in check, as technologies evolve and our ability to crunch large amounts of data grows.



4.3 Necessary scientific research

Activism, not based on facts and scientifically verified data, quickly turns into a form of ideological warfare. No matter how noble the cause is, not being able to interpret the situation correctly will cause activists to systematically overreact to or underestimate unfolding events, and as a result become marginalized, ineffective and even cause long-term harm if the change they try to advance is actually of importance, but now becomes associated with hysterical and unreasonable views.

Rational activism, however, can do a great deal of service to society, alert governments and experts of potential dangers, and defend human rights in the wake of technologies that dramatically reshape our world. Activists, well grounded in reality have a better chance to be heard and implement measures that produce real change.

A number of studies can be initiated to understand the scale of problems, as well as efficiency of "free" software as a solution. Even research based solely on the publicly available data would be of great value.

Statistics based approach to customer rights violations can show whether the scale of the problem in the proprietary software world is indeed as unprecedented. While existing data already makes this hypothesis quite implausible, it is possible that in some areas malpractice is indeed more rampant than in others. Such data will help focus activists' efforts to argue for more checks and balances in these particular areas.

Longevity of "free" software teams, the amount of "free" projects that are released and then become widely re-used, the rate of innovation in the "free" software community, variability of software, speed of bug fixing, satisfaction of users, actual maintenance cost, comparison of these attributes to various models of proprietary software development, including smaller projects - all these topics could prove to be an interesting and fruitful subject of scientific research. Data, gathered as a result, can ultimately help "free" software communities run better and achieve more.




Sources:

0. Free software movement

1. Philosophy of the GNU Project

2. The GNU Project

3. What is free software?

4. Proprietary Software Is Often Malware

5. Free Software Is Even More Important Now

6. 351,000 People Die of Food Poisoning Globally Every Year

7. Eating in Restaurants: A Risk Factor for Foodborne Disease?

8. Remotely Eavesdropping on Cell Phone Microphones

9. This goes no further...

10. The second operating system hiding in every mobile phone

11. Recently Bought a Windows Computer? Microsoft Probably Has Your Encryption Key

12. Microsoft may have your encryption key; here’s how to take it back

13. "We Own You" - Confessions of an Anonymous Free to Play Producer

14. Google Privacy Policy

15. Arris password of the day generator

16. Apple's Jobs confirms iPhone 'kill switch'

17. Microsoft: We can remotely delete Windows 8 apps

18. Information privacy law

19. Federal Data Protection Act

20. CNiL

21. Federal Council's Message to Parliament

22. Data protection reform

23. Privacy concerns and ethics of data mining

24. Coolpad at Wikipedia

25. FBI–Apple encryption dispute

26. How to manage Windows 10 notification and upgrade options

27. The JavaScript Trap

28. Double standard: Why Apple can force upgrades but Microsoft can't

29. Apple says game about Palestinian child isn’t a game

30. GNU Software

31. Free GNU/Linux distributions

32. iFixit App Pulled from Apple's Store

33. Apple, your anti-choice tendencies are showing in your app store

34. Apple has Approved the Hinder App

35. Nintendo's New 3DS Charges 30 Cents to Remove an Internet Browser Filter

36. Dell, Comcast, Intel & Who Knows Who Else Are Out to Get You

37. Setting the Record Straight on Moplus SDK and the Wormhole Vulnerability

38. Wipeout: When Your Company Kills Your iPhone

39. Wikipedia: Measuring desktop adoption

40. Risks in using open source software

41. Is open source software insecure? An introduction to the issues

42. Is open source security a myth?

43. Open-source software security

44. Security of open-source software again being scrutinized

45. The Heartbleed Bug

46. 44 U.S. Code 3542 - Definitions

47. Wikipedia: Definitions of Informational security

48. WhatsApp completes end-to-end encryption rollout

49. Secure Messaging Scorecard

50. Apple Stole My Music. No, Seriously.

51. No, Apple Music is not deleting tracks off your hard drive — unless you tell it to

52. Uninstall QuickTime for Windows: Apple will not patch its security bugs

53. Apple and James' Excellent Adventure

54. Apple Sent Two Men to My House. No, They Weren’t Assassins.

55. iTunes may play a role in reports that Apple Music was replacing user libraries with DRM-encumbered files.

56. More than half of all IE users face patch axe in 10 months

57. Windows XP: End of an Era, End of an Error

58. Linux adoption

59. Mac Virus & Malware Threats

60. Linux Malware On The Rise

61. Bad arguments for software freedom

62. Don't believe these four myths about Linux security

63. Competition law

64. Philips Hue will no longer block third-party light bulbs (Update)

65. Adobe Goes All-In With Subscription-Based Creative Cloud, Will Still Sell CS6 For Now But Will Stop Developing It

66. "Error 53" fury mounts as Apple software update threatens to kill your iPhone 6

67. Apple Apologizes And Updates iOS To Restore iPhones Disabled By Error 53

68. Apple Facing Class Action Lawsuit Over "Error 53" iPhone 6 Bricking

69. Apple's Operating Systems Are Malware

70. The good, the bad and the ugly of vendor lock in

71. Repairing Your iPhone Home Button From An Unofficial Repair Shop Can Brick Your Phone

72. Apple Details Touch ID And The A7’s Secure Enclave In Updated iOS Security Document

73. Judge Rules in Apple's Favor, Dismisses "Error 53" Lawsuit

74. Update: Apple plays hardball: Upgrade 'bricks' unlocked iPhones

75. Oracle will continue to bundle 'crapware' with Java

76. LG Will Take The 'Smart' Out Of Your Smart TV If You Don't Agree To Share Your Viewing And Search Data With Third Parties

77. INDICATIVE AND NON-EXHAUSTIVE LIST OF TERMS WHICH MAY BE REGARDED AS UNFAIR

78. Facing Backlash And A UK Govt Inquiry, LG Now Claims To Be 'Looking Into' Its Smart TVs' Data-Slurping Habits

79. Not in front of the telly: Warning over 'listening' TV

80. NSA gets early access to zero-day data from Microsoft, others

81. How Badlock Was Discovered and Fixed

82. Twice in the past six months AWS has had to reboot some of its cloud servers because of a Xen vulnerability

83. Stuxnet

84. LTS Enablement Stack

85. Microsoft sued for $10,000 after unwanted Windows 10 upgrade

86. Nintendo Updates Take Wii U Hostage Until You "Agree" to New Legal Terms

87. Why Open Source misses the point of Free Software

88. When Free Software Isn't (Practically) Superior

89. Non-free software can mean unexpected surprises

90. Creative Cloud activation and sign-in troubleshooting

91. Internet connectivity, offline grace period, and reminders

92. ios 9 automatic download making me furious

93. How do you stop an iPhone from automatically downloading an iOS update when on wifi and charging?

94. Yahoo Signs Deal With Oracle To Attract New Users Via Java Installs

95. New Search Strategy for Firefox: Promoting Choice & Innovation

96. Mozilla Public License (MPL) version 2.0

97. Three times as bad as malware: Google shines light on pay-per-install

98. Investigating Commercial Pay-Per-Install and the Distribution of Unwanted Software

99. New research: Zeroing in on deceptive software installations

100. Pokémon Go: Restaurants and bars cash in on Pokéstop locations

101. Proprietary Surveillance

102. Music Downloads and the Flip Side of Digital Rights Management

103. Digital rights management

104. Defective by Design

105. Intel Processors to Become OS Locked

106. Sony agrees to pay millions to gamers to settle PS3 Linux debacle

107. Snapchat agrees to settle FTC charges that it deceived users

108. Men make up their minds about books faster than women, study finds

109. Microsoft’s new small print – how your personal data is (ab)used

110. Issue with 14.04.5 new hardware enablement stack on intel hd graphics

111. Hardware Enablement Stack update crashed computer

112. Can't install nvidia-current after installing hardware enablement stack

113. Black screen at start of 14.04 after update hardware enablement

114. Yesterday's 12.04 "Hardware Update" Destroyed My System

115. Boot problem after enablement stack update

116. Cannot boot after updating newest hardware-enablement stack

117. How to disable HWE messages (14.04)

118. ASIdentifierManager

119. Who Knows What About Me? A Survey of Behind the Scenes Personal Data Sharing to Third Parties by Mobile Apps

120. Free Android apps connect to thousands of tracking and ad URLs, research shows

121. Luigi VIGNERI

122. Taming the Android AppStore: Lightweight Characterization of Android Applications

123. OpenRTB 2.5

124. Permissions Reference - Facebook Login

125. How do I control my permissions when I sign up for an app or game?

126. First Response's Bluetooth pregnancy test is intriguing — and a privacy nightmare

127. CloudPets stuffed toys leak details of half a million users

128. Adobe will try anything to stop a Creative Cloud cancellation

129. Proprietary Sabotage

130. Proprietary Back Doors

131. Why Schools Should Exclusively Use Free Software

132. Freedom of speech

133. Free Software and Education

134. Why Schools Should Exclusively Use Free Software

135. Why Educational Institutions Should Use and Teach Free Software

136. What's the Best Way for a Programmer to Learn a New Language?

137. What are some of the best ways to learn programming?

138. An example of a Programming Methodology course at Stanford

139. Computer science

140. Public good

141. The 25 best alternatives to Photoshop

142. 10 Photoshop alternatives that offer powerful editing and photo management controls

143. The 10 Best Photoshop Alternatives You Need To Know

144. British businesses battle with bad browsers – IE6 usage up 1,200% during office hours

145. The Right to Read

146. Consumer protection

147. Million Lines of Code

148. Selling Free Software

149. A more detailed history of GNU

150. A guide to the Kernel Development Process

151. Re: Announce: Linux-next (Or Andrew's dream :-))

152. Linux Audio: E-MU 0404 USB

153. Club good

154. Explaining Collective Action

155. The Tragedy of the Commons

156. Who does that server really serve?

157. Proprietary DRM

158. Open Source Adoption

159. Use of Open Source Software by the Brazilian Government

160. Brazil: Free Software's Biggest and Best Friend

161. Adoption of free and open-source software by public institutions

162. Brazil Is Ditching Open Source For Microsoft

163. Em nota oficial, Planejamento nega que esteja abrindo mercado governmental para Microsoft

164. Region to switch to proprietary cloud office alternative

165. About EFF