RFC: Adding a SECURITY.md document to the Binutils

Siddhesh Poyarekar siddhesh@gotplt.org
Thu Apr 13 17:29:13 GMT 2023


On 2023-04-13 13:05, Paul Koning wrote:
> 
> 
>> On Apr 13, 2023, at 1:00 PM, Siddhesh Poyarekar <siddhesh@gotplt.org> wrote:
>>
>> On 2023-04-13 12:49, Paul Koning wrote:
>>> If someone sends me an executable file, and I execute it and suffer a virus, shame on me.  If someone sends me a C source file and I compile and link that BUT DO NOT EXECUTE the resulting executable, and I suffer a virus, shame on the tool.
>>
>> If someone sends me a C source file and I compile and link it without inspecting it first, then definitely shame on me again.  Compilers and linkers assume *trusted* input.
> 
> That's news to me.
> 
> It is true that not all text is valid C, and some text has "undefined" behavior.  But "undefined" is a property of the resulting executable program, NOT of the act of compiling it.  I have never before seen anyone suggest that submitting a bad program to a compiler could reasonably be expected to result in that compiler attacking the security of your system, or that if it did so it wouldn't be a security bug in that compiler.

I haven't seen anyone suggest (and have seen many balk at) the idea of 
crashes/buffer overruns in compilers being considered security issues. 
Only lately (i.e. the last few years) has there been a spate of fuzzed 
reports that security researchers file CVEs for to increase their "kill" 
count.  When we have the energy to do so, we dispute them and have them 
rejected, but many just go through, wasting days and weeks of time 
across the industry to respin builds and release updates.

The only accepted security implication for compilers is when it takes in 
valid code and spews out a program that invoke undefined behaviour.

>>> I don't expect the act of compiling or linking or objdumping to compromise my system's security, any more than I expect the act of editing a text file to do so.  The key point is expectation.  I'm reminded of a legal rule seen, for example, in "expectation of privacy": I should assume I can be seen when walking around town, but it is valid for me to assume I'm not seen when at home in my bathroom.  Similarly, I should assume my system can get attacked when I execute a program, but it is reasonable for me to assume no attack is possible when I run gcc or objdump (or hexdump or cat).
>>
>> It's valid for you to assume that you're not seen when you're at home in your bathroom.  However, if you take a random device someone gives you with you in your bathroom without actually checking what it does...
>>
>> Anyway like I said to Richard, it's all well and good to say that binutils *should* be able to handle untrusted inputs.  The reality is that it is not in a position to make that claim and the only reasonable security position the project can take is to strongly recommend either validating inputs (to make them trusted) or running the tools in a sandbox.
> 
> So what you're saying is that, at least in your view, the quality of binutils is so low that it is unlikely to meet the reasonable expectation I voiced.  And furthermore, that it is in such bad shape that it's unreasonable to consider fixing it so it does meet those reasonable expectations.
> 
> I rather doubt either assumption is true.  But if it is, we should say so (or, arguably, discontinue the project).

This is not the first time we've had this conversation[1] and there is a 
lot more nuance to this than "so you think binutils sux and should die". 
  There are practical issues and design concerns that cannot be 
immediately addressed.  Instead of giving the false impression of 
security, it makes more sense to make clear what the state of the 
project is and how users could safely consume it.

Sid

[1] 
https://inbox.sourceware.org/binutils/6f99c92f-1986-b8f0-0854-868598421dda@gotplt.org/t/


More information about the Gdb mailing list