Look, this issue extremely simple: The OS doesn't know if access is permitted by the owner of the data; and therefore must either allow access knowing that doing so may be against the owner's wishes (unethical), or deny access to ensure that the owner's wishes have not been disregarded (ethical).
Unless you are able to show that doing something against the owner's wishes is ethical; then it's an irrelevant distraction. Ethics is not about convenience, it's about doing what is right (even when it's inconvenient).
You're completely ignoring my point which is that your idea can lock out the owners themselves from their own data
, simply because they weren't using whatever your specific OS instance wants everywhere (which is guaranteed to happen the moment they deal with multiple systems at any time in their lives).
I'm not ignoring your point; I'm saying that your point is irrelevant (nothing to do with ethics), and that the "problems" you point out are insignificant, that there are better ways to avoid the "non-problems", and that it's not an OS's responsibility to shield incompetent users of poorly designed OS's from the consequences they deserve.
Oh, and on top of that I could just take your data to another system, modify it there carefully and change any permissions without your OS ever suspecting anything (assuming any traces are correct, possibly generating fake logs if needed). So your method isn't even foolproof.
My method is foolproof. If you use one OS to bypass the security/permissions of another OS but neither of those OSs are mine; then it's impossible to accuse me of providing software that behaves in an unethical way, impossible to blame me or my OS for allowing security to be bypassed, and impossible hold me responsible for any data that is taken without permission.
Your idea doesn't bring in any real good whatsoever and instead has the potential to cause a lot of harm, and not to strangers but to the people who actually are meant to own those files in the first place (let alone anybody they want to share those files with). Please explain me how is that any more ethical than all those arbitrary restrictions.
Even with a poorly designed OS on a system with cheap hardware and an incompetent administrator, the chance of the owner losing their data simply because one OS won't violate the security of another OS is almost zero. The only likely consequence is that the owner of the data (or someone working on their behalf) would need to use a "no permissions file system" like FAT when copying files to a different OS.
If I can't trust your OS to uphold the security policy of another OS, how can I trust your OS to uphold it's own security policy? Would you mind if I claim that my OS is superior to your OS because my OS has a much stricter security (that even includes upholding the security of other OSs that don't even exist yet)?
If there are 10 different OSs that are not your responsibility and each of those OSs has 2 different types of file system that are not your responsibility; how much time are you going to spend writing, testing and maintaining code to ensure that users that aren't your responsibility are able to use your OS to access files from all these file systems just in case of extremely unlikely problems that aren't your responsibility?
Would you mind if I claimed your OS "has only a limited commercially significant purpose or use, other than to circumvent technological measures" and had your OS banned in multiple countries (due to the WIPO Copyright Treaty
that "prohibits circumvention of technological measures for the protection of works")?
How would you restore data from a backup if you refuse to access any external file system?
Send the data to a trusted system that holds the back-up, and when you need it back you request the data back from said system (which would be explicitly granting you the permissions you need). You aren't touching the filesystems directly at all, you're transferring files between two systems.
The only thing this does is push the problem somewhere else. How do you back up the "trusted system that holds the back-up"?
This does mean that probably removable media is out of the question, simply because they normally won't tell you whether you should be allowed to touch the files in them. But then ask yourself if allowing that would be ethical.
The majority of removable media uses file systems like ISO9660 and FAT and isn't a problem.
What about external file systems that are intended for transferring data between systems (e.g. how would you copy files stored on a digital camera's CompactFlash card onto your computer)?
How can you be sure that the original owner of said data has given you permission to access that data?
Either the original owner willingly removed the permissions (by copying/storing the files on a file system that has no permissions); or an OS that is not mine stripped the permissions against the owner's. Either way, my OS is not guilty of violating the owner's permissions.
Case in point, what if I take a camera (or its memory) from somebody else without their permission and then load it in my computer? How is that any better from me trying to access data from a partition in another OS in one of my own drives? (where I presumably would own most or all of the data there)
Why aren't you able to see that the answer is obvious? If there are no permissions for an OS to disregard the OS can't behave unethically by disregarding permissions. The reasons why there are no permissions is irrelevant because it's beyond the OS's control.
If someone has the correct key (even though no human was supposed to ever see or obtain the correct key) then you can assume security has been compromised and should erase all the data as soon as possible in an attempt to protect any confidentiality that hasn't already been lost.
How are you sure I wasn't the source of the key? (e.g. by entering a password that feeds the encryption algorithm, or by generating a file with the key instead of a password) The original key must have come from somewhere after all.
If the OS generates the key (like it does in almost all modern disk encryption schemes) then it can be sure that you were not the source of the key.
The assumption you're making is that once encrypted it shouldn't be decryptable ever anymore since whoever created the key can't be the valid source of it. That completely defeats the point of encryption, may as well just delete the data directly instead of encrypting it.
No; the assumption that I am making is that the OS is secure (and therefore securely generates the key itself, securely stores the key, and securely uses the key for both encryption and decryption).
By assuming a human should have any kind of access to the key, you create this scenario:
Honestly I think that what ticks me off the most though is how data that is more likely for the user to be allowed to touch (what's in their own systems) gets a lot more of restrictions than data that has absolutely no indications of whether anybody is allowed to touch it (removable media without file permissions). That's completely backwards. Either restrict all or restrict none. Or at least provide an override for when the OS assumption turns out to be wrong.
What ticks you off most is that you've falsely assumed that a typical user would actually notice if an OS didn't allow the security of another OS to be violated.
For example; for my network almost all files come from the Internet without any permissions, and files moving between computers/OSs use networked file systems that uphold the security. I wouldn't know if any of the OS's I use do or don't allow the security of another OS to be violated; except for files that come from things like Steam or Origin (that use an additional DRM system on top of the OS to prevent "copied against owner's permission" executables from correctly executing).