Re: how to prevent files and directories from being deleted?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]



Mark Haney <mark.haney@xxxxxxxxxxx> writes:

> On 10/04/2017 08:22 AM, Gary Stainburn wrote:
>> On Wednesday 04 October 2017 12:54:44 Mark Haney wrote:
>>> Sorry, but if you have to use packages that don't originate from CentOS
>>> and they do that, then I wouldn't use them. Period.  I'd compile from
>>> source before I used something configured that way.
>> This perspective to some extent employs cutting your nose of dispite youre
>> face.  Before Packages were introduced, everyone compiled from source. That
>> was a pain, and a long process, especially when you had dependancies that you
>> also had to compile.  Packages eased this process but kept the dependancy
>> issue.
> If you think using non-standard packages that put /persistent/ items
> in non-persistent locations like /var/run in production environments
> is far more acceptable than compiling from source because of package
> management 'benefits' then (to me anyway) you're lazy and dangerous
> with critical data.  My statement still stands.  Let me be clear:

Please explain how compiling packages yourself turns such packages into
standard packages.

> THIS. IS. NOT. ACCEPTABLE.

It is not acceptable to create the system in such a way that files
indiscriminately disappear, without any check whatsoever if that´s ok.

> The fact you'd rather bandaid a problem (in production no less) than
> follow proper standards or compile from source to avoid said bandaid
> would be a fire-able offense in any IT shop I've ever worked at.

Did they require you to verify all the sources of packages you compiled
yourself to make sure that they behave exactly like the packages that
come with the distribution?

How is compiling packages yourself not another bandaid?

>> Package managers got round (mostly) both the dependancy problem and updating
>> too. The problem with package maintainers not keeping up to date shows that
>> this still isn't perfect.
>>
>> However, if you go back to compiling from source then you lose all of these
>> benefits.
>>
>> Thankfully I do not earn my keep by watering lawns.  I do not believe that
>> this is acceptable, but by the same token I have to earn my keep and that
>> involves having working production servers and services.
>>
>> I have managed to get round this problem in the past through manually doing
>> the same function as systemd-tmpfiles. It is a small price to pay to have a
>> working, (relatively) up to date server.
> The fact you find this acceptable means you're either the only
> qualified' (and even that is subject to doubt) person there, or your
> management is too ignorant to understand the danger.

How is compiling your own packages not a danger?

> I'm sorry, but in no way is this acceptable for production level
> servers. I'm sure, if you asked 100 IT people you'd get 100 to agree
> with me.  Being flippant with production servers is never acceptable.

Then you have to agree that defaulting to using a ramdisk for /var/run
--- or anything else --- is an utterly stupid idea.

> Of course, most people refuse to listen to logic and reason because
> they are convinced they are right despite evidence (and best practices
> over 40+ years of Unix) to the contrary.

I don´t see how compiling your own packages or randomly disappearing
files falls under best practises.

> I'll end this by saying, I hope the production servers you have don't
> provide critical services that could jeopardize the lives of people. 
> I'd ask who you work for, to make sure I avoid them at all costs, but
> I'm not sure I'd be told.
>
> Again, denying 40+ years of Unix design and  best practices because
> you're too lazy to manage compiling from source to avoid denying those
> practices is truly one of the most astonishing things I've ever seen
> in the 25 years I've been in IT.
>
> Then again, maybe I'm old-fashioned when I expect to do something and
> do it right rather than half-ass it.

People always make mistakes.  Best practises can´t be best practises
unless they take this into account, and that involves not deleting,
truncating or disappearing files that have been placed somewhere,
mistakenly or otherwise, lightheadedly.  That particularly applies to
unknown files, like files in /var/run, which have been placed there and
have not been specified for removal, perhaps due to a mistake of the
package manager.

Best practises also involve to generally not delete files unless you can
be sure that they can be deleted.  That is probably what the FHS
intended by specifying that files in /var/run must be deleted/truncated
at boot time, assuming that the programs that created them would do this
(and then create them anew if needed), which can be assumed to be
reasonably safe since it implies that unknown files remain.

For all I know, someones life could depend on a file that was placed
somewhere mistakenly.


-- 
"Didn't work" is an error.
_______________________________________________
CentOS mailing list
CentOS@xxxxxxxxxx
https://lists.centos.org/mailman/listinfo/centos




[Index of Archives]     [CentOS]     [CentOS Announce]     [CentOS Development]     [CentOS ARM Devel]     [CentOS Docs]     [CentOS Virtualization]     [Carrier Grade Linux]     [Linux Media]     [Asterisk]     [DCCP]     [Netdev]     [Xorg]     [Linux USB]


  Powered by Linux