does the bear poop in the woods?

Story: Can Large Commercial Web Sites Be Run on Free Linux?Total Replies: 12
Author Content
tuxchick

Aug 16, 2007
10:31 AM EDT
Are the oceans deep and the mountains high? Does the wind blow? Are articles like this beyond silly?

Some of it is downright scary:

"Q: How often do you expect to patch your CentOS servers? A: Until now we've been patching Red Hat 7.3 and RHEL3 on an irregular schedule, usually only when high priority patches came through. But as part of our new policy we plan to patch both the Red Hat and CentOS servers on a monthly basis."

Might as well change the server banner to "pwn me, please!"
techiem2

Aug 16, 2007
11:14 AM EDT
haha. Yeah, nothing like announcing your update schedule. Talk about asking for pwn attempts.

It was overall kinda silly. I mean, what's really the big difference between commercial and free distros for most uses? Paid support and maybe a more set release schedule (depending on the distro)?
rijelkentaurus

Aug 16, 2007
11:53 AM EDT
Quoting: Paid support and maybe a more set release schedule (depending on the distro)?


I think more than the set release schedule is the set OS life run. Red Hat, for instance, is good for 7 years from the date of release...but then again, so is CentOS.

There's another article similar to this, something about open source databases being good enough to run production apps...

So for the record, TC, I think the bear really poops on both of these questions, not just in the woods. :)
tuxchick

Aug 16, 2007
1:13 PM EDT
A seven-year lifecycle on any software seems absurd, especially Linux. All software ages to the point where you lose binary compatibility, and then you have to do all sorts of library and compiler juggling to make source builds work. Unless you freeze your entire system at a single point in time, which doesn't work either. Because then you lose hardware support, and you lose security, because applications often undergo extensive re-writes to address core problems that can't be fixed by patching.

Bear pooping big now!
devhen

Aug 16, 2007
1:29 PM EDT
@tuxchick:

quote: >> A seven-year lifecycle on any software seems absurd, especially Linux.

Hardly absurd. Sure, most people running Linux servers are hosting web sites or systems that are constantly evolving and staying somewhat on the cutting edge in order to stay relevant and competitive but there are many, many .... many! ... uses for the Linux operating system that don't require the latest, greatest versions of the software. In these cases the ability to install once and receive regular security updates for up to 7 years is ideal. In fact, this has always been a very strong, effective aspect of the Linux ecosystem.
devhen

Aug 16, 2007
1:38 PM EDT
In regard to this and the similar article 'Are Open-Source Databases Ready for Production Applications?':

For those of us familiar with Linux in the enterprise this has to make you seriously question eWeek's credibility. Look, if you don't understand that Open Source databases are running production applications all over the world (and have been for some time), or that many successful, large commercial web sites are run on free Linux, either you are stupid, have no real-world experience in the matter, or are being paid by the Microsoft FUD machine. Come on eWeek. You're readers are much smarter than this and you're only making yourselves look bad.
tuxchick

Aug 16, 2007
1:58 PM EDT
Devhen, I don't think so. Older BIND versions, Postfix 1.x, older releases of OpenSSH and OpenSSL are just a few examples of unmaintainable older applications that are suicidal to keep using. They all come with the same warning: upgrade to version $foo because older versions present unacceptable security risks and do not support important modern features. Hanging on to moldy old software is just as bad a practice as rushing new, untested released into production.
devhen

Aug 16, 2007
2:29 PM EDT
@tuxchick:

I still disagree. Strongly ;). All that is required is an understanding of backporting. See here:

http://www.redhat.com/advice/speaks_backport.html

Red Hat is in the business of backporting security fixes for the full 7 years and do a good job of it. This is the nature of their business. For 7 years from the date of release they support secure, production ready software. If security was an issue for older releases they would shorten the support life. This is also true for CentOS because Red Hat provides source code for everything they release according to the GPL and similar licenses that the software is governed by.

tuxchick

Aug 16, 2007
2:40 PM EDT
No way is backporting going to fix fundamental flaws, devhen. Sorry but I am not persuaded. So you would run antique OpenSSH that only supports SSH1, which has been deprecated as unsafe for years? BIND 4 or 8, which are unfixable? Backporting does not address those kinds of problems, plus it mucks up versioning. I could dig up a lot more examples but I don't want to make this my life's work. :) You can have your ancient bits; me, I'll stick to modern ones that aren't full of holes and lacking features!
rijelkentaurus

Aug 16, 2007
2:43 PM EDT
Quoting: Hanging on to moldy old software is just as bad a practice as rushing new, untested released into production.


I am rather inclined to agree with this, generally speaking, but the problem with many Linux distros is the fast release schedule not allowing for adequate testing of applications and capabilities, particularly with customized apps. While there is not really a reason to maintain the same OS for 7 years*, it's good to not have to feel like you have to stumble through a testing period every 6 months with your software. There are those that sort of meet you halfway and release something every 3 or so years (hello, Debian!), and Ubuntu LTS server releases are maintained for 5 years.

As a bit of an aside, this is a definite reason to use a full stack of Free Software, IMO, for the major software packages are often released with backwards compatibility in mind, so your code written in PHP today will work in the new version released tomorrow, perhaps with a few minor code edits. That makes upgrading far less frightening. Proprietary software is often released with vast differences between versions, requiring you to fork over $$$ to get the new version to work with the old data. The formats are usually proprietary also, so you're captive to the format and have no choice but to upgrade. I am a bit of a conspiracy theorist, and I am inclined to think that MS rewards those companies who force upgrades (OS upgrades and server upgrades) on their customers by offering them better operability on the platform so they can stand out as a "better" product. Yes, it's paranoid, but how many servers from 2000 can run Server 2003 adequately? It's a relatively small number compared to the whole, and almost none of the servers in production today are going to play well with Server 2008 when it hits the streets (um, sort of a "Vista Server Edition", perhaps). Hardware upgrades are good for MS license sales, but plenty (or all, even) of those servers from 2000 can be put to good use with a nice Linux distro on them, even if only for serving up network services on the local LAN.

Wow, rambling again. I gotta stop that.

*Red Hat is 3 years full support, the rest is support limited to bugs and security fixes.
devhen

Aug 16, 2007
3:01 PM EDT
@tuxchick:

From the backporting link above:

>>On every security issue that affects software shipped by Red Hat we analyse the changes made to see if we can update to a newer version, or if we will backport a security fix.

So in the case of OpenSSH only supporting SSH1 Red Hat would update to a newer version or backport SSH2 compatibility into the existing version.

Having said that, we can agree on one thing. I too like to stay up to date and will probably never stick with a particular OS version for more than 3 years or so, worst case scenario. While security fixes are available for 7 years on Red Hat systems, new features are not added, which goes back to my original comments. You're always going to want the latest, greatest software you can get without compromising on security when your building systems that need to stay competitive in dynamic markets.
Steven_Rosenber

Aug 17, 2007
3:56 PM EDT
Funny, when MS ends support for a Windows OS before 10 years, everybody's unhappy. How long will they support XP?

For a Linux server environment, I don't know all the particulars, but for me it would all be about security AND hardware/software compatibility. If you're talking about a serious Web or file server, you want to really lock it down, security wise, keep it up to date and make sure everything is the best it can be. Same for a mail server.

For many, though, they don't want to change out hardware or software unless absolutely necessary. And they never, ever want to make major changes or do just about anything until they really, really have to -- that's the reality for much of the business world. You only change hardware or software when an app comes along that you can't run, there are serious performance issues, or the box just plain dies.

That said, I think Red Hat offers seven years of support for two reasons: their customers want it ... and it keeps the money rolling in for Red Hat.

I agree that the Debian and Ubuntu LTS intervals of three to five years are more realistic. Following the six-month schedule of regular Ubuntu releases seems like too much even for the home desktop user.

So even if you do want to upgrade a server on a yearly basis, it's nice to have the option NOT to do it. But doing patches once a month? How about once a week? Sheesh.
Bob_Robertson

Aug 17, 2007
7:23 PM EDT
My take on the "patching" thing:

First, after a year or so you should already know this, if your application isn't changing then _leave_the_system_alone_.

Second, take a little bit of time and lock the machine down. A security hole in an internal piece of software is only a danger if the cracker can get to that internal piece of software. A server should have sufficient up-front defenses to prevent unauthorized access _at_all_. Linux is very good about this.

Third, if security support is so important, then plan your migration accordingly. It's not like a Debian Stable (or insert other distribution choice here) release happens without warning.

Fourth, these are not desktop machines. Interactively used desktop machines _must_ be continually checked, due to their constant threat of attack by inadvertent user actions.

Therefore, my primary recommendation would be to roll out any _new_ system with the "latest and greatest", and allow the deployed application product life-cycle to determine when older machines are taken out of service. Don't just update the application unless you trust the underlying OS. It is my experience that the application will be obsolete long before the underlying OS shows its age.

And then, unlike well known proprietary OSs, if the hardware is still good the "old" machine can turn around and become one of the next "latest and greatest" systems by updating the OS at the same time. The resource-frugality of the underlying Linux OS is a great asset. (My 9 year old 350MHz 128MB spare laptop, today, runs bleeding-edge Debian Unstable, just like my 2.8GHz primary system that I am using right now. I just don't expect it to transcode video quickly, which it never did anyway.)

Unlike proprietary systems, the underlying Linux OS does not change the application environment so much with each new version. Linux 2.6.22 will most likely not break all your applications when being updated from 2.6.21, unlike such nightmares as XP-SP2.

Posting in this forum is limited to members of the group: [ForumMods, SITEADMINS, MEMBERS.]

Becoming a member of LXer is easy and free. Join Us!