LXer Day Desk: The newest “Get the Facts” report
Note: You can find the full report of the study here(“Reliability: Analysing solution uptime as business needs change”). Though I did my best to represent facts and be unbiased in this article, I strongly advise and encourage you to read the report carefully for yourself before judging it. Please note, not the whole report is covered in this article.
The title of the press release is, as almost always with the Get the Facts campaign, wrong. The study only researched Suse, and Suse is not the same as Linux.
The press release two days ago mentioned Secure Innovations (from now on abbreviated as SI) as an independent provider of application security services. How independent is independent in Microsofts (from now on abbreviated as MS) definition?
At the SI site, we read: “Security Innovation is a certified Microsoft partner for security services. We have both the Microsoft SWI and ACE certifications as an authorized professional services provider for Microsoft technologies.”
And further on: “Dozens of leading organizations, including IBM, ING, Microsoft, SAP, Symantec, Google, VeriSign and a number of government agencies, rely on Security Innovation’s expertise in application security testing and training to develop, evaluate and deploy more secure applications.”
This means, MS is a client of SI, and SI is a certified MS partner. MS is even the first partner on the list. On the other hand, Novell isn't mentioned as a partner or client. For that reason, bashing Suse will get SI in less trouble than bashing Windows.
At page 6, at the acknowledgements, we find:
“This research and our analysis were funded under a research contract from Microsoft. As part of the agreement, we have complete editorial control over all research and analysis presented in this report.”
No matter if MS influenced the research or not, if the research has an outcome which favours Suse over Windows, Microsoft won't pay SI for any new researches. So one may ask about the independence of SI in making this report.
The research was about a simulation of a growing e-commerce application. During a year, three administrators used Windows, and three others used Suse to manage this application, while requirements grew. Now, how relevant is it, to judge the stability of an e-commerce solution based on the experiences of only six sysadmins? The report helps us out: (p35, conclusions, 2nd paragraph, end of 5th line)
“While the sample size of administrators was too small to provide conclusive statistic comparison, the results highlight some off the fundamental differences in the Windows and Linux models.”
For this e-commerce solution on Suse and Windows, third party software was used. The most important thing when doing research, is reproducibility. This means specifying the methods and material used as precise as possible. To quote Ph. D. Thompson from the press release: "Security Innovation designed this study to be repeatable”.
Nonetheless, Appendix 5 states (3d paragraph, 4th line):
“The specific 3rd party vendors are not disclosed, because the focus of the study is the methodology and not a specific component.”
But if I take two sponges, put soap on one and mud on the other, and use them both according the same methodology to cleanse my window, everyone understands the results may differ due to the components used. The report states that the 3rd party components were chosen based on their market leadership in their respective fields, but nobody can verify this.
Though appendices 3 and 4 list possible IT certifications the sysadmins participating in this study might have, this study doesn't mention anywhere, which certifications the participants have. Their competences and history in IT-business is also left out, and the study doesn't mention for how long this people have worked on the Linux / Windows platform, and what the actual jobs of this people was/is. This makes the study even harder to reproduce. Moreover, the study doesn't mention a maximum experience with the software, though it does mention a minimum experience. The minimum for “e-commerce experience” in this study was 2 years on a minimum of 20 servers, but this could mean the Linux sysadmin has 2 years experience with 20 servers, while the Windows sysadmin has 10 experience with 200 servers.
The study shows, Suse needed more patching than Windows. The software included in the 'initial state (S1) setup' wasn't included however, which could mean, more applications were included on the Suse-platform then on the Windows-platform. The study also doesn't tell us if one of the platforms was optimized. In the past, it happened that a 'Get the Facts' report compared a optimized Windows platform against a non-optimized Linux platform, so this lacking information is important.
Lacking info about the sysadmins
Also, the names and contact information of the sysadmins isn't included. Now, SI will probably claim this is for privacy reasons, and the participating sysadmins aren't paid to answer questions anyone might have about this report. But it is a missed opportunity to prevent conspiracy theories. Because, we can't verify if this participants even exist in the real world, which means, we are not able to verify if there were real participants in this study at all. This could raise theories about the study not being done at all, and the report just being made up by its author.
If we compare Windows Administrators Requirements (appendix 3) with the Linux Administrators Requirements (appendix 4), there are some strange differences in the years of experience required. For example, SI looked for at least 4-5 years experience on Windows (page 41, General IT background, second requirement), while they looked for at least 2 years on Linux (page 43, General IT background, second requirement ).
The technical parts of the study focusses on dependency problems the Linux sysadmins encounter, while the Windows administrators don't encounter this problems. The study fails to mention,Windows 2000 may have dependency problems too. Anyway, the Linux-sysadmins faced problems because an 'updated enhanced search solution they needed' needed a newer version of glibc, which broke RPM and finally their whole platform. The study states they couldn't update MySQL using Yast, which sounds strange, but I won't go into this in detail since my lack of knowledge about this matter.
But lets focus on something else. One of the Windows sysadmins had an issue with third party software, and contacted the vendor, after which the issue was solved (p33, 3d line). On the other hand, the Linux sysadmins, facing more problems, didn't contact the vendor of the search-application, and neither contacted Suse about this issue. This would have been a nice moment to upgrade to SLES 9, but they didn't, and broke their RPM tool. Two of the platforms were 'unrecoverable'. This is a strange idea, since any Linux sysadmin would make a backup before updating glibc, and after that, could use a LiveCD to recover the server and put all files back the way it was before updating. Since updating glibc is something 'large' in Linux, it could better be compared to installing a service-pack for Windows. As anyone probably knows, this always gives issues and solution-downtime too, so the comparison made in this study isn't that fair.
Facts of this study were distorted in the MS press release about it. The independence of the company committing this study, Security Innovations, is at least questionable. The sample size of administrators was too small to provide conclusive statistic comparison. The report provided far too less details too reproduce this study. System administrator requirements for Linux and Windows varied, and any information about the sysadmins isn't given. Sysadmin contact information wasn't included, and therefore, results can't be verified. Linux sysadministrators reacted a bit strange to the problems they encountered, and the comparison wasn't really fair, while it compared two updates of a different order.
Therefore, this report is not really useful to base decisions upon, and if it was, only useful in very limited situations for very limited goals. One should read the story very critical before judging reliability of Windows compared to Suse. I also suggest taking a look at the Lxer migration list (sorted on distro, starting at Suse from halfway, continuing on the next pages). The list provide links to articles about, amongst others things, Novell (Suse) sucess stories, which includes companies telling why, for their companies, Suse works better than Windows.
|Subject||Topic Starter||Replies||Views||Last Post|
|the major problem with these "studies"...||tuxchick||14||1,760||Nov 21, 2005 3:03 AM|
You cannot post until you login.