No Brainer
|
Author | Content |
---|---|
montezuma Jun 09, 2012 4:43 PM EDT |
Linux is mature, has fast compilers and is cheap. All a scientist needs really. The proof of the pudding is of the Top 500 supercomputers in November 2011 91.4% used Linux. Drumroll........ and the percentage using Windows: 0.2% i.e. Only one out of 500. Talk about a token effort. http://i.top500.org/stats |
nikkels Jun 09, 2012 11:33 PM EDT |
Another link where you can see same on a different way http://www.bbc.co.uk/news/10187248 |
montezuma Jun 10, 2012 10:31 AM EDT |
Another interesting aspect about this is the democratization of computing. Most supercomputers today rely on being massively parallel. For example the No 1 on that list has 224,000 cores. As a user of that machine it would be difficult to access all those cores given a decent scheduler. Thus the ordinary user makes do typically with effectively far fewer cores. But you can buy a cluster yourself with that number of effective cores for a rather reasonable price.
As an example I purchased a 32 core (=4 nodes each with 8 cpus) cluster for around $8K (3 years ago) and am the sole user. I run it like I run my linux box at home and write ssh scripts to fire jobs off to all the nodes. You then let it run for a few days and it is rather surprising what you can achieve. I used to require access to a supercomputer but no more. Of course if you run a gigantic numerical model such as a cosmology or global climate simulation you need all those cores for a day or so and so a supercomputer is vital. But if you run small models a personal cluster (running linux) is a very convenient way to go. |
Posting in this forum is limited to members of the group: [ForumMods, SITEADMINS, MEMBERS.]
Becoming a member of LXer is easy and free. Join Us!