You know, sometimes I understand..

Story: Advocacy groups decry Freiburg's stealth return to proprietary officeTotal Replies: 3
Author Content

Sep 23, 2012
10:05 AM EDT
Both LibreOffice and OpenOffice are not able to handle LARGE amounts of data. See my previous posts on Access vs. FOSS ( and

This Friday I was working with a large MS-XML 2003 sheet (50M, ~45,000 rows). Both suites exited with a strange "General I/O error" thingy. I tried .FODS. Same thing.

Then I fired up my (work) Windows laptop, opened MS-Excel and tried again (MS-XML 2003). Sure, it took a while, but it worked flawlessly. Job done.

The only thing I can conclude is that in that respect there has been very little progress. Another bug: My LaTeX2RTF RTF files lose their references when importing them. MS-Word? No problem!

Sure, if you want to call me a M$-shill, FINE! And yes, I did report those things. Very little feedback though.

Sep 23, 2012
4:29 PM EDT
The question might be, is that the reason they are moving? Is it that they are all routinely needing to open 45,000 row spreadsheets?

I would say, anyone doing this is nuts and any sysadmin supporting it or condoning it is even more nuts. If you are dealing with this scale of data, do it in the proper kind of tool, and don't let amateurs loose on it (which is what you get if you do it in spreadsheets).

Maybe they are importing LaTex as well. We should all move to Freiburg, a really interesting place! If they do everything equally originally, hold on to your hats!

Sep 23, 2012
8:05 PM EDT
Quoting:WEDNESDAY, FEBRUARY 3, 2010 .. A data cruncher bites the dust ...

I am not sure how couple articles written over two years ago about older version is supposed to be relevant to this story?

Let's see, two years old release of OOo & Kexi failed at executing couple odd jobs.

Is that supposed to be a valid reason to replace their latest releases that are perfectly good for normal office work? Is the cost of purchasing proprietary licenses for the rest of the users truly justified?

The guy is a consultant and must have a personal reason. Such consultants I wouldn't hire for penny.


Sep 24, 2012
11:27 AM EDT
I find spreadsheets unreliable things, especially when large amounts of data are involved. This is from experience involving almost entirely Excel, because it came at work. Excel spreadsheets sometimes suddenly become un-updateable for no apparent reason. The fix usually involves going back to a previous version and re-entering the most recent data. Of any really widely used program I'm familiar with, I think I'd have to say that Excel was the quirkiest.

I have to wonder how Gnumeric would do on large spreadsheets, since back when I needed to work on one from a Linux box, it worked better than OpenOfficeCalc (this was quite a while ago).

I agree that stupendously large spreadsheets probably shouldn't be spreadsheets in the first place. As an IT support person, spreadsheets that large make me sweat just thinking about them. It's only daily backups that make the idea at all palatable. All the large spreadsheets where I work import data from a database to perform calculations on. If the spreadsheet breaks, the data is unaffected. They are not relied upon to store any data. That's not something spreadsheets are good at, and this is Microsoft Office experience talking.

Posting in this forum is limited to members of the group: [ForumMods, SITEADMINS, MEMBERS.]

Becoming a member of LXer is easy and free. Join Us!