Hard Go with Large Spreadsheets

Story: Review: Hands on LibreOffice 3.3Total Replies: 5
Author Content
robT

Feb 08, 2011
1:11 PM EDT
I used LibreOffice spreadsheets to manipulate some large data sets (approx 8000 rows by 29 columns). What I did was use a bunch of vlookups to match dates between two data sets. LibreOffice crashed several times, took 100% of one dual core CPU and consumed 40-50% of 2GB of RAM. The saved spreadsheet was around 10MB. When I broke the problem up between two spreadsheets, CPU use dropped, but memory use went up to 90-100% of RAM (obviously swap was being used and performance dropped accordingly).
herzeleid

Feb 08, 2011
1:19 PM EDT
Aside from the observation that this job seems to be perfectly suited to perl, what different spreadsheets have you tried with this?

BTW this discussion brings to mind an office suite called "Applix" that I ran on linux in the 90s - it easily handled large data sets that would crash ms office every time, but unfortunately they are no longer in the market.
jacog

Feb 09, 2011
4:41 AM EDT
Hey, I remember Applixware. Quite a decent suite, as I recall. And it's still around:

http://www.vistasource.com/en/products.php

But not too sure how it is these days. :)
herzeleid

Feb 09, 2011
1:55 PM EDT
@jacog - thanks for the pointer, who knew? I actually did my first white paper with the current employer back in the 90s, on a solaris-2.5/sparc desktop, running applixware from a remote linux box. Good times...

Edit: Checked out applixware at the link you gave - unfortunately it seems to be stagnant, stuck in a weird time warp when solaris/sparc and windoze peecee were the popular platforms...

Steven_Rosenber

Feb 09, 2011
3:39 PM EDT
I'll be working on a 40,000-line spreadsheet soon, and I will report back on how that goes.
bigg

Feb 09, 2011
4:29 PM EDT
Use R, not a spreadsheet, if you have 40,000 lines.

Posting in this forum is limited to members of the group: [ForumMods, SITEADMINS, MEMBERS.]

Becoming a member of LXer is easy and free. Join Us!