[LINK] Bizarre lameness in Microsoft Excel 2002
Craig Sanders
cas at taz.net.au
Thu Nov 6 10:14:49 AEDT 2008
On Thu, Nov 06, 2008 at 09:09:59AM +1100, Marghanita da Cruz wrote:
>> ps: is a spreadsheet really the most appropriate tool for working with
>> large amounts of tabular data like that? wouldn't a database be
>> better?
>
> A database isn't much use for analysis.
>
> My personal preference was Fortran for aggregating data with Lotus 123 for
> graphs, over MSExcell 20 years ago.
if i needed random access to the data, i'd store it in a database.
otherwise, a flat text file.
for processing it, i'd almost certainly use perl. and gnuplot if i
needed any graphs. or, occasionally, i will import the summary output
AFTER processing by perl into gnumeric or OO (i.e. use the spreadsheet
just as a convenient graphing tool).
but, for people who don't like to program, a good sql implementation
with a good frontend (GUI or text) is a handy tool.
(of course, the dirty trick is that constructing complex sql queries
*IS* programming, and people can do it without even realising that
they're "programming" :-)
> Now I rely on AWSTATS and sometimes use Open Office to analyse my site
> logs.
for web logs, i use webalizer. used to use awstats years ago, but found
webalizer was far easier to automate custom configs for mass virtual
hosting. I also like the output more.
if i just want to extract some specific detail from apache logs (like
"how many requests were there for foo.pdf on tuesday"), i'll use a
combination of sh and related tools like sed and awk, or a Q&D perl
script.
> I have also used Webalizer's standard configuration, but never got
> round to customising the configuration - which is supposed to be
> possible. http://www.webalizer.com/
webalizer is quite customisable, and it's fairly easy.
craig
--
craig sanders <cas at taz.net.au>
More information about the Link
mailing list