Re: 2GIG-file

From: Anton Ivanov (aivanov@eu.level3.net)
Date: Tue Aug 01 2000 - 05:32:20 EST


> 1) The file isn't fragmented on the the NTFS.

1. I would suggest you use some sort of a netpipe mechanism for copying it.
like cat junk | ssh otherhost "cat > junk". As most network filesystems are
not terribly 2G+ compliant.

2. Frankly I have no idea how you will manage to do this off an NTFS
partition. Splitting/zipping or other archiving while running under NT comes
to mind.

>
> 2) I really need to find a solution to this DB ported to Linux today.

Unless I am mistaken:

1. Use alpha. As the only really stable 64 bit system. Ships with linux in the
US under a week.

2. Or use Postgress as it does a per 1GB (tunable to arbitrary size) file
spill.

>
> Currently I am compiling 2.3.99pre9, any comments for running this as a
> production kernel, I know its in dev, but what other choices do I have ?

See above.
        But in general unless I am mistaken you are trying to retrieve data from a
"very experimental" filesystem. I would worry about that in first place.

Unless you really know what you are doing I suggest you resort to more
conventional (either by means of software or hardware) methods.

[snip]

include "stdisclaimer.h"
include "my2cents.h"

-- 
Anton R. Ivanov ARI2-RIPE
mailto:aivanov@sigsegv.cx
The excuse for delaying today's deliverables is:
Interference from lunar radiation


- To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.rutgers.edu Please read the FAQ at http://www.tux.org/lkml/



This archive was generated by hypermail 2b29 : Mon Aug 07 2000 - 21:00:05 EST