The "metafile-xxx.txt" is alphanumeric data.
The file has 25 to 30 million lines of data.
Each line has 85 to 95 characters terminated with CR & LineFeed
it is unsorted raw data.
my attempt to read the file into memory, sort it and write it back to disk.
At about 14 million lines read, the programs stops (crashes)
The attached snippet crashes (fb programs stops working)
When it reads about 14.8 million lines it stop working -- that's about 1.5 billion bytes
the instruction fre() says that I have about 2.1 billion bytes available.
my system is Win 7, 32 bit.
the fb help file on arrays hints that I could have an array with milllions of elements, but a
Maximum Size (in bytes) of +2147483647.
28 million lines * 100 chars/line = 2.8 gigabytes which EXCEEDS the limit spec'd
Thus, I knew it that it was going to to crash.
But, not this early.
It stopped at 1.4 gigabyes (or about 14 million lines input)
IF the data is not stored ( comment out 'pattern(count)=textline')
it reads to the end of the file. The line count for this one is 27,784,060.
I expected it to crash somewhere close to 2.147 gigabytes, not at 1.4 GB.
using the -exx parameter, it says "aborting due to run time error 12, segmentation violation ...
That almost implies that there is a 1 gigabyte limit, not 2.147
Can anyone explain why it died so early?
Code: Select all
'' zMaxArrayTest.bas
dim as ulongint maxi = 30001001ull '' approx. 30 million
dim shared as string Pattern()
redim Pattern(maxi)
dim as string FileName = "MetaFile-001.txt"
dim as integer opnerr
dim as integer fh1 = freefile
opnerr = open( FileName for input as #fh1)
if opnerr then stop
dim as integer Count
dim as string textline
do
count += 1
input #fh1, textline
Pattern(count) = textline
loop until eof(fh1)
close #fh1
print "TOTAL LINES IN FILE: ";count
getkey