What would be the fastest way to open and then search through a very large file?

Now I'm just allocating the memory needed to load the whole file into a buffer, but that won't work when the filesize is 700 MB. The obvious solution to this problem would be to search the file in large chunks, but what if the search pattern is on the boundary of two searches?

I'm also thinking of using memory mapped files, but I have never used it before so I don't really know what advantages there are. If I decide to use memory mapped files, can I write directly to the file while having it open?
Posted on 2003-08-07 04:23:55 by Delight
Posted on 2003-08-07 05:00:33 by RobotBob
Thanks, I'll take a look at it :)
Posted on 2003-08-07 06:28:58 by Delight