FUDforum
Fast Uncompromising Discussions. FUDforum will get your users talking.

Home » Imported messages » comp.lang.php » I Need to search over 100 largeish text documents efficiently. What's the best approach?
Show: Today's Messages :: Polls :: Message Navigator
Return to the default flat view Create a new topic Submit Reply
Re: I Need to search over 100 largeish text documents efficiently. What's the best approach? [message #184743 is a reply to message #184742] Mon, 27 January 2014 09:58 Go to previous messageGo to previous message
Arno Welzel is currently offline  Arno Welzel
Messages: 317
Registered: October 2011
Karma:
Senior Member
Am 27.01.2014 02:43, schrieb Denis McMahon:

> On Sun, 26 Jan 2014 05:34:21 -0800, rob.bradford2805 wrote:
>
>> What is the best/fastest approach to scan 100+ largish text files for
>> word strings
>
> A quick googling finds:
>
> http://sourceforge.net/projects/php-grep/
> http://net-wrench.com/download-tools/php-grep.php
>
> Claims to be able to search 1000 files in under 10 secs

Under ideal conditions - maybe. But if each file is more than 1 MB, it
is barely possible to even read this amount of data in just 10 seconds
(assuming around 80 MB/s and 1000 MB of data to be searched).

Even using a simple word index (word plus the name of the file(s) and
the position(s) where the word is located) would be the better solution.


--
Arno Welzel
http://arnowelzel.de
http://de-rec-fahrrad.de
[Message index]
 
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Previous Topic: include capturing wrong value
Next Topic: help with preg_match pattern
Goto Forum:
  

-=] Back to Top [=-
[ Syndicate this forum (XML) ] [ RSS ]

Current Time: Sat Nov 23 13:46:09 GMT 2024

Total time taken to generate the page: 0.04028 seconds