Re: I Need to search over 100 largeish text documents efficiently. What's the best approach? [message #184740 is a reply to message #184735] |
Sun, 26 January 2014 20:55 |
Ben Bacarisse
Messages: 82 Registered: November 2013
Karma:
|
Member |
|
|
rob(dot)bradford2805(at)gmail(dot)com writes:
> As part of my hosting providers re-platforming cycle my site has moved
> server, on the new server and all new servers php exec() and
> equivalents are blocked, this has taken out my fast document search
> that used exec() to call grep then awk. I now need to do the grep
> part as effectively as possible in PHP as I can no longer access the
> shell from the scripts. The awk part is easily sorted.
>
> What is the best/fastest approach to scan 100+ largish text files for
> word strings, I really don't wish to index each file into a database
> as the documents change quite frequently. my grep-awk scan was around
> one second to begin rendering the results page, I know I can't match
> that but I can't afford too much of a delay.
I second Richard Damon's suggestion. If the awk bit is easily sorted,
and all you are missing is grep, it must be a matter of minutes to make
something like the functionality you had before from a handful of lines
of PHP[1]. Of course, it won't be an external command, so the way it
integrates with the rest of the code might make this not quite the
trivial task that it first appears to be.
[1] fgets, preg_match and glob (if you need it).
--
Ben.
|
|
|