Re: I Need to search over 100 largeish text documents efficiently. What's the best approach? [message #184737 is a reply to message #184735] |
Sun, 26 January 2014 15:29 |
Jerry Stuckle
Messages: 2598 Registered: September 2010
Karma:
|
Senior Member |
|
|
On 1/26/2014 8:34 AM, rob(dot)bradford2805(at)gmail(dot)com wrote:
> As part of my hosting providers re-platforming cycle my site has moved server, on the new server and all new servers php exec() and equivalents are blocked, this has taken out my fast document search that used exec() to call grep then awk. I now need to do the grep part as effectively as possible in PHP as I can no longer access the shell from the scripts. The awk part is easily sorted.
>
> What is the best/fastest approach to scan 100+ largish text files for word strings, I really don't wish to index each file into a database as the documents change quite frequently. my grep-awk scan was around one second to begin rendering the results page, I know I can't match that but I can't afford too much of a delay.
>
> Any ideas appreciated whilst I look for a new hosting provider, I feel that any hosting set up that makes such a change without notification really has no respect for it's clients.
>
> Rob
>
Whether the files change frequently or not, your best bet is going to be
putting the documents in a database. You won't be able to do anything
nearly as fast in PHP as the database does. And it isn't that hard to
insert the documents into the database when they are uploaded.
And kudos to your hosting provider for closing a huge security exposure.
--
==================
Remove the "x" from my email address
Jerry Stuckle
jstucklex(at)attglobal(dot)net
==================
|
|
|