Re: Efficiency of a lot of variables [message #170860 is a reply to message #170858] |
Sat, 04 December 2010 02:36 |
jwcarlton
Messages: 76 Registered: December 2010
Karma:
|
Member |
|
|
On Dec 3, 9:07 pm, Magno <marbar...@gmail.com> wrote:
> On 12/03/2010 09:52 PM, jwcarlton wrote:
>
>
>
>> I have 1,000 variables, written like this:
>
>> // Block 1
>> $hash['arr1']['var1'] = "whatever1";
>> $hash['arr1']['var2'] = "whatever2";
>> :
>> $hash['arr1']['var10'] = "whatever10";
>
>> // Block 2
>> $hash['arr2']['var1'] = "whateverelse1";
>> $hash['arr2']['var2'] = "whateverelse2";
>> :
>> $hash['arr2']['var10'] = "whateverelse10";
>
>> There are 100 blocks, and 10 variables in each block. The block used
>> is determined based on the domain used to access the site (there are
>> 99 domains parked on top of the primary account). In the sample above,
>> 'arr1' and 'arr2' represent the domains, and these are loaded on every
>> page of the site.
>
>> I've been researching this for awhile, and most seem to agree that
>> it's faster / more efficient to store all 1,000 variables into arrays,
>> instead of 100 blocks of if-else or switch-case. I had also considered
>> having 100 text files, and then just loading the appropriate text file
>> based on the domain (like below), but most believed that the I/O speed
>> would be worse than just loading all 1,000 at once:
>
>> $domain = "whatever";
>> list($var1, $var2,..., $var10) = FILE("/path/to/variables-
>> $domain.txt");
>
>> So, I guess my first question is, do you guys agree that storing them
>> all at once is faster / more efficient than the 3 alternatives? Or can
>> you suggest another option that I haven't considered?
>
>> If this is the best method, my next question is in regards to the
>> construct of the arrays. In the sample above, I have a single
>> multidimensional array; is that better than, for example, 100 arrays
>> with 10 dimensions:
>
>> $arr1['var1'] = "whatever1";
>> $arr1['var2'] = "whatever2";
>> :
>> $arr1['var10'] = "whatever10";
>
>> Or vice versa, 10 arrays with 100 dimensions:
>
>> $var1['arr1'] = "whatever1";
>> $var2['arr1'] = "whatever2";
>> :
>> $var10['arr1'] = "whatever10";
>
>> I know that I'm probably worrying too much about microseconds, but I
>> currently have an average of 600,000 pageviews a day, and expect it to
>> increase to at least 12,000,000 within the next year, so I'm trying to
>> make sure everything is as fast as possible before the problems set
>> in :-)
>
>> TIA,
>
>> Jason
>
> What I do in cases like this is benchmarking.
> Make thousand of calls for every case and compare their response time.
That's my next step, but I was concerned that the current server load
would give me false results. I don't have an unused server to test
with.
I also just stumbled across "session handling". That's a totally new
concept for me. Would it be practical to load the variables on the
first session, then store them as a $_SESSION? Like this:
session_start();
if (!$_SESSION['var1']) {
switch ($domain) {
case "arr1" :
default :
$_SESSION['var1'] = "whatever1";
$_SESSION['var2'] = "whatever2";
:
$_SESSION['var10'] = "whatever10";
break;
case "arr2" :
$_SESSION['var1'] = "whatever1";
$_SESSION['var2'] = "whatever2";
:
$_SESSION['var10'] = "whatever10";
break;
}
}
|
|
|