Search found 1 match

by NcongruNt
Thu Oct 18, 2007 4:51 pm
Forum: Site Announcements, Questions & Suggestions
Topic: 500 Sever Error
Replies: 20
Views: 3436

Kalrog wrote:
Charles L. Cotton wrote:
Kalrog wrote:
Charles L. Cotton wrote:
  • 65MB of memory per script
    180seconds per script execution
    20 simultaneous processes
I know what it means... but I am having a problem understanding why this site would consume more than that. Or do you have multiple sites all being run from the same hosting platform (I would bet that you do).
This one, http://www.TexasShooting.com and one other for a friend. TexasShooting.com is not active and the other site sees little traffic. Does "20 simultaneous processes" mean only 20 "searches" can be active at any one time?

Chas.
So let's assume for a minute that you are hitting the 20 processes limit. A process is basically a worker. You can have one huge worker that does everything or you can have millions of workers that each do a little bit. There are arguments about which is better, but as with most things the best answer is usually somewhere in the middle. So let's list the processes that we know are working.

1) A web server process (Apache or IIS depending on your OS but since this forum is PHP I will assume it is a Linux host running Apache).
2) This forum is at least one process
3) The database is at least one process

Beyond that it is tough for me to know exactly how many are running. Take one example of running a search or even just requesting a page. This forum is DB driven. So there is almost certainly a process that does the communication between the web server (Apache) and the database (probably MySQL). Here is where it gets tricky. There is something called connection pooling which allows the website to check to see if there is an open connection to the database already, and if there is it will use that one. If not, then it will create one. The alternative is for each page view or search to open its own connection to the database and then close it when it is done. I could try (and probably fail) to explain exactly when one is better than the other, but let's just say that they each have their benefits and that there is no one right answer. People get paid lots of money to set these things up for a reason.

With that said, it sounds like you need to head down the path of reducing the number of processes running because I haven't seen anything to make me even suspect that you could possibly be hitting the other 2 limitations that you mentioned. If you want to PM or email me some software specifics (or I can just go dig and probably find them) I can do some of that checking for you. Are there any other limitations that you might be hitting?
My guess is that the process limit they quote would be Apache child processes. They probably have this defined in his config to limit the processes to 20. Within those constraints, Apache allocates sessions across those processes. The more sessions the process takes on, the more memory it uses. The config also defines how many sessions a process can handle.

My guess is that either one of two things is happening:

1) The processes have a session limit low enough that the growing numbers of users is hitting that limit for all 20 processes .

OR

2) The processes are hitting the memory limit.

Looking at the Apache error logs would tell you what is happening either way.

If the issue is the session limit, Apache can be reconfigured to allow more sessions per process.

If it is a memory usage issue, then the terms of service will need to be renegotiated to address the problem.


Charles: In answer to your question, a process is simply a single worker program that answers requests from users. Within a single process, Apache has a defined number of sessions allowed. Generally, a single session is created for every person connecting to the website and has a specific timeout limit so that the session doesn't stay open indefinitely.

Effectively, your concurrent user limit is the number of processes (20) multiplied by the sessions per process limit.

(edited for clarity)

Return to “500 Sever Error”