Announcement

Collapse
No announcement yet.

Getting random errors 500

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Getting random errors 500

    Hello,
    3 months ago I purchased an entry level hosting package from a provider running cloudlinux for testing a php driven community software. Everything was running flawlessly until June when the server started dropping one or more requests completely randomly with error 500.
    In the log I got lines like: Cannot allocate memory: couldnt create child process: /opt/suphp/sbin/suphp for /home/ongolito/http://newblue.ongolito.net/index.php, referer: https://newblue.ongolito.net/
    As you can see from my screenshots, one time a javascript fails to load, another time 2 images are missing.
    Ive been told by the support that the community software is reaching the limits of the hosting package, which doesnt really make sense to me as the login page is using 20M and the home page 4M which is far below of my package limit of 2000M.

    Having read several articles here, Ive learned that CloudLinux would issue an error 503 in cases like that instead of error 500?!
    So my question: Is this problem related to CloudLinux at all?

    Yesterday my hosting package has been moved to another server, and everythings fine again as in the beginning. But of course - with having no real answer what was causing those faults - Im a little frightened to run into the same problem again.

  • #2
    You are hitting virtual memory limit (as there are faults) and this is causing issues. Feel free to point our your host to this thread. Usage of virtual memory limits are deprecated, and not needed when physical memory limits are used (like in this case). They serve no purpose and can cause such issues. It is highly recommended that virtual memory limits are disabled (set to 0).

    Comment


    • #3
      Thank you Igor.

      But wouldnt that mean: The usage info is incorrect? Neither the virtual mem used nor the physical mem is close to my limit of 2048M. So why Im running into faults sometimes - and only sometimes? Calling the same page with just one user cant results in a fault one time and the next time not. (I had disabled my site to the public to avoid any interferences while making that test)

      And (I repeat myself) there are no more problems with the same package on the new server. The difference (if i can trust viewdns.info): the old server is hosting 391 clients and the new one only 142.

      Thus to me it looks like the CloudLinux settings of my hoster (2048 M each for the entry package) are in total far beyond the resources the machine is capable to handle. And of course then it starts to make sense that Im getting faults already with 20M because the machine is stuffed to its limit with trying to handle 390 other client requests simultaneously. And perhaps thats the reason why Im getting no CloudLinux 503 error but a server error 500 because the machine is simply running out of memory.

      Kind regards.
      michael

      Comment


      • #4
        The stats/everything is correct. Averages are just that -- averages. If you used 2GB vmem for 0.1 seconds out of 1 minute, and 0 for the rest, that would be 2GB/600 usage on average.

        If you want to see if you are hitting limits, check faults.

        Comment

        Working...
        X