In Windows Server 2012 with 2 CPUs I have Apache 2.4 with PHP 5.6 and when I generate a PDF document with DOMPDF a cumulative 50% of total CPU power is used. No matter what I do, I cannot get the total over 50%. I tried opening a bunch of windows and creating a multiple series of PDF docs at the same time.
Each individual CPU will be less than 50% and if one spikes up the other spikes down at the same time. It seems like windows is limiting the Apache service to use 50% of the CPU. Is there somewhere to change this?
Edit: my application is already utilizing both CPUs just not to their full capacity, and after 60 seconds of load, the utilization moves to 100%. I think it is not anything to do with threading... maybe an environment setting?
It's not windows limitation but program design itself. I think it is related with CPU cores (for example it has 4 cores and uses just 2, literally 50%).
As far as I know you cannot do anything about this, as it cannot be split to more cores without proper program design.
Related
I am using Laravel 8 and wkhtmltopdf to convert dynamically generated HTML to PDF files. The problem is that the process is too slow. Each job is split into at least 20 chunks, and each chunk is a job in a batch that then gets queued according to these instructions. Using supervisord, I am spawning 20 PHP processes. When I run the document generator, all 20 PHP processes get to work, and they launch 20 wkhtmltopdf processes. Everything is happening in the same docker container.
Increasing resources from 1 CPU and 1GB RAM to 2 CPUs and 4GB of RAM nearly doubled the speed. Sadly, going from that to 4 CPUs and 8GB of RAM has not brought any measurable speed gain.
Current speed is 4 files per second. I have tens of thousands to process. Where can I get more performance? What can I change without adding more hardware resources?
I recently saw stats on my vps dashboard. but I wonder why the level of use in RAM so much, when the number of visitors have not boming.
following product details:
OS : CentOS 7 64bit
CPU Cores count : 3
Total CPU(s) speed : 4800Mhz
Memory : 2 GB
The number of online visitors an average of 30 people. this takes up about 50% of the amount of Memory available. so it can be estimated if the number of online visitors reached 60 people, then the use of RAM is overloaded.
Is this at a reasonable level? or I need to set up a strategy to prevent the site from being down?
Additional information: I build a site NOT from wordpress or anything else. all suggestions and opinions are greatly awaited, thank you.
The subject is too broad.
Manage your cache
Make use of virtual memory
Use pointers and references when programming: Even if you are not using C++, you still can call large objects and files by reference.
kill unused processes
Uninstall unnecessary applications. This includes starting up applications if you know what you are doing.
Do not open many windows at once
prefer text and command utilities rather than graphical interfaces.
I've been working on a enterprise application with a user base of 1000. There are about 20-30 users at any given time using the tool and at peak times around a 100 users.
Current configuration:
LAMP Stack
MySQL 5.5.29, PHP 5.3.20 - CodeIgniter Framework, Front-end Action script with Flex Framework
Intel Xeon 32 Cores # 2.9 GHz
161 GB of RAM
2 TB Hard drive
Linux Fedora 16 OS
I have just recently received a requirement that we will be granting access to different departments which will add about 5000 more users.
I've been mostly a front-end developer with the last two years doing back-end development as well. However, I do not have enough experiencing in load balancing, what hardware configuration can support what etc.
Currently, I am building the application from the ground up trying to optimize areas with bottlenecks and also add certain features so we don't experience issues with the new additions.
First, are there any phpmyadmin type dashboards that can be installed on linux that will give me a nice GUI showing CPU load, memory usage, harddrive read/writes etc. ?
Second, are there any suggestions from people with experience in this area on what I can do to optimize the server?
Third, is my server configuration enough to support that many users? Application will be using 75% select statements with joins to display reports and 25% of the time writing/updating to the database.
Thanks in advance for your help.
I'm a newbie to memcache. My host(hostgator) says they recommend on memcahce on VPS 5 & up. I'm on VPS 4 (Centos, 2 Ghz, 1.3 GB ram 60 GB HDD) & VPS 5 is Centos, 2.7Ghz, 1.8GB ram, 80GB hdd.
Does memcache require so much resources ? I thought that VPS4 should do but my host thinks other wise. Can anyone suggest why memcahced resource needs are so high ?
thanks
Anita
That depends on how much data you want to store using memcache. All data will be kept in RAM, so if you are going to keep there a lot of data you will also need a lot of RAM. Well.. if your VPS4 RAM is completely used by sites/apps you already have there - it is reasonable to use VPS 5 (or even a system with large amount of RAM). Of course, it will use some additional CPU resources, but it needs should not be very high. But if you have on your VPS something simple enough - than it looks like they just want to sell VPS 5.
Determine how much data you are going to need to cache? You just need to have an idea for what size the data objects are going to be and how many of them you think there will be.
You really just need to look at the memcached stats... probably shoot for a hit rate of 90%+ (depending on application needs), an eviction rate of 0, and and perhaps around 50% memory usage when the cache is fully populated to provide for any growth needs, spikes, etc.
Once you know how much that memory allocation is, you can size your server's memory properly.
We have a website which had a previous memory limit of 12 MB (12 MB in php.ini, and 16 MB in settings.php) and worked previously.
After moving to a new server it started giving memory limit errors and displaying half-blank screen.
We increaded the limit in both files (php.ini and settings.php) and now it works, but I dont understand how is it possible that now it needs a considerably larger amount of memory (it used to work with 12 MB, now it cont work with less than 20 MB).
I assume you did not change the OS in the process. Moving from Windows to Linux or vice versa is quite likely to change resource usage.
And this is a long shot, but perhaps you moved from a 32-bit system to a 64-bit one? This would slightly increase memory usage as addresses (pointers) are twice as large on 64-bit architectures, and code with lots of small objects uses plenty of pointers.
On the whole though, we can't tell you much without seeing what changed about the system.
12 is too low, if you don't use only drupal as it is. Higher is recommend, than more modules you will install, usually 96MB is enough with image processing...
12 MB is really very low. I would tend to ignore it and go on.
Ideas what could have changed, though:
The old server could have had modules installed that reduced memory usage, e.g. memcache
The new server may have to rely on GD library for image processing, while the old server maybe had ImageMagick (which is an external tool and doesn't count towards the memory limit)