I am working on an IoT project in which several sensors will send data to my Wamp server through internet and I will log that data in my database.
My question is, can Apache + mySql handle data of these dimension.
There are nearly 800 data coming from sensor over different URL to my server.
Those data needs to be inserted in different table of database.
Now these 800 data comes with frequency of about 5 sec. Data will come 24*7. So on average I will need to fire 800-900 queries every 5 sec.
Would wamp and sql be sufficient to handle these density of data? If not what other server should I use? Or would I need to install some server OS instead of windows?
My PC specs - intel core i7, 16gb ram, 2gb nvidia 1080 graphics
NO!
But that is not WAMPServers fault.
Windows 10 itself can only handle about 20 remove connections, as Windows itself uses some of the max 30 that are allowed.
And before you ask, no, this is not configurable, otherwise noone would buy Windows Server.
However, if you want to use WAMPServer for testing (cause thats what it and XAMPP et al were designed for) then you should not have to many problems. It will cope with the volume of traffic just not 100's of connections at the same time
Related
I have test instance of Prestashop in AWS on EC2 t3.small (2 x vCPUs, 2GB memory) running Ubuntu with Apache2 + PHP 7.4 + MySQL 5.7.
I have cloned exactly the same setup to Azure App Service with PHP 7.4 (B1 1 vCP, 1.74GB memory) and MySQL Flexible Server 5.7 (Burstable 1 x vCore, 2 GB memory, 640 IOPS). MySQL is accessible via public network. Both AAS and MySQL are on the same region.
Both setups have the same configuration, AWS hosted Prestashop takes on average 2-3 seconds to load any page.The one in Azure takes around 1 minute and 30 seconds to load any page.
Metrics show none of the resources: CPU, memory, IOPS reaching 100% of usage.
I upgraded both Azue App Service to Premium (P1V2) and MySQL to Business Critical Tier (Standard_E2ds 2 vCores, 16 GiB, 5000 iops) and the results are the same.
Prestashop debug mode shows huge amount of time spent on querying.
I also connected to both AWS and Azure MySQL directly and executed the same query.
On average AWS is 3 times faster than Azure one (100ms vs 310ms).
One approach I haven't tried is to set MySQL on VNET, but would that improve the performance at all?
Maybe there is something I'm missing in the setup or maybe MySQL performance in Azure is questionable. I have seen other posts stating that running MySQL in Azure VM gives better performance than using managed one which would be crazy.
I have a production application which uses barcode. It's created using PHP/MySQL on xampp server. When there is more than 50000 data on specific 5 tables then its performance slow on input and fetch data. I have already use indexing on columns. Here are more than 50 users, and give input to 15 users. Its installed on vmware with 14 GB ram and xenon processor server.pc configxampp configNow I want to increase innodb_buffer_pool_size.Now it's 2G. But if I increase it 3G or above then my MySQL stop unexpectedly. Can anyone give me solution how can I increase it
I will tell you about website specs for the below configuration.
social networking site- 70%dynamic content
Linux centos 6.6
Apache web server
php language
server specs x 2 (main server and sql server)
4 x Intel® Xeon® E5-4640 v2 2.20GHz, 20M Cache, 8.0GT/s QPI, 10 Core
48 x 16GB (768 GB) RDIMM, 1600MT/s, Low Volt RAM
4 x 300GB 15K RPM SAS 6Gbps
for other storage = Dell Storage Direct-Attached Storage (DAS)
network = 10 gigabit / sec
Assume that memcache / load balancer / other extra servers are there and not included in this.
(just needed rough calculation)
my question is:
how many concurrent users (users that will click at a same time) this platform can handle and assume that average connectivity of users will be 512 kilobit / sec.
concurrent users depends on which factor ? (ram>cpu>hdd is this right?)
I am not an expert , this question is for educational purpose only.
This question is very vague. The load that you can support will depend on the complexity of your PHP code and database design... planning (and even testing) load is a complicated topic.
You could also configure your hardware in a variety of ways which will have an impact on performance. Which RAID system you use will depend on whether your application is read or write heavy, as will your database design.
You will also need to consider whether to use Virtualisation for backup/redundancy which adds a layer of performance overhead...
Bear with me here.
I'm seeing some php.ini processes (?) running or processes touching php.ini that are using up to 80% or more of the CPU and i have no idea what would cause this. All database processing is offloaded on a separate VPS and the whole service is supported by a CDN. I've provided a screenshot of "top -cM"
Setup:
MediaTemple DV level 2 application server (the server we are looking at in the images), 8 cores, 2GB RAM
Mediatemple VE level 1 database server
Cloudflare CDN
CentOS 6.5
NginX
Mysql 5.4, ect
EDIT
I'm seeing about 120K pageviews a day here, with a substantial number of concurrent connections
Where do i start looking to find what is causing this?
Thanks in advance
I am having a bit of an issue with a php scropt. When I open my site (hosted locally) it pauses for 1-2 seconds then it loads the page.
the database where I am readying data from is very small and has indexes. The queries are quick.
My PHP code is somewhat optimized, and my databases are indexed.
PHP5.3.19 is installed on Windows 2008 R2 Server (Intel Xeon(R) CPU E5-2400 0 #2.20 GHz (2 processors) 16GB of RAM and MySQL Server in installed on a different server. Both servers are on the same network so all connection should be internal.
I also use PDO to connect to my databases.
How can I determine what is causing the extra delay?
What things can I check for to expedite the page load?
Thanks
According to my Experience I can say, there might JavaScript or any script files which you might have called in your code.
if browser find any missing script(due to wrong path or what so ever reason it may be) file then it search for it again and again until the searching time is out, which is around 2 to 5 sec, depending on the setting of the browser.