PHP application took longer time to respond [closed] - php

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have developed an Application in PHP, the application has complete User Management System along with Dynamic form creation feature , Data import,export feature and many more. I am using Mysql as a database. When i was doing testing on that application, it was working perfectly fine. Now i have deployed this application on the customer side and almost 50-60 user are running this application, its been two month and now they are facing some problem. They said some time application respond very late , and some time it took so much time to respond. For example to use this application user need to login into the application , now some time login feature perfactly fine and user can easly and some time it took a lot of time to login. I personally look into this and face the same problem. Now i am confuse where is the actual problem.
My Applicaiotn
Network Speed
Sever
LargeData in SQL
How can i get any clue that where is the exact problem.

You're going to need to provide a LOT more information to get a decent answer. And in providing that data, you will almost certainly solve the problem...
In most database-driven applications, the database is the first place where performance issues arise, especially as the data scales. A system that works fine with just a few records in the database can grind to a halt when scaling up...
So the first thing I'd do is look at the database processes while people are using the system, and look for long-running queries. Optimize those queries, and rinse & repeat.
It may also be worth writing debug log statements around your database access logic so you can look at historical performance stats - this is useful if the client tells you that the system slowed down yesterday, but is running fine today.
If it isn't the database, you need to look at the PHP performance. You'll need a profiler (try XDebug) and look for bottlenecks in your PHP code.
If it's neither the database nor your PHP code, you may have a configuration problem with your web server software (Apache or whatever you're using). This is hard to debug - you'll need to trawl through the configuration files and look for limits (e.g. "MaxConnections").
If it's not those things, the network may be the problem. This is pretty rare - if the network can't support a web application, file sharing, email and video conferencing will all suffer too. Ask your client if this is the case. To prove or disprove, put a decent size file on your webserver (a 20MB MP3 file should do it) and test how long it takes to download it while the application is running slowly.

If it is the the problem with your application, try optimizing the code. This point becomes irrelevant as the code is not provided.
Try pinging the server and check the response time. If it is normal then there is no issue with the network.
Check the Server's H/W configuration(both application and mysql server) and if the H/W configuration of the server is low for the application to run, kindly upgrade.
THIS IS MOST LIKELY TO BE THE SOLUTION: If your mysql has a large amount of data, try indexing the database. For instance you had problem with logging in, so try to indexing the usernames in your "users" table.
To be frank the data provided in the question is insufficient to come up with a concrete solution.

If the speed of your application was correct when you deployed it initially, I suppose that the problem will be the database.
Is your database normalized? Do you have the correct indexes? You can try indexing all the columns that you use in the WHERE clausules.
But, as Abhay Pai said, the data provided is insufficient to solve this problem.

Related

Our Centos server is infected with some malware. It is making calls to random IP's/ Domains. How to stop the server from making external requests? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 years ago.
Improve this question
We have a CentOS[release 6.8 (Final)] server with Drupal 7.51 and php 5.3.3 and apache.
When we go to certain pages on the server, the server makes calls to random IP addresses. We traced the issue with tcpdump command. Here's the output:
In the image, the IP 45.250.47.93 is from our network while cpe-24-194-158-202.nycap.res.rr.com & 52.128.135.13 do not belong to us. These IP addresses keep changing with every request.
How should we secure our server from making these requests to random IP addresses?
Some more background:
Last evening, some of our website pages started getting redirected to ad servers automatically. On investigating we realised that some php files were created on our server and a crontab was added. We removed all the php files that were not created by us and also disabled the cron. Since then the redirecting to ad servers has stopped, but some pages are sending out requests to random IP addresses.
You should definitely isolate that server asap before it creates more damage. Trying to fix it online is not an option IMHO as you will never be sure to have cleaned it totally unless you fully understand how they got in (difficulty depends) and what was done (the most difficult part, even for experts).
The proper course would be to stop/isolate that server, identify the origin of the hack (eg how did they come in), fix it and reinstall a clean and fixed (or at least mitigated) instance on a new server (the hacked one should at least be formatted before re-use, or even bios flashed in case the hack was sophisticated).
The investigation can be long and should really be performed offline. If it is a VM and you can create a snapshot, do it, and use it offline for forensic analysis. Otherwise, you may be able to access the FS in rescue mode and copy all of it.
It may not seem an option to stop your service, but believe me, you prefer a downtime than to be the origin of more spreading of the malware.
Since Drupal and php are involved, it is likely to have been the entry point. Do a full diff with trusted sources, you may find new/altered files and understand the leak.
I agree with everything that fab2s said.
The random pages that are sending requests to me like there is a piece of code that was modified on that page to send out that request. Could be anything from a shell of sorts to some kind of user data extraction call.
A diff or your application against your production copy on your version control may show some leads.
Checking any logging that your application may have could turn up something.
But creation of the crontab implies that there was some access an attacker had to system. Most likely going through a vulnerability on your web application.
Checking these CentOS logs may help find the time that the machine was logged into by the attacker.
All Users access log: /var/log/wtmp
Root access log: /var/log/secure
https://www.centos.org/forums/viewtopic.php?t=15117

using phpfastcache file cache system

I have a question regarding phpfastcache that i hope someone can answer and explain to me. phpfastcache claim that it can serv 10,000+ visitors, but what is the limit it can serv if there is any? let me give you a example so you can answer me based on it.
Let say my website or app have 12 million online users ( just example ) and i have to send the same query to the database on every page/app load. Do you think it will be able to handle this amount of users? we are using a nosql database and our website is linked to a cdn. We know that memory is always faster than file system, but we are using file system when we are using phpfastcache for cache. I hope there is someone who can answer my question and explain things to me to future use.
It doesn't claim that it can serve 10.000 requests! It suggests that it's a great fit if you have 10.000 identical requests to the database. To get real numbers you have to profile your server.
As it's probably an Apache based one it will depend on the number of concurrent connections Apache can handle.
What the guys from phpfastcache mean is that if you have a page that's constantly hit and your page performs the same query over and over again their software is a great fit to that problem.

What are possible ways to write 24/7 remote working game engine? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I am a beginner programmer that wants to write simple turn-based 2-player game.
The problem begins when the application of paired 2 players wants to communicate, I can write a simple server in C++ using sockets, but then people could play only while server is online.
I cant run my comp 24/7 so:
Is it possible to use on-line free hosting (empty website + PHP parameters + PHP sockets + mySQL) as a game server? I serached some sites but they allow only web hosting, they wont allow me to use it as game-phycics-computing-centre.
What are other ways to write a server application that would run 24/7 (preferably for free)?
If I switched whole project to peer-2-peer, how should it work? (the communication during match is extremely easy but ... how would players search for others? How would one player know that other player is online and challenge him a match?)
Thanks for current answers, now I am just thinking about free sql database + p2p solution or any other that would work 24/7
The best solution I could come up with works as follows:
Have a MySQL database to register users. Stuff like a name and their IP address.
When client starts, it queries for the entire user database and saves / updates them locally.
Next the client queries the server to update the players own IP address. This so any other clients receive their latest IP address.
From this point on, the client uses it's own ports to ping the IP addresses saved locally so it knows who's online and who isn't. Or something like that. I'm not to formilliar with p2p protocols.
Lastly, have the client query the server to update the users only once per hour or longer, so they receive updated IP addresses that happened while they were already playing.
Doing it like this will save you A LOT of data traffic, while still keeping your multi-player game viable to play.
In order to run a server application, you probably need some computing power with internet connectivity. In the following few lines I try to list a few free options (presuming this is what you are looking for).
As you did not specify details about your situation, here are some cases:
General: you can try AWS Free Tier, where
After creating your AWS account you can use any of the 21 products and services, listed below, for free within certain usage limits.
In order to make the answer sound less of a cheap ad, you could also try Microsoft Azure Free Trial what is much like AWS, but only for 1 month.
You have nowhere stated, but in case you are an official student, the GitHub Student Developer Pack offers you $100 in platform credit for DigitalOcean

Synchronising data between MySQL and SQL Server over an unreliable network connection [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
This question relates to implementing redundancy when having two databases syncronising on 2 different servers. First of all, I'll explain the setup so that you can understand the backround of the issue.
I have two different servers, operating different operating systems with different DB types at 2 different locations.
Server 1 (local server):
Windows 2003 Small Business Server OS
MSSQL DB Server
Server-Side Language - C# ASP.NET
Server 2 (website server):
Linux CentOS 6
MySQL DB
Server-Side Language - PHP
Server 1 runs the internal management side of the software and server 2 is the external website interface (server 2 has some management aspects 2). Unfortunately, we have seen frequent loss of broadband at our office and this means that communciation between the two servers may not always be possible (we have 3G broadband redundancy but it takes time for Dynamic DNS to update the DNS record with our new IP address when we lose the primary broadband). This is what caused the need for each server to have its own DB - as we can't allow one or the other of these two sides to go offline entirely.
As such, I've built it into both servers such that both have their own local database but apply any changes to other server as well. If the other server cannot be connected to, it will save the query to the local DB so that the queries can be applied when the other server becomes available again. This leaves two issues:
A general point about syncing between databases like this - how
can I best avoid conflicting queries being applied when the servers
reconnect? See example 1 below for a description of what I mean. This
problem is minimised because the connection should only be lost for a
couple minutes - until the DNS record is updated.
When applying the back-logged queries, I would like to have the script automatically apply the queries (rather than having to manually launch a script to resolve back-logged queries). However, I'm unsure what effect this would have if the user navigated away from the page whilst it was still appying the back-logged queries in the background.
Example 1:
If a user on the website side (server 2) changes their email address, or phone number etc., and a user on the management side (sever 1) changes that user's details then these two queries will have conflicting information. When the servers reconnect then the two queries will be applied to the other server - and the entries will still be out of sync.
P.S. Sorry the question is so long.
Don't re-invent database replication. Use the one provided by your database software instead.
However, this means you should switch to the same RDBMS on both ends. MySQL replication is fairly reliable on WAN connections and will work reliably between platforms and versions.
I agree with gertvdijk, the best solution would be to use the same RDBMS on both servers (though I'd go with MySQL over MSSQL since it will run just fine on both host OSes) and then use the replication software for that RDBMS.
However if you're unwilling or unable to do that the next best alternative I can think of in your case would be to create a "transaction log" on both servers. At it's simplest something that captures a UTC date/time and an SQL command to be executed:
2012-12-14 13:42:13 UPDATE users SET email = 'me2#mydomain.com' WHERE id = 13542
2012-12-14 13:43:55 UPDATE ....
...
Obviously you would want to use parameterized queries instead of what I've shown - it's just easier for me to explain my idea without getting mired down in implementation details.
So each server is keeping a log of it's transactions and suddenly you lose your office connection. A few minutes later when it comes up both servers exchange their transaction logs and merge them with their own transaction log, sorting all records by the UTC date/time stamp. Now you can execute all the queries that happened on both servers and not worry about them being out of sync, even if changes were made to the exact same column while the connection between servers was lost. If you're worried about the exact same column being changed at the exact same time you can put the microtime in the date/time stamps.

PHP server for Android turn based multiplayer game? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I developed a turn based game for Android, and now I want to add multiplayer gaming. I don't want to use providers like "skiller", I would like to develop my own server.
I don't have a dedicated server, but I have a php hosting with "1 and 1". Would be a good idea to use this hosting like a game server? My idea is that my Android game polls server every X seconds waiting for opponent move.
What do you think about it?
I have looked into using them for hosting and I have heard great things about their support. BUT unless your using a dedicated server that they offer (instead of the base level shared hosting service) you probably won't have the resources available to support a gaming server.
If you can handle doing the server config yourself or are good with detailed instructions on how to set one up then I highly recommend using the $20 a month linode.com base plan. I am using it for site hosting and it makes a huge difference in terms of performance and flexibility. Also I have seen some performance benchmark comparisons done between it, slicehost, rackspace, and Amazon S3 and it blew all of them (especially Amazon) out of the water. The benchmark is 2 or 3 years old, but it is still rather telling.
Linode will let you do ANYTHING you want within the bounds of the law with the server. So if you want to host an adult site they won't have a problem with it. They will probably have a problem with setting up a spam server or some shady things like that. But they are cool with everything else it seems. Plus they are probably the most affordable option out there.
I would add that you may look into the technology behind APE servers (AJAX Push Engine). Its a high efficiency chat system that works with pretty much any server-end language and front-end.
http://www.ape-project.org/
this is how I would do it, since your game is asking the server for pull request lets skip one step. Thats asking the MySQL server. So here is the way I would do it.
From what it sounds like its a html5 game meaning that you could set cookies, on the users phone. I would have a cookie or even a javascript var that then stores a JSON or ARRAY string. from this you would be able to push 1 or 0 to the PHP script that hopefully would be able to send it to the user or store it as a .txt files it is known that fetching data from mysql is slower and if you have 400 users all doing it at 5second intervals you will bring down most shared servers, They are just not strong enough to do what you want.
If your budget is small may I suggest you look this option. I cant really think of away you can do this without storing the data somewhere.
You should not listen to programming language specific arguments, most of the time they are personal preferences. What you want to do is of course possible with PHP. You can do good and bad programming in every language. For a turn based game PHP is totally sufficient, if you know how to use it. Use the programming language you are most comfortable with, and you will be just fine.
What is more important: You want your game to be successful, 1und1 is not capable of handling a way of "success", because of its localized and not scalable on demand nature. If you want to gain money or spread, you really should not fear to invest some low bucks and go for Amazons infrastructure. You will have to learn a little, but it is definitely worth it.
Most IDEs (Zend Studio, PHP Storm, ...) even have good integration already. A shared hoster or a localized storage hoster is not what you want, because if your app gets famous they simply will not be able to handle global demand. And you will have security problems. If you really expect your game to be successful, even a VPS will reach its limit almost immediately.
You can try to grow your game by using 1und1 first, and upon demand going to a better solution. But quite some games just died because the demand was much higher than expected. The worst thing about this is you have to think less euphoric but more realistic.
To sum it up: Just use the language you are most comfortable with, if you believe in your idea do not fear to invest little bucks to meet the global market, create your own protocol with as less data transfer as possible, and please poll more often than "x seconds", because a second waiting for the opponent feels like three days. And you are on your way.
Good progress! And post a link when you're done:)
You can use Firebase messaging service to implement the same. Further you have must one Web server (e.g. PHP).
Use your android code to send request to PHP file at web server.
From this web server you can send broadcast message in form of firebase Messaging which will deliver to all devices of your subscribed topic.
All android device those listen for this FCM can respond(depend on parameters set by your code).

Categories