To explain what actually happens:
An external service sends a HTTP POST request with XML data to a PHP script. This script checks if the data already exists in the MySQL DB. If not, it inserts a new record.
Now there came a second service for failure safety. It sends the exact same XML data to the PHP script.
The Problem:
The script already checks if the record exists. But the requests are coming nearly at the same time and the script gets called in parallel. So the data of both requests with the same data are getting inserted.
I thought about using a queue but I can't imagine a simple way to do this. This whole process is actually very simple.
What's the easiest way to ensure to not insert data twice?
generate a hash for your transaction, make it unique at the DB level. MySQL will throw if you try to add the same data twice ?
Related
I'm trying to import the excel data to mysql database by PHP Ajax method. When the user upload an excel, the jQuery will fetch each row by loop and send it to PHP via ajax but there are about 500++ rows. Due to that, the PHP is running the query simultaneously and causing the database error already has more than 'max_user_connections' active connections. Some of the query are working but some not.
the jQuery will fetch each row by loop and send it to PHP via ajax
...this is a design flaw. If you try to generate 500 AJAX requests in a short space of time it's inevitable, due to its asynchronous nature, that a lot of them will overlap and overload the server and database...but I think you've realised that already, from your description.
So you didn't really ask a question, but are you just looking for alternative implementation options?
It would make more sense to either
just upload the whole file as-is and let the server-side code process it.
Or
If you must read it on the client-side, you should at least send all the rows in one AJAX request (e.g. as a JSON array or something).
I have a contact form, which sends non-sensitive data (name, message, a few checkboxes and address fields) to an external CRM system through a CURL request.
The problem is, sometimes the receiving system is down for maintenance, and I have to store the incoming requests somewhere temporary. Right now, I'm doing this manually when I receive a message for incoming maintenance, but this is not an option in the long run.
My question is, what's the best way to do automated storing and sending depending on the server status. I know it should depend on the CURL response code, and if it returns code 200, the script should check for any temporary stored requests to be sent alongside the current one, but I'm not sure exactly what's the best way to implement this - for example, I wonder if serialized request inside a database table is better than making a JSON array and storing it inside a file which is later deleted.
How would you solve this? Any kind of advice and tips you can give me is welcomed.
Thanks.
I would perform the following procedure to make sure the HTTP request is being successfully sent to the target server:
Make sure that the server is up and running, you may want to use get_headrs to perform this check.
Depending on server's response from the previous step you will perform one of tow actions:
1) If server response was OK then go a head and fire your request.
2) If server response was NOT OK then you may want to to store the HTTP request in some serialized form in database.
Run a cronjob that reads all requests from DB and fire them, upon server's response, if the request went successfully then delete its record from DB table.
At the ocasion of cronjob running, i would use HttpRequestPool to run all requests in parallel.
I would suggest using the DB storage instead of JSON file storage cause it will be easer to maintain specially in the case of having some of the HTTP requests being successfully processed by the server and some are not, in that case you want to remove the succesful ones and keep the others, this is much easer using a database.
I've a complex php cron job that retrieve data from an external webpage and join all the information in one variable that is encoded in json. The whole process is very slow and takes a lot of time.
The point is that I need to retrieve the json object from my index page, but I don't want to load all the script because it will take too long to execute. What I've been doing is tell the cron job to create a new file and write the json object and I've been retrieving the information from that file.
I would like to know if there is a more efficient/simple way to transfer this information without having to create a new file or executing the script 'manually'. I've heard that you can send information using CURL, the truth is that I've never used this technique before, so dunno if it would be useful in this situation.
This is a pretty common issue. Long running tasks shouldn't be executed on page load because it impacts ux. having your time intensive php script running as cron job is a great solution.
Perhaps using a db would be easier still. You can easily using sqlite or a "full fledged" rdbms to store your data (like mysql or postregs). it could be something like:
time intesive php script is running on cronjob every x minutes. Saves data to your db instead of a file.
When user requests index page it sends ajax request to another php script. The php script looks for data in your db and returns it to your user if it exists.
I'm using PHP/MySQL, although I think this question is essentially language/db ambivalent. I have a PHP script that connects to one API, gets the response data, parses it, and then sends it to a different API for storage in its database. Sometimes this process fails because of an error with one of the APIs. I would therefore like to easily track its success/failure.
I should clarify that "success" in this case is defined as the script getting the data it needs from the first API and successfully having it processed by the second API. Therefore, "failure" could result from 3 possible things:
First API throws an error
Second API throws an error
My script times out.
This script will run once a day. I'd like to store the success or failure result in a database so that I can easily visit a webpage and see the result. I'm currently thinking of doing the following:
Store the current time in a variable at the start of the script.
Insert that timestamp into the database right away.
Once the script has finished, insert that same timestamp into the database
again.
If the script fails, log the reason for failure in the DB.
I'd then gauge success or failure based on whether a single timestamp has two entries in the database, as opposed to just one.
Is this the best way to do it, or would something else work better? I don't see any reason why this wouldn't work, but I feel like some recognized standard way of accomplishing this must exist.
A user declared shutdown function might be an alternative: using register_shutdown_function() you can decalre a callback to be executed when the script terminates, whethe rsuccessfully, user-aborted, or timed-out
You could use a lock file :
at the very beginning of your script, you create a lock file somewhere on the filesystem
at the very end of your script, if everything worked good, you delete it from filesystem
Then you've just to monitor the directory where you've placed these files. With the lock file's creation date you can find which day didn't work.
You can combine this system by a monitoring script that sends alerts if lock files are present and have a creation date older than a given interval (let's say 1 hour for example if your script usually runs in a few minutes).
I have a script that takes a while to process, it has to take stuff from the DB and transfers data to other servers.
At the moment i have it do it immediately after the form is submitted and it takes the time it takes to transfer that data to say its been sent.
I was wondering is the anyway to make it so it does not do the process in front of the client?
I dont want a cron as it needs to be sent at the same time but just not loading with the client.
A couple of options:
Exec the PHP script that does the DB work from your webpage but do not wait for the output of the exec. Be VERY careful with this, don't blindly accept any input parameters from the user without sanitising them. I only mention this as an option, I would never do it myself.
Have your DB updating script running all the time in the backgroun, polling for something to happen that triggers its update. Say, for example, it could be checking to see if /tmp/run.txt exists and will start DB update if it does. You can then create run.txt from your webpage and return without waiting for a response.
Create your DB update script as a daemon.
Here are some things you can take a look at:
How much data are you transferring, and by transfer is it more like a copy-and-paste the data only, or are you inserting the data from your db into the destination server and then deleting the data from your source?
You can try analyzing your SQL to see if there's any room for optimization.
Then you can check your php code as well to see if there's anything, even the slightest, that might aid in performing the necessary tasks faster.
Where are the source and destination database servers located (in terms of network and geographically, if you happen to know) and how fast the source and destination servers are able to communicate through the net/network?