I have a contact form, which sends non-sensitive data (name, message, a few checkboxes and address fields) to an external CRM system through a CURL request.
The problem is, sometimes the receiving system is down for maintenance, and I have to store the incoming requests somewhere temporary. Right now, I'm doing this manually when I receive a message for incoming maintenance, but this is not an option in the long run.
My question is, what's the best way to do automated storing and sending depending on the server status. I know it should depend on the CURL response code, and if it returns code 200, the script should check for any temporary stored requests to be sent alongside the current one, but I'm not sure exactly what's the best way to implement this - for example, I wonder if serialized request inside a database table is better than making a JSON array and storing it inside a file which is later deleted.
How would you solve this? Any kind of advice and tips you can give me is welcomed.
Thanks.
I would perform the following procedure to make sure the HTTP request is being successfully sent to the target server:
Make sure that the server is up and running, you may want to use get_headrs to perform this check.
Depending on server's response from the previous step you will perform one of tow actions:
1) If server response was OK then go a head and fire your request.
2) If server response was NOT OK then you may want to to store the HTTP request in some serialized form in database.
Run a cronjob that reads all requests from DB and fire them, upon server's response, if the request went successfully then delete its record from DB table.
At the ocasion of cronjob running, i would use HttpRequestPool to run all requests in parallel.
I would suggest using the DB storage instead of JSON file storage cause it will be easer to maintain specially in the case of having some of the HTTP requests being successfully processed by the server and some are not, in that case you want to remove the succesful ones and keep the others, this is much easer using a database.
Related
To explain what actually happens:
An external service sends a HTTP POST request with XML data to a PHP script. This script checks if the data already exists in the MySQL DB. If not, it inserts a new record.
Now there came a second service for failure safety. It sends the exact same XML data to the PHP script.
The Problem:
The script already checks if the record exists. But the requests are coming nearly at the same time and the script gets called in parallel. So the data of both requests with the same data are getting inserted.
I thought about using a queue but I can't imagine a simple way to do this. This whole process is actually very simple.
What's the easiest way to ensure to not insert data twice?
generate a hash for your transaction, make it unique at the DB level. MySQL will throw if you try to add the same data twice ?
I'm trying to send certain data from iOS to online MySQL database. PHP is used on the server to handle the data receiving and inserting.
The thing is that I have several data packages. And the key is to send them one by one, which means I need a mechanism to let the second data package in the queue wait until iOS received feedback from the server confirming the first set of data has already been stored into the database.
I initially tried creating a serial dispatch queue, aiming to have the iOS app execute uploading work in a sequence. Although iOS side did carry out the work according to the sequence, but each task simply "finished" at sending out its data package without waiting to see if the data have been inserted into the database. Then the problem is there will always be some time lapse between sending out the data and data being fully saved to MySQL in the server, due to issues like network connection.
So the result is the data may not be saved in desired sequence, with some later data may be saved earlier than the previous data.
I'm guess what is missing is a "feedback" mechanism from the server side to the iOS side.
Can anybody suggest a way to realize this feedback mechanism, so I can control the serial sequence of uploading data tasks.
Thank you very much!
Regards,
Paul
If you are sending data to server then most of available frameworks offers callback. With AFNetworking (or now known as Almofire) it would look like this:
[[ConnectionManager instance] GET: #"link" parameters: nil
success:^(AFHTTPRequestOperation* operation, id responseObject)
{
}
failure:^(AFHTTPRequestOperation* operation, NSError* error)
{
}];
So you can put your code in given handlers and continuously make requests.
You may also want to create concurrent Operations and put those on OperationQueue while setting proper dependencies but it's surely more time consuming.
I'm developing an application that will use a server to sync data, this server will use php and a mysql database.
To the sync process I'm thinking is a 3 way communication:
1 - The client sends the data to the server, the server handle the data and reply to the client with an OK or ERROR, at this point the server should begin the transaction.
2 - The client if receive OK just updates the internal info (update date and delete some rows from the database)
3 - The client sends another request to the server (OK or CANCEL), when the server receives this new request it commits or rollback the transaction.
Is this possible? Start the transaction in one request and commit the transaction in another?
In case of YES, how? Sessions?
Or should i do this in another way?
in php you shouldnt store objects in the session(citation needed). so what i would do is store the data into session, and when you receive the confirmation from the client retrieve the data and assemble the query(mysqli or PDO, the one you like).
this will work unless you need to use some data from the database(IE last_inser_id) if thats the case i dont kwnow how to do that, and im tempted to say thats impossible (dont remember, but i think that php closes the DB session when the script ends)
I have a script that takes a while to process, it has to take stuff from the DB and transfers data to other servers.
At the moment i have it do it immediately after the form is submitted and it takes the time it takes to transfer that data to say its been sent.
I was wondering is the anyway to make it so it does not do the process in front of the client?
I dont want a cron as it needs to be sent at the same time but just not loading with the client.
A couple of options:
Exec the PHP script that does the DB work from your webpage but do not wait for the output of the exec. Be VERY careful with this, don't blindly accept any input parameters from the user without sanitising them. I only mention this as an option, I would never do it myself.
Have your DB updating script running all the time in the backgroun, polling for something to happen that triggers its update. Say, for example, it could be checking to see if /tmp/run.txt exists and will start DB update if it does. You can then create run.txt from your webpage and return without waiting for a response.
Create your DB update script as a daemon.
Here are some things you can take a look at:
How much data are you transferring, and by transfer is it more like a copy-and-paste the data only, or are you inserting the data from your db into the destination server and then deleting the data from your source?
You can try analyzing your SQL to see if there's any room for optimization.
Then you can check your php code as well to see if there's anything, even the slightest, that might aid in performing the necessary tasks faster.
Where are the source and destination database servers located (in terms of network and geographically, if you happen to know) and how fast the source and destination servers are able to communicate through the net/network?
I have a page that I am performing an AJAX request on. The purpose of the page is to return the headers of an e-mail, which I have working fine. The problem is that this is called for each e-mail in a mailbox. Which means it will be called once per mail in the box. The reason this is a problem is because the execution time of the imap_open function is about a second, so each time it is called, this is executed. Is there a way to make an AJAX call which will return the information as it is available and keep executing to prevent multiple calls to a function with a slow execution time?
Cheers,
Gazler.
There are technologies out there that allow you to configure your server and Javascript to allow for essentially "reverse AJAX" (look on Google/Wikipedia for "comet" or "reverse AJAX"). However, it's not incredibly simple and for what you're doing, it's probably not worth all of the work that goes into setting that up.
It sounds like you have a very common problem which is essentially you're firing off a number of AJAX requests and they each need to do a bit of work that realistically only one of them needs to do once and then you'd be good.
I don't work in PHP, but if it's possible to persist the return value of imap_open or whatever it's side effects are across requests, then you should try to do that and then just reuse that saved resource.
Some pseudocode:
if (!persisted_resource) {
persisted_resource = imap_open()
}
persisted_resource.use()....
where persisted_resource should be some variable stored in session scope, application scope or whatever PHP has available that's longer lived than a request.
Then you can either have each request check this variable so only one request will have to call imap_open or you could initialize it while you're loading the page. Hopefully that's helpful.
Batch your results. Between loading all emails vs loading a single email at a time, you could batch the email headers and send it back. Tweak this number till you find a good fit between responsiveness and content.
The PHP script would receive a range request in this case such as
emailHeaders.php?start=25&end=50
Javascript will maintain state and request data in chunks until all data is loaded. Or you could do some fancy stuff such as create client-side policies on when to request data and what data to request.
The browser is another bottleneck as most browsers only allow 2 outgoing connections at any given time.
It sounds as though you need to process as many e-mails as have been received with each call. At that point, you can return data for all of them together and parse it out on the client side. However, that process cannot go on forever, and the server cannot initiate the return of additional data after the http request has been responded to, so you will have to make subsequent calls to process more e-mails later.
The server-side PHP script can be configured to send the output as soon as its generated. You basically need to disable all functionality that can cause buffering, such as output_buffering, output_handler, HTTP compression, intermediate proxies...
The difficult part is that you'd need that your JavaScript library is able to handle partial input. That is to say: you need to have access to downloaded data as soon as it's received. I believe it's technically possible but some popular libraries like jQuery only allow to read data when the transfer is complete.