I'm trying to send certain data from iOS to online MySQL database. PHP is used on the server to handle the data receiving and inserting.
The thing is that I have several data packages. And the key is to send them one by one, which means I need a mechanism to let the second data package in the queue wait until iOS received feedback from the server confirming the first set of data has already been stored into the database.
I initially tried creating a serial dispatch queue, aiming to have the iOS app execute uploading work in a sequence. Although iOS side did carry out the work according to the sequence, but each task simply "finished" at sending out its data package without waiting to see if the data have been inserted into the database. Then the problem is there will always be some time lapse between sending out the data and data being fully saved to MySQL in the server, due to issues like network connection.
So the result is the data may not be saved in desired sequence, with some later data may be saved earlier than the previous data.
I'm guess what is missing is a "feedback" mechanism from the server side to the iOS side.
Can anybody suggest a way to realize this feedback mechanism, so I can control the serial sequence of uploading data tasks.
Thank you very much!
Regards,
Paul
If you are sending data to server then most of available frameworks offers callback. With AFNetworking (or now known as Almofire) it would look like this:
[[ConnectionManager instance] GET: #"link" parameters: nil
success:^(AFHTTPRequestOperation* operation, id responseObject)
{
}
failure:^(AFHTTPRequestOperation* operation, NSError* error)
{
}];
So you can put your code in given handlers and continuously make requests.
You may also want to create concurrent Operations and put those on OperationQueue while setting proper dependencies but it's surely more time consuming.
Related
I'm trying to import the excel data to mysql database by PHP Ajax method. When the user upload an excel, the jQuery will fetch each row by loop and send it to PHP via ajax but there are about 500++ rows. Due to that, the PHP is running the query simultaneously and causing the database error already has more than 'max_user_connections' active connections. Some of the query are working but some not.
the jQuery will fetch each row by loop and send it to PHP via ajax
...this is a design flaw. If you try to generate 500 AJAX requests in a short space of time it's inevitable, due to its asynchronous nature, that a lot of them will overlap and overload the server and database...but I think you've realised that already, from your description.
So you didn't really ask a question, but are you just looking for alternative implementation options?
It would make more sense to either
just upload the whole file as-is and let the server-side code process it.
Or
If you must read it on the client-side, you should at least send all the rows in one AJAX request (e.g. as a JSON array or something).
I have a contact form, which sends non-sensitive data (name, message, a few checkboxes and address fields) to an external CRM system through a CURL request.
The problem is, sometimes the receiving system is down for maintenance, and I have to store the incoming requests somewhere temporary. Right now, I'm doing this manually when I receive a message for incoming maintenance, but this is not an option in the long run.
My question is, what's the best way to do automated storing and sending depending on the server status. I know it should depend on the CURL response code, and if it returns code 200, the script should check for any temporary stored requests to be sent alongside the current one, but I'm not sure exactly what's the best way to implement this - for example, I wonder if serialized request inside a database table is better than making a JSON array and storing it inside a file which is later deleted.
How would you solve this? Any kind of advice and tips you can give me is welcomed.
Thanks.
I would perform the following procedure to make sure the HTTP request is being successfully sent to the target server:
Make sure that the server is up and running, you may want to use get_headrs to perform this check.
Depending on server's response from the previous step you will perform one of tow actions:
1) If server response was OK then go a head and fire your request.
2) If server response was NOT OK then you may want to to store the HTTP request in some serialized form in database.
Run a cronjob that reads all requests from DB and fire them, upon server's response, if the request went successfully then delete its record from DB table.
At the ocasion of cronjob running, i would use HttpRequestPool to run all requests in parallel.
I would suggest using the DB storage instead of JSON file storage cause it will be easer to maintain specially in the case of having some of the HTTP requests being successfully processed by the server and some are not, in that case you want to remove the succesful ones and keep the others, this is much easer using a database.
I need to send data amount (not more then 5000 symbols) from one website to another on browser level. How can I do it?
I can put any PHP code on the first server, but data processing should be not on server level (not SOAP, curl, etc) because of performance and stability reason (pages need to be loaded fastly, data transfer should be done only after load).
So, on first server I have data, which I need to send 'on fly', on second I have PHP server that catch it. It is not necessary to catch response from server.
As I know, AJAX can be done only on the same domain.
One method I know is to create tag with src = some file on second server. Like www.test.com/myimage.png?param="testtsttest". But GET has limitation.
You can use JSONP, which allows you to transfer JSON data between 2 servers.
I'm developing an application that will use a server to sync data, this server will use php and a mysql database.
To the sync process I'm thinking is a 3 way communication:
1 - The client sends the data to the server, the server handle the data and reply to the client with an OK or ERROR, at this point the server should begin the transaction.
2 - The client if receive OK just updates the internal info (update date and delete some rows from the database)
3 - The client sends another request to the server (OK or CANCEL), when the server receives this new request it commits or rollback the transaction.
Is this possible? Start the transaction in one request and commit the transaction in another?
In case of YES, how? Sessions?
Or should i do this in another way?
in php you shouldnt store objects in the session(citation needed). so what i would do is store the data into session, and when you receive the confirmation from the client retrieve the data and assemble the query(mysqli or PDO, the one you like).
this will work unless you need to use some data from the database(IE last_inser_id) if thats the case i dont kwnow how to do that, and im tempted to say thats impossible (dont remember, but i think that php closes the DB session when the script ends)
My currently in-development website is written in PHP. As users are using the site, they'll be performing actions and I'd like to be able to push notifications of these actions to other users that they're connected to.
Now while I'm sure that using EventSource and a PHP document to server up the appropriate data: lines would work, I've got absolutely no idea how I should notify that PHP document when a new message actually needs to be sent.
What I essentially mean is that when an action takes place, there will be an entry into the PostgreSQL database with the message information (such as the action that was taken). However, it's not efficient to have each instance of the "messaging" PHP document (the one that EventSource is connected to) to continuously poll PostgreSQL for new messages. With 50 users active at once, that would be 50 instances polling PostgreSQL, and as you can probably see, not a very efficient use of resources.
So I'm wondering whether anyone has any suggestions as to software that might assist with this problem. Ideally I'd like to be able to call a function that indicates an action has been undertaken, which is then sent all the other instances of "messaging" PHP document so that they can interpret the message and see whether it's relevant and push it back to the client.
Essentially I need a way to notify running PHP instances (that were started via Apache) of a new message being created, by calling a function in another PHP instance with the message information. I don't need assistance with getting the messages to the client; I can do that with EventSource.
Does anyone have any suggestions as to how this task could be undertaken?
Conventional ways of solving the problem are using a java applet (which can open a socket back to the originating server) or using long polling (e.g. comet).
I've succeeded in doing this by using memcache with a messages-count key-value and a message-$i key-value where $i is an incrementing number. A PHP document is connected to via long polling and it continuously checks to see whether message-$(messages-count) exists, in which case it returns it.
There's a bit more to this since it will return multiple messages if they're created at once and also can load the initial checking number ($i) as a $_GET parameter, but this is essentially how it works. It's near instant and new messages can easily be added to memcache via PHP (each time you create a new message, you increment messages-count).
Take a look at php mem sharing