Send data from one website to another on browser level - php

I need to send data amount (not more then 5000 symbols) from one website to another on browser level. How can I do it?
I can put any PHP code on the first server, but data processing should be not on server level (not SOAP, curl, etc) because of performance and stability reason (pages need to be loaded fastly, data transfer should be done only after load).
So, on first server I have data, which I need to send 'on fly', on second I have PHP server that catch it. It is not necessary to catch response from server.
As I know, AJAX can be done only on the same domain.
One method I know is to create tag with src = some file on second server. Like www.test.com/myimage.png?param="testtsttest". But GET has limitation.

You can use JSONP, which allows you to transfer JSON data between 2 servers.

Related

How to manage uploading operations one by one

I'm trying to send certain data from iOS to online MySQL database. PHP is used on the server to handle the data receiving and inserting.
The thing is that I have several data packages. And the key is to send them one by one, which means I need a mechanism to let the second data package in the queue wait until iOS received feedback from the server confirming the first set of data has already been stored into the database.
I initially tried creating a serial dispatch queue, aiming to have the iOS app execute uploading work in a sequence. Although iOS side did carry out the work according to the sequence, but each task simply "finished" at sending out its data package without waiting to see if the data have been inserted into the database. Then the problem is there will always be some time lapse between sending out the data and data being fully saved to MySQL in the server, due to issues like network connection.
So the result is the data may not be saved in desired sequence, with some later data may be saved earlier than the previous data.
I'm guess what is missing is a "feedback" mechanism from the server side to the iOS side.
Can anybody suggest a way to realize this feedback mechanism, so I can control the serial sequence of uploading data tasks.
Thank you very much!
Regards,
Paul
If you are sending data to server then most of available frameworks offers callback. With AFNetworking (or now known as Almofire) it would look like this:
[[ConnectionManager instance] GET: #"link" parameters: nil
success:^(AFHTTPRequestOperation* operation, id responseObject)
{
}
failure:^(AFHTTPRequestOperation* operation, NSError* error)
{
}];
So you can put your code in given handlers and continuously make requests.
You may also want to create concurrent Operations and put those on OperationQueue while setting proper dependencies but it's surely more time consuming.

PHP - Could you use POST method to stream data to a server over a long period of time?

Caveat, I know this has the potential to be a ridiculously stupid question, but I had the thought and want to know the answer.
Aim: run an interactive session between browser and server with a single request without ajax or websockets etc..
Scenario: a PHP file on the server receives data by POST method from a user. The content length in the header is 8MB so it keeps the connection open until it receives the full data of 8MB. But on the user side we are delivering this data very very slowly (simulating a terrible connection, for example). The server is receiving the data bits at a time. [can this be passed to the PHP file to process bits at a time? Or does it only get passed once all the data is received?] It then does whatever it wants with those bits, and delivers it to the browser, in an echo loop). At certain time intervals, the user injects new data into the 'stream' which will be surrounded by a continuous stream of padding data.
Is any of that possible? Or even with CGI? I am expecting this not to be possible really, but what stops the process timing out if someone does have a terrible connection and the POST data is huge?
As far as I know, you could do this, but the PHP file you are calling with the POST data will only be called by the webserver once it has received all the data. Otherwise, say you were sending an image with POST, and your PHP script moves this image from the tempfiles directory to another directory, before all the data has been received, you would have a corrupt image, nothing more, nothing less.
As long as the ini configurations have been altered correctly, I would think so. But would be a great test to try out!

Getting big amount of data from a very slow external data-source

I need to recieve a big amount of data from external source. The problem is that external source sends data very slow. The workflow is like this:
The user initiates some process from app interface (common it is fetching data from local xml file). This is quite fast process.
After that we need to load information connected with fetched data from external source(basically it is external statistics for data from xml). And it is very slow. But user needs this additional inforamtion to continue work. For example he may perform filtering according to external data or something else.
So, we need to do it asynchronously. The main idea is to shows external data as it becomes available. The question is how could we organise this async process? Maybe some quess or something else? We`re using php+mysql as backend and jquery at front-end.
Thanks a lot!
Your two possible strategies are:
Do the streaming on the backend, using a PHP script that curls the large external resource into a database or memcache, and responds to period requests for new data by flushing that db row or cache into the response.
Do the streaming on the frontend, using a cross-browser JavaScript technique explained in this answer. In Gecko and WebKit, the XmlHttpRequest.onreadystatechange event fires every time new data is received, making it possible to stream data slowly into the JavaScript runtime. In IE, you need to use an iframe workaround, also explained at Ajax Patterns article linked in the above SO post.
One possible solution would be to make the cURL call using system() with the output being redirected in a file. Thus PHP would not hang until the call is finished. From the PHP manual for system():
If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.
This would split the data gathering from the user interface. You could then work with the gathered local data by several means, for example:
employ an iFrame in the GUI that would refresh itself in some intervals and fetch data from the local stored file (and possibly store it in the database or whatever),
use jQuery to make AJAX calls to get the data and manipulate it,
use some CGI script that would run in the background and handle the database writes too and display the data using one of the above from the DB directly,
dozens more I can't think of now...

PHP - Manage data from URL variables

Using a PHP script, I need to manage data sent to the script in a variable format.
The URL sent is something like: http://hawkserv.co.uk/heartbeat.php?port=25565&max=32&name=My%20Server&public=True&version=7&salt=wo6kVAHjxoJcInKx&players=&worlds=guest&motd=testtet&lvlcount=1&servversion=67.5.0.1&hash=randomhash&users=0
(clicking the link returns a formatted table of the results)
What is the best method of storing this information for it to be used in a formatted HTML page?
Multiple URL's will be sent to the script, with different values. The script needs to store each response to be used later, and also "time out" responses that haven't been updated in a while.
Example scenario:
3 servers exist, Server 1, Server 2, and Server 3. Each of these servers send the above url every 45 seconds with a few values changed per server. A formatted table can display information when the page is requested, and is updated when the page refreshes to any new information that the servers send.
Server 1 goes offline, and doesn't send any more requests. The script accounts for this lack of request and removes Server 1's information from the list, declaring it offline.
Although code is greatly appreciated to have, I think I can just go off the best way of doing it. Is it storing each url as an array in a file, and reading the file when needed, or is there some other way?
I would store the variables + the time the request was received in a database. The database can be a SQLite one if you don't like to go through the hassle of setting up a full blown system. The advantages of using SQLite over dumping arrays to a file is that you can do flexible queries without coding up parsing routines and the like.

ajax multi-threaded

Is it possible to achieve true multi-threading with Ajax? If so, how? Please give me some related information, websites or books.
It depends on what you mean by "multithreaded".
Javascript code is distinctly singlethreaded. No Javascript code will interrupt any other Javascript code currently executing on the same page. An AJAX (XHR) request will trigger the browser to do something and (typically) call a callback when it completes.
On the server each Ajax request is a separate HTTP request. Each of these will execute on their own thread. Depending on th Web server config, they may not even execute on the same machine. But each PHP script instance will be entirely separate, even if calling the same script. There is no shared state per se.
Now browsers typically cap the number of simultaneous Ajax requests a page can make on a per host basis. This number is typically 2. I believe you can change it but since the majority of people will have the default value, you have to assume it will be 2. More requests than that will queue until an existing request completes. This can lead to having to do annoying things like creating multiple host names like req1.example.com, req2.example.com, etc.
The one exception is sessions but they aren't multithreaded. Starting a session will block all other scripts attempting to start the exact same session (based on the cookie). This is one reason why you need to minimize the amount of time a session is opened for. Arguably you could use a database or something like memcache to kludge inter-script communication but it's not really what PHP is about.
PHP is best used for simple request processing. A request is received. It is processed and a response is returned. That response could be HTML, XML, text, JSON or whatever. The request could be an HTTP request from the browser or an AJAX request.
Each of these request-response cycles should, where possible, be treated as separate entities.
Another technique used is long-polling. An HTTP request is sent to the server and may not return for a long time. This is used for Web-based chat and other "server push" type scenarios. Sometimes partial responses will be flushed without ending the request.
The last option (on Unix/Linux at least) is that PHP can spawn processes but that doesn't seem to be what you're referring to.
So what is it exactly you're trying to do?
You can't actually multi-thread but what a lot of larger websites do is flush the output for a page and then use Ajax to load additional components on the fly so that the user sees content even while the browser is still requesting new information. Its a good technique to know but, like everything else, you need to be careful how you use it.

Categories