send xml to external site in background - php

I have a form allowing a user to signup for a news letter which submits back to the page it's sat in for validation and adding the content to the db, however I also need to send an xml file to a third part using the information collected from the form to add to a mailing list. The data sent to the third party seems to need to be sent using the post method.
How can I achieve this?
I tried AJAX, but realised after a bit that AJAX isn't able to send info to external links so abandoned that.
Essentially the site needs to reload the page, validate the info sent to it, either return errors or add info the db and fire off the xml in the background, so having it send a separate form after reload isn't ideal either. Also the third party page when sent the xml through the main form loads it's own page, which is far from pretty and takes the user away from our site, not good at all.

You will have to validate in PHP and then send the XML from the
<?php
$hCurl = curl_init();
curl_setopt($hCurl, CURLOPT_PUT, true);
curl_setopt($hCurl, CURLOPT_HEADER, true);
curl_setopt($hCurl, CURLOPT_RETURNTRANSFER, true);
curl_setopt($hCurl, CURLOPT_CONNECTTIMEOUT, 60);
curl_setopt($hCurl, CURLOPT_URL, $URL_TO_UPLOAD);
curl_setopt($hCurl, CURLOPT_HTTPHEADER, $aCurlHeaders);
// TODO it could be possible that fopen() would return an invalid handle or not work altogether. Should handle that
$fp = fopen ($XML_FILE, "r");
curl_setopt($hCurl, CURLOPT_INFILE, $fp);
curl_setopt($hCurl, CURLOPT_INFILESIZE, $finfo['size']);
$sResp = curl_exec($hCurl);
?>
Just replace $URL_TO_UPLOAD with your server that you want to POST to and $XML_FILE with the file you want to send and we are done!

I would recommend getting your server to submit the data to the third party once it has added the information to the database. It can even queue up this process and deal with it at a later date if needed.
There are lots of ways of doing this in PHP, such as Curl.

How about the XML is sent not by your user's browser, but generated and sent by your server? You could still use AJAX, and you'd have no headaches about users leaving your site.
Something along the lines of
Browser -> Server
Server -> write into own DB
Server -> generate an XML file and send it to the foreign server

Related

PHP : understand the CURL timeout

From a php page, i have to do a get to another php file.
I don't care to wait for the response of the get or know whether it is successful or not.
The file called could end the script also in 5-6 seconds, so i don't know how to handle the get timeout considering what has been said before.
The code is this
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://mywebsite/myfile.php');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, false);
curl_setopt($ch, CURLOPT_TIMEOUT, 1);
$content = trim(curl_exec($ch));
curl_close($ch);
For the first task (Where you don't need to wait for response )you can start new background process and below that write code which will redirect you on another page.
Yeah, you definitely shouldn't be creating a file on the server in response to a GET request. Even as a side-effect, it's less than ideal; as the main purpose of the request, it just doesn't make sense.
If you were doing this as a POST, you'd still have the same issue to work with, however. In that case, if the action can't be guaranteed to happen quickly enough to be acceptable in the context of HTTP, you'll need to hive it off somewhere else. E.g. make your HTTP request send a message to some other system which then works in parallel whilst the HTTP response is free to be sent back immediately.

Page waiting for new POST VARIABLE

I want to redirect my page when I get POST variable by other external domain, my page is:
http://goo.gl/kpm2GT
When you push the red button "Realizar Pago", automatically open a new windows to bank payment platform. Well, when you finish all the payment bank steps, this external web send me some POST variables with important data to my page.
This is what I want:: when someone click "Realizar Pago", the page stay waiting for new $_POST variables (from payment platform), so when the POST variables are already sended to my page, I want redirect my page to ha payment suscessfully page.
Thanks for help guys, and sorry for my english.
This is not possible in the way you think about it.
PHP executes each request separately. When your server executes the request from the external service you could assume it doesn't know anything about that other request from your user.
The $_POST array is unique for every request and could not be read across requests.
Okay, sounds like you are wanting to connect to an outside webservice from your page and then display the results to your users. In PHP, you'd probably want to create a form processor that takes user data and then uses cURL to pass it along to the banking end. Once the bank receives the request, they will send back a response to you which you can then display to the user or redirect them to a page that says it was a success.
cURL will wait for a while (you can specify how long it waits) for the response from the banking folks. In this example, I have told the program to wait for 30 seconds. If it finishes before the 30 seconds, it will go ahead and close the connection.
<?php
$bank_url = 'http://www.bank.com';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $bank_url);
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 30);
$response = curl_exec($ch);
print $response;

what would be the proper way to automate an xml import

I've written a script that imports data from an xml file into the mysql database by selecting it from the source disk and uploading it via a button submital. But what if a 3rd party application were to be used to automate this import. Would it be proper to check if a get parameter of a xml path exist and grab its content and import the same way i did before? or is there a better method?
by get parameter i mean like this:
http://domain.com/import.php?path=externaldomain.com/xml/page.xml
it depends on what kind of data you are importing. If you import data from an rss feed, this method is fine. But if you are going to import personal data this might not really be a good method.
I would suggest something more secure if you are working with critical data that others are not supposed to see. You can start thinking of importing the xml files through ftp, download them from behind a server secured folder. Ask the 3rd party application to upload the xml files to a secure location of your choosing. Anything that goes on behind some kind of security is better then the suggested method for personal data.
Firstly I'd advice you using cURL. Doesn't matter how huge is your XML will be, you'll have less problems with memory.
$fp = fopen('/var/www/vhosts/my.com/xml/feed.xml', 'w'); // opening file handler to write feed in
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://domain.com/xml/page.xml'); // setting URL to take XML from
curl_setopt($ch, CURLOPT_ENCODING, 'gzip'); // If result is gziped
curl_setopt($ch, CURLOPT_SSLVERSION, 3); // OpenSSL issue
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, 0); // Wildcard certificate
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, 2);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 0); // disabling buffer output, bec. we want to write XML to the file first and don't need it to be returned into variable
curl_setopt($ch, CURLOPT_FILE, $fp); // here we should transfer opened file handler to the cURL and it should be writable!
$result = curl_exec($ch); // executing download
$reponse_code = (int)curl_getinfo($ch, CURLINFO_HTTP_CODE); // retrieving HTTP return code for our request. Was it successful or not.
Thus, you can download/save your XML feed even if it is behind SSL and GZIPed, directly to the file.
Using curl_getinfo() you can get diverse information about your request. If procedure supposed to be automated than it would be nice to decide what to do if request fails.
Than, if file is not large (I mean really large files above 200 - 300 Mb) you can just use SimpleXML (available only since PHP5) library and parse your data. If you are under PHP4 (it is still possible today) try to find libXML which is very useful too.
If file you retrieved is rather huge :) MySQL database with FILE permissions is your friend.

How do I pass Variables from my site to another site without leaving my site?

I would like to be able to send variables to another website without actually going to the website using php.
I am building an ecommerce website where the shipping warehouse is being outsourced. After the person checks out with their products, I would like to send some variables over to the shipper's website using $_GET['vars']. This is a completely different URL. The problem is, I don't want the person actually going to the shipper's webpage. I just want to ping that info over there.
Is is possible to send Variables via URL to another site without leaving yours?
Yes, you can. the simplest way:
$contents = file_get_contents("http://example.com/some/page.php?var=abcd");
For more advanced features see Curl.
You should be storing all the relevant order info within your database then using cron to trigger a script that will process the unprocessed, this way systematic checks can be made on orders before any request to your outsource site. Dont rely on your users browser to hit a curtain point in the order process to trigger the API call or trust them not to triple click or inject values before submitting.
And I advise to use curl todo your actual request as its faster. Something as simple as:
<?php
function curl_do_api($url){
if (!function_exists('curl_init')){
die('Sorry cURL is not installed!');
}
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
$output = curl_exec($ch);
curl_close($ch);
return $output;
}
?>
actually there is a more simpler way of solving this...using ajax
include the jquery header first
refer here
http://api.jquery.com/jQuery.ajax/
Both are right and you should make your choice based on security you're looking for.
cURL will be more secure and you should use it if you do not want to pass some argument in query string. At the same time when you pass data using file_get_cotents("URL?data=value"); you will have limit of about 2k for data being passed.
On the other side cURL will be secure if you use it with https it's much more secure. With cURL you will also be able to post files and emulate form post.

How to process XML returned after submitting a form?

I have just started a project that involves me sending data using POST in HTML forms to a another companies server. This returns XML. I need to process this XML to display certain information on a web page.
I am using PHP and have no idea where to start with how to access the XML. Once I knwo how to get at it I know how to access it using XPath.
Any tips of how to get started or links to sites with information on this would be very useful.
You should check out the DOMDocument() class, it comes as part of the standard PHP installation on most systems.
http://us3.php.net/manual/en/class.domdocument.php
ohhh, I see. You should set up a PHP script that the user form posts to. If you want to process the XML response you should then pass those fields on to the remote server using cURL.
http://us.php.net/manual/en/book.curl.php
A simple example would be something like:
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://the_remote_server");
curl_setopt($ch, CURLOPT_POST, TRUE);
curl_setopt($ch, CURLOPT_POSTFIELDS, $_POST);
$YourXMLResponse = curl_exec($ch);
?>

Categories