How to get xml content that is generated by php - php

I have and xml url from a supplier which generates xml content dynamically with php like;
http://www.example.com/outputxml/index.php?xml_service_id=161
This url is valid for a static ip so I gave him my websites hosting ip. Is there a way to open that url in browser with data scraping? Because My internet connection has no static ip.
Thank you.
I have tried below code;
$url = 'http://www.example.com/outputxml/index.php?xml_service_id=161?';
$ch = curl_init();
curl_setopt( $ch, CURLOPT_URL, $url );
curl_setopt( $ch, CURLOPT_POST, true );
curl_setopt( $ch, CURLOPT_HTTPHEADER, array('Content-Type: text/xml'));
curl_setopt( $ch, CURLOPT_RETURNTRANSFER, true );
curl_setopt( $ch, CURLOPT_POSTFIELDS, $xml );
$result = curl_exec($ch);
curl_close($ch);
echo $result;
But it gave html format.

Save the content on your server with something like a wget and then serve it. Please notice that you are probably going to infringe the policy of the xml's author (I don't know the consequences or the policy itself, but you should be careful), so you might consider to at least add a .htacces authentication on your server's page, just to not make the xml public.

Related

Unable to receive inbound XML sent via cURL on remote API server

I am tasked with building an API to receive inbound XML data. On my client, I have this code.
$url = "http://stackoverflow.com";
$xml = '<?xml version="1.0" encoding="UTF-8"?><Request PartnerID="asasdsadsa" Type="TrackSearch"> <TrackSearch> <Title>love</Title> <Tags> <MainGenre>Blues</MainGenre> </Tags> <Page Number="1" Size="20"/> </TrackSearch> </Request>';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt( $ch, CURLOPT_HTTPHEADER, array('Content-Type: text/xml'));
curl_setopt( $ch, CURLOPT_POSTFIELDS, "xml=".$payload );
curl_setopt( $ch, CURLOPT_RETURNTRANSFER, true );
$request = curl_exec($ch);
curl_close($ch);
On my remote server, I have this code
function TransmitRx()
{
$xml = trim(file_get_contents('php://input'));
file_put_contents("newStandard/".rand(100,500)."received.xml", $xml);
}
//Listen for inbound data
TransmitRx()
If I open up the server endpoint URL, there is an empty file saved. I don't know why. But when I run the client-side script. I get nothing. No errors. Nothing.
I have looked at several of the pages here and every one of them has a similar cURL statement to send data.
Why am I not receiving any post data at the API endpoint?
I have been unsuccessful at any information via the WWW.
UPDATE
Final Code that works:
function get_url($request_url, $payload)
{
$headers = [
"Access-Control-Allow-Origin: *",
"Content-type: text/xml",
"Content-length: " . strlen($payload),
"Connection: close",
];
$data = ['xml' => $payload];
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $request_url);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
curl_setopt( $ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt( $ch, CURLOPT_POST, true );
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);
$response = curl_exec($ch) or die(curl_error($ch));;
if (curl_errno($ch)) {
print curl_error($ch);
} else {
curl_close($ch);
}
return $response;
}
$request_url = 'https://stackoverflow.com/posts';
$response = get_url($request_url, $payload);
I wish I knew for sure what caused it to start working. I was reading this page last.
https://www.php.net/manual/en/curlfile.construct.php
Okay, so if adding the lines I suggested to disable certification / peer validation enabled the process to work, then that just means the remote server is using an SSL certificate that is not trusted by cURL.
Those lines are NOT the final fix. You never want to disable SSL validation in a real environment. I only suggested you temporarily disable them to see if that was truly the problem. If you leave them disabled, then you are leaving your code vulnerable to man-in-the-middle (MITM) attacks.
The correct fix is usually to point cURL at an updated CA bundle. The official cURL website graciously provides these here:
https://curl.haxx.se/docs/caextract.html
The process is that you download the CA bundle file and put it somewhere where your script can find it, and then add this curl option:
curl_setopt ($ch, CURLOPT_CAINFO, "full path to the CA bundle file.pem");
If the remote server's certificate came from any of the major CA vendors out there, this should be all you need to do.
If the remote server's certificate is self-signed or something then you might need to download the specific CA certificate and any supporting intermediate CA certificates and tell cURL to find them.

Shell scripting using cURL on windows for beginner

I have a long list of URLs saved in excel and have a shell script to use with cURL to retrieve all URLs with errors, redirects, and/or connection timeout. I have never used cURL or shell. I need to use it on windows. So far I only know how to get to " C:\MyCurl>curl ". I have my shell script saved in a notepad. Can someone please tell me in specific detail what to do, including what the script and URLs should be saved as? It would be very helpful as I do not want to have to do each one manually and could be useful for many times in the future. Thank very much for your time.
I don't have enough details to answer your question about using a curl with a shell script on your local machine. But you may want to consider avoiding a shell altogether. PHP has a robust curl library. If you add the function below to your PHP script and call it with the URL as the parameter like:
$result = file_get_contents_curl( $url );
you will have code that's far more portable and scalable.
function file_get_contents_curl( $url ) {
$ch = curl_init();
curl_setopt( $ch, CURLOPT_AUTOREFERER, TRUE );
curl_setopt( $ch, CURLOPT_HEADER, 0 );
curl_setopt( $ch, CURLOPT_RETURNTRANSFER, 1 );
curl_setopt( $ch, CURLOPT_URL, $url );
curl_setopt( $ch, CURLOPT_FOLLOWLOCATION, TRUE );
$data = curl_exec( $ch );
if ( curl_errno( $ch ) <> FALSE ) {
return FALSE;
}
curl_close( $ch );
return $data;
}

Another way to check the image url is valid or not

I am trying to import lot's of images into my database which is in my .csv file, but before importing i am trying to check if url of an image is valid or not by using 'getimagesize()' function in php.
When i use getimagesize() function to check url of an image is valid or not at that time my importing is working too slowly.
So is there any other way to check image URL and import speedly as well please suggest me, Thanks in advance
The fastest way to test a remote url is using CURL with the NOBODY flag enabled, if your server supports it.
function remoteFileExists( $url )
{
$ch = curl_init();
curl_setopt( $ch, CURLOPT_URL,$url );
curl_setopt( $ch, CURLOPT_NOBODY, 1 );
curl_setopt( $ch, CURLOPT_FAILONERROR, 1 );
curl_setopt( $ch, CURLOPT_RETURNTRANSFER, 1 );
$output = ( curl_exec( $ch ) !== false );
curl_close( $ch );
return $output;
}

Cleaning facebook url cache from php

I have an issue with the facebook linter feature. I'm trying to clean the Facebook cache of one my URL, with a PHP curl query. As you can see here (Updating Objects), Facebook got an api for scrapping all the content of an URL and update URL Facebook meta. It's the same API used in the Facebook Debugger Tool.
If I use their debugger manually, there is no problem. My URL is correctly scrapped and all corrects meta are retrieved. But it's not the case with a php curl query.
Here my actual code :
$furl = 'https://graph.facebook.com';
$ch = curl_init();
curl_setopt( $ch, CURLOPT_URL, $furl );
curl_setopt( $ch, CURLOPT_RETURNTRANSFER, true );
curl_setopt( $ch, CURLOPT_POST, true );
$params = array(
'id' => "http://www.gametrader-app.com/aTPVev",
'scrape' => true,
);
$data = http_build_query( $params );
curl_setopt( $ch, CURLOPT_POSTFIELDS, $data );
curl_setopt( $ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt( $ch, CURLOPT_FOLLOWLOCATION, 1);
$resultLinter = curl_exec( $ch );
The API result displays incorrect/old facebook meta.
Moreover, I have tried to call directly the URL of the Debugger without success
curl_setopt( $ch , CURLOPT_URL, "http://developers.facebook.com/tools/debug/og/object?q=http://www.example.com");
curl_setopt( $ch , CURLOPT_HEADER , 0 );
curl_setopt( $ch , CURLOPT_RETURNTRANSFER , true );
curl_setopt( $ch , CURLOPT_USERAGENT , $useragent );
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
$str_response = curl_exec( $ch );
I need to do this for one reason. Here the full process :
Publish a new ad on the iPhone App "GameTrader"
Post an open graph on facebook for this publish
That's all ;)
If I don't clean the URL cache on Facebook side before the post open graph, I get this error :
Error during open graph : OAuthException: (#3502) Object at URL
http://www.gametrader-app.com/aTPVev has og:type of 'website'. The property 'ad' requires an object of og:type 'gametrader-app:ad'
Can you help to understand what I'm doing wrong ?
Cheers!

Login throw curl and browser

I'm working with drupal 7 and REST api.
I'd like to log in into drupal with curl and later when i visit the website with a browser i want to be logged in.
i'm usin gthis code:
<?php
$user_data = array(
'username' => 'usertest',
'password' => 'pass',
);
$ch = curl_init();
$tmpfname = dirname(__FILE__).'/'.$_COOKIE['PHPSESSID'].'.txt';
curl_setopt($ch, CURLOPT_URL, "http://example.com/networklogin/user/login");
curl_setopt($ch, CURLOPT_POST, TRUE);
curl_setopt($ch, CURLOPT_POSTFIELDS, $user_data);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt( $ch, CURLOPT_COOKIESESSION, true );
curl_setopt( $ch, CURLOPT_COOKIEJAR, $tmpfname );
curl_setopt( $ch, CURLOPT_COOKIEFILE, $tmpfname );
$response = curl_exec($ch);
$http_code = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
?>
And it works pretty well, it logs the user in, but when i visit "example.com" i'm not logget in as "usertest".
Is there any way to log me into Drupal and allow me to access with a browser?
The goal is to log me in into many website with the same users when i log in into one as i wrote here (Same webserver, same drupal, same db, single sign on?).
Ok, i found a solution here:
Login to website through CURL and client browser
Use curl cookies for the user browser is pretty impossible..

Categories