Suppose I have an api: api.example.com, this code actually gets the contents of api.example.com/Browser/API/ping.php (this is a simple ping script). Now I want to be able to do something like api.example.com/ping/site_to_ping (keep in mind, the folder "ping" doesn't exist, neither do I have a folder for every existing site). This would then execute the following: file_get_contents("api.example.com/Browser/ping?*site_to_ping*");
Is this possible?
Sending an HTTP POST request using file_get_contents is not that hard, actually : as you guessed, you have to use the $site_to_ping parameter.
There's an example given in the PHP manual, at this page : HTTP context options (quoting) :
$postdata = http_build_query(
array(
'var1' => 'some content',
'var2' => 'doh'
)
);
$opts = array('http' =>
array(
'method' => 'POST',
'header' => 'Content-type: application/x-www-form-urlencoded',
'content' => $site_to_ping_name
)
);
$site_to_ping = stream_context_create($opts);
$result = file_get_contents('api.example.com/Browser/ping', false, $site_to_ping);
Basically, you have to create a stream, with the right options (there is a full list on that page), and use it as the third parameter to file_get_contents -- nothing more ;-)
As a sidenote : generally speaking, to send HTTP POST requests, we tend to use curl, which provides a lot of options an all -- but streams are one of the nice things of PHP that nobody knows about... too bad...
Related
I have been given a URL that I need PHP to post data to, anonymously, without the end user knowing about it.
The exact structure is:
https://example.com/api/rest/example/createSubscription?email=1#1.com&subscriberNumber=12345JD&subscriberGroup=shop&firstName=Joe&lastName=Bloggs&offerCode=ex1&licenseParameters="STARTDATE%3D2014-08-11%26ENDDATE%3D2014-09-11"
Obviously this is a dynamic URL and I have set it up to be. I am not sure about the best way to approach this issue. Would it be a PUT http_request? I have tried that using the following but it returns a 400 error.
$url = 'https://example.com/api/rest/example/createSubscription?email=1#1.com&subscriberNumber=12345JD&subscriberGroup=shop&firstName=Joe&lastName=Bloggs&offerCode=ex1&licenseParameters="STARTDATE%3D2014-08-11%26ENDDATE%3D2014-09-11"';
$options = array(
'method' => 'PUT',
'timeout' => 15,
'header' => "Content-type: html/txt",
);
$response = http_request($url, $options);
As for your last comment, if the subscription is created simply opening the url in the browser then it is a GET request.
You can perform a GET request using file_get_contents
It's really strange you use PUT method with GET paramater.
After checking php manual here you don't use correctly this methode. that's why the server can't understand your request.
you can look after this function to do a PUT request
I need to crawl a web site with simple_dom_html->load_file(),and i need include a user agent,follow is my code,but i don't know if my code is right or there has a good way to achieve my needs.thanks in advance
$option = array(
'http' => array(
'method' => 'GET',
'header' => 'User-Agent: Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)',
)
);
$context = stream_context_create($option);
$simple_html_dom = new simple_html_dom();
$simple_html_dom -> load_file(CRAWLER_URL, false, $context);
I have tested your method / code and I can confirm it works as intended: the user-agent in the HTTP header send, is correctly changed to the one you provide with the code. :-)
As for your uncertainty: I usually use the curl functions to obtain the HTML string (http://php.net/manual/en/ref.curl.php). In this way I have more control of the HTTP request and then (when anything works fine) I use the simple_dom_html→str_get_html() function on the HTML string I get with curl. So I am more flexible in error handling, dealing with redirects and I had implemented some caching...
The solution for your problem was simply to grep a URL like http://www.whatsmyuseragent.com/ and lock in the result for the user-agent string used in the request, to check if it had worked as intended...
I looked through Stack Overflow for similar questions but found only pieces of information. So my problem is this:
I want to grab the content of a page let's say : needpage.php (using file_get_contents() + stream_context_create() or using cURL() ) but the page that I need redirects me to a login page ( loginpage.php - <form action=*processlogin.php*> with user and pass).
Do I need to cURL() or file_get_contents() the processlogin.php page first to POST the username and password field, then grab the sessionID and then send another request to the needpage.php I need posting:
$opts = array(
'http' => array(
'method' => 'GET',
'header' => 'Cookie: PHPSESSID=0123456789abcdef0123456789abcdef'
)
);
What do you think is the right flow? Is it possible that cURL or file_get_contents to store the cookie and then use that cookie for another page?
curl_setopt() lists all kind of useful flags. Maybe CURLOPT_COOKIESESSION would help in your case? The documentation seems to claim so unless I misread it well.
If it doesn’t work, there is CURLOPT_COOKIEJAR, which can be used to save cookie data to a file, after curl_close() has been called.
Then it can be loaded using CURLOPT_COOKIEFILE.
Any idea, Page loaded from different domain in iframe containing a table in it. how can i get that table.. Using any language, html, javascript, jquery, php, etc ..
Edit:
You can get data after posting the form with PHP too. Try this:
$post = http_build_query(
array(
'var1' => 'some content',
'var2' => 'doh'
)
);
$opts = array('http' =>
array(
'method' => 'POST',
'header' => 'Content-type: application/x-www-form-urlencoded',
'content' => $post
)
);
$context = stream_context_create($opts);
$file = file_get_contents('http://example.com/submit.php', false, $context);
$file now contains the response to the posted data. Now parse it via simplehtmldom that i said below.
Obviously, it won't work due to the cross-origin policy restrictions.
If you know the src of the iframe, you can get the source by either CURL or file_get_contents and then read through the DOM structure to get the desired table data.
A sample code:
$file = file_get_contents( 'IFRAME_URL_HERE' );
Once you have got the source, you can parse it very easily by using a library called SimpleHTMLDom http://simplehtmldom.sourceforge.net/
It can get you the desired table data in like a line or two of code. (Very similar syntax to jQuery, just written in PHP).
If you can give the iframe src and which table data you want, i can give you a working sample.
Hope it helps.
You could get it using CURL in PHP - just curl the url that the iframe is loading. Here's a simple guide to getting started doing this kind of thing (scraping with PHP).
http://www.phpbuilder.com/columns/marc_plotz011410.php3
I need to send HTTP POST data to a webpage. My host is missing some extensions (I'm not sure which ones). I tried cURL and fopen, neither of them work.
What are other ways to send data?
Edit: By the way, I can send $_GET data as well. So as long as I can open a url (eg. file_get_contents), it's works.
Checkout the very powerful PHP stream functions.
However, if the file/stream and cURL functions are disabled - then make them on the frontend using AJAX requests. jQuery is good at this as long as the data isn't sensitive.
I built an entire blog system using just jQuery JSONP requests on the frontend since I wanted to move the load to the user instead of my server.
This may work. The context is not really needed, but allows you to set custom timeout and user-agent.
/* Set up array with options for the context used by file_get_contents(). */
$opts = array(
'http'=>array(
'method' => 'GET',
'timeout' => 4,
'header' => "Accept-language: en\r\n" .
"User-Agent: Some UA\r\n"
)
);
/* Create context. */
$context = stream_context_create($opts);
/* Make the request */
$response = #file_get_contents('http://example.com/?foo=bar', null, $context);
if($response === false) {
/* Could not make request. */
}
You can use http_build_query() to build your query string from an array.