I have a weird scenario here. I have 3 servers:
1.) http://my-server1/test
--> This server url will only return a json object "test"
2.) http://my-server2/get_request
--> This url will send a request via PHP CURL method
3.) http://mylocal-machine-server/get_request
--> The same as my server2, only that, it is run on my local machine via XAMPP
The get_request method in both second and third server has the ff. simple code to test CURL:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://www.google.com');
curl_exec($ch);
The two servers executed the request successfully and the content of google.com was displayed. Now, I changed the url from google.com to my server 1 url in get_request method for both server 2 and my local server, so it looks like this now:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://my-server1/test');
curl_exec($ch);
I run the get_request method on both the second server and my local server. The get_request on my local server was able to get the "test" json object. However, the get_request on my second server takes a while to load, and when it finished loading, it didn't display anything.
I found the culprit. The ip address of my second server is not whitelisted on the firewall of my first server (the url where I get the data). The second server time out because it has no access yet to the first server. I just realized this when I use the curl_error suggested by greeflas, and find that the error is connection time out.
Related
I am making cURL calls from php scripts on one domain (mac2cash.com) to another (thebookyard.com), both hosted on the same Apache server and the same IP address. This has been working fine but I need to add some new functionality to the site and I have just created a new php script at the root level of the same target domain as the cURL call that is working, but when I call this new script using the same code I used on the working script, this is returning the message "Found: the document has moved here".
The target scripts for the working and failing cURL calls are at the root level of the same domain. I have checked they have the same unix permissions. But if I simply change the php file name in the working script to the name of the target script in the failing call, this now fails too with the same 302 redirect message.
I even duplicated the 'working' target script (byasd_api.php) on the target domain to a new file (byasd_api_copy.php) and I get the 302 message if I make a cURL call to that from the calling script that was working even though the code is exactly the same!
I cannot see what the difference is between the two files. Is there some kind of cacheing going on where newly created files are not being treated the same?
For reference, here is the calling code:
$header=array("Host:thebookyard.com");
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, HTTP_SERVER_IP."/byasd_api.php");
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_HTTPHEADER, $header);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_REFERER, 'http://www.mac2cash.com');
curl_setopt($ch, CURLOPT_POST,3);
curl_setopt($ch, CURLOPT_POSTFIELDS,$post_data);
$output = curl_exec($ch);
curl_close($ch);
The 'byasd_api.php' script name is the only thing I am changing.
I've spent some hours googling for a solution so would appreciate any suggestions.
Your apache have configurated to search favicon.ico in each call the 302 is because not find the ico
GET http://theboo....com/favicon.ico [HTTP/1.1 302 Found 151ms]
Change the cofiguration or add the favicon.ico file.
Maybe the configuration try to find the ico file only in the root
It turns out the reason for the difference in behaviour was that the working script name was included as a rewrite condition in the htaccess file which was redirecting http to https. Changing the CURL url to "https://".HTTP_SERVER_IP."/byasd_api.php" stopped the "Found: the document has moved here" error but the call was then failing because CURL was trying to validate the SSL certificate for the IP address rather than the domain.
The solution to that was to add the following :
curl_setopt($ch, CURLOPT_RESOLVE, array("www.thebookyard.com:443:".HTTP_SERVER_IP,));
This still allows the call to be to the IP address (which is much faster than via the domain name) but CURL validates the SSL certificate against the domain name.
How can I print request I have sent, header and body? Following:
curl_setopt($ch, CURLINFO_HEADER_OUT, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
gives me just $r['request_headers']?
i made a server & website specifically for this purpose, http://dumpinput.ratma.net for the website, and https://github.com/divinity76/dumpinput.ratma.net for the server, note that it does not support httpS, so if the input is sensitive, you should probably run your own instance of the server (it's free & open source~), also note that i run the server on a dev-vm from cloudatcost.com, which has a bad reputation for uptime, so don't depend on the website being up.
you could also set up a netcat server as the target of the curl request (but that won't work with Expect 100 Continue POST-requests, but the dumpinput server will still work.)
My user logs into my app, app sends idtoken to my server, server sends request to google, google sends userdata to my server, server puts it in my db. worked fine for 5 Months.
Now:
If I type in my webbrowser
https://www.googleapis.com/oauth2/v3/tokeninfo?id_token= (+ idtoken of my google account extracted from my app)
I'll get the json with my name and so on in milliseconds as it should be.
today, since 6 hours, if my phpfile runs the command
$url = "https://www.googleapis.com/oauth2/v3/tokeninfo?id_token=".$idtoken;
$str =file_get_contents($url);
The page is loading and loading and finally returns no value.
I tried this version to get the .json file
$ch = curl_init($url); // such as http://example.com/example.xml
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, 0);
$str= curl_exec($ch);
curl_close($ch);
no fix with that.
I cannot install composer because no root access.
I contacted customer support, they told me everything is fine on the serverside.
If after checking on the server's console that you can't curl to it over SSL, but can otherwise, it would suggest the port is blocked. Ask the hosting provider to open it.
Is there any way to create an "iframe-like" on server side ? The fact is, I need to acceed to certains page of my society's intranet from our website's administration part.
I already have a SQL link to the database that works fine, but here I would access to the pages without duplicating the source code on the webserver.
My infrasructure is the following:
The Webserver is in a DMZ and has the following local IP: 192.168.63.10.
Our Intranet server is NOT in the DMZ and has the following IP: 192.168.1.20.
Our Firewall has serverals rules and I've just added the following:
DMZ->LAN Allow HTTP/HTTPS traffic and LAN->DMZ Allow HTTP/HTTPS (just as we've done for the SQL redirection)
I've tried the following PHP function:
$ch = curl_init();
// set URL and other appropriate options (also tried with IP adress instead of domain)
curl_setopt($ch, CURLOPT_URL, "http://intranet.socname.ch/");
curl_setopt($ch, CURLOPT_HEADER, 0);
// grab URL and pass it to the browser
curl_exec($ch);
// close cURL resource, and free up system resources
curl_close($ch);
I've also tried:
$page = file_get_contents('http://192.168.1.20/');
echo $page;
Or:
header('Location:http://192.168.1.20');
But in all thoses cases, it works fine from local but not from internet. From internet, it doesn't load and after a while, says that the server isn't responding.
Thanks for your help !
Your first and second solution could work. Can your webserver access 192.168.1.20? (try ping 192.168.1.20 on your webserver) or resolve the Hostname intranet.socname.ch ? (try nslookup intranet.socname.ch)
What you're looking for is called "proxy", here is a simple PHP project that I found:
https://github.com/Alexxz/Simple-php-proxy-script
Download the repo, copy example.simple-php-proxy_config.php to simple-php-proxy_config.php and change $dest_host = "intranet.socname.ch";
It should do the trick! (may also need to change $proxy_base_url)
How can I send a custom HTTP Request to a server whose URL is "http://[IP]:[Port]/"?
What I mean is that, instead of the first line being a normal GET or POST like so:
GET /index.html HTTP/1.1
Host: www.example.com
How can this be replaced with something just like:
CUSTOM
Host: [IP]
I don't mind having to use any additional libraries if necessary such as cURL.
UPDATE:
I've tried using the following cURL code:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://[IP]:[Port]/");
curl_setopt($ch, CURLOPT_PORT, [Port]);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
//curl_setopt($ch, CURLOPT_CUSTOMREQUEST, "CUSTOM");
$output = curl_exec($ch);
curl_close($ch);
print($output);
But it just keeps loading for 2 minutes until it said internal error (both with and without using the CURLOPT_CUSTOMREQUEST). However, if I use a standard website such as http://google.com it'll work fine.
Also I forgot to mention, the port my server is using is 7899, is this ok? Open it in my web browser fine, but cURL doesn't seem to be able to.
Looks like there's nothing wrong with your code. If you're using a shared hosting provider, ask them to open up outbound TCP port 7899.
You can create a custom HTTP request method using the http_request_method_register function:
http://www.php.net/manual/en/function.http-request-method-register.php
This code needs to be run on the server that will be handling the request. If you try to use a non-standard HTTP request method on any old server, depending on the server it may ignore it, or may return an error code.