I am working on a php SoapClient. The Soap server is a .NET server.
Everything works fine when the client calls a http address and when the answer come from the server that the client calls.
The issue I have is that the client must call a https address and the server uses a load balancing system, leading to get an answer from another server (client calls serverX and get an answer sometimes from serverY, sometimes from serverZ, etc.).
Here is the php code that I use (works fine when no https and no load balancing, doesn't work with https and load balancing):
$client = new SoapClient('http://www.serverA.com/serviceB.svc?wsdl');
$immat = 'yadiyadiyada';
$params = array('immat' => $immat);
$result = $client->__soapCall('setImmat', array($params));
$id_found = $result->setImmatResult;
Any idea of what I should do? Any tips would be greatly appreciated!
Thanks
I at last found a work around.
Instead of instantiating the php SoapClient with the XML file provided by the server, I made a copy of it on the client side and I modified it a little bit. I only changed the "schemaLocation": server side, the value was something like "https://www.serverY.com/serviceB.svc?xsd=xsd0", I replaced it by "https://www.serverX.com/serviceB.svc?xsd=xsd0".
Now I instantiate the php SoapClient with this new file:
$client = new SoapClient('/local_path/wsdl.xml');
and it works!
Related
The constructor of the SOAP client in my wsdl web service is awfully slow, I get the response only after the fastcgi_read_timeout value in my Nginx config is reached, it seems as if the remote server is not closing the connection. I also need to set it to a minimum of 15 seconds otherwise I will get no response at all.
I already read similar posts here on SO, especially this one
PHP SoapClient constructor very slow and it's linked threads but I still cannot find the actual cause of the problem.
This is the part which takes 15+ seconds:
$client = new SoapClient("https://apps.correios.com.br/SigepMasterJPA/AtendeClienteService/AtendeCliente?wsdl");
It seems as it is only slow when called from my php script, because the file opens instantly when accessed from one of the following locations:
wget from my server which is running the script
SoapUI or Postman (But I don't know if they cached it before)
opening the URL in a browser
Ports 80 and 443 in the firewall are open. Following the suggestion from another thread, I found two work arounds:
Loading the wsdl from a local file => fast
Enabling the wsdl cache and using the remote URL => fast
But still I'd like to know why it doesn't work with the original URL.
It seems as if the web service does not close the connection, or in other words, I get the response only after reaching the timeout set in my server config. I tried setting keepalive_timeout 15; in my Nginx config, but it does not work.
Is there any SOAP/PHP parameter which forces the server to close the connection?
I was able to reproduce the problem, and found the solution to the issue (works, maybe not the best) in the accepted answer of a question linked in the question you referenced:
PHP: SoapClient constructor is very slow (takes 3 minutes)
As per the answer, you can adjust the HTTP headers using the stream_context option.
$client = new SoapClient("https://apps.correios.com.br/SigepMasterJPA/AtendeClienteService/AtendeCliente?wsdl",array(
'stream_context'=>stream_context_create(
array('http'=>
array(
'protocol_version'=>'1.0',
'header' => 'Connection: Close'
)
)
)
));
More information on the stream_context option is documented at http://php.net/manual/en/soapclient.soapclient.php
I tested this using PHP 5.6.11-1ubuntu3.1 (cli)
I have a PHP script that sends a SOAP request to an ASMX API that is owned by a dfferent division of the company that I work for. I developed this script in the WAMP environment on my personal company PC and it works fine in that environment, successfully recveiving and parsing the response. However when I upload this script to the LAMP-based development server (which was just setup) and attempt to run it through a web browser, I get a 500 Internal Server Error. The web engineer who setup the server says he installed and enabled the PHP SOAP extension. Does anybody know of anything else used in the code below that needs to be activated on the server that would cause a 500 error by its absence?
Here is the code that sends the request, minus the actual XML body, which is being processed correctly by the API. The code following this snippet simply pulls various values out of the arrays that are returned and echoes them out.
//Common SOAP client options that can be set.
$clientOptions = array(
'exceptions'=>true,
'trace'=>1,
'cache_wsdl'=>WSDL_CACHE_NONE,
'location' => $endpointURL);
//The creation of the client object. We pass it a reference to the WSDL.
$requestXML = '<GetRateQuote xmlns="<URL>">
CORRECTLY FORMATTED XML IS HERE
</GetRateQuote>';
$requestObj = new SoapVar($requestXML,XSD_ANYXML);
$client = new SoapClient($endpointURL . "?wsdl", $clientOptions);
//Retrieve the response.
$result = $client->GetRateQuote($requestObj);
//Put the data into a variable to be parsed
$data = $result->GetRateQuoteResult;....
$output = file_get_contents("http://www.canadapost.ca/cpc2/addrm/hh/current/indexa/caONu-e.asp");
var_dump($output);
HTTP 505 Status means the webserver does not support the HTTP version used by the client (in this case, your PHP program).
What version of PHP are you running, and what HTTP/Web package(s) are you using in your PHP program?
[edit...]
Some servers deliberately block some browsers -- your code may "look like" a browser that the server is configured to ignore. I would particularly check the user agent string that your code is passing along to the server.
Check in your PHP installation (php.ini file) if the allow_url_fopen is enabled.
If not, any calls to file_get_contents will fail.
It works fine for me.
That site could be blocking the server that you're using to access it.
When you run the URL from your browser, your own ISP is used to get the information and display in your browser. But when you run from PHP, the ISP of your web host is used to get the information, then it passes it back to you.
Maybe you can do this to check and see what kind of headers its returning for you?
$headers=get_headers("http://www.canadapost.ca/cpc2/addrm/hh/current/indexa/caONu-e.asp");
print_r($headers);
I tried using curl to post to a local file and it fails. Can it be done? my two management systems are on the same server and it seems unnecessary to have it traverse the entire internet system just to go to a file on the same hard drive.
Using localhost didn't do the trick.
I also tried to $_SERVER[DOCUMENT_ROOT].'/dir/to/file.php' using post data. It's for an API that is encrypted, so I'm not sure exactly how it works. It's for a billing system I have and I just realized that it sends data back (API).
It's simply post data and an XML response. I could write an html form tag and input fields and get the same result, but there isn't really anything else to know.
The main question is: Does curl have the ability to post to a local file or not?
it is post data. it's for an API that is encrypted so i'm not sure exactly how it works
Without further details nobody can answer then what you should do.
But if it's indeed a POST receival script on the local server, then you can send a POST request to it using the URL:
$url = "https://$_SERVER[SERVER_NAME]/path/to/api.php";
And then receive its output from the cURL call.
$data = curl($url)->post(1)->postdata(array("billing"=>1234345))
->returntransfer(1)->exec();
// (you would use the cumbersome curl_setopt() calls instead)
So you get a XML or JSON or whatever response.
If they're on the same drive, then use file operations instead:
file_put_contents('/path/to/the/file', $contents);
Using CURL should only be done if you absolutely NEED the http layer to get involved for some reason, or if you're dealing with a remote server. Using HTTP would also mean you need to have the 'target' script be able to handle a file upload plus whatever other data you need to send, and then that script would end up having to do file operations ANYWAYS, so in effect you've gone on a round-the-world flight just so you can move from your living room to the kitchen.
file://locafilespec.ext worked for me. I had 2 files in the same folder on a linux box, in a folder that is not served by my webserver, and I used the file:// wrapper to post to file://test.php and it worked great. it's not pretty, but it'll work for dev until I move it to it's final resting place.
Does curl have the ability to post to a local file or not?
To curl local file, you need to setup HTTP server as file:// won't work, so:
npm install http-server -g
Then run the HTTP server in the folder where is the file:
$ http-server
See: Using node.js as a simple web server.
Then test the curl request from the command-line to localhost like:
curl http://127.0.0.1:8081/file.html
Then you can do the same in PHP.
I'm using a PHP script to detect of a referral URL is a proxy. This is a very simplified version but it works great.
The problem is that I'm trying to use the same script on my other web server but for reasons am not copying over the script. What I'm doing instead is using a get_file_contents.
My problem is that when I use get_file_contents it detects it as a proxy. Is there anyway around this, possibly by changing the port?
<?php $stop = file_get_contents("http://mysite.com/file.php"); echo $stop; ?>
Any help would be great, Thanks!
file_get_contents with a remote URL is very different from a local URL -- you are actually running the script on mysite.com and simply getting the output of that script on your local server. This actually sends another HTTP request to mysite.com, so the referrer for that request is different from the referrer for your original request.