NuSOAP - HTTP Error: no data present after HTTP headers - php

I'm getting an error from a PHP API -
HTTP Error: no data present after HTTP headers
The API uses NuSOAP 0.9.5 .
Before the HTTP Error, I had the error
Fatal error: Maximum execution time of 30 seconds exceeded
so I set the max_execution_time to 300s.
There's no problem if I limit the number of rows in the SQL query (rownum < 2100), but if I remove the limit I get the HTTP Error.
Could this be a memory problem or a limitation of NuSOAP?
Specs:
WAMP (Apache 2.2.21, PHP 5.3.10)
Oracle 12c
Note: I searched for this question and I got two results, but none of them them specified using NuSOAP. One was unanswered, the other had an answer stating that the HTTP response body contained no data which doesn't answer my problem.

It was a memory issue.
I increased the value of memory_limit in php.ini and it works fine.

Related

How to overcome maxreceivedmessagesize using php soapclient on linux

Hi I am interrogating a windows service on a remote IIS server from my Linux server using PHP soapclient.
Everything's working fine except when I make a request that returns a large chunk of data, I get the following message:
Fatal error: Uncaught SoapFault exception: [a:InternalServiceFault] The maximum message size quota for incoming messages (65536) has been exceeded. To increase the quota, use the MaxReceivedMessageSize property on the appropriate binding element.
Is there something I can do with SoapClient to increase this value? I assume that it is at my end that the problem is arising...? Everything I read says to 'increase the MaxReceivedMessageSize in the appropriate web.config or app.config'. I don't have anything like that here on the client-side, just a couple of lines of PHP that make the call using the PHP soap class.
Any help welcome!

Php - Azure 500 - The request timed out

I am fetching data from cleardb mysql. It takes around 10 mins to give me result back.
But after 230 secs Azure gives error as "500 - The request timed out.
The web server failed to respond within the specified time."
i have tried to set max_execution_timeout to infinite and changed more config variables in .user.ini.
As well tried to set manually in first line of php script file as set_time_limit(0); and ini_set('max_execution_time', 6000000);.
But no luck.
I don't want to use webjobs.
Is there any way to resolve Azure 500 - The request timed out. issue.
Won't work. You'll hit the in-flight request timeout long before the 10-min wait.
Here's a better approach. Call a stored procedure that produces the result and make a second call 10 minutes later to retrieve the data.
call the stored procedure from your code
return a Location: header in the response
follow the URL to grab results, 200 OK means you have them, 417 Expectation Failed means not yet.

Google Cloud messaging php Curl timeout

Does anyone know how what might be causing this type of error sometimes i try to execute my php file?
Curl error: Operation timed out after 0 milliseconds with 0 out of 0
bytes received
I get this error very often when I try to load my php file with GCM code, but sometimes I do not get it.

SOAP Client Request in PHP failing at 4MB or greater

I've spent almost two full days trying to resolve this issue, but to no avail. Any help is greatly appreciated.
I am trying to make a Soap Request in PHP using php's build in SoapClient. I have verified that trying to send a request where the size of envelope is smaller than 4MB works great. The communication between the called server and my client has no issues in this circumstance. As soon as I tip the size of the envelope just over 4MB, my php instance takes somewhere between 1-2 minutes to throw a SoapFault where the error message is "Error fetching HTTP headers". I have my max_post_size and memory_limits set to 150M in my php.ini and my IIS request limit is set to 500MB.
I have verified that if I am not using php to make the SOAP request, I can complete my request and response chain with bodies upwards of 4MB in no time at all, so I feel that I've narrowed this down to a php/SoapClient issue.
If anybody has any ideas, I would greatly appreciate the help. I am not sure what else to try at this point.
PHP Warning: SoapClient::__doRequest(): SSL: An existing connection was forcibly closed by the remote host.
in C:\myvu\services\vendor\vu\file_storage_client\FileStorageClient\File\VuStore\VuStoreFileManager.php on line 54
[07-May-2015 08:31:48 America/Chicago] Error Fetching http headers
Thank you!
Phil

Gateway Time-out:The gateway did not receive a timely response from the upstream server

I am sending 300 newsletter at a time with a url, after 2 min it refresh itself again to send next 300 at so on.
But I am getting this error:
Gateway Time-out
The gateway did not receive a timely response from the upstream server
or application.
Additionally, a 404 Not Found error was encountered while trying to
use an ErrorDocument to handle the request.
I have set max execution to 3600
ini_set('max_execution_time', 3600);
But I am regularly getting same error. Please help me to find out the solution.
I encountered the same problem and I used ini_set('default_socket_timeout', 6000); to fix it.
http://php.net/manual/en/filesystem.configuration.php#ini.default-socket-timeout
I encountered the same problem. After i changed my php.ini file
default_socket_timeout = 240
max_execution_time = 240
to fix it.
"Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request."
This would indicate something is not configured properly on the server.
Can't follow why you think this is a CloudFlare issue right now (from the tag). Are you getting a CloudFlare error message at all?
if the problem is coming from sql sentence, is a server's processing the long query try to optimize the SQL Sentence
I have 18,600,000 rows in my table. TimeOut error was over when I set the TimeOut to 6000 in http.conf after the ServerRoot.

Categories