Wolfram Alpha php implementation issue - php

I am trying to implement Wolfram Alpha API to my application. I download their language library and load their sample page in my browser.
However, I got
Warning: simplexml_load_file(http://api.wolframalpha.com/v1/query.jsp?appid=myid-8RJR4ELL82&input=pi): failed to open stream: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond. in C:\xampp\htdocs\wolfram\PHP_Binding_0_1\wa_wrapper\WolframAlphaEngine.php on line 47
Warning: simplexml_load_file(): I/O warning : failed to load external entity "http://api.wolframalpha.com/v1/query.jsp?appid=myid-8RJR4ELL82&input=pi" in C:\xampp\htdocs\wolfram\PHP_Binding_0_1\wa_wrapper\WolframAlphaEngine.php on line 47
Fatal error: Maximum execution time of 30 seconds exceeded in C:\xampp\htdocs\wolfram\PHP_Binding_0_1\wa_wrapper\WolframAlphaEngine.php on line 47
I use their premade PHP library and load
http://localhost/wolfram/PHP_Binding_0_1/samples/simpleRequest.php?q=pi
in my address bar.
Can anyone help me about this issue? Thanks a lot!

Maximum execution time of 30 seconds exceeded
Set the request timeout to a number greater than 30 seconds or get a better internet connection.
You can do this by the following:
set_time_limit
set_time_limit(120); // two minutes
or if you are using a CURL request:
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 120);
Alternatively, call up your internet provider and ask to upgrade your data plan to a higher number of mb/s.
It would also help to post you code. The mentioned methods of fixing this are based on the error, which could be resultant of a separate problem.

Related

filegetcontents error on high traffic site

My PHP script runs okay when my traffic is 5000 visitor/ DAY - no error.
When I suddenly increase my traffic spent to 3000 visitor/ HOUR I get this many errors:
[26-May-2017 07:30:03 Asia/Jakarta] PHP Warning: file_get_contents(http://mydomain1.com/api/ip2country_v6/?ip=76.119.xxx.xxx): failed to open stream: HTTP request failed! in /home/h32xxx/mydomain2.com/landings/script.php on line 47
This php script is in my mydomain2.com
file_get_contents is requesting from mydomain1.com
My mydomain2.com and mydomain1.com is on 1 server account. I can't do 'http://localhost/~h32xxx/api/ip2country_v6/?ip=76.119.xxx.xxx' because my CPanel settings block it (mod_userdir is disabled)
What is the problem and how do I fix it?

A PHP program access a website timout

My PHP program access a website which is very slow to open, therefore I get a warning message:
Warning: file_get_contents(http://www.example.com): failed to open stream: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond. in C:\xampp\htdocs\mezi.php on line 155.
Line 155 code: $html = file_get_contents('http://www.example.com');
My question is: how to increase time allowed to wait for slow website? I have already increased allowed execution time by adding set_time_limit(100); to my code. But this does not help.
This is actually a socket session that's being created behind the scenes, therefore the correct value in php.ini would actually be
default_socket_timeout

Warning: file_get_contents failed to open stream: Connection timed out in includes/simple_html_dom.php on line 75

All of a sudden my cronjob has stopped working properly where it grabs content through file_get_contents and started giving the following warning and fatal error. Does anyone know why it is doing this?:
Warning: file_get_contents(http://seriesgate.me/search/indv_episodes/The+Social+Network): failed to open stream: Connection timed out in includes/simple_html_dom.php on line 75
Fatal error: Call to a member function find() on a non-object in includes/seriesgate.class.php on line 25
Its working.
echo file_get_contents("http://seriesgate.me/search/indv_episodes/The+Social+Network");
It may be execution problem.
Try this
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
echo file_get_contents("http://seriesgate.me/search/indv_episodes/The+Social+Network");
The Connection timed out error is a network problem for sure.
First, that's maybe your network connection problem take too long to get response from server. Please check your local/server/hosting internet connection; domain name, IP, port,... typo.
Secondly, that's maybe destination problem. For example, seriesgate.me is down for everyone, not only you. I know this question is 3 years old and the site is down. But this is best practice that you should check your destination when it's say something like Connection timed out.

SSL Broken Pipe with PHP

My error output looks like this:
PHP Warning: fwrite(): SSL: Broken pipe in /home/whitelot/public_html/webservies/mylastwishnew/apnstest.php on line 89
and then every subsequent write to that resource gets the error:
PHP Warning: fwrite(): SSL: Broken pipe in /home/whitelot/public_html/webservies/mylastwishnew/apnstest.php on line 89
It works for a while, for maybe a few hundred messages/payloads, then all of a sudden pipe breaks and water goes all over the floor.
Anyone have any ideas if there is a good fix for this problem?
I found that using the
'keep_alive' => false
option for
new SoapClient($url, $options);
solved the issue in my case. It seems like if you don't use keep_alive = false, the SOAP connection tries to re-use a previous connection, which gets rejected by SSL. See https://bugs.php.net/bug.php?id=60329
Also, make sure this is not a question of max_execution_time or another limit that might prevent the SSL connection to finish successfully.

PHP file_get_contents throws "failed to open stream: HTTP request failed!" within 5-10 seconds when accessing large xml file

I'm trying to load a file ("http://feeds.artistdata.com/xml.shows/artist/AR-YX458DZO75EQACZ3/xml") with file_get_contents but I'm getting an error within 5-10 seconds. The xml file loads fine in a browser, and if I use a smaller version of it (adding "/future" on the end of the above url) it loads fine with file_get_contents.
The script is:
$file = "http://feeds.artistdata.com/xml.shows/artist/AR-YX458DZO75EQACZ3/xml";
$data = file_get_contents($file);
One of two errors show up, either:
Warning: file_get_contents("http://feeds.artistdata.com/xml.shows/artist/AR-YX458DZO75EQACZ3/xml") [function.file-get-contents]: failed to open stream: HTTP request failed! in MY_PHP_SCRIPT.php on line 2
Or:
Warning: file_get_contents("http://feeds.artistdata.com/xml.shows/artist/AR-YX458DZO75EQACZ3/xml") [function.file-get-contents]: failed to open stream: Connection timed out in MY_PHP_SCRIPT.php on line 2
Any ideas? I've tried using cURL instead, but when I do that I just get "Error on line 1". The error shows up in less then 10 seconds, so I can't imagine it's a timeout issue, since those defaults are generally 15 seconds or 30 seconds depending on the app.
//use this for your xml feed works fine
$load = file_get_contents('http://artistdata.sonicbids.com/ari-herstand/shows/xml/');
print_r($load);
file_get_contents() returns the file in a string, starting at the specified offset up to maxlen bytes
An E_WARNING level error is generated if either maxlength is less than zero, or if offset exceeds the length of the stream.

Categories