I have to make a thousand requests to the IGDB API and I am having troubles making this work. Everytime I run my script, it's loading for some time then my web host tells me "Error: there is a problem...It seems that something went wrong." (not very helpful I know).
Since I believe the issue comes from the amount of requests, I have tried reducing it but I am down to 60 requests with a pause of 4 seconds between each and still no success.
My latest try:
$splice = array_splice($array, 0, 60);
foreach($splice as $key => $value){
$request = wp_remote_get('https://igdbcom-internet-game-database-v1.p.mashape.com/games/?fields=*&search='.$value['Name'],
array( 'headers' => array(
'Accept' => 'application/json',
'X-Mashape-Key' => 'Key' )));
$body = wp_remote_retrieve_body($request);
$data_api = json_decode($body, true);
sleep(4);
}
Would anyone know what I am doing wrong? I am running out of ideas...
That's very likely to be nothing more than the timeout response from either PHP or the server.
Although there ways to go around these securities, they aren't for nothing.
You should use CLI to execute cargo queries, not CGI. CGI access is for regular users, whatever their role/priviledges. As a developer, you have access to the code, and to the server (or at least your sysadmin does if you are in a team). You SHOULD use command line to do these queries. It will take less time, will have less chance to fail and you'll have the error logs printed right away unless you redirect them to a file.
Related
I want to run more than 2500+ call on same time. So i have created a batch of 100 (2500/100 = 25 total call).
// REQUEST_BATCH_LIMIT = 100
$insert_chunks = array_chunk(['array', 'i want', 'to', 'insert'], REQUEST_BATCH_LIMIT);
$mh = $running = $ch = [];
foreach ($insert_chunks as $chunk_key => $insert_chunk) {
$mh[$chunk_key] = curl_multi_init();
$ch[$chunk_key] = [];
foreach ($insert_chunk as $ch_key => $_POST) {
$ch[$chunk_key][$ch_key] = curl_init('[Dynamic path of API]');
curl_setopt($ch[$chunk_key][$ch_key], CURLOPT_RETURNTRANSFER, true);
curl_multi_add_handle($mh[$chunk_key], $ch[$chunk_key][$ch_key]);
}
do {
curl_multi_exec($mh[$chunk_key], $running[$chunk_key]);
curl_multi_select($mh[$chunk_key]);
} while ($running[$chunk_key] > 0);
foreach(array_keys($ch[$chunk_key]) as $ch_key) {
$response = curl_getinfo($ch[$chunk_key][$ch_key]);
$returned_data = curl_multi_getcontent($ch[$chunk_key][$ch_key]);
curl_multi_remove_handle($mh[$chunk_key], $ch[$chunk_key][$ch_key]);
}
curl_multi_close($mh[$chunk_key]);
}
When i running this in local the system is hanged totally.
But this limit of batch like 100, 500 are not same on different device and server, so what is the reason about it? and what changes should i do to increase it?
If i am adding 1000 data with batch of 50, so for every batch 50 records should insert, but it insert randomly for a batch like 40, 42, 48, etc. so way this is skipped calls? (If i am using single record with simple cURL using loop then it is working fine.)
P.S. This code is i am using for bigcommrece API.
The BigCommerce API definitely throttles requests. The limits are different depending on which plan you are on.
https://support.bigcommerce.com/s/article/Platform-Limits
The "Standard Plan" is 20,000 per hour. I'm not sure how that is really implemented, though, because in my own experience, I've been throttled before hitting 20,000 requests in an hour.
As Nico Haase suggests, the key is for you to log every response you get from the BigCommerce API. While not a perfect system, they do usually provide a response that is helpful to understand the failure.
I run a process that makes thousands of API requests every day. I do sometimes have requests that fail as if the BigCommerce API simply dropped the connection.
I have a website scraping project. Look at this code:
<?php
include('db.php');
$r = mysql_query("SELECT * FROM urltable");
$rows= mysql_num_rows($r);
for ($j = 0; $j <$rows; ++$j) {
$row = mysql_fetch_row($r);
$html = file_get_contents(mysql_result($r,$j,'url'));
$file = fopen($j.".txt", "w");
fwrite($file,$html);
fclose($file);
}
?>
I have a list of url. This code means that, make text files using the contents(HTML) from each url.
When running this code, I can make only one file per second [each file size~ 20KB]. My internet is providing 3mbps downloading speed, but I can't utilize that speed with this code.
How do I speed up file_get_contents()? Or how do I speed up this code using threading or configure php.ini file or any other methods?
As this was not one of the suggestions on the duplicate page I will add it here.
Take a close look at Curl Multi PHP Manual page.
Its not totally straight forward but once you get it running its very fast.
Basically you issue multiple curl requests and then collect the data returned as and when it returns. It returns in any order so there is a bit of control required.
I have used this on a data collection process to reduce 3-4 hours of processing to 30 minutes.
The only issue could be that you swamp a site with multiple requests and the owner considers that an issue and bans your access. But with a bit of sensible sleep()'ing added to your process you should be able to reduce that possibility to a minimum.
You can add few controls with the streams.
But cURL should be much better, if available.
$stream_options = array(
'http' => array(
'method' => 'GET',
'header' => 'Accept-language: en',
'timeout' => 30,
'ignore_errors' => true,
));
$stream_context = stream_context_create($stream_options);
$fc = file_get_contents($url, false, $stream_context);
I'm using a web service to send SMS in PHP. The code in like below:
$options = array(
'login' => 'yourusername',
'password' => 'yourpassword'
);
$client = new SoapClient('http://sms.hostiran.net/webservice/?WSDL', $options);
try
{
$messageId = $client->send(destination mobile number, 'test sms');
sleep(3);
print ($client->deliveryStatus($messageId));
var_dump($client->accountInfo());
}
catch (SoapFault $sf)
{
print $sf->faultcode."\n";
print $sf->faultstring."\n";
}
The problem is that when i run this code on a WAMP server, it runs rapidly.But when i use this code in an ubuntu server, the speed of running this code is very low.
Is there any configuration in php.ini to solve this problem ?
Thanks!
First, you need to remove sleep(3). That makes it take an extra 3 seconds.
Second, it looks like the sms provider is in Iran so it'd be best for you to get a web server in Iran.
As far as I know there is no reason why a Ubuntu server would be slower at SOAP than a Windows server
If you want to try and speed up the webpage what you should do is instead of running the SOAP request on page load you save the request to a database and then have a cron that runs every few minutes, pulls the requests out of the database, and makes the request.
We're currently looking into doing some performance tweaking on a website which relies heavily on a Soap webservice. But ... our servers are located in Belgium and the webservice we connect to is locate in San Francisco so it's a long distance connection to say the least.
Our website is PHP powered, using PHP's built in SoapClient class.
On average a call to the webservice takes 0.7 seconds and we are doing about 3-5 requests per page. All possible request/response caching is already implemented so we are now looking at other ways to improved the connection speed.
This is the code which instantiates the SoapClient, what i'm looking for now is other ways/methods to improve speed on single requestes. Anyone has idea's or suggestions?
private function _createClient()
{
try {
$wsdl = sprintf($this->config->wsUrl.'?wsdl', $this->wsdl);
$client = new SoapClient($wsdl, array(
'soap_version' => SOAP_1_1,
'encoding' => 'utf-8',
'connection_timeout' => 5,
'cache_wsdl' => 1,
'trace' => 1,
'features' => SOAP_SINGLE_ELEMENT_ARRAYS
));
$header_tags = array('username' => new SOAPVar($this->config->wsUsername, XSD_STRING, null, null, null, $this->ns),
'password' => new SOAPVar(md5($this->config->wsPassword), XSD_STRING, null, null, null, $this->ns));
$header_body = new SOAPVar($header_tags, SOAP_ENC_OBJECT);
$header = new SOAPHeader($this->ns, 'AuthHeaderElement', $header_body);
$client->__setSoapHeaders($header);
} catch (SoapFault $e){
controller('Error')->error($id.': Webservice connection error '.$e->getCode());
exit;
}
$this->client = $client;
return $this->client;
}
So, the root problem is number of request you have to do. What about creating grouped services ?
If you are in charge of the webservices, you could create specialized webservices which do multiple operations at the same time so your main app can just do one request per page.
If not you can relocate your app server near SF.
If relocating all the server is not possible and you can not create new specialized webservices, you could add a bridge, located near the webservices server. This bridge would provide the specialized webservices and be in charge of calling the atomic webservices. Instead of 0.7s * 5 you'd have 0.7s + 5 * 0.1 for example.
PHP.INI
output_buffering = On
output_handler = ob_gzhandler
zlib.output_compression = Off
Do you know for sure that it is the network latency slowing down each request? 0.7s seems a long round time, as Benoit says. I'd look at doing some benchmarking - you can do this with curl, although I'm not sure how this would work with your soap client.
Something like:
$ch = curl_init('http://path/to/sanfrancisco/');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$output = curl_exec($ch);
$info = curl_getinfo($ch);
$info will return an array including elements for total_time, namelookup_time, connect_time, pretransfer_time, starttransfer_time and redirect_time. From these you should be able to work out whether it's the dns, request, the actual soap server or the response that's taking up the time.
One obvious thing that's just occurred to me is are you requesting the SOAP server via a domain or an IP? If you're using a domain, your dns might be slowing things down significantly (although it will be cached at several stages). Check your local dns time (in your soap client or php.ini - not sure) and the TTL of your domain (in your DNS Zone). Set up a static IP for your SanFran server and reference it that way if not already.
Optimize the Servers (not the client!) HTTP response by using caching and HTTP compressing. Check out the tips at yahoo http://developer.yahoo.com/performance/rules.html
1 You can assert your soap server use gzip compression for http content, as well as your site output does. A 0,7s roundup to SF seems a bit long, it's either webservice is long to answer, either there is an important natwork latency.
If you can, give a try to other hosting companies for your belgium server, in France some got a far better connectivity to the US than others.
I experienced to move a website from one host o another and network latency between Paris and New york has almost doubled ! it's uge and my client with a lot of US visitors was unhappy with it.
The solution of relocating web server to SF can be an option, you'll get a far better connectivity between servers, but be careful of latency if your visitors are mainly located in Europe.
2 You can use an opcode cache mecanism, such as xcache or APC. It wil not change the soap latency, but will improve php execution time.
3 Depending if soap request are repetitive, and how long could a content update could be extended, you can give it a real improvement using cache on soap results. I suggest you to use in-memory caching system (Like xcache/memcached or other) because they're ay much faster than file or DB cache system.
From your class, the createclient method isn't the most adapted exemple functionality to be cached, but for any read operation it's just the best way to perf :
private function _createClient()
{
$xcache_key = 'clientcache'
if (!xcache_isset($key)) {
$ttl = 3600; //one hour cache lifetime
$client = $this->_getClient(); ///private method embedding your soap request
xcache_set($xcache_key, $client, $ttl);
return $client;
}
//return result form mem cache
return xcache_get($xcache_key);
}
The example is for xcache extension, but you can use other systems in a very similar manner
4 To go further you can use similar mecanism to cache your php processing results (like template rendering output and other ressource consumming operations). The key to success with this technic is to know exactly wich part is cached and for how long it will stay withous refreshing.
Any chance of using an AJAX interface.. if the requests can be happening in the background, you will not seem to be left waiting for the response.
I have a php site www.test.com.
In the index page of this site ,i am updating another site's(www.pgh.com) database by the following php code.
$url = "https://pgh.com/test.php?userID=".$userName. "&password=" .$password ;
$response = file_get_contents($url);
But,now the site www.pgh.com is down.So it also affecting my site 'www.test.com'.
So how can i add some exception or something else to this code,so that my site should work if other site is down
$response = file_get_contents($url);
if(!$response)
{
//Return error
}
From the PHP manual
Adding a timeout:
$ctx = stream_context_create(array(
'http' => array(
'timeout' => 1
)
)
);
file_get_contents("http://example.com/", 0, $ctx);
file_get_contents returns false on fail.
You have two options:
Add a timeout to the file_get_contents call using stream_get_context() (The manual has good examples; Docs for the timeout parameter here). This is not perfect, as even a one second's timeout will cause a notable pause when loading the page.
More complex but better: Use a caching mechanism. Do the file_get_contents request in a separate script that gets called frequently (e.g. every 15 minutes) using a cron job (if you have access to one). Write the result to a local file, which your actual script will read.