Everytime i am unable to import wordpress test data from here.
http://codex.wordpress.org/Theme_Unit_Test
I am using xampp, sometimes its failed to import.
Thanks
Find class-wp-http-curl.php in that directory (C:\xampp\htdocs\wordpress\wp-includes) And before line 240 , input set_time_limit (5000);
set_time_limit (5000);
curl_exec( $handle );
$theHeaders = WP_Http::processHeaders( $this->headers, $url );
$theBody = $this->body;
$bytes_written_total = $this->bytes_written_total;
Your answer is in the error your provide Fatal error: Maximum execution time of 60 seconds exceeded.
You should edit your server PHP.ini to extend this 60 second limit. In example to 240 seconds.
max_execution_time = 240 ;
Find class-http.php in that directory (C:\xampp\htdocs\wordpress\wp-includes)
And before line 1511 OR 1510, input set_time_limit (5000);
the 5000 is the seconds. so the larger the number, the longer the maximum execution time will be.So at line 1509 it should look like this...
set_time_limit (5000);
curl_exec( $handle );
$theHeaders = WP_Http::processHeaders( $this->headers, $url );
$theBody = $this->body;
$bytes_written_total = $this->bytes_written_total;
Max Execution Time exceeded error while importing Wordpress Theme Unit Testing Data
If Nothing works, try it.
wp-includes > deprecated.php line number 3636;
WordPress V: 4.8.2
function wp_get_http( $url, $file_path = false, $red = 1 ) {
_deprecated_function( FUNCTION, '4.4.0', 'WP_Http' );
#set_time_limit( 60 );
if ( $red > 5 )
return false ………………….
change #set_time_limit( 1260 );
It's work for me
Related
I'm import data from a CRM Server by JSON to Wordpress.
I know that the load may take several minutes, so the script runs outside Wordpress. And I execute "php load_data.php"
But when the script reaches the part where we upload the images, it throws an error:
php: time limit exceeded `Success' # fatal/cache.c/GetImagePixelCache/2042.
and it stops.
This is my code to upload image to media:
<?php
function upload_image_to_media($postid, $image_url, $set_featured=0) {
$tmp = download_url( $image_url );
// fix filename for query strings
preg_match( '/[^\?]+\.(jpg|jpe|jpeg|gif|png)/i', $image_url, $matches );
$before_name = $postid == 0 ? 'upload' : $postid;
$file_array = array(
'name' => $before_name . '_' . basename( $matches[0] ),
'tmp_name' => $tmp
);
// Check for download errors
if ( is_wp_error( $tmp ) ) {
#unlink( $file_array['tmp_name'] );
return false;
}
$media_id = media_handle_sideload( $file_array, $postid );
// Check for handle sideload errors.
if ( is_wp_error( $media_id ) ) {
#unlink( $file_array['tmp_name'] );
return false;
}
if( $postid != 0 && $set_featured == 1 )
set_post_thumbnail( $postid, $media_id );
return $media_id;
}
?>
They are like 50 posts and each one has 10 large images.
Regards
The default execution time is 30 seconds so looks you are exceeding that. We have a similar script that downloads up to a couple thousand photos per run. Adding set_time_limit(60) to reset timer each loop fixed timeout issues. In your case you can probably just add at the beginning of the function. Just be very careful you don't get any infinite loops as they will run forever (or until the next reboot).
To make sure it works you can add the below as the first line inside your upload function
set_time_limit(0)
this will allow it to run until it's finished, but watch it as this will let it run forever which WILL hurt your servers available memory. But to see if the script works put that in there, then adjust to proper time if need be.
If you get another or the same error it will at least verify its not a time issue (error messages are not always factual).
The other possibility is that you are on a shared server and are exceeding their time allotment for you server. (continuous processor use for more then 30 seconds, as an example).
I've battled with Elegant Themes and WPEngine over this issue now for the past 2 weeks, and I'm getting absolutely nowhere with either of them.
Here is just one of the errors from the log:
[Wed Apr 26 20:37:29.797099 2017] [:error] [pid 9802] [client 192.104.203.89:55604] PHP Fatal error: Allowed memory size of 1073741824 bytes exhausted (tried to allocate 218817933 bytes) in /nas/content/live/amberrein/wp-content/plugins/monarch/monarch.php on line 4279, referer: http://www.amberreinhardt.com/category/weddings/
I've already upped the memory limit via wp-config.php:
define( 'WP_MEMORY_LIMIT', '1024M' );
define( 'WP_MAX_MEMORY_LIMIT', '1024M' );
It's very hit or miss with the memory leak. Some blog posts are fine, while others are not. Then, I go back to visit a working post, and then it's not loading because of the leak, so for whatever reason, it changes or something. I don't know. And the memory leak amount is always different, but never higher than what I have the limit set to. But, it also seems that the higher the limit, the longer it takes for a post to load if it does load, which is strange.
here is the entire function that i believe is causing the error. line 4279 starts with $content:
function display_media( $content ) {
$monarch_options = $this->monarch_options;
if ( $this->check_applicability( $monarch_options[ 'sharing_media_post_types' ], 'media' ) ) {
preg_match_all( '/<img [^>]*>/s', $content, $images_array );
foreach ( $images_array[0] as $image ) {
if ( false !== strpos( $image, 'class="ngg_' ) ) {
continue;
}
preg_match( '#src="([^"]+)"#' , $image , $image_src );
$icons = $this->generate_media_icons( $image_src[1] );
$replacement = '<div class="et_social_media_wrapper">' . $image . $icons . '</div>';
$content = str_replace( $image, $replacement, $content );
}
}
return $content;
}
What can I do to resolve this issue, since Elegant Themes and WPEngine are no help? I don't want to have to disable the plugin as the client wants to keep it.
So I'm trying to cache an array in a file and use it somewhere else.
import.php
// Above code is to get each line in CSV and put in it in an array
// (1 line is 1 multidimensional array) - $csv
$export = var_export($csv, true);
$content = "<?php \$data=" . $export . ";?>";
$target_path1 = "/var/www/html/Samples/test";
file_put_contents($target_path1 . "recordset.php", $content);
somewhere.php
ini_set('memory_limit','-1');
include_once("/var/www/html/Samples/test/recordset.php");
print_r($data);
Now, I've included recordset.php in somewhere.php to use the array stored in it. It works fine when the uploaded CSV file has 5000 lines, now if i try to upload csv with 50000 lines for example, i'm getting a fatal error:
Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 79691776 bytes)
How can I fix it or is there a possible way to achieve what i want in a more convenient way? Speaking about the performance... Should i consider the CPU of the server? I've override the memory limit and set it to -1 in somewhere.php
There are 2 ways to fix this:
You need to increase memory(RAM) on the server as memory_limit can only use memory which is available on server. And it seems that you have very low RAM available for PHP.
To Check the total RAM on Linux server:
<?php
$fh = fopen('/proc/meminfo','r');
$mem = 0;
while ($line = fgets($fh)) {
$pieces = array();
if (preg_match('/^MemTotal:\s+(\d+)\skB$/', $line, $pieces)) {
$mem = $pieces[1];
break;
}
}
fclose($fh);
echo "$mem kB RAM found"; ?>
Source: get server ram with php
You should parse your CSV file in chunks & every time release occupied memory using unset function.
I have a script that is running multiple times cause the validation is taking longer and allowing multiple instance of the script. It is supposed to run about once a day but yesterday script_start() ran 18 times all right around the same time.
add_action('init', 'time_validator');
function time_validator() {
$last = get_option( 'last_update' );
$interval = get_option( 'interval' );
$slop = get_option( 'interval_slop' );
if ( ( time() - $last ) > ( $interval + rand( 0, $slop ) ) ) {
update_option( 'last_update', time() );
script_start();
}
}
It sounds messy, that you've detected 18 instances of your script running although you don't want that. You should fix the code which calls those script instances.
However, you can implement this check into the script itself. To make sure that the script runs only once you should use flock(). I' ll give an example:
Add this to the top of your code that should run only once a time:
// open the lock file
$fd = fopen('lock.file', 'w+');
// try to obtain an exclusive lock. If another instance is currently
// obtaining the lock we'll just exit. (LOCK_NB makes flock not blocking)
if(!flock($fd, LOCK_EX | LOCK_NB)) {
die('process is already running');
}
... and this and the end of the critical code:
// release the lock
flock($fd, LOCK_UN);
// close the file
fclose($fd);
The method described is safe against race conditions, it really makes sure that a critical section runs only once.
For one off my projects I need to import a very huge text file ( ~ 950MB ). I'm using Symfony2 & Doctrine 2 for my project.
My problem is that I get errors like:
Fatal error: Allowed memory size of 33554432 bytes exhausted (tried to allocate 24 bytes)
The error even occurs if I increase the memory limit to 1GB.
I tried to analyze the problem by using XDebug and KCacheGrind ( as part of PHPEdit ), but I don't really understand the values :(
I'am looking for a tool or a method (Quick & Simple due to the fact that I don't have much time) to find out why memory is allocated and not freed again.
Edit
To clear some things up here is my code:
$handle = fopen($geonameBasePath . 'allCountries.txt','r');
$i = 0;
$batchSize = 100;
if($handle) {
while (($buffer = fgets($handle,16384)) !== false) {
if( $buffer[0] == '#') //skip comments
continue;
//split parts
$parts = explode("\t",$buffer);
if( $parts[6] != 'P')
continue;
if( $i%$batchSize == 0 ) {
echo 'Flush & Clear' . PHP_EOL;
$em->flush();
$em->clear();
}
$entity = $em->getRepository('MyApplicationBundle:City')->findOneByGeonameId( $parts[0] );
if( $entity !== null) {
$i++;
continue;
}
//create city object
$city = new City();
$city->setGeonameId( $parts[0] );
$city->setName( $parts[1] );
$city->setInternationalName( $parts[2] );
$city->setLatitude($parts[4] );
$city->setLongitude( $parts[5] );
$city->setCountry( $em->getRepository('MyApplicationBundle:Country')->findOneByIsoCode( $parts[8] ) );
$em->persist($city);
unset($city);
unset($entity);
unset($parts);
unset($buffer);
echo $i . PHP_EOL;
$i++;
}
}
fclose($handle);
Things I have tried, but nothing helped:
Adding second parameter to fgets
Increasing memory_limit
Unsetting vars
Increasing memory limit is not going to be enough. When importing files like that, you buffer the reading.
$f = fopen('yourfile');
while ($data = fread($f, '4096') != 0) {
// Do your stuff using the read $data
}
fclose($f);
Update :
When working with an ORM, you have to understand that nothing is actually inserted in the database until the flush call. Meaning all those objects are stored by the ORM tagged as "to be inserted". Only when the flush call is made, the ORM will check the collection and start inserting.
Solution 1 : Flush often. And clear.
Solution 2 : Don't use the ORM. Go for plain SQL command. They will take up far less memory than the object + ORM solution.
33554432 are 32MB
change memory limit in php.ini for example 75MB
memory_limit = 75M
and restart server
Instead of simply reading the file, you should read the file line by line. Every time you do read the one line you should process your data. Do NOT try to fit EVERYTHING in memory. You will fail. The reason for that is that while you can put the TEXT file in ram, you will not be able to also have the data as php objects/variables/whathaveyou at the same time, since php by itself needs much larger amounts of memory for each of them.
What I instead suggest is
a) read a new line,
b) parse the data in the line
c) create the new object to store in the database
d) goto step a, by unset(ting) the old object first or reusing it's memory