I'm import data from a CRM Server by JSON to Wordpress.
I know that the load may take several minutes, so the script runs outside Wordpress. And I execute "php load_data.php"
But when the script reaches the part where we upload the images, it throws an error:
php: time limit exceeded `Success' # fatal/cache.c/GetImagePixelCache/2042.
and it stops.
This is my code to upload image to media:
<?php
function upload_image_to_media($postid, $image_url, $set_featured=0) {
$tmp = download_url( $image_url );
// fix filename for query strings
preg_match( '/[^\?]+\.(jpg|jpe|jpeg|gif|png)/i', $image_url, $matches );
$before_name = $postid == 0 ? 'upload' : $postid;
$file_array = array(
'name' => $before_name . '_' . basename( $matches[0] ),
'tmp_name' => $tmp
);
// Check for download errors
if ( is_wp_error( $tmp ) ) {
#unlink( $file_array['tmp_name'] );
return false;
}
$media_id = media_handle_sideload( $file_array, $postid );
// Check for handle sideload errors.
if ( is_wp_error( $media_id ) ) {
#unlink( $file_array['tmp_name'] );
return false;
}
if( $postid != 0 && $set_featured == 1 )
set_post_thumbnail( $postid, $media_id );
return $media_id;
}
?>
They are like 50 posts and each one has 10 large images.
Regards
The default execution time is 30 seconds so looks you are exceeding that. We have a similar script that downloads up to a couple thousand photos per run. Adding set_time_limit(60) to reset timer each loop fixed timeout issues. In your case you can probably just add at the beginning of the function. Just be very careful you don't get any infinite loops as they will run forever (or until the next reboot).
To make sure it works you can add the below as the first line inside your upload function
set_time_limit(0)
this will allow it to run until it's finished, but watch it as this will let it run forever which WILL hurt your servers available memory. But to see if the script works put that in there, then adjust to proper time if need be.
If you get another or the same error it will at least verify its not a time issue (error messages are not always factual).
The other possibility is that you are on a shared server and are exceeding their time allotment for you server. (continuous processor use for more then 30 seconds, as an example).
Related
I have problem with my Wordpress Query.
What I'm try to do:
I have CSV file with products data(name, price, stock, sku etc.)
And I want to import this file, but when I'm trying to get Product ID by SKU my query is too high for my server, but I'm doing some stupid idea : in foreach I'm trying to get all product_id.
It's possible to split my wp query without killing my server?
I'm trying sleep but this is no result...
My code is here:
public function new_import_stock_prices(){
global $wpdb;
global $post;
if ( !function_exists( 'wc_get_product_id_by_sku' ) ) {
require_once '/includes/wc-product-functions.php';
}
echo '<h1>Import stanów magazynowych i cen z pliku CSV </h1>';
echo '<h4>Plik pobierany jest z netis/products.csv</h4>';
$fn = 'https://e-xxxxx.pl/xxx/products.csv';
$file_array = file($fn);
echo '<table>';
echo '<tr>';
echo '<td>LP</td>';
echo '<td>Nazwa</td>';
echo '<td>SKU</td>';
echo '<td>Stan magazynowy</td>';
echo '<td>Cena</td>';
echo '<td>Product ID</td>';
$i = 1;
if ( in_array( 'woocommerce/woocommerce.php', apply_filters( 'active_plugins', get_option( 'active_plugins' ) ) ) ) {
foreach ($file_array as $line_number =>&$line)
{
if ($line_number > 0 && $line_number % 10 == 0) {
$row2=explode('|',$line);
$sku = $row2[1];
// get the product ID from the SKU
$product_id = $wpdb->get_var( $wpdb->prepare( "SELECT post_id FROM $wpdb->postmeta WHERE meta_key='_sku' AND meta_value='%s' LIMIT 1", $sku ) );
// Get an instance of the WC_Product object
$product = new WC_Product( $product_id );
//Get product stock quantity and stock status
$stock_quantity = $product->get_stock_quantity();
$stock_status = $product->get_stock_status();
echo '<tr>';
echo '<td>'.$i.'</td>';
echo '<td>'.$row2[0].'</td>';
echo '<td>'.$row2[1].'</td>';
echo '<td>'.$row2[5].'</td>';
echo '<td>'.$row2[2].'</td>';
echo '<td>'.$product_id.'</td>';
echo '</tr>';
$i = $i +1;
sleep(10);
}
}
}
echo '</table>';
}
BTW. my wp_postmeta table has ~900 000+ records :O
And I want to import this file
I don't see any code for importing, I see code for displaying. Assuming by import, you mean display:
What's probably happening is one of a few things.
your running out of memory (you should get an error for this)
don't use file($fn) use file functions that open the file and read line by line, such as fgetcsv
your running out of time
not much you can do about this, except send less data
your overwhelming the browser buffer by sending to much output.
again not much you can do about this but send less data.
The only real solution (Assuming by import, you mean display) is to page the data.
Now even in a file you can page the data, but I would suggest using SQLFileObject instead of the procedural file functions. That said you can page using the procedural style but its by Byte Offset, not page number.
While I can't code an entire paging system I can give you some tips:
For example
//hard to tell how many lines in the file
$fn = 'https://e-xxxxx.pl/xxx/products.csv';
$f = fopen($fn, 'r');
fseek($f, $_GET['offset']); //seek to a byte offset
$i=0;
while(!feof($f) && ($row=fgetcsv($f)) && null !== $row[0]){
if($i==10)
$offset = ftell($f); //get byte offset
++$i;
}
ftell and fseek allow you get or move the file pointer (in bytes). So you can start reading from a predefined offset that you can pass around in the url ... etc.
You can do the same thing with SplFileObject, but a bit better.
try {
$fn = 'https://e-xxxxx.pl/xxx/products.csv';
$csv = new SplFileObject($fn, 'r');
} catch (RuntimeException $e ) {
printf("Error openning csv: %s\n", $e->getMessage());
}
$csv->seek($_GET['line']); //seek to a predefined line
while(!$csv->eof() && ($row = $csv->fgetcsv()) && null !== $row[0]) {
if(($csv->key()-$_GET['line'])==10)
$line = $csv->key(); //get line offset
++$i;
}
The main advantage of SPL is you can use the row number, which is much easier to work with.
You can also get the total number of lines in a file like this
$csv->seek(PHP_INT_MAX);
$total = $csv->key();
$csv->rewind(); //or $csv->seek($_GET['line'])
Basically this seeks to the largest possible INT PHP can handle, but because the file is a fixed length, it puts the pointer at the end of the file, then using key we can get the line number. Then we simply rewind to where we want to read from.
I mention the total number of rows because in paging it's nice to be able to show that.
Another option (to display)
Besides paging is to output the page without buffering.
// Turn off output buffering
ini_set('output_buffering', 'off');
// Turn off PHP output compression
ini_set('zlib.output_compression', false);
//Flush (send) the output buffer and turn off output buffering
//ob_end_flush();
while (ob_get_level()) ob_end_flush();
// Implicitly flush the buffer(s)
ini_set('implicit_flush', true);
ob_implicit_flush(true);
Combine this with one of the methods I showed above to read the file 1 line at a time, and you may be able to eventually read all that data out.
Saving
For saving the data, your probably going to need to break it into batches, the same thing with paging can be done here (using offset or line). So that you only import a couple thousand rows at a time. I would also recommend not outputting the data, because you can give the browser more buffer then it can handle and lock it up. However if you page the data you can break it into small enough chunks that the browser can handle it.
You can even automate this using successive AJAX calls. Basically you would call the code on the backend to save a certain number of rows (x). The sever would respond, and then you would make another call for (x) more rows, save & repeat.
I want to display all products id, to check if it's correct. Next step is change stock, price and saving products
It would be easier to do this work in something like excel, just from a data entry standpoint, no one wants to edit thousands of rows on a web page and then have their session time out or something like that.
Hope that helps.
I'm sorry if this has been asked I can't find the correct keywords to look in google. So I decided to ask here in SO.
right now, I have an upload function that is built in laravel where it can upload to 20 images at a time, but client would like to it to be 50 images at a time, so I'm thinking I can't edit php.ini everytime the client wants to increase the maximum uploads at the same time cause it might break the server.
is it possible in php to upload in queue like 10 images for this second next second is 10 images again until the upload are all done so the server won't break.
foreach ($request->file('image') as $key => $file)
{
$filename = md5(time() . uniqid()) . '.' . $file->getClientOriginalExtension();
$imagesize = getimagesize($file);
if( ($imagesize[0] == 533 && $imagesize[1] == 800) == false &&
($imagesize[0] == 800 && $imagesize[1] == 533) == false
) {
$error++;
continue;
}
$file->move('uploads', $filename);
$data['image'] = url('uploads/' . $filename);
$order_id = 1;
$count = PropertyImages::where('property_id', $id)->count();
if( $count > 23 )
{
return response()->json(['success' => false, 'msg' => 'Images must not exceed 24']);
}
$image = PropertyImages::where('property_id', $id)->orderBy('order_id', 'DESC')->first();
if( $image )
{
$order_id = $image->order_id + 1;
}
$item = PropertyImages::create([
'property_id' => $id,
'filename' => $filename,
'order_id' => $order_id
]);
$items[$key]['id'] = $item->id;
$items[$key]['filename'] = url('uploads/' . $item->filename);
}
I am not sure what setup you are using for uploading images, so I assume that you have a plain simple form with 20 file input fields.
PHP.ini limits the size of request (e.g. post_max_size=20M will limit your request to 20MB or max_input_vars = 20 will limit variables in request to 20) so it all depends upon what limit you are using in your PHP.ini
So if you have limited max_input_vars = 20 and you are able to send 20 images in single post request, for sending 50 images to server will need 3 requests (max 20 images per request and 10 for last)
Now if you are using simple plain form then it can't do the job because with click on submit button you will be able to submit your form only once. For this you will have to submit your form using javascript (or jQuery). Using JS you will be able to submit images in multiple AJAX requests.
Here is a jQuery plugin that can do the job very efficiently
https://github.com/zimt28/laravel-jquery-file-upload
Little help for using it with Laravel
https://laracasts.com/discuss/channels/general-discussion/jquery-file-upload-with-laravel
Happy Coding!!
Everytime i am unable to import wordpress test data from here.
http://codex.wordpress.org/Theme_Unit_Test
I am using xampp, sometimes its failed to import.
Thanks
Find class-wp-http-curl.php in that directory (C:\xampp\htdocs\wordpress\wp-includes) And before line 240 , input set_time_limit (5000);
set_time_limit (5000);
curl_exec( $handle );
$theHeaders = WP_Http::processHeaders( $this->headers, $url );
$theBody = $this->body;
$bytes_written_total = $this->bytes_written_total;
Your answer is in the error your provide Fatal error: Maximum execution time of 60 seconds exceeded.
You should edit your server PHP.ini to extend this 60 second limit. In example to 240 seconds.
max_execution_time = 240 ;
Find class-http.php in that directory (C:\xampp\htdocs\wordpress\wp-includes)
And before line 1511 OR 1510, input set_time_limit (5000);
the 5000 is the seconds. so the larger the number, the longer the maximum execution time will be.So at line 1509 it should look like this...
set_time_limit (5000);
curl_exec( $handle );
$theHeaders = WP_Http::processHeaders( $this->headers, $url );
$theBody = $this->body;
$bytes_written_total = $this->bytes_written_total;
Max Execution Time exceeded error while importing Wordpress Theme Unit Testing Data
If Nothing works, try it.
wp-includes > deprecated.php line number 3636;
WordPress V: 4.8.2
function wp_get_http( $url, $file_path = false, $red = 1 ) {
_deprecated_function( FUNCTION, '4.4.0', 'WP_Http' );
#set_time_limit( 60 );
if ( $red > 5 )
return false ………………….
change #set_time_limit( 1260 );
It's work for me
I have a script that is running multiple times cause the validation is taking longer and allowing multiple instance of the script. It is supposed to run about once a day but yesterday script_start() ran 18 times all right around the same time.
add_action('init', 'time_validator');
function time_validator() {
$last = get_option( 'last_update' );
$interval = get_option( 'interval' );
$slop = get_option( 'interval_slop' );
if ( ( time() - $last ) > ( $interval + rand( 0, $slop ) ) ) {
update_option( 'last_update', time() );
script_start();
}
}
It sounds messy, that you've detected 18 instances of your script running although you don't want that. You should fix the code which calls those script instances.
However, you can implement this check into the script itself. To make sure that the script runs only once you should use flock(). I' ll give an example:
Add this to the top of your code that should run only once a time:
// open the lock file
$fd = fopen('lock.file', 'w+');
// try to obtain an exclusive lock. If another instance is currently
// obtaining the lock we'll just exit. (LOCK_NB makes flock not blocking)
if(!flock($fd, LOCK_EX | LOCK_NB)) {
die('process is already running');
}
... and this and the end of the critical code:
// release the lock
flock($fd, LOCK_UN);
// close the file
fclose($fd);
The method described is safe against race conditions, it really makes sure that a critical section runs only once.
I have a WordPress plugin with a backup script that executes on a schedule. The catch is, if someone hits the page multiple times in succession it can execute the backup script multiple times. Any thoughts on how to prevent multiple executions?
global $bwpsoptions;
if ( get_transient( 'bit51_bwps_backup' ) === false ) {
set_transient( 'bit51_bwps_backup', '1', 300 );
if ( $bwpsoptions['backup_enabled'] == 1 ) {
$nextbackup = $bwpsoptions['backup_next']; //get next schedule
$lastbackup = $bwpsoptions['backup_last']; //get last backup
switch ( $bwpsoptions['backup_interval'] ) { //schedule backup at appropriate time
case '0':
$next = 60 * 60 * $bwpsoptions['backup_time'];
break;
case '1':
$next = 60 * 60 * 24 * $bwpsoptions['backup_time'];
break;
case '2':
$next = 60 * 60 * 24 * 7 * $bwpsoptions['backup_time'];
break;
}
if ( ( $lastbackup == '' || $nextbackup < time() ) && get_transient( 'bit51_bwps_backup' ) === false ) {
$bwpsoptions['backup_last'] = time();
if ( $lastbackup == '' ) {
$bwpsoptions['backup_next'] = ( time() + $next );
} else {
$bwpsoptions['backup_next'] = ( $lastbackup + $next );
}
update_option( $this->primarysettings, $bwpsoptions );
$this->execute_backup(); //execute backup
}
}
}
Create a file at the start of the code.
When the code finishes running delete the file.
At the beginning of the code make sure thefile doesn't exist before running.
Sort of like the apt-get lock in linux.
If your site is very busy and basic locking mechanism arn't working (I personally can't imagine that but oh well!), you can try the solution from PHP session's garbage collector.
Just randomly choose a number between 0 and 10 and if the number is 0, do the backup. If now 10 user's call your backup script at nearly the same time, statistically only one will actually execute the backup.
define("BACKUP_PROBABILITY", 10);
if (mt_rand(0, BACKUP_PROBABILITY) == 0)
doBackup();
You can increase the maximum (the 10) if your site is very highly frequented.
If in those 10 visits none got the 0, the next 10 visitors will get their chance.
You will need of course some kind of locking mechanism and it is still possible (though unplausible) that you will end up with more than one or even 10 backups.
I found this question about mutexes (locks) in PHP. Might be helpful: PHP mutual exclusion (mutex)
Store the last backup date/time in some external file on server or into database, and use a check against that value!
I assume that this backup thing makes a backup somewhere.
So check the metadata on the latest backup, and if it's creation time is not far enough in the past, don't do the backup.
I assume there's a good reason why this isn't a cron job?