PHP unlink doesn't work - php

I have a multi-delete feature in one of my CMS solutions, and I have following code:
public function actionDelete() {
if(Yii::app()->request->isPostRequest) {
if(isseT($_POST['submit'])) {
if(isset($_POST['delete']) && (sizeof($_POST['delete']))) {
foreach($_POST['delete'] as $i => $items) {
$model[$i] = Pages::model()->findByPk((int)$items);
if(!is_null($model[$i])) {
$image[$i] = $model[$i]->image;
if($model[$i]->delete()) {
if(!unlink($image[$i])) {
die('Unable to delete Page Image.');
}
}
}
}
}
}
}
$this->redirect('/admin/pages');
}
This is the action of a Yii controller, and on EVERY page there is a value filled in the "image" field/column.
After I invoke this action with post data, it acctually deletes the records from the database table, but it does not remove the pictures and images from the file system, and the script never comes up to this point: die('Unable to delete Page Image.');
Is it possible that PHP strips and ignores the unlink function, mostly on production / live servers?

Is it possible that PHP strips and ignores the unlink function, mostly on production / live servers?
No, absolutely not (unless they've disabled that function but that should throw an error). It's either a logic error, permissions error, or pathing error.

1: My first suggestion would be to test the $model[$i]->image value, to make sure it's outputting what you want it to output. Then, do an output of $image[$i] RIGHT before you try to unlink it. Make sure you're not trying to, say, delete a file which does not exist (which will always be a success.) Test the existence of the file before trying to delete it. I suspect this as a very likely case. Perhaps $model[$i]->image is saving the image path in terms of web access, and not as a file system?
2: I would highly recommend reversing the order of your deletions.
Right now, the database row is deleted first, then the image. If the delete process fails halfway through, then you've lost your link to the image that needs deleting.
If you do this instead:
if(!unlink($image[$i])) {
if($model[$i]->delete()) {
die('Unable to delete Page Image.');
}
}
This allows you to only delete the database entry if the image is successfully deleted, which should help prevent unattached images from floating around your file system. If the delete process fails, you still have all the info needed to try deleting it again.

Related

PHP connection_aborted() and/or register_shutdown_function work intermittently

I've a PHP script that outputs a file to the user (as a download) which is also used to record what the user is downloading.
Basic structure is this:
set_time_limit(0);
ignore_user_abort(true);
register_shutdown_function('shutdown_fn'); //as a fail safe (i think)
//some other code here
//do some mysql queries
while(!feof($fh) && !connection_aborted()) {
echo fread(....);
ob_flush;
ob_end_flush;
sleep(1);
}
fclose($fh);
//do some more mysql queries here and set a boolean to track if it was done successfully
function shutdown_fn () {
//check boolean to see if queries failed, if so, do them here
}
The above code seems to work 99% of the time just fine. However, there are some instances when the second set of queries don't execute at all (the other 1%). I have no idea why. The files being sent to the user range from very small to very large (and in both cases they work just fine so i cant see how a large file (or small file) would be breaking the code).
Any thoughts? I hope i have explained myself well enough
I need to see some more code like the opening/reading of the file to further help you, but if you really want to be sure and not depend on the one shutdown_fn() function then why not call it yourself as well on the end of the script? Reset the boolean in the shutdown_fn() so whenever the actual shutdown is triggered than your sql queries are not ran twice.

file is uploading two times when user clicks upload and if it takes more time to process

when i am trying to upload excel sheet and process each row store it on my db. unfortunately this is not working when i am uploading larger data set. same file is uploading twice. here is same code snippet.
ignore_user_abort(true);
$excelSheetReader = new Spreadsheet_Excel_Reader();
$excelSheetReader->read($_FILES['bulk_data']['tmp_name']);
$sheets = $excelSheetReader->sheets;
if(count($sheets)>0){
$sheets=$sheets[0];
}
if($sheets !=NULL) {
for ($x = 1; $x <= $sheets['numRows']; $x++) {
set_time_limit(0);
//process each row
}
}`
`
The first possible cause to check is if the users actually are uploading the files twice. :) You should make sure users aren't accidentally uploading the file multiple times, for example by clicking "submit" twice.
If user error isn't the problem, then the next thing to look at is request type. What kind of request are you using to upload the file: PUT or a POST? A PUT is idempotent, which means the user's browser can repeat the request automatically if it hits an error, like timeout. POST, however, will never be retried without asking the user first.
If you're using a PUT, I'd suggest that you try changing the request type to POST and see if that helps.

How do I get a PHP function to continue after return?

I have a problem where I call a PHP function on page load - the function checks to see if a file exists it returns the filename, if it doesn't exist it runs a script which is fairly resourceful and takes time - converting a waveform image from an audio file. The problem is the audio files are large so creating the file can take some time, so if the audio file doesn't have this image file associated with it the page load takes as long as the process does.
What I'm after is for this function to return a placeholder image if one doesn't exist, but carry on with the process after the page is loaded - or in the background. So in theory when the page is reloaded at a later date the correct image will be there.
I can get the return of the placeholder image currently but then the process stops and the image doesn't get generated. Here's what I have so far:
function example($file_path, $file_name) {if ($file_path) {
if (file_exists("/path/to/folder/{$file_name}.png")) {
return "/path/to/folder/{$file_name}.png";
}
if (!file_exists("/path/to/folder/{$audio_file_name}.png")) {
return "/path/to/folder/processing.png";
Some stuff in here
return $new image
} return FALSE
As you can see this just stops when the file doesn't exist but I want the stuff in here to continue in background. Is it possible or do I need a different approach? Like a cron job or something? Any help appreciated.
You might try a queuing system like resque https://github.com/chrisboulton/php-resque
You then can generate a job, that processes the information and quite fast return with the "processing" image.
With this approach you won't know when it is finished though.
In my experience this is still easier than arguing with the operations guys to compile php with multi threading support.
I'd do it with AJAX. If the image is found, just put it there.
Otherwise, put the placeholder, and add a JS flag with data to load the waveform image.
In the PHP code that generates HTML Document, no conversion happens. And you have another request handler to handle requests coming from JS, that makes the conversion with suppied data.
The data created originally on HTML Document generation code will be passed to JS, which will use it to send a request for the conversion. While JS waits for response, you handle to loading time, and when response comes you put it on the placeholder.
If you're running on FastCGI / FPM you could consider doing the following:
You put a regular <img> tag with the src attribute pointing to your script.
If your script needs to regenerate, you make the browser redirect to a processing image.
If the image is ready, you redirect to the created image (you could do an AJAX poll on the page as well)
How to do step 2?
Normally, the browser has to wait for your script to end before performing a render or redirect; but FastCGI (PHP-FPM) has a special function for this: fastcgi_finish_request. It's largely undocumented, but its use is simple:
if ($need_to_process) {
header('Location: /path/to/processing.png');
fastcgi_finish_request();
// do processing here
} else {
header('Location: /path/to/final_image.png');
}
Alternative
You can apply it to your existing process as well if you have a template that you can immediately render just before doing fastcgi_finish_request().
Yet another alternative
Use a task scheduler like Gearman.
you can use "try" and "finally"
try {
return "hello world";
} finally {
//do something
}
I am not able to comment because my reputation is below 50, but I wanted to note something on mohammadhasan's answer. It seems to work but avoid 'return' statement in both try and finally block
try {
return "hello world";
} finally {
//do not put return here
}
Example:
function runner() {
try {
return "I am the trial runner";
} finally {
return "I am the default runner";
}
}
echo runner();
Will only show I am the default runner.

How can I run a php script exactly once - No sessions

I have the following question: how can I run a php script only once? Before people start to reply that this is indeed a similar or duplicate question, please continue reading...
The situation is as follows, I'm currently writing my own MVC Framework and I've come up with a module based system so I can easily add new functionality to my framework. In order to do so, I created a /ROOT/modules directory in which one could add the new modules.
So as you can imagine, the script needs to read the directory, read all the php files, parse them and then is able to execute the new functionality, however it has to do this for all the webbrowsers requests. This would make this task about O(nAmountOfRequests * nAmountOfModules) which is rather big on websites with a large amount of user requests every second.
Then I figured, what if I would introduce a session variable like: $_SESSION['modulesLoaded'] and then simply check if its set or not. This would reduce the load to O(nUniqueAmountOfRequests * nAmountOfModules) but this is still a large Big O if the only thing I want to do is read the directory once.
What I have now is the following:
/** Load the modules */
require_once(ROOT . DIRECTORY_SEPARATOR . 'modules' . DIRECTORY_SEPARATOR . 'module_bootloader.php');
Which exists of the following code:
<?php
//TODO: Make sure that the foreach only executes once for all the requests instead of every request.
if (!array_key_exists('modulesLoaded', $_SESSION)) {
foreach (glob('*.php') as $module) {
require_once($module);
}
$_SESSION['modulesLoaded'] = '1';
}
So now the question, is there a solution, like a superglobal variable, that I can access and exists for all requests, so instead of the previous Big Os, I can make a Big O thats only exists of nAmountOfModules? Or is there another way of easily reading the module files only once?
Something like:
if(isFirstRequest){
foreach (glob('*.php') as $module) {
require_once($module);
}
}
At the most basic form, if you want to run it once, and only once (per installation, not per user), have your intensive script change something on the server state (add a file, change a file, change a record in a database), then check against that every time a request to run it is issued.
If you find a match, it would mean the script was already run, and you can continue with the process without having to run it again.
when called, lock the file, at the end of the script, delete the file. only called once. and as so not needed any longer, vanished in nirvana.
This naturally works the other way round, too:
<?php
$checkfile = __DIR__ . '/.checkfile';
clearstatcache(false, $checkfile);
if (is_file($checkfile)) {
return; // script did run already
}
touch($checkfile);
// run the rest of your script.
Just cache the array() to a file and, when you upload new modules, just delete the file. It will have to recreate itself and then you're all set again.
// If $cache file does not exist or unserialize fails, rebuild it and save it
if(!is_file($cache) or (($cached = unserialize(file_get_contents($cache))) === false)){
// rebuild your array here into $cached
$cached = call_user_func(function(){
// rebuild your array here and return it
});
// store the $cached data into the $cache file
file_put_contents($cache, $cached, LOCK_EX);
}
// Now you have $cached file that holds your $cached data
// Keep using the $cached variable now as it should hold your data
This should do it.
PS: I'm currently rewriting my own framework and do the same thing to store such data. You could also use a SQLite DB to store all such data your framework needs but make sure to test performance and see if it fits your needs. With proper indexes, SQLite is fast.

Delete file from second server via first server via php

I have two servers. I want delete file from second server via first server!
For example:
first-server.com
second-server.com
I have made two php files - file on first server and file on second server.
The file on first server contains
files.php
while($file = mysql_fetch_array($files){
echo $file['file_name'];
echo 'Delete';
}
the file on second server contains
delete.php
if($_GET['file']){
if(file_exists($_GET['file']){
unlink($_GET['file'];
//file deleted !
}
No it's ok , but. I want done this job without redirect me or visitor to the second server
For example : ajax or curl or something like that. What is the best way to do that?
Edit.
The codes above is just tests. It's not my real files. Please help in the way to process delete request without redirect to second server php file.
I think a simple file_get_contents is enough:
File on first server:
$result = file_get_contents('second-sercer.com/delete.php?file=text.txt&some_security_token=asd');
//From $result you will know what was the result on the other server
File on second server (delete.php);
if($_GET['some_security_token'] == "asd"){
if(file_exists($_GET['file']){
if(unlink($_GET['file'])){
//File deleted we are cool
echo 1;
} else {
//File deletion failed
echo 0;
}
}else{
//File don't exists
echo -1;
}
}else{
//bad token
echo -2;
}
So this way your first server on script level goes to the second server so you can check parameters before that. And the second server sends back error / success codes so you can handle them on first server:
1 - success
0 - failed deletion
-1 - file doesn't even exists
-2 - bad security token
I do not include a way to create a token that both of the servers know. You can hash the file name with some key value for start, but you have to make it expensive to guess. I just try to point out that you need this kind of security too to make it even safer. And you have to find out a way to protect file system from deleting files that important for second-server. For example you only let the deletion of files in some folder only.
You could use cURl too the same way for this. But always try to return info for the first-server.com about the process on the second-server.com
unset unsets a variable, it doesn't have anything to do with files.
You're looking for unlink.
BTW, you should do some serious validation on what you're going to unlink. Just blindly accepting anything in the URL can have serious consequences.
http://second_server.com/delete.php?file=delete.php
Delete file
<?php if ($foo = $_GET['file']) {
echo "<img src=\"http://second_server.com/delete.php?file=$foo\" style=\"display:none;\"><script>alert('deleted');</script>"; }
?>
First of all, you want a security code or token or the like, so that unauthorised people do not delete files from your server.
while($file = mysql_fetch_array($files){
echo $file['file_name'];
echo 'Delete';
}
and in first_server.com/delete.php, put this:
file_get_contents('second-server.com/delete.php?file=' . $_GET['file'] . '&securitycode=thisisasecuritycode');

Categories