php script that deletes itself after completion - php

There's a problem that I'm currently investigating: after a coworker left, one night some files that he created, basicly all his work on a completed project that the boss hasn't payed him for, got deleted. From what I know all access credentials have been changed.
Is it possible to do this by setting up a file to do the deletion task and then delete the file in question? Or something similar that would change the code after the task has been done? Is this untraceable? (i'm thinking he could have cleverly disguised the request as a normal request, and i have skimmed through the code base and through the raw access logs and found nothing).

It's impossible to tell whether this is what actually happened or not, but setting up a mechanism that deletes files is trivial.
This works for me:
<? // index.php
unlink("index.php");
it would be a piece of cake to set up a script that, if given a certain GET variable for example, would delete itself and a number of other files.
Except for the server access logs, I'm not aware of a way to trace this - however, depending on your OS and file system, an undelete utility may be able to recover the files.
It has already been said in the comments how to prevent this - using centralized source control, and backups. (And of course paying your developers - although this kind of stuff can happen to anyone.)

Is is possible to do this by setting up a file to do the deletion task
and then delete the file in question?
Yes it is. He could have left an innoculous looking php file on the server which when accessed over the web later, would give him shell access. Getting this file to self delete when he is done is possible.
Create a php file with the following in it:
<?php
if ($_GET['vanish'] == 'y') {
echo "You wouldn't find me the next time you look!";
#unlink(preg_replace('!\(\d+\)\s.*!', '', __FILE__));
} else {
echo "I can self destruct ... generally";
}
?>
Put on your server and navigate to it. Then navigate again with a "vanish=y" argument and see what happens

Related

how do i delete a subdirectory that created by rtmp

My rtmp-module settings as follow:
Hls_nested on.
Hls_continuous on// once setup seems to disable hls_cleanup.
I just want to delete this subdirectory after the live is over, but the question is the subdirectorys permission is the way to high that php cannot do anything about it, what can i do?
I did some searches and did not find a solution,if i set hls_continuous off then hls_cleanup will get the job done,but of course that is not what i want it.
If you cannot use hls_cleanup option then I think the only way is to use "exec_publish_done", It will automatically execute any command or script when the stream ends gracefully.
exec_publish_done /path/to/directory;
Wiki explains it too.
https://github.com/dreamsxin/nginx-rtmp-wiki/blob/master/Directives.md#exec_play_done

What is the PHP manual talking about with clearstatcache()?

You should also note that PHP doesn't cache information about non-existent files. So, if you call file_exists() on a file that doesn't exist, it will return false until you create the file. If you create the file, it will return true even if you then delete the file. However unlink() clears the cache automatically.
Source: https://www.php.net/manual/en/function.clearstatcache.php
I've read this numerous times but simply cannot make any sense of it. What is the actual information it's trying to convey?
To me, it sounds as if it's contradicting itself. First it says that PHP doesn't cache information about non-existent files. Then it goes on to state that it will return true even if you delete the file. But also that unlink() clears the cache automatically.
Is it referring to the file being deleted outside of PHP? Isn't that the only thing it can mean? But the way it's put is so incredibly confusing, ambiguous and weird. Why even mention that file_exists() will return false until you create the file? It's like saying that water will remain wet even if you clap your hands.
Is it actually saying, in a very round-about way, that I have to always run clearstatcache() before file_exists() unless I want a potentially lying response because a non-PHP script/program has deleted the file in question after the script was launched?
I swear I've spent half my life just re-reading cryptic paragraphs like this because they just don't seem to be written by a human being. I've many, many times had to ask questions like this about small parts of various manuals, and even then, who knows if your interpretations are correct?
I'd like to first address your last paragraph:
I swear I've spent half my life just re-reading cryptic paragraphs like this because they just don't seem to be written by a human being.
Quite the opposite: like all human beings, the people who contribute to the PHP manual are not perfect, and make mistakes. It's worth stressing that in this case these people are not professional writers being paid to write the text, they are volunteers who have spent their free time working on it, and yet the result is better than many manuals I've seen for paid software. If there are parts you think could be improved, I encourage you to join that effort.
Now, onto the actual question. Before going onto the part you quote, let's look at the first sentence on that page:
When you use stat(), lstat(), or any of the other functions listed in the affected functions list (below), PHP caches the information those functions return in order to provide faster performance.
What this is saying is that when PHP asks the system about the status of a file (permissions, modification times, etc), it stores the answer in a cache. Next time you ask about the same file, it looks in that cache rather than asking the system again.
Now, onto the part you quoted:
You should also note that PHP doesn't cache information about non-existent files.
Straight-forward enough: if PHP asks the system about the status of a file, and the answer is "it doesn't exist", PHP does not store that answer in its cache.
So, if you call file_exists() on a file that doesn't exist, it will return false until you create the file.
The first time you call file_exists() for a file, PHP will ask the system; if the system says it doesn't exist, and you call file_exists() again, PHP will ask the system again. As soon as the file starts existing, a call to file_exists() will return true.
Put another way, file_exists() is guaranteed not to return false if the file exists at the time you call it.
If you create the file, it will return true even if you then delete the file.
This is the point of the paragraph: as soon as the system says "yes, the file exists", PHP will store the information about it in its cache. If you then call file_exists() again, PHP will not ask the system; it will assume that it still exists, and return true.
In other words, file_exists() is not guaranteed to return true if the file doesn't exist, because it might have previously existed, and had information filed in the cache.
However unlink() clears the cache automatically.
As you guessed, all of the above is about you monitoring if something else has created or deleted the file. This is just confirming that if you delete it from within PHP itself, PHP knows that any information it had cached about that file is now irrelevant, and discards it.
Perhaps a different way to word this would be to give a scenario: Imagine you have a piece of software that creates a temporary file while it's running; you want to monitor when it is created, and when it is deleted. If you write a loop which repeatedly calls file_exists(), it will start returning true as soon as the software creates the file, without any delay or false negatives; however, it will then carry on returning true, even after the software deletes the file. In order to see when it is deleted, you need to additionally call clearstatcache() on each iteration of the loop, so that PHP asks the system every time.

How can I edit a live webpage without causing chaos for the page's visitors during my editing session?

Pardon me if these very basic questions have been asked and answered many times, but several hours of searching turned up nothing pertinent.
(1) After "going live" with a webpage, a Stackoverflow developer wants to make some changes to a page, but does not want those changes to "go live" until he has completed a cycle of PROTOTYPING and TESTING the changes (you know, basic SDLC). How does he do this in the website was live?
(2) An even more basic rephrasing of the question: While I am spending, say, 30 minutes making updates to an existing, live webpage, it appears that any visitor to that webpage during that time will observe every incremental change (including inadvertent blunders, typos, etc.) that I am making IN REAL TIME. I must be missing something really obvious here, so forgive me! How do I make changes to a currently-live webpage without causing such chaos during my edit sessions?
I am seeing so many bad comments and one bad answer. You never work on the live pages. What if you break it? The fact that you are asking this question tells me that you are not good enough not to make a mistake.
There is really only one good way to do this in my opinion.
Create a sub domain dev.whatever.com and put your current live site there. Work on your local project to start, when you are happy with that result, then move it to your dev site to make sure there are no issues, do it all there first, then when you are happy, you move to live.
Why not just save a copy of the page with a different file name. Upload it and work off that until it works how you want. No one will know the page exists except you so no one will see any issues. Once you are happy with it just use that pages code.
I think working on a local repository and learning version control would be best. This way, incase you do happen to completely break your code, you can revert to a previous working version without any worries. You can also use a local server this way only you can see the current updates going on with your code. Once you are happy with everything and all code works, you can then push the changes onto your remote server.
version control: https://git-scm.com/
Further to my comment, here is a simple caching page script that uses a define for allowing the page to run in cache mode or dynamic mode. Toggle CACHESITE to false to return it to live rendering of the page. This would need to be expanded out to bypass the cache if you are an admin (or some other trigger that you decide), so you can see the changes being made but the user sees the cached page:
<?php
// If true, cache this page
define("CACHESITE",true);
function CachePage($content = '',$saveto = 'index.php')
{
ob_start();
echo $content;
$data = ob_get_contents();
ob_end_clean();
if(CACHESITE) {
// If you have a define for site root, use instead
$dir = $_SERVER['DOCUMENT_ROOT'].'/temp/';
if(is_file($cached = $dir.$saveto)) {
include($cached);
return;
}
if(!is_dir($dir))
mkdir($dir,0755,true);
$cfile = fopen($cached,'a');
fwrite($cfile,$data);
fclose($cfile);
}
return $data;
}
// Whatever your content is, it needs to be in a string
// Use output buffering if you have include pages
$test = 'test is the best'.rand();
echo CachePage($test,'thispage.php');
?>
At any rate, there are lots of ways to do this sort of thing, but caching works relatively well.

Single download file upload mechanism

Sorry for the relatively vague title, I couldn't think of anything else.
So in short what I'm asking for is what would be an optimal (least resource intensive) way of creating a simple file upload service that would delete the file after the first download. Can be PHP or anything else (as long as it's relatively easy to implement). It's basically for streaming screenshots for a single user.
The first thing that comes to mind is simply doing a regular upload and then doing a readfile() followed by an unlink(). sendfile is obviously out of the question since then I don't have a way of executing code after the file has been transferred. But readfile() doesn't seem like such a good idea.
I wouldn't mind installing a separate daemon or something along those lines.
Pseudo-code:
Get the temporary path to the file from the $_FILES['tmp_name']
Move it to a non-guessable server location (as in uploads/file{random_numbers}.extension
Store the information in a DB
Upon visiting yoursite.tld/view.php?id={unique id that's <> file{random_numbers}:
SELECT path FROM TABLE WHERE token = 'UNIQUE ID ABOVE' AND downloaded = 0
1.1 IF there is a row in the DB, we get the path and then we set downloaded = 1 in the DB
1.2 ELSE we don't do anything further
INCLUDE the file on the page with a non-regular header so that it gets downloaded
Run a cron-job every x minutes to clear out files that aren't needed anymore - cron won't be able to delete a file that's currently being transmitted to the user (as far as I know, as it would still be "in use").
Hopefully you'll be able to follow my logic and implement it as planned.
If you wouldn't mind installing a separate daemon you can install cron. You might set it up so that it would remove outdated files every n minutes.

Tearing eyeballs out, PHPThumb class on shared hosting is failing

UPDATE & SOLUTION: Everyone, for whoever has this problem in the future, I figured out how to solve it. If you use the PHPThumb class, you MUST import the settings from the config file, as it will not do this otherwise. I discovered this by opening the testing file that Clint gave. To do so, insert this code after you define the object:
if (include_once($PATH . 'phpThumb.config.php')) {
foreach ($PHPTHUMB_CONFIG as $key => $value) {
$keyname = 'config_'.$key;
$phpThumb->setParameter($keyname, $value);
}
}
Thanks to the people who attempted to help, and thanks Clint for at least giving me somewhere to look.
Original Question:
Before I cannot read this message due to my newfound blindness, I need help with something. And before you go farther, I must warn you that I have to link to other sites to show you the problems. So get ready to have a few tabs open.
So I am using PHPthumb to generate images for my gallery website. It was going along really great actually until I uploaded the script so that my business partner could start showing them (clients) an alpha stage of the script (I know it says beta).
http://speakwire.net/excitebeta2/?m=2
The problem becomes very obvious on the two gallery pages, which I happened to link to. The images are not being created at all. If you go into the admin panel, they seem to work, but that is merely a cache generated from my desktop. I have meticulously stopped at every step and even tried to manipulate the class code. I looked for other scripts but they did not help me, because they did not have what I needed. Because the code is proprietary though, I cannot share it. I bet you are thinking "Oh my farce god", but here is something you can look at - because I am able to replicate the same problem with the code I got before.
http://speakwire.net/phpthumbtest/
The second website has the EXACT same structure and code as:
http://mrphp.com.au/code/image-cache-using-phpthumb-and-modrewrite
The few exceptions are allowing the 100x100 parameters, but those are supposed to be changed and I know that is not causing the error, because its very existence is optional and removing it only allows people to do naughty things. The second is the thing I made only after the error persisted, and that was chmod(dirname($path), 0777); because for some weird reason, mkdir won't give the folder 777 permissions.
The old image: http://speakwire.net/phpthumbtest/images/flowerupright.JPG
The new image: http://speakwire.net/phpthumbtest/thumbs/100x100/images/flowerupright.JPG
As seen in the new image, it is unable to write the file. This happens to be the fault of PHPThumb. Whether that be the lack of parameters given, or the hosting does not permit.
Which brings me to the point, the script works superbly on my desktop WAMP, but fails when on the GoDaddy hosting. My business partner is going to open up an account on the hosting we plan to have people on soon, but the problem is still existent and if it is happening here, it most certainly can happen there too. Even though, it won't be on GoDaddy's servers later.
The specific place where it is failing I will insert here, but the rest you need to open up the mrphp.com.au site to see. It is way to long to post here.
require('../phpthumb/phpthumb.class.php');
$phpThumb = new phpThumb();
$phpThumb->setSourceFilename($image);
$phpThumb->setParameter('w',$width);
$phpThumb->setParameter('h',$height);
$phpThumb->setParameter('f',substr($thumb,-3,3)); // set the output format
//$phpThumb->setParameter('far','C'); // scale outside
//$phpThumb->setParameter('bg','FFFFFF'); // scale outside
if (!$phpThumb->GenerateThumbnail()) { // RETURNS FALSE FOR SOME REASON
error('cannot generate thumbnail'); // And is called due to fail.
}
I would love you long time whoever helps me with this, because I have spent essentially all my free time for the last few days, including time meant to be sleeping, trying to figure this out.
EDIT: http://speakwire.net/phpthumbtest/index2.php
I added this as Clints suggestion, seems imagemagick isn't working, could that really be the problem and how would I fix it?
Sounds like a permissions issue. Make sure whatever folder you are writing to, apache has write access.

Categories