Delete file from second server via first server via php - php

I have two servers. I want delete file from second server via first server!
For example:
first-server.com
second-server.com
I have made two php files - file on first server and file on second server.
The file on first server contains
files.php
while($file = mysql_fetch_array($files){
echo $file['file_name'];
echo 'Delete';
}
the file on second server contains
delete.php
if($_GET['file']){
if(file_exists($_GET['file']){
unlink($_GET['file'];
//file deleted !
}
No it's ok , but. I want done this job without redirect me or visitor to the second server
For example : ajax or curl or something like that. What is the best way to do that?
Edit.
The codes above is just tests. It's not my real files. Please help in the way to process delete request without redirect to second server php file.

I think a simple file_get_contents is enough:
File on first server:
$result = file_get_contents('second-sercer.com/delete.php?file=text.txt&some_security_token=asd');
//From $result you will know what was the result on the other server
File on second server (delete.php);
if($_GET['some_security_token'] == "asd"){
if(file_exists($_GET['file']){
if(unlink($_GET['file'])){
//File deleted we are cool
echo 1;
} else {
//File deletion failed
echo 0;
}
}else{
//File don't exists
echo -1;
}
}else{
//bad token
echo -2;
}
So this way your first server on script level goes to the second server so you can check parameters before that. And the second server sends back error / success codes so you can handle them on first server:
1 - success
0 - failed deletion
-1 - file doesn't even exists
-2 - bad security token
I do not include a way to create a token that both of the servers know. You can hash the file name with some key value for start, but you have to make it expensive to guess. I just try to point out that you need this kind of security too to make it even safer. And you have to find out a way to protect file system from deleting files that important for second-server. For example you only let the deletion of files in some folder only.
You could use cURl too the same way for this. But always try to return info for the first-server.com about the process on the second-server.com

unset unsets a variable, it doesn't have anything to do with files.
You're looking for unlink.
BTW, you should do some serious validation on what you're going to unlink. Just blindly accepting anything in the URL can have serious consequences.
http://second_server.com/delete.php?file=delete.php

Delete file
<?php if ($foo = $_GET['file']) {
echo "<img src=\"http://second_server.com/delete.php?file=$foo\" style=\"display:none;\"><script>alert('deleted');</script>"; }
?>

First of all, you want a security code or token or the like, so that unauthorised people do not delete files from your server.
while($file = mysql_fetch_array($files){
echo $file['file_name'];
echo 'Delete';
}
and in first_server.com/delete.php, put this:
file_get_contents('second-server.com/delete.php?file=' . $_GET['file'] . '&securitycode=thisisasecuritycode');

Related

PHP script to check on remote server, a file exists

I am having roblems with locating a PHP script to allow me to obtain the contents of a txt file on a remote server, then output to a variable. Outputting something to a variable is not the hard part. It's the picking up and reading the contents of the file that's the hard part. Anyone have any ideas?
I have trawled the forum and can only locate a method that works locally. Not ideal as the target is remote.
The objective really is, how do I find out if a file exists on the remote server and output a status in html.
Ideas?
Assuming your remote server is accessible by http or ftp you can use file_exists():
if (file_exists("http://www.example.com/somefile.txt")) {
echo "Found it!;
}
or
if (file_exists("ftp:user:password#www.example.com/somefile.txt")) {
echo "Found it!;
}
Use this:
$url = 'http://php.net';
$file_headers = #get_headers($url);
if($file_headers[0] == 'HTTP/1.1 404 Not Found') {
echo "URL does not exist";
}
else {
echo "URL exists";
}
Source: http://www.php.net/manual/en/function.file-exists.php#75064
You can try to use this code:
if (file_exists($path)) {
echo "it exists";
} else {
echo "it does not exist";
}
As you can see $path is the path of your file. Of course you can write anything else instead of those echo.
Accessing files on other servers can be quite tricky! If you have access to the file via ftp, you can use ftp to fetch the file, for example with ftp_fget().
If you do not have access to the file-system via ssh, you only can check the response the server gives when requesting the file. If the server responds with an error 404, the file is either not existent or it is not accessible via http due to the server configuration.
You can check this through curl, see this tutorial for a detailled explanation of obtaining the response code through curl.
I know this is an old thread, but as Lars Ebert points out, checking for the existence of a file on a remote server can be tricky, so checking the server response, using cURL, is how I have been able to do it on our big travel site. Using file_exists() threw an error every time, but checking for a "200 OK" has proved quite successful. Here is the code we are using to check for images for our hotel listings page:
$media_url = curl_init("http://pathto/remote_file.png");
curl_setopt($media_url, CURLOPT_RETURNTRANSFER, true);
$media_img = curl_exec($media_url);
$server_response = curl_getinfo($media_url, CURLINFO_HTTP_CODE);
if($server_response != 200){
echo "pathto/graphics/backup_image.png";
}else{
echo "http://pathto/remote_file.png";
}
Where "http://pathto/remote_file.png" is the remote image we seek, but we need to know whether it is really there. And "pathto/graphics/backup_image.png" is what we display if the remote image does not exist.
I know it's awfully verbose, compared to file_exists(), but it's also more accurate, at least so far.

file is uploading two times when user clicks upload and if it takes more time to process

when i am trying to upload excel sheet and process each row store it on my db. unfortunately this is not working when i am uploading larger data set. same file is uploading twice. here is same code snippet.
ignore_user_abort(true);
$excelSheetReader = new Spreadsheet_Excel_Reader();
$excelSheetReader->read($_FILES['bulk_data']['tmp_name']);
$sheets = $excelSheetReader->sheets;
if(count($sheets)>0){
$sheets=$sheets[0];
}
if($sheets !=NULL) {
for ($x = 1; $x <= $sheets['numRows']; $x++) {
set_time_limit(0);
//process each row
}
}`
`
The first possible cause to check is if the users actually are uploading the files twice. :) You should make sure users aren't accidentally uploading the file multiple times, for example by clicking "submit" twice.
If user error isn't the problem, then the next thing to look at is request type. What kind of request are you using to upload the file: PUT or a POST? A PUT is idempotent, which means the user's browser can repeat the request automatically if it hits an error, like timeout. POST, however, will never be retried without asking the user first.
If you're using a PUT, I'd suggest that you try changing the request type to POST and see if that helps.

PHP unlink doesn't work

I have a multi-delete feature in one of my CMS solutions, and I have following code:
public function actionDelete() {
if(Yii::app()->request->isPostRequest) {
if(isseT($_POST['submit'])) {
if(isset($_POST['delete']) && (sizeof($_POST['delete']))) {
foreach($_POST['delete'] as $i => $items) {
$model[$i] = Pages::model()->findByPk((int)$items);
if(!is_null($model[$i])) {
$image[$i] = $model[$i]->image;
if($model[$i]->delete()) {
if(!unlink($image[$i])) {
die('Unable to delete Page Image.');
}
}
}
}
}
}
}
$this->redirect('/admin/pages');
}
This is the action of a Yii controller, and on EVERY page there is a value filled in the "image" field/column.
After I invoke this action with post data, it acctually deletes the records from the database table, but it does not remove the pictures and images from the file system, and the script never comes up to this point: die('Unable to delete Page Image.');
Is it possible that PHP strips and ignores the unlink function, mostly on production / live servers?
Is it possible that PHP strips and ignores the unlink function, mostly on production / live servers?
No, absolutely not (unless they've disabled that function but that should throw an error). It's either a logic error, permissions error, or pathing error.
1: My first suggestion would be to test the $model[$i]->image value, to make sure it's outputting what you want it to output. Then, do an output of $image[$i] RIGHT before you try to unlink it. Make sure you're not trying to, say, delete a file which does not exist (which will always be a success.) Test the existence of the file before trying to delete it. I suspect this as a very likely case. Perhaps $model[$i]->image is saving the image path in terms of web access, and not as a file system?
2: I would highly recommend reversing the order of your deletions.
Right now, the database row is deleted first, then the image. If the delete process fails halfway through, then you've lost your link to the image that needs deleting.
If you do this instead:
if(!unlink($image[$i])) {
if($model[$i]->delete()) {
die('Unable to delete Page Image.');
}
}
This allows you to only delete the database entry if the image is successfully deleted, which should help prevent unattached images from floating around your file system. If the delete process fails, you still have all the info needed to try deleting it again.

Tracking changes in text file with PHP

I have a PHP script that has to reload a page on the client (server push) when something specific happens on the server. So I have to listen for changes. My idea is to have a text file that contains the number of page loads for the current page. So I would like to monitor the file and as soon as it is modified, to use server push in order to update the content on the client. The question is how to track the file for changes in PHP?
You could do something like:
<?php
while(true){
$file = stat('/file');
if($file['mtime'] == time()){
//... Do Something Here ..//
}
sleep(1);
}
This will continuously look for a change in the modified time of a file every second. If you don't constrain it you could kill your disk IO and may need to adjust your ulimit.
This will check your file for a change:
<?php
$current_contents = "";
function checkForChange($filepath) {
global $current_contents;
$new_contents = file_get_contents($filepath);
if (strcmp($new_contents, $current_contents) {
$current_contents = $new_contents;
return true;
}
return false;
}
But that will not solve your problem. The php file that serves the client finishes executing before the rendered html is sent to the client. That client will need to call back to some php file to check for a change... and since that is also a http request, the file will finish executing and forget anything in memory.
In order to properly solve this, you'll probably have to back off the idea of checking a file. Either the server needs to know when and how to contact currently connected clients, or those clients need to poll a lightweight service at a regular interval.
This is sort of hacky but what about creating a cron job that sucks in the page, stores it in a scope or table, and then simply compares it every 30 seconds?

How to update variables while page in Infinite loop in PHP?

I have something like this :
<?php
$perform_check = 1; #Checks data with ID : 1
while(true)
{
#code
}
?>
But some data must be updated in this process, is it possible to update this data from another document?
I tried something like this :
index.php
<?php
setcookie("data", 19, time()+3600);
?>
and
loop.php
<?php
while(true)
{
if($perform_check!=$_COOKIE[data]) $perform_check = $_COOKIE[data];
#rest of code
flush();
sleep(0.3);
}
?>
But it doesn't work. I also tried $_SESSION but the page crashes on session_start().
Is it somehow possible?
Cookies are sent as a HTTP header when PHP is sending a response through the web server (for example Apache2).
All HTTP headers are sent before any output. If you output anything, headers are sent (including the set-cookie header) before the output.
After you flush() the first time you can no longer set cookies or other headers.
If you want a progress indicator or updates, you need to initiate whatever operation you are doing using javascript and do polling at an interval. In the process with the loop you need to save the progress in a shared memory, in a file or in a database (in this order of preference), then read that data using the process started by javascript progress check/updates check.
You could use the existence of a file to flag the ending of the process. For example, create a lock file,
$lock_file = <some unique name>
fopen($lock_file, 'w') or die("can't open file");
while ( file_exists($lock_file)) {
.
.
doStuff();
.
.
}
If the file is removed by some other process, it should terminate.
I think a while loop isn't doing any good here. you should look into php websocket implementation. It's an implementation in PHP to have websockets and to have a open connection with your user. If you have that you can manage things with listeners on both sides.
If you want the value to change while your code is looping, you need to check if the value has changed within the loop, not before.
A cookie or session will only work if the same user/browser is running both scripts. Writing to a file or database is the more usual approach.
I'm not quite sure about what you really want to do here.. but i guess you could do this..
<?php
$perform_check = 1; #Checks data with ID : 1
while(true) : ?>
$.ajax({
type:'post',
data: //your data to be passed to ajax script..
url: //the script wherein you want to run the query..
onSuccess: function(data) {
if(data==test) {
//if you have data that you want then you could stop the loop
<?php $perform_check = false; ?>
}
}
});
<?php endwhile: ?>
Hope this helps.
If you need to check very often you should use a CRON that calls the function you use to check.

Categories