how can I delete a file using an url path ?
I have
$file_with_path = "http://www.myweb.com/uploads/audio.mp3";
if (file_exists($file_with_path)) {
unlink($file_with_path);
}
I don't use "/uploads/audio.mp3" or similar directory paths due some reasons.
thanks in advance !!
unlink tells the operating system to delete a given file. The OS identifies files by file system path - it does not interact with URLs in any way. URLs are translated to file system paths by the web server, which is an entirely different piece of software. While theoretically there is a way to tell a web server to delete the file (by sending a HTTP DELETE request), no web server is going to honor that - it would be way too insecure. It is relatively easy to control who can access the file system; it is very hard to control who can send requests to the web server.
In short, you will have to figure out what the file system path for the file is, and use unlink (and file_exists) with that path.
Related
I have an url to a microservice, like api.service.com/generateRandomPdf that downloads an X file.
I wish to use this URL in my backend server in PHP application in a way that the file from given URL is directly passed to client without storing downloaded file in some /tmp directory so the call to mysite.com/download would be equivalent in logic to api.service.com/generateRandomPdf. In other words - I don't want to download the file twice
I think I could make it by using .htaccess or any rewrite rules, but I also need to set some auth tokens in that request and the whole code has a character of composer package which kind of limits me I suppose.
Is it possible to send file from a local URL?
I need to upload files in php from your local url, eg I open the web page with:
www ... upload.php?url=c://...file.jpg,
from the url GET,I would get the file on the pc and would upload, without using html or anything else that I have to choose, just with the file url.
Important, it can only be this way, it will not be possible to choose the file during the upload, it will only be a POST or GET with the local url.
I researched and found nothing related, if anyone can help
I think it is impossible for server to get a client file by using the file path.
But maybe you can use JS and FileSystemobject to prepare the file, make it to streaming and post to the server.
And you must know FSO need a high security permission and may be disabled on users' browser.
I am making a feature to my site so that users can upload files (any type).
In order to secure the upload form, i made a blacklist of non-accepted filetypes. But in order to assure protection to my server (in case of uploading malicious scripts in any way) i thought to tar the uploaded files (using the tar class) so that they are stored as .tar zipped files.
So if the user wants to donwload it, then he will receive a .tar file.
My question is, is this secure enough? (since the files cannot be executed then).
[I have this reservation as i can see at the code of tar class, the "fread()"]
Thanks!
Two points, here :
Using a blacklist is a bad idea : you will never think to all possible evil filetypes.
Do not store the uploaded files into a public directory of your server :
Store those files to a directory that is not served by Apache, outside of your DocumentRoot.
And use a PHP script (even if Apaches cannot serve the files through HTTP, PHP can read them) to send those files contents to the user who wants to download them.
This will make sure that those uploaded files are never executed.
Of course, make sure your PHP script that sends the content of a file doesn't allow anyone to download any possible file that's on the server...
You can upload the files to an non web accessible location (under your webroot) and then use a download script to download the file.
The best way of handling uploaded files, in my opinion, is to place them in a folder that's not reachable through HTTP. Then when a file is requested, use a PHP file to send then download headers, the use readfile() to send the file to the user. This way, files are never executed.
That might work, assuming that you're users that will download the files can untar them (most non UNIX systems just have zip, I'd give them the option to download either format).
Also, i think its better to create a list of allowed files vs banned files. Its easy to forget to ban a specific type; whereas you will probably have a better idea of what users can upload
Dont block/allow files on extension. Make sure you are using the mime type that the server identifies the file as. This way its hard for them to fake it.
also, store the files in a non web accessible directory and download them through a script.
Even if its a bad file, they won't be able to exploit it if they can't directly access it .
When saving the files make sure you use these functions:
http://php.net/manual/en/function.is-uploaded-file.php
http://php.net/manual/en/function.move-uploaded-file.php
Dan
I've got some code that generates a word document, as follows:
$word->Documents [1]->SaveAs ( $localDir . $filename );
Now, I was kinda hoping that I could now open the file once it's saved by doing the following:
$word->Documents->Open($remoteDir . $filename)
// remotedir = 'word/', so for example the above would be '/word/document1.doc'
But it seems to open it on the host machine, and not the users! Is there anyway to open it on the user's machine and not on the server?
edit: Just for clarity it will be used exclusively on an intranet by a single user that will be on a Windows machine at all times, with Word etc installed... just want to try and make her life a little easier!
Thanks
I think you are fundamentally mistaken about what runs where. PHP is a purely server side language. You can not use it to open a file on the client's PC so that the user has an opened instance of Word in front of them.
You can maybe achieve that through client side Scripting, namely in VBScript or some other Microsoft scripting flavour. Be prepared for massive obstacles and incompatibilities, though, because such things are blocked for security reasons by default in all browsers, and sometimes those blocks cannot be circumvented even with special settings ("Trusted sites") in the client browser.
You may be able to display the document in the user's browser as an embedded HTML object.
The most simple thing really may be generating the file, and offering it to the user as a download. The user can then save it, and open it. Job done.
/word/document1.doc is the path for a file in the server, not in the client. On Windows, supposing that the file sharing is enabled for the client PC, then you can use a path such as \\IP\word\document1.doc, where IP is the IP of the client PC.
You can get the IP of the PC connecting to the server with $_SERVER['REMOTE_ADDR']; $_SERVER['REMOTE_HOST'] is the result of a DNS reverse lookup, which could return the same value of $_SERVER['REMOTE_ADDR'], in your case.
Probably PHP will not open remote files if it has not been set to do so (there is a directive or that).
If directly accessing the shared file from the COM object doesn't work, then you can copy the file from the client PC to the server in a temporary file, and then give that file to the COM object. In this way, if there are any errors while accessing the networked file, you should be able to get them.
I find strange, anyway, that passing a network file path you get a local file. Are you sure the COM object is not copying of the server the file it finds at the remove file path passed? Did you try with a different file? If that happens with different files too, then we are missing something; I would find strange that for all the network files you try to open, there is already a local file with the same name. Try also renaming the network files.
On the current website I'm working on, I've got a directory of files for users to download which would be really nice to have some security method other than obscurity ;)
I was wondering if there's any way to supply login information via PHP to htaccess as though a user were entering it.
Alternately, if anyone knows a better way to secure user downloads using PHP, that's also acceptable. All of my googling turns up "just use htaccess" which isn't really helpful, as from the non-savvy user's point of view, they have to log in twice every time they use the website.
My best guess at doing it exclusively with PHP is to store files above the web root, then copy them to a web accessible folder temporarily, but this seems highly inefficient and I couldn't think up any way to remove them after the download has finished.
Note: I don't own the server this is running on and don't have ssh access to it.
If files are not too big (Gb) you can always use readfile for file's download. In this mode you can check user's auth before, and if it's ok output file contents to user, otherwise send him to login page.
With this method you can put your files in protected (with .htaccess) directory so you can be sure that nobody who isn't authenticated can access them.
I think I would either store them in a folder outside of the web root, or in a folder protected by .htaccess and then have a php script that checked if the user was logged in and allowed to download a file asked for. If he was, then just pass the file through to the user.
Example from linked page at php.net:
Example #1 Using fpassthru() with binary files
<?php
// open the file in a binary mode
$name = './img/ok.png';
$fp = fopen($name, 'rb');
// send the right headers
header("Content-Type: image/png");
header("Content-Length: " . filesize($name));
// dump the picture and stop the script
fpassthru($fp);
exit;
?>
Someone else made a comment about having to report the correct content-type, which is true. Often, in my own experience, I already know it, or can use the file extension pretty easily. Otherwise you can always try to have a look at finfo_file. On that page there are also some comments about what you could do especially for images as well.
you should use a php script to control the access.
create a dir outside the webroot or inside the webroot with a .htaccess where you location the download files.
outsite the webroot is better.
you have to make sure that no one can access those files if they are located inside.
then take from the pear class lib. the class http_download.
using this class has many advantages.
Ranges (partial downloads and resuming)
Basic caching capabilities
Basic throttling mechanism
On-the-fly gzip-compression
Delivery of on-the-fly generated archives through Archive_Tar and Archive_Zip
Sending of PgSQL LOBs without the need to read all data in prior to sending
you should not use readfile oder any forwarding filepointer because you have to set the headers yourself and the don't support http "range".
for the access restrictions you can use you session-manager, password, framework, forum etc.
pear - http_download http://pear.php.net/package/HTTP_Download
you need to copy the url, because SO encodes it to url-encoded string (which is correct), but PEAR-homepage doesn't like that.
Why reinvent the wheel? Take a look at File Thingy, which is pretty easy to install and customise. If nothing else, you can study the source to learn how to perform the authentication step.
You could use MySQL to store uploaded files, rather than storing them in a file, better and more secure, in my opinion. Just Google "MySQL upload php" for example.
You could create the htaccess file using a PHP script, from your users table, each time a user accesses that folder, very troublesome.
I think the first option is better.
Use X-SendFile! There's extensions for Apache, Lighty and Nginx so there's a good chance there's one for your webserver.
Once you have the extension installed, you can authenticate the user using your PHP script, and then add the header:
header('X-SendFile','/path/to/file');
When your PHP script is done, it will trigger the webserver to stream the file for you. This is especially efficient if you use PHP along with for example FastCGI, because it frees up the PHP process for other work.
Evert