protecting a zip through PHP + SQL - php

Without going in to, too much detail about the rest of the site I have a pretty standard user setup with a users table, id, name, password etc. Some users are "free" users and some have paid through paypal this is set by the "user_premium" table as a 1 or 0.
What I want to do is only allow the download of a zip file if the user has premium. Obviously I can hide the link on my pages if they don't, but they can still access domain.com/myfile.zip directly.
I tried blocking direct access to the zip via htaccess and used fpassthru in a PHP script to get access to the file e.g. (on the fly example code)
if($user->can_download()) {
$fp = fopen('myfile.zip', 'rb');
header("Content-Type: application/zip");
fpassthru($fp);
} else {
redirect('domain.com/premium.html');
}
However I got memory exhausted errors each time (the file is 4GB).
Is there another way around this?

You should look into using Apache mod_xsendfile instead of fopen. With xsendfile, you respond with an HTTP header with the path to the file on your server, which Apache scoops up and serves the file to the user's browser directly, allowing your PHP script to complete execution.
You can find a great guide for getting started with mod_xsendfile here: http://codeutopia.net/blog/2009/03/06/sending-files-better-apache-mod_xsendfile-and-php/

From comments in the php doc.
While adding CFLAGS="-D_FILE_OFFSET_BITS=64" immediately before calling "./configure" on the PHP source will enable support for using fopen() on large files (greater than 2 GB), note that -- if such an installation of PHP is used in conjunction with Apache HTTPD [2.x], Apache will become completely unresponsive even when not serving output from a PHP application.
In order to gain large file support for non-web applications while maintaining the operability of Apache, consider making two distinct PHP installations: one with the above CFLAGS specified during configuration (for non-web uses), and the other without this flag (for use with Apache).

Related

Upload file with sftp and php

I'm using the library phpseclib0.3.7 to connect via SFTP to my web server. The users that are present have the following structure:
/dirhome/FirstUser/FirstUser/
/dirhome/SecondUser/SecondUser/
/dirhome/....../.....
Where:
dirhome:
own -> root:root
permission -> 700
FirsUser: (is the name of a user (example))
own -> root:root
permission -> 755
FirsUser: (is the name of a user (example))
own -> FirstUser:mygroup
permission -> 700
The same structure for the second user and so on. With this facility, users can view / edit / create content only within their directory.
I can easily connect in PHP using the library and view folders/files, but do not know how to upload files from user's local PC on the remote server within your personal folder.
The library provides the method:
->put($remotepath, $localpath, NET_SFTP_LOCAL_FILE);
Il problema รจ:
How can I open a dialog box to allow the user to select a file on user's PC and upload in her personal directory?
If I put: $localpath = "C:\\Users\\****\\Desktop\\test.txt"
I get:
C:\\Users\\****\\Desktop\\test.txt is not a valid file
But if I take a file that resides locally on the server works fine, but it does not make sense because users can not put its files. There are several days that I try but without success.
Also the download does not work, essentially for the same problem.
PHP is a server side scripting language. When your browser requests a PHP generated site, PHP runs on the server and your browser only gets to see the result. For that reason, PHP obviously has no way to access client side files: it already ran before anything even reaches the client computer.
I'm assuming the server with the PHP files on it and the SFTP server are different servers, otherwise your whole question doesn't make too much sense.
So, what you need to do here is a two step approach: First, you need to upload the files to the server who runs the PHP files the regular way, using a HTTP POST request. You can send the request to a PHP script, that then uses SFTP to move the files to the other server.
For downloads (as you asked this in your comments) it works similar: the browser requests a PHP script that fetches the file from the SFTP server, and then sends it to the browser as HTTP response. The browser should then display a regular file download dialog and the user can select where to store it.
FOr uploading you should consider using some kind of cron job or a background job started using PHP's exec() instead, as you will most likely either run into max execution timeouts or have to set them way higher than you should if you upload them usign PHP, especially for large files. The alternative is to use a separate PHP configuration (depending on what version of PHP you are running you can use .htaccess files, .user.ini files or different PHP-FPM pools for that) to increase the execution time for only the upload script.

Allow logged in user to Download File in PHP else nobody can't

I have .mp3 files in my website and I want to set my site so that after my users have logged in they can download files. If users are not logged in they won't be able to download files. I do not want anyone to be able to find the path of the files.
I'd make the file impossible to access via an HTTP request alone, and with PHP, just print it out:
<?php
session_start();
if (isset($_SESSION['logged_in'])) {
$file = '/this/is/the/path/file.mp3';
header('Content-type: audio/mpeg');
header('Content-length: ' . filesize($file));
readfile($file);
}
?>
You can create a token based on something like the user session id and some random value. Then, the logged in user urls would be like :
/download.php?token=4782ab847313bcd
Place the MP3 files above your docroot, or if that is impossible, deny access to them with .htaccess (if using Apache).
Verify a user is logged in.
Send the appropriate headers and readfile() on the MP3 when the user requests the file.
From wordpress.stackexchange.com/a/285018
Caution: Be wary of using this PHP-driven file download technique on larger files (e.g., over 20MB in size). Why? Two reasons:
PHP has an internal memory limit. If readfile() exceeds that limit when reading the file into memory and serving it out to a visitor, your script will fail.
In addition, PHP scripts also have a time limit. If a visitor on a very slow connection takes a long time to download a larger file, the script will timeout and the user will experience a failed download attempt, or receive a partial/corrupted file.
Caution: Also be aware that PHP-driven file downloads using the readfile() technique do not support resumable byte ranges. So pausing the download, or the download being interrupted in some way, doesn't leave the user with an option to resume. They will need to start the download all over again. It is possible to support Range requests (resume) in PHP, but that is tedious.
In the long-term, my suggestion is that you start looking at a much more effective way of serving protected files, referred to as X-Sendfile in Apache, and X-Accel-Redirect in Nginx.
X-Sendfile and X-Accel-Redirect both work on the same underlying concept. Instead of asking a scripting language like PHP to pull a file into memory, simply tell the web server to do an internal redirect and serve the contents of an otherwise protected file. In short, you can do away with much of the above, and reduce the solution down to just header('X-Accel-Redirect: ...').

Performance-oriented way to protect files on PHP level?

I am looking for some input on something I have been thinking about for a long time. It is a very general problem, maybe there are solutions out there I haven't thought of yet.
I have a PHP-based CMS.
For each page created in the CMS, the user can upload assets (Files to download, Images, etc.)
Those assets are stored in a directory, let's call it "/myproject/assets", on a per-page basis (1 subdirectory = 1 page, e.g. "/myproject/assets/page19283")
The user can "un-publish" (hide) pages in the CMS. When a page is hidden, and somebody tries to access it because they have memorized the URL or they come from Google or something, they get a "Not found" message.
However, the assets are still available. I want to protect those as well, so that when the user un-publishes a page, they can trust it is completely gone. (Very important on judicial troubles like court orders to take content down ... Things like that can happen).
The most obvious way is to store all assets in a secure directory (= not accessible by the web server), and use a PHP "front gate" that passes the files through after checking. When a project needs to be watertight this is the way I currently go, but I don't like it because the PHP interpreter runs for every tiny image, script, and stylesheet on the site. I would like have a faster way.
.htaccess protection (Deny from all or similar) is not perfect because the CMS is supposed to be portable and able to run in a shared environment. I would like it to even run on IIS and other web servers.
The best way I can think of right now is moving the particular page's asset directory to a secure location when it is un-published, and move it back when it's published. However, the admin user needs to be able to see the page even when it's un-published, so I would have to work around the fact that I have to serve those assets from the secure directory.
Can anybody think of a way that allows direct Apache access to the files (=no passing through a PHP script) but still controlling access using PHP? I can't.
I would also consider a simple .htaccess solution that is likely to run on most shared environments.
Anything sensitive should be stored in a secure area like you suggested.
if your website is located at /var/www/public_html
You put the assets outside the web accessible area in /var/www/assets
PHP can call for a download or you can feed the files through PHP depending on your need.
If you kept the HTML in the CMS DB, that would leave only non-sensitive images & CSS.
If you absolutely have to turn on and off all access to all materials, I think your best bet might be symlinks. Keep -everything- in a non-web-accessible area, and sym link each folder of assets into the web area. This way, if you need to lock people out completely, just remove the symlink rather than removing all files.
I don't like it, but it is the only thing I can think of that fits your crtieria.
I'd just prevent hotlinking of any non-HTML file, so all the "assets" stuff is accessible only from the HTML page. Removing (or protecting) the page just removes everything without having to mess up the whole file system.
Use X-Sendfile
The best and most efficient way is using X-Sendfile. However, before using X-Sendfile you will need to install and configure it on your webserver.
The method on how to do this will depend on the web server you are using, so look up instructions for your specific server. It should only be a few steps to implement. Once implemented don't forget to restart your web server.
Once X-Sendfile has been installed, your PHP script will simply need to check for a logged in user and then supply the file. A very simple example using Sessions can be seen below:
session_start();
if (empty($_SESSION['user_id'])){
exit;
}
$file = "/path/to/secret/file.zip";
$download_name = basename($file);
header("X-Sendfile: $file");
header("Content-type: application/octet-stream");
header('Content-Disposition: attachment; filename="' . $download_name . '"');
Important note:
If you are wanting to serve the file from another webpage such as an image src value you will need to make sure you sanitize your filename. You do not want anyone overriding your script and using ".." etc. to access any file on your system.
Therefore, if you have code that looks like this:
<img src="myscript.php?file=myfile.jpg">
Then you will want to do something like this:
session_start();
if (empty($_SESSION['user_id'])){
exit;
}
$file = preg_replace('/[^-a-zA-Z0-9_\.]/', '', $_GET['file']);
$download_name = basename($file);
header("X-Sendfile: $file");
header("Content-type: application/octet-stream");
header('Content-Disposition: attachment; filename="' . $download_name . '"');
EDIT: How about a hybrid for the administrative interface? In the ACP you could access via the PHP method to, basically, send all file requests to the PHP authing file, but for public, you can use HTTP AUTH/htaccess to determine the availability of the result. this gives you the performance on the public side, but the protection on the ACP side.
OLD MESSAGE:
.htaccess is compatible with most Apache and IIS<7 environments (using various ISAPI modules) when using mod_rewrite type operations. The only exception is IIS7 + the new Rewrite module which uses the web.config file. HOWEVER, I'd be willing to be that you could efficiently generate/alter the web.config file for this instance instead of using .htaccess.
Given that, you could set up redirects using the rewrite method and redirect to your custom 404 Page (that hopefully sends the proper 404 header). It is not 100% appropriate because the actual asset should be the one giving a 403 header, but... it works.
This is the route I would go unless you want to properly create HTTP AUTH setups for every server platform. Plus, if you do it right, you could make your system extendable to allow other types in the future by you or your users (including a php based option if they wanted to do it).
I'm assuming the 'page' is being generated by PHP and the 'assets' should not require PHP. (Let me know if I got that wrong.)
You can rename the assets folder. For example, rename '/myproject/assets/page19283' to '/myproject/assets/page19283-hidden'. This will break all old, memorized links. When you generate the page for admin users that can see it, you just write the urls using the new folder name. After all, you know whether the page is 'hidden' or not. The assets can be accessed directly if you know the 'secret' url.
For additional security, rename the folder with a bunch of random text and store that in your page table (wherever you store the hidden flag): '/myproject/assets/page19283-78dbf76B&76daz1920bfisd6g&dsag'. This will make it much harder to guess at the hidden url.
Just prepend or append a GUID to the page name in the database and the resource directory in the filesystem. The admin will still be able to view it from the admin interface because the link will be updated but the GUID effectively makes the page undiscoverable by an outside user or search engine.
Store your information in a directory outside the web root (i.e.: one directory outside of public_html or htdocs). Then, use the readfile operator in a php script to proxy the files out when requested. readfile(...) basically takes a single parameter--the path to a file--and prints the contents of that file.
This way, you can create a barrier where if a visitor requests information that's hidden behind the proxy, even if they "memorized" the URL, you can turn them down with a 404 or a 403.
I would go with implementing a gateway. You set a .htaccess file to the /assets/ URL pointing to a gateway.php script that will deny if both the credentials are not valid and this particular file is not published or show it.
I'm a little confused. Do you need to protect also the stylesheet files and images? Perhaps moving this folder is the best alternative.

How to protect downloadable files in a remote directory from non-premium users (in php?)

Im building a "premium" section of my site and Im in a need to give download access to files in a remote directly (on a different server), to users with special privileges (accounts stored in mysql db). My site is coded in php/mysql so a php solution would be great.
direct all download links to a php file that'll do all the credential checking.
you can call the file download.php
pass along parameters via cookies, get, post, session, or whichever manner you verify privileges.
once credentials are verified, you can send an appropriate header.
if it's an image, the header would be header("Content-type: image/jpeg");
i'm assuming that you also own this remote server.
some useful links:
MIME types
PHP Header Function
As #pxl said, you need to check for authorization and then output the correct mime type as an HTML header (like he said: header("Content-type: image/jpeg");)
Also, once you are done with that, you will need to output the actual contents of the file and it's size (in bytes) as such:
header("Content-Length: ".filesize("FILENAME")*1.001);
/* The *1.001 puts a nice buffer on the filesize, I read about it online.
Browsers will stop downloading exactly at the Content-Length, but if they go
over, it's not a big deal at all. */
readfile("FILENAME");
die();
Just make sure to store the file in a directory that is not accessible from the web.
I'm used to doing this in ASP.NET where it's built in, but this article seems to chronicle your exact situation.
Here's what I would do:
Built a PHP-SOAP-Sever on the remote server B that holds the files.
Whenever a user triggers a download on your main server A connect to the SOAP-Server on B and reserve a ticket for the user specifying an IP-address and the id/path of the file to download.
Server B will now create a ticketId(which should only be valid for a limited time) for this download and return it to A.
Server A redirects the user to Server B supplying the ticketId as a GET parameter
Server B now checks if the ticket was already used, is expired or if the user comes from the wrong IP. If none of them apply serve the file and mark the ticket as used.
Note: On server B don't keep PHP running while serving the file but use the X-Sendfile header instead. Otherwise the download might stop after the PHP max execution time.

htaccess Authentication with PHP

On the current website I'm working on, I've got a directory of files for users to download which would be really nice to have some security method other than obscurity ;)
I was wondering if there's any way to supply login information via PHP to htaccess as though a user were entering it.
Alternately, if anyone knows a better way to secure user downloads using PHP, that's also acceptable. All of my googling turns up "just use htaccess" which isn't really helpful, as from the non-savvy user's point of view, they have to log in twice every time they use the website.
My best guess at doing it exclusively with PHP is to store files above the web root, then copy them to a web accessible folder temporarily, but this seems highly inefficient and I couldn't think up any way to remove them after the download has finished.
Note: I don't own the server this is running on and don't have ssh access to it.
If files are not too big (Gb) you can always use readfile for file's download. In this mode you can check user's auth before, and if it's ok output file contents to user, otherwise send him to login page.
With this method you can put your files in protected (with .htaccess) directory so you can be sure that nobody who isn't authenticated can access them.
I think I would either store them in a folder outside of the web root, or in a folder protected by .htaccess and then have a php script that checked if the user was logged in and allowed to download a file asked for. If he was, then just pass the file through to the user.
Example from linked page at php.net:
Example #1 Using fpassthru() with binary files
<?php
// open the file in a binary mode
$name = './img/ok.png';
$fp = fopen($name, 'rb');
// send the right headers
header("Content-Type: image/png");
header("Content-Length: " . filesize($name));
// dump the picture and stop the script
fpassthru($fp);
exit;
?>
Someone else made a comment about having to report the correct content-type, which is true. Often, in my own experience, I already know it, or can use the file extension pretty easily. Otherwise you can always try to have a look at finfo_file. On that page there are also some comments about what you could do especially for images as well.
you should use a php script to control the access.
create a dir outside the webroot or inside the webroot with a .htaccess where you location the download files.
outsite the webroot is better.
you have to make sure that no one can access those files if they are located inside.
then take from the pear class lib. the class http_download.
using this class has many advantages.
Ranges (partial downloads and resuming)
Basic caching capabilities
Basic throttling mechanism
On-the-fly gzip-compression
Delivery of on-the-fly generated archives through Archive_Tar and Archive_Zip
Sending of PgSQL LOBs without the need to read all data in prior to sending
you should not use readfile oder any forwarding filepointer because you have to set the headers yourself and the don't support http "range".
for the access restrictions you can use you session-manager, password, framework, forum etc.
pear - http_download http://pear.php.net/package/HTTP_Download
you need to copy the url, because SO encodes it to url-encoded string (which is correct), but PEAR-homepage doesn't like that.
Why reinvent the wheel? Take a look at File Thingy, which is pretty easy to install and customise. If nothing else, you can study the source to learn how to perform the authentication step.
You could use MySQL to store uploaded files, rather than storing them in a file, better and more secure, in my opinion. Just Google "MySQL upload php" for example.
You could create the htaccess file using a PHP script, from your users table, each time a user accesses that folder, very troublesome.
I think the first option is better.
Use X-SendFile! There's extensions for Apache, Lighty and Nginx so there's a good chance there's one for your webserver.
Once you have the extension installed, you can authenticate the user using your PHP script, and then add the header:
header('X-SendFile','/path/to/file');
When your PHP script is done, it will trigger the webserver to stream the file for you. This is especially efficient if you use PHP along with for example FastCGI, because it frees up the PHP process for other work.
Evert

Categories