I have a PHP file that performs a query to an external API using a query parameter to retrieve the image from the database. This particular API doesn't return the image, but rather generates a new URL that can be used for a short period of time (associated cookie involved).
So my PHP file might be found at:
myserver.com/getFile.php?id=B9590963-145B-4E6A-8230-C80749D689WE
which performs the API call which generates a URL like this:
myserver.com/Streaming_SSL/MainDB/38799B1C4E38F1F9BCC99D5A4A2E0A514EEA26558C287C4C941FA8BA4FB7885B.png?RCType=SecuredRCFileProcessor&Redirect
which I store in a PHP variable - $fileURL. I then use:
header('Location: '. $fileURL);
to redirect the browser to the URL that shows the image. This is all working well, but has created an issue with a service that I integrate with. This service caches the 2nd (redirected URL) which causes problems, as it essentially only works the first time it is generated due to the use of the session cookie. I need to come up with a solution that will allow them to cache a URL that will continue to work that shows the image.
I'm wondering is there a way that I can download the image and show that somehow, without having to redirect the browser to a new location here and thus allowing the original URL to continue working if it is cached?
To get image or file do this:
$getImg = file_get_contents('myserver.com/getFile.php?id=B9590963-145B-4E6A-8230-C80749D689WE')
Name the file
$fileName = 'name.png'
Save the image on your server (in directory) to view it later or show somewhere else with url.
$saveFile = file_put_contents($fileName , $getImg);
Load the file like you wanted after you have saved it.
header('Location: '. $fileName);
Hope it helps.
Related
I tried to write a program that can automatically down files (with php links). However, I have two issue right now
First, my target website requires registration for the first time access. Then, every time when I clicked the download link, it automatically downloads the file I want. It looks like searched some cookies that saved on my computer to determine who I am. How to make my python program deal with my local cookies? if multiples?
Second, can anyone provide me an example code about how to deal with php download link file? I want to save all these files on a specific location with a specific name. How should I do that in python3?
For getting the cookies:
Try:
import urllib.request
cookier = urllib.request.HTTPCookieProcessor()
# create the cookie handler
opener = urllib.request.build_opener(cookier)
urllib.request.install_opener(opener)
The HTTPCookieProcessor will return cookielib.CookieJar object which contains those cookies. You can loop through it to find the cookie you want.
for c in cookier.cookiejar:
if c.domain == '.stackoverflow.com':
# do something
For read the content in the link:
Try:
url = 'YOUR_URL'
req = urllib.request.Request(url, headers=_headers) # where headers is the header setting you can find in your brwoser
f = urllib.request.urlopen(req)
contents = f.read().decode('utf-8')
# contents is the content inside your file
# You can add the code here to write contents to other file to save it
I have a php file that is generqating Image and that is being included in the image tag like this:
<img src"generate_contact.php?memberid=3456">
Now if anyone will try to access this file directly, with a memberid query string, they an actually see the image file being generated.
How can i prevent direct access to "generate_contact.php"?
Note: If i try to make a CONSTANT in the file in which this img tag is inserted, will generate_contact.php have access to that CONSTANT? asking because generate_contact.php is not being included.. it is being added as src in image tag only.
Regards
Instead of using an incremental member-id to access this image, why not using a unique hash?
In your members table, add a "hash" field, and add a random string inside this field for each member.
I'm used to generate 10 chars hash this way :
$hash = substr(str_shuffle(base_convert(str_shuffle(sha1(str_shuffle(md5(rand() . microtime())))), 16, 36)), 0, 10);
After that, use your hash to identify your member :
<img src = "generate_contact.php?memberhash=0qxv(...)"/>
In such a way, crawlers will not be able to increment the id and get associate contact of your whole members.
The users browser making a request for image to be placed inside the page is a 'get' request, and is same as if you type the address directly. There is no way you can actually place an image on a page and then keep it unaccessible completely, unless ofcourse using flash
OK, take it back one step.
It's all about URLs. If you put a URL into an image element's src attribute, the image must be available at that URL:
<img src="/img/profile/42.jpg">
The browser will download the HTML document with this image element, then make another HTTP request, just like the first one, to that URL to also download the image.
You can put that URL directly into your browser's address bar for the same effect. There's certain content available at some URL. It does not matter how that URL is accessed. It is not "tied to an HTML document" or "secret" or "hidden" or anything like that because it's in an HTML document.
URLs are always public and accessed "directly", otherwise nobody could see their content.
So, either your URL generate_contact.php?memberid=3456 spits out an image or it doesn't. What it does behind the scenes is irrelevant.
There's no way to prevent direct access to an image that you're wanting to display to the user. By having:
<img src = "generate_contact.php?memberid=3456">
You're already giving them direct access because the user's browser will actually make a GET request for the file once the page has loaded. Trying to prevent direct access to an image that you're wanting to display to the user goes against the fundamentals of the Internet, in which users request public documents.
I am using formmail by tactite to have the info submitted from my form get emailed to me. After the user hits the submit button, it goes to a "Thank You" page that by default just has some text, I'm trying to change that to load up a thank you page that I created and it doesn't work, what am I doing wrong?
Thanks!
Here's what doesn't work:
// MSG_THANKS_PAGE is the default page that's displayed if the
// submission is successful
// Parameters: none
$aMessages[MSG_THANKS_PAGE] = load('http://nimbledesigns.com/kelsie/thankyou.html');
This is what i had there before that DOES work:
$aMessages[MSG_THANKS_PAGE] = 'Thanks!<br /><br />'.
'Go Back'.
'';
Tere is no load() function built into PHP. Most likely what you're looing for is file_get_contents(), which'll retrieve the contents of a file (local or otherwise) as a string.
If that URL points back to your own server, you may want to save yourself a full HTTP round-trip and simply use a local path ... = file_get_contents('/path/to/that/thank/you/file.html').
File_get_Contents()
use
$aMessages[MSG_THANKS_PAGE] = file_get_contents('http://nimbledesigns.com/kelsie/thankyou.html');
instead.
Documentation
file_get_contents() - http://php.net/manual/en/function.file-get-contents.php
Alternatives'
If that file is on your server, then you may only need to do this:
$aMessages[MSG_THANKS_PAGE] = file_get_contents('thankyou.html');
That will stop PHP from using the HTTP stream connector and will use the File IO connector instead, which is going to be faster with less overhead (although the difference may only be viewable when your server is running slowly)
Redirects
You could also redirect them to the page, by issuing this command before you send any data to the browser:
header('Location: thankyou.html');
exit();
This will redirect their browser to the file. Again assuming it resides on your server. You could replace that with a the full address if required http://nimbledesigns.com/kelsie/thankyou.html
As stated earlier, file_get_contents is your best bet. There is no load() function.
But why not just redirect to the page?
It says how to here: http://www.tectite.com/fmhowto/redir.php
(I'm assuming that's the form mailer you're using, and "tactite" was a typo).
haven't used php load for a long time, but isn't it just for xml and returns an object?
is this? http://php.net/manual/en/domdocument.load.php
I am writing a anti-leeching download script, and my plan is to create a temporary file, which is named by session ID, then after the session expires, the file will be automatically deleted. Is it possible ? And can you give me some tips how to do that in PHP ?
Thanks so much for any reply
PHP has a function for that name tmpfile. It creates a temporary file and returns a resource. The resource can be used like any other resource.
E.g. the example from the manual:
<?php
$temp = tmpfile();
fwrite($temp, "writing to tempfile");
fseek($temp, 0);
echo fread($temp, 1024);
fclose($temp); // this removes the file
?>
The file is automatically removed when closed (using fclose()), or when the script ends. You can use any file functions on the resource. You can find these here. Hope this will help you?
Another solution would be to create the file in the regular way and use a cronjob to regular check if a session is expired. The expiration date and other session data could be stored in a database. Use the script to query that data and determine if a session is expired. If so, remove it physically from the disk. Make sure to run the script once an hour or so (depending on your timeout).
So we have one or more files available for download. Creating a temporary file for each download requests is not a good idea. Creating a symlink() for each file instead is a much better idea. This will save loads of disk space and keep down the server load.
Naming the symlink after the user's session is a decent idea. A better idea is to generate a random symlink name & associate with the session, so the script can handle multiple downloads per session. You can use session_set_save_handler() (link) and register a custom read function that checks for expired sessions and removes symlinks when the session has expired.
Could you explain your problem a bit more deeply? Because I don't see a reason why not to use $_SESSION. The data in $_SESSION is stored server-side in a file (see http://php.net/session.save-path) BTW. At least by default. ;-)
Ok, so we have the following requirements so far
Let the user download in his/her session only
no copy & paste the link to somebody else
Users have to download from the site, e.g. no hotlinking
Control speed
Let's see. This is not working code, but it should work along these lines:
<?php // download.php
session_start(); // start or resume a session
// always sanitize user input
$fileId = filter_input(INPUT_GET, 'fileId', FILTER_SANITIZE_NUMBER_INT);
$token = filter_input(INPUT_GET, 'token', FILTER_UNSAFE_RAW);
$referer = filter_input(INPUT_SERVER, 'HTTP_REFERER', FILTER_SANITIZE_URL);
$script = filter_input(INPUT_SERVER, 'SCRIPT_NAME', FILTER_SANITIZE_URL);
// mush session_id and fileId into an access token
$secret = 'i can haz salt?';
$expectedToken = md5($secret . session_id() . $fileId);
// check if request came from download.php and has the valid access token
if(($expectedToken === $token) && ($referer === $script)) {
$file = realpath('path/to/files/' . $fileId . '.zip');
if(is_readable($file)) {
session_destroy(); // optional
header(/* stuff */);
fpassthru($file);
exit;
}
}
// if no file was sent, send the page with the download link.
?>
<html ...
<?php printf('a href="/download.php?fileId=%s&token=%s',
$fileId, $expectedToken); ?>
...
</html>
And that's it. No database required. This should cover requirements 1-3. You cannot control speed with PHP, but if you dont destroy the session after sending a file you could write a counter to the session and limit the number of files the user will be sent during a session.
I wholeheartedly agree that this could be solved much more elegantly than with this monkeyform hack, but as proof-of-concept, it should be sufficient.
I'd suggest you not to copy the file in the first place. I'd do the following: when user requests the file, you generate a random unique string to give him the link this way: dl.php?k=hd8DcjCjdCkk123 then put this string to a database, storing his IP address, maybe session and the time you've generated the link. Then another user request that file, make sure all the stuff (hash, ip and so on) matches and the link is not expired (e.g. not more that N hours have passed since the generation) and if everything is OK, use PHP to pipe the file. Set a cron job to look through the DB and remove the expired entries. What do you think?
tmpfile
Creates a temporary file with a unique
name in read-write (w+) mode and
returns a file handle. The file is
automatically removed when closed
(using fclose()), or when the script
ends.
Maybe it's to late for answering but I'm try to share on feature googlize!
if you use CPanel there is a short and quick way for blocking external
request on your hosted files which name is: HotLink.
you can Enable HotLinks on you Cpanel and be sure nobody can has request o your file from another hosting or use your files as a download reference.
To acheive this, I would make one file and protect it using chmod - making it unavailable to the public. Or, alternatively, save the contents in a database table row, fetch it whenever required.
Making it downloadable as a file. To do so, I would get the contents from the protected file, or if it is stored in a database table, fetch it and simply output it. Using php headers, I would, give it a desired name, extension, specify it's type, and finally force the browser to download the output as a solid file.
This way, you only need to save data in one place either, in a protected file or in database. Force client browser to download it as many times as the the conditions meet e.g., as long as the user is logged-in and so on. Without having to worry about the disk space, making any temp file, cronJobs and or auto-deletion of the file.
I just ran into something today that I am not sure how it is being done.
I know a few things:
The site is done in php
There is an image gallery, a url would be something like
http://www.example.com/girls/Cyn/sets/getImage/1170753147/7.jpg
I can see that url, as long as I am logged into the site. It does not appear to be referrer based, as I took the url, made a new window in a browser, and was able to load it while still logged in. In doing so, I had no referrer.
The second I log out, I am redirected to a please register/login page.
This is a heavy hit site.
How is this done? I can tell they are running apache on Cent. When I login, I am given a cookie, that has a hash of something, which I am sure they are using to lookup an id to make sure I am allowed to be logged in.
However, the above is a direct request for a resource that is just a jpg. There has to be some communication then with Apache, and their database to see the state of that request. How would merely loading a url, send off a cookie value to apache that could then pass it off to a database?
I am about to embark on a paid membership site, and will need to protect images in the same way. This was not http auth, this was form based login, and I am at a loss as to how this was done. Any ideas?
All requests go through the web server. If a site sets a cookie, then all your requests to that site will include the cookie contents until that cookie expires or is removed. It doesn't matter what you're requesting it only matters where you are requesting it from.
If you have firebug open up the 'Net' tab when you're on this site and check all the requests you have made. You'll see in the request headers a 'Cookie' line. This will be on every resource requested: the images, the stylesheets, everything.
If Apache is the web server then it could use mod_rewrite to direct your request or it could pass it to PHP or Perl or something else that can check the cookie and output the image if valid or redirect if not.
Here is a php example (image output taken from php.net):
if(valid_auth($_COOKIE['auth'])) {
// open the file in a binary mode
$name = './img/ok.png';
$fp = fopen($name, 'rb');
// send the right headers
header("Content-Type: image/png");
header("Content-Length: " . filesize($name));
// dump the picture and stop the script
fpassthru($fp);
exit;
} else {
header('Location: /login');
exit;
}
It's probably a web application that uses a session cookie for authentication and redirects if the session has not been authenticated.
Pretty much any web framework has plugins for this sort of thing. There might even be apache modules to do it, but I haven't seen one.
You must create a "getter" for the images. The images must be stored in a folder outside of the public accessible directories.
/public_html
/js
jquery.js
index.php
getimage.php
/private_images/
myimage.jpg
Note that private_images directory is not accessible when you: http://www.mysite.com/private_images
Now, to create the "getter" script.
/* This is getimage.php */
if(!isset($_SESSION['is_logged_in'])) {
header('Location: /login');
exit;
}
/*
Get the image_name from the URL
You will be using something like: http://mysite.com?image_name=flowers.jpg
This is the way to get the image.
*/
$path = "/var/www/html/private_images"
$name = $path.'/'.$_GET['image_name'];
$fp = fopen($name, 'rb');
// send the right headers
header("Content-Type: image/jpg");
header("Content-Length: " . filesize($name));
// dump the picture and stop the script
fpassthru($fp);
exit;
If you missed the comment above, you can do this to retrieve the image:
http://mysite.com?image_name=flowers.jpg
However, the above is a direct request for a resource that is just a jpg. There has to be some communication then with Apache, and their database to see the state of that request. How would merely loading a url, send off a cookie value to apache that could then pass it off to a database?
Every single http request is sent to a web server. The web server will then decide how to handle the request, based on a set of rules. By default, Apache has a simple handler that just sends the requested file back to the user. There is however no reason why you couldn't configure Apache to handle all requests with a php-script. On a high traffic site, you would probably solve this differently, since it's a bit expensive to fire up php for each and every image to show, but in theory you could just make a mod_rewrite rule that pipes all requests matching a particular pattern (Such as ^girls/Cyn/sets/getImage/.*) to a php-script. This script would then read the actual file from somewhere outside the web root and print it out to the user.