mySQL images upload - php

I need a simple code to upload images to mySQL using PHP... short! snippet... and is it possible to upload an html, css file to mySQL?... its reason is complicated but all answers are appreciated!... EDIT:: say I have 1000 users.. and they each have their own layout for their page.. So inside their MYSQL record will be a html file, css file(possibly), and image(s)...

I am a big fan of using a filesystem for storing physical files, i've yet to see any solid reason why they are better off in a database.
To automate this process you could have a shell script called through exec
exec("/home/some/path/my_filesystem_creator.sh ".escapeshellarg($args));
or PHP's native mkdir or anything really. If you went for a structure like:
/common/
/userdirs/1/
/userdirs/2/
essentially all i would imagine you would need to do is create a user dir, and copy into it the default versions of their site assets - images/css/html etc.
This should be easy enough to manage

Are you asking how to store a file in the database?
http://www.php-mysql-tutorial.com/wikis/mysql-tutorials/uploading-files-to-mysql-database.aspx
Or do you need to know how to upload a file to your web server in order to display it in a PHP/MySQL website?

Your page would be faster, if you generate a directory on your filespace for each user and store their css/js/image files there.
The reason for this is, that when you like to output your images to the browser, you will need to establish an own db connection for each file (since each is an own HTTP request to a PHP file, selecting the image).

You might want to take a look at http://mysqldump.azundris.com/archives/36-Serving-Images-From-A-Database.html and http://hashmysql.org/index.php?title=Storing_files_in_the_database before doing that. Storing files in mysql is generally considered a bad idea.

Just use different CSS rules for each user. Create the CSS dynamically though PHP based on user-specific variables. For example, if they have a div with an avatar or some other personal image, just create a class that uses variables for images, and then you really only need one or two files at most to do the whole thing. I would use a heredoc, but you could just use quotation marks to integrate the PHP.
php creates your css -
.useravatar{ 'background: url($baseurl.$urseridpic)'}
In the html, the div just needs the class of 'useravatar' never needing to be changed.

Related

How to display images that are uploaded to my ftp folder

I am building a website where I am uploading images to my ftp folder through PHP script. Now I want to display those images on to my HTML pages. I was thinking about using PHP and getting array of all the images from my ftp folder and then display them using image view.
Please tell me if I am doing this the wrong way and if there is any other better alternatives to it. I was reading php manual for ftp_nlist and ftp_rawlist but did not understand.
Well it may depend on how many images you have in there. Probably the most "correct" way to do it would be to store the filenames in a DB. You could scan the entire folder, but for every single request that's potentially a lot of overhead rather than just grabbing them out of a DB.
Are you manually uploading the images? Give us more details on how that works and we can better serve you. If you're using a script to upload images (I've had lots of projects where that's the case), then you can just have the script insert those filepaths into the DB for you. If not, (you're manually uploading them), or if indeed there are not a large number of files, then scanning the folder wouldn't necessarily be a bad thing. I've used that method on smaller projects myself.
Read up on the php readdir function in the docs (which actually works a lot like mysql_fetch_assoc, ironically)- That will provide you with an excellent way to go without setting up a DB. For an approach where an upload script handles it, I recommend a DB. Without more info, it's hard to say.
Good luck!

Protecting csv files used for plotting visualizations dynamically via PHP

Before I begin, I must warn you that I'm not much of a web programmer so my methods may seem somewhat roundabout and the terminology I use may be awkward.
Here's the situation. I'm developing a website for users to visualize data.
I have a public php page sitting in /var/www/thepage/index.php path (yes, Linux server + apache). This is the main page of the site and is also where users make selections in a form.
Upon form submission, a second php page will be called and this is where the form selections from the first php page are passed to the javascript that creates the visualization. In order for that to happen, csv files are first written into this directory using a php script that queries from a MySQL database.
Thing is, I want users to be able to see the visualizations but not be able to download the csv files (unless they are admin). How I allow admin to download the files is to create a protected (.htaccess) subdirectory /var/www/thepage/secure/ which has an index html that runs a cgi script once an admin logs in (prompted when a download link is clicked). This script copies the latest files (with dynamic names) from the /var/www/thepage/ directory and moves them to the secure/ directory with static filenames. Download links pointing to these files with static names are on the protected index.html. However, if a user looks at the source code of the 2nd php page, they can also download the files as they know the paths and they are not protected.
If remove file permissions, the php script won't be able to read the files either, causing the visualization to fail (I want normal users to be able to see the visualizations). It is also important to have the files because I have a cgi script (bash + awk) running a mathematical function on the files which also requires permission
Obscuring the filenames doesn't really work either since the files are written on the fly and the source code of the html page will reveal the obscured csv filenames being written.
How can I get around this problem? I would prefer not to have to create sessions and log-ins for normal users, etc...
As previously said, it's hard to hide anything on the net, especially if you need to send it to javascript. You could try hacking it a bit, could cost you a bit of performance, but would be a deterrent against people who aren't web savvy... But could also be seen as a challenge by others :)
A rough example would be something like..
$csv = fgetcsv("/var/www/thepage/secure/file.csv");
echo "<script type'text/javascript'>";
echo json_encode($csv);
echo "</script>";
Bit rusty here, but javascript should interpret the json as an object, that you can use in your code. You could go a step further and break the php array into sections before sending it off, making it harder to know what's going on.
Like I said. It's rough, but it could be a solution.
I would have imagined the best way is to store the files in a secure directory that isn't accessible in a web browser (outside of the web root). You could then show a list of the available files to authenticated users with a download link. When they click the link you could check they are logged in and if so then begin the file download.
PHP readfile - May help

Linking an image to a PHP file

Here's a bit of history first: Recently finished an application that allows me to upload images and store them in a directory, it also stores the information of that file in a database. Database stores the location, name and gives it an ID (auto_increment).
Okay, so what I'm doing now is allowing people to insert images into posts. Throwing a few ideas around on the best way to do this, as the application I designed allows people to move files around, and I don't want images in posts to break if an image is moved to a different directory (hence the storing of IDs).
What I'm thinking of doing is when linking to images, instead of linking to the file directly, I link it like so:
<img src="/path/to/functions.php?method=media&id=<IMG_ID_HERE>" alt="" />
So it takes the ID, searches the database, then from there determines the mime type and what not, then spits out the image.
So really, my question is: Is this the most efficient way?
Note that on a single page there could be from 3 to 30 images, all making a call to this function.
Doing that should be fine as long as you are aware of your memory limitations configured by both PHP and the web server. (Though you'll run into those problems merely by receiving the file first)
Otherwise, if you're strict about this being just for images, it could prove more efficient to go with Mike B's approach. Design a static area and just drop the images off in there, and record those locations in the records for their associated post. It's less work, and less to worry about... and I'm willing to bet your web server is better at serving files than most developer's custom application code will be.
Normally, I would recommend keeping the src of an image static (instead of a php script). But if you're allowing users to move them around the filesystem you need a way to track them
Some form of caching would help reduce the number of database calls required to fetch the filesystem location of each image. Should be pretty easy to put an indefinite TTL on the cache and invalidate upon the image being moved.
I don't think you should worry about that, what you have planned sounds fine.
But if you want to go out of your way to minimise requests or whatever, you could instead do the following: when someone embeds an image in a post, replace the anchor tag with some special character sequence, like [MYIMAGE=1234] or something. Then when a page with one or more posts is viewed, search through all the posts to find all the [MYIMAGE=] sequences, query the database to get all of the images' locations, and then output the posts with the [MYIMAGE=] sequences replaced with the appropriate anchor tags. You might or might not want to make sure users cannot directly add [MYIMAGE=] tags to their submitted content.
The way you have suggested will work, and it's arguably the nicest solution, but I should warn you that I've tried something similar before and it completely fell apart under load. The database seemed to be keeping up, but the script would start to time out and the image wouldn't arrive. That was probably down to some particular server configuration, but it's worth bearing in mind.
Depending on how much access you have to the server it's running on, you could just create a symlink whenever the user moves a file. It's a little messy but it'll be fast and reliable, and will also handle collisions if a user moves a file to where another one used to be.
Use the format proposed by Hammerite, and use [MYIMAGE=1234] tags (or something similar).
You can then fetch the id-path mappings before display, and replace the [MYIMAGE] tags with proper tags which link to images directly. This will yield much better performance than outputting images using php.
You could even bypass the database completely, and simply use image paths like (for example) /images/hash(IMAGEID).jpg.
(If there are different file formats, use [MYIMAGE=1234.png], so you can append png/jpg/whatever without a database call)
If the need arises to change the image locations, output method, or anything else, you only need to change the method where [MYIMAGE] tags are converted to full file paths.

Keeping track of links or references to image files and deleting unused ones (PHP/Database)

I need a way to remove "unused" images from my filesystem, i.e. images that are never accessed from any point in my website (doesn't matter if I break external links. I might disable external hotlinking altogether). What's the best way of going about this? Regular users can add multiple attachments to topics/posts and content contributers can bulk upload large numbers of images which can be used in articles or image galleries.
The problem is that the images could be referenced in any of the following ways:
From user content (text/html, possibly Markdown or BBCode) stored in the database
Hardcoded into an HTML page
Hardcoded into a PHP file
Hardcoded into a CSS file
As an "attachment" field in a database table, usually containing only the filename itself with no path, because the application assumes that it would be in a certain folder.
And to top it off, the path of the image could be an absolute or relative HTTP or PHP path and may or may not be built with string concatenation in PHP.
So obviously find/replace or regexing the database or filesystem is out of the question. But luckily for you and me, this system isn't fully implemented yet and I don't need anything that deals with an existing hoard of images. I just need to set up some efficient structure that will allow this in the future.
Some ideas I've thought of:
Intercepting the HTTP request for the image with PHP, and keeping track of the HTTP_REFERER. The problem with this is that just because no one has clicked on a link at the time of checking this doesn't mean the link doesn't exist.
Use extreme database normalization - i.e. make a table for images and use foreign keys for anything that references it. However this would result in making a metric craptonne of many-to-many relationships (and the crosstables) in addition to being impractical for any regular user to use.
Backup all the images and delete them, and check every single 404 request and run a script each time that attempts to find the image from the backup folder and puts it in the "real" folder. The problem is that this cache would have to be purged every so often and the server might be strained when rebuilding the cache.
Ideas/suggestions? Is this just something you have to ignore and live with even if you're making a site with a ridiculous amount of images? Even if it's not worth it, how would something work just for proof-of-concept (I added the garbage-collection tag just because this might be going into that area conceptually).
I will admit that my experience with this was simpler than yours. I had no 'user generated content' so to speak, and my images were all in only templates or database with full path. But what I did is create a perl script that
Analyzed my HTML templates, database
table, and CSS generated a list of
files
In the HTML it looked for <img> tags
In the CSS it looked for any .png, .jp*g, or .gif regex strings
The tables were easy because I had an Image table for the image data
The files list was then
ordered to remove duplicates
The script iterated through the list and
wrote a csv like:
filename,(CSS filename|HTML filename|DBTABLE),(exists|notexists) for
auditing
In another iteration it
renamed all files not in the list by
appended .del to the filename
After regression testing I called the
script with a -docleanup tag which
told it to go through and delete all
the .del appended files.
If for whatever reason an image was tagged
as .del and shouldn't have been, I
just manually renamed it back to its
original form.
A couple of notes: I realize that I could have made this script 'smoother' and done multiple things in multiple steps, but its use grew over time and I wanted clearly delineated processing steps so it couldn't ever run amok. I used the CSV to go back and clean up the information where the image didn't exist.

How to protect HTML navigation

I am writing software for an elearning platform. Validation is performed via PHP and MySQL. All content is uploaded into a folder protected for all direct access by HTACCESS and content is only served to users via a PHP routine that validates student credentials and then Fopens the file and sends it to the browser.
This is ok for all regular types of content (flash, gif, pdf, etc.), but cannot be used for content uploaded as regular HTML pages and graphics.
Does anyone know what would be a good idea to protect as much as possible this type of content? I thought of placing it in a random named directory, and linking to the content within an iframe to hide the address as much as possible, but is there a better way of doing this?
Thanks!
I may be a bit late on this, but in case somebody else looks for this...Maybe you can put give those files an extra extension, and then tell apache (or whoever) that php will actually be handling those files.
For example, you have an 'example.html' file, rename it to 'example.html.answers'.
Then when you need to serve the 'name.html' file, you actually pass the 'name.html'.'answers' argument.
One way of solving this problem is to have an intermediary script serving the binary content (based on user privileges) from a protected location.
So a typical file download URL instead of /resources/foo.pdf would be /download.php?foo.pdf
You could serve non-HTML (or non-text) as cherouvim suggests, and store HTML (all text, really) in a database table. Your download.php script would then have to be smart enough to query the database for .html, etc. files.
The problem then is that any images or other assets would have the wrong URL (e.g. <img src="image.jpg"/> instead of <img src="download.php?image.jpg/>. You'd have to re-write the links to these assets in the saved HTML in the database -- you could probably do this with a combination of an XML interpreter (to pull all the img tags and other tags that link to files) and regular expressions to determine which need to be rewritten (e.g. rewrite src="image.jpg" but not src="http://google.com/image.jpg".
Not the most elegant solution ever, but I think it would work.

Categories