I have a page that allows users to submit photos to the server. On another server I need to have a page that will have access to those photos on the first server and give possibility to upload/delete photos. What choices do I have considering that I have full access to both servers and I don't want to use php ftp.
Thanks
I need to have a page that will have
access to those photos on the first
server and give possibility to
upload/delete photos.
You need to have a look at Same Origin Policy:
In computing, the same origin policy
is an important security concept for a
number of browser-side programming
languages, such as JavaScript. The
policy permits scripts running on
pages originating from the same site
to access each other's methods and
properties with no specific
restrictions, but prevents access to
most methods and properties across
pages on different sites.
For you to be able to get data, it has to be:
Same protocol and host
You need to implement JSONP to workaround it.
One thing you can do is to setup a a shared drive (NAS - Network attached storage) between these boxes that the files will be stored on, then you will have access to the files from both servers.
The NAS shares can be mounted via NFS (assuming you using Linux)
HTH
Related
I have a number of different domains where I would like users to be able to select a photo, however I want to photos uploaded/stored on one central separated domain.
Is there a more advisable way to do this?
I've considered using an iframe from the other domain (so they are interacting with the required domain) but haven't tried it yet.
I have also read that curl can potentially do this.
Any other ideas/problems/concerns...
All advise appreciated.
thx
There are a couple ways you can handle this. Here are the scenarios I see.
Scenario A:
All domains are on the same server. With this setup you can actually help your server security by storing the images and other files in a directory that is not accessible by Apache. Then you use PHP to serve the file to the end user through virtual links. Drupal does this with its private file system.
If the users are able to upload images from the other domains then just have all of the domains write the images to the same directory, but then always return the "central" domain's URL for retrieving them.
Scenario B:
The domains exist across two or more servers. With this setup what you really need to do is setup an API where the other domains communicate behind the scenes to send the photo to the core repository domain. This domain saves it and returns a URL for accessing it.
In either case you should really look up CDN technology. Its basically what your trying to accomplish plus a lot more. I would also recommend that in either scenario that you always use the path on the central domain for all the images instead of returning them back through the current domain.
No reason to get iframes or curl involved. Assuming that all these domains are on the same server (or same cluster of servers), there is nothing which requires that a file uploaded using a file on one domain must only be served from that domain. An upload form can do whatever it wants with the uploaded file, including storing it in a location where it'll be available from your central domain.
I'm looking for some quick info about best practices on storing user's uploaded files on different servers or sub domains...
For example, photos on facebook of course arent on facebook.com/files/users/453485 etc...
but rather on photos.ak.fbcdn.net or whatever...
I'm wondering how with php i can upload to a different server whilst maintaining a mysql connection to my original... is it possible?
Facebook uses a content delivery network (cdn, hence fbcdn or facebook content delivery network) and probably uses webservices to pass binary data (photos) from server to server.
Rackspace Cloud offers a similar service. Here is an example application of their PHP library to access their webservice api: http://cloudfiles.rackspacecloud.com/index.php/Sample_PHP_Application
I'm going to make the assumption that you have multiple webservers, and want to be able to access the same set of files on each one. In that case, some sort of shared storage that each machine can access might be a good place to start.
Here are a couple options I've used:
Shared NFS Volume [ http://en.wikipedia.org/wiki/Network_File_System_(protocol) ]
MogileFS [ http://www.danga.com/mogilefs/ ]
Amazon S3 [ http://aws.amazon.com/s3/ ]
If you don't have control over the hardware or aren't able to install extra software, I'd suggest Amazon S3. There is an api that you can use to shuttle files back and forth. The only downside is that you don't get to use storage that you may already use, and it will cost you some money.
If you do have access to the hardware and software, MogileFS is somewhat like S3, in that you have an api to access the files. But is different in that you get to use your existing storage and get to do so for no additional cost.
NFS is a typical place where people will start, because it's the simplest way to get started. The downside is that you'll have to be able to configure servers, and setup a NFS volume for them to mount.
But if I were starting a high-volume photo hosting service, I'd use S3, and I'd put a CDN like Akamai in front of it.
I have a file that is being linked to from other sub websites.
The file: http://site.com/file.img
Website A linking to it <img src="http://site.com/file.img"></img>
website B linking to it <img src="http://site.com/file.img"></img>
I need to reliably identify which of these websites has accessed the file, but I know that $_SERVER['HTTP_REFERER'] can be spoofed. What other ways do I have to reliably confirm the requester site? By IP, get them to register an IP? not sure. setup an API key? What options are there?
If a website is only linking to a file, the "website" itself will never actually access your image. Instead, the client who's viewing the site will make a request for the image.
As such, you're depending on information sent by the client, which is completely out of your control and not reliable at all. If you have the opportunity to set some sort of unique cookie on the client, you may be able to use this in some fashion for extended identification, but even that won't be reliable.
There is no 100% reliable solution.
Getting the referrer is the best you can do without getting into complicated territory.
If you don't mind complicated, then read on: set up your Web server to serve file.img only to Website A and Website B, then require that Website A and Website B set up a proxy configuration on their end that will retrieve file.img on behalf of their visitors.
Example:
A visitor to Website A loads a page that contains an image tag like <img src="http://websiteA.com/file.img"/> (note reference to Website A rather than your site). Client requests file.img from WebsiteA.com accordingly. Website A is configured to proxy requests for the path /file.img to your server, http://site.com/file.img. Your site verifies that it is in fact Website A that is requesting the image and then serves it to Website A's proxy. Website A then serves it to the visitor.
Basically, that makes it a pain for Websites A and B, gives you a performance hit, and also requires further configuration on your part. But I imagine that would satisfy your requirement.
Have a look at how OpenID relying is implemented, it allows one site to authenticate against another. The protocol specification will give a hint at the effort and overhead required to reliably implement such a scheme.
http://googlecode.blogspot.com/2010/11/googles-sample-openid-relying-party.html
I am planing to bulid www portal in PHP, where many pictures will be stored. I decide to store pictures in directories at the server(not DB) for performence reason. Some pictures will be accessible for all users from internet, and some (if user set them private) not according to session id. What is the best solution of this problem? Perforance is important. Shoud I use some mod_rewrite or move private files to other directory than public?
John
Do not allow users to hot-link to those images. Ignoring this means you'll be allowing freeloaders to steal network resources from other legitimate users.
Make sure those images are not in a Web-accessible path.
Find a way to throttle traffic. You don't want one user hogging bandwidth that should have been allocated to other users.
Store the private files outside of your wwwroot, only accessible via a php script, that checks access rights and forwards to a 403 page or servers the image, via readfile() for example.
There are two options:
Store the pictures in some publicly accessible folder of your site root, name the files such that the filename cannot be guessed (randomly generated and non-sequential), and make sure directory indexing is turned off. Only serve links to images that a user is authorized to see. Provided they don't share the links to their private photos, it will be very hard for people to accidentally (or intentionally) stumble across private photos.
Store the pictures outside of your site root, and serve them via a script which authenticates against the session. There are examples of how to read an image into memory and output it to the browser on the imagejpeg page of PHP.net
I am currently working on 2 web servers, One Coldfusion and the other PHP.
Right now, the Coldfusion server is my main server where users log in to access restricted data.
However, I have also begun using a PHP server and want to make it transparent for users to access a specific page on that server - that server requires log in information as well.
I do not want the users to log in twice.
Is there a way to accomplish this ?
Thx
UPDATE: Working in an Intranet environment, so I can't use any public solution.
UPDATE: Reason I am asking for this is because we are moving from a MSQL / Coldfusion environment (Initial server) to a PHP / ORACLE (new server). So I have 2 user tables as well (although they contain mostly the same information).
I am trying to faze out the use of our initial server in favor of our new server transparently to the user and thus I have to work in parallel for the time being.
Most single-sign-on solutions work a bit like this...
Main system authenticates use
User opts initiates a need to move to system 2
Main system authenticates the user with system 2 in the background
System 2 supplies a random, long and disposable token to Main system
Main system redirects the user, with the token, to system 2
System 2 checks the token (and other factors such as IP address) to validate the session
System 2 disposes of the token to ensure it can't be replayed
You would want to ensure that the transmission channels had some security on, especially where Main system and system 2 are talking to each other. You would want that to be a secure transport.
Store sessions in a database, and share them between the two apps.
You could use xml-rpc to get user data and log the user into the other site when they have a login cookie for the first one and vice versa.
Php manual page for XML-rpc
Here is what I have done, in running my own game server, had users on sql server, and on mysql, and wanted to integrate them both.
I made sure that if a user was created on 1 system, was also created on the other.
So you can modify code in both applications, to automatically create a user in other system if it is created on here.
Depending if both servers share a domain, can you do cross-domain sessions or cookies...But my best guess is to store and retreive data...
Or..
as a person logins/registers record their current ip address, on both servers, then check if this person was on the other server within 2-5 minutes, if so, use the ip address to identify them....
This system is tricky because timing is important, so your not leaving a huge hole in your security....But for short term, going between servers, this is simplest solution, in my own opinion.
Good Luck.
If you are on an intranet, you can actually sniff out the network username of the user from the PC they are logged into the network on using PHP. This assumes that:
You are using IIS to host your PHP application.
Your users are using Windows.
Check the section "2.2 Enabling Support for Detecting Usernames" here.
After that, all you need to do is investigate if the same is possible from Coldfusion, and you have the basis of an SSO solution based on the network usernames.
How about implementing an OpenID solution, much like the one apparent on StackOverflow?
You may benefit from dropping a shared object on the client machine via Flash or Flex. This object could then be read from ColdFusion/PHP/Python on servers that otherwise had no connection to each other or access to a common database.
Here is a simple example from the Adobe Docs
Maintain local persistence. This is
the simplest way to use a shared
object, and does not require Flash
Media Server. For example, you can
call SharedObject.getLocal() to create
a shared object in an application,
such as a calculator with memory. When
the user closes the calculator, Flash
Player saves the last value in a
shared object on the user's computer.
The next time the calculator is run,
it contains the values it had
previously. Alternatively, if you set
the shared object's properties to null
before the calculator application is
closed, the next time the application
runs, it opens without any values.
Another example of maintaining local
persistence is tracking user
preferences or other data for a
complex website, such as a record of
which articles a user read on a news
site. Tracking this information allows
you to display articles that have
already been read differently from
new, unread articles. Storing this
information on the user's computer
reduces server load.
Full Information: http://livedocs.adobe.com/flex/3/langref/flash/net/SharedObject.html