HI,
I have a flash application (working demo here) which I'm using to let users take pictures of themselves on my website. The application takes the picture, then lets the user save it by invoking a PHP script located in the same directory as the Flash file.
Testing this on my local machine works fine - the picture saves as intended. However, once on my server, the saving no longer works. The flash runs fine - pictures can be taken, however the save button does not work. Why would the environments differ in such a specific way, and what might be preventing the save function from working?
You might just need to make the permissions on the directory you want to save the image into writeable. Typically, the default setting for most hosting companies is read only. This will probably be in your hosting companies control panel.
Related
I am working on a website. I do most of my development works from the localhost. Usually whenever I have to update the website, I delete all files on the server and upload everything from the website folder to the server. Previously, the files were less so it was not a problem, but now the files are large and it takes quite a lot of time to upload all files. Is there any way/software that would help me upload only the updated files? My website is based on PHP and MySQL. Thanks - this is my first post.
You should use any version control system to manage all such cases.
There are so many wonderful tools available for version control like GIT, SVN etc
You can try to work with Cobian Backup using an incremental Backup to the webserver. Just create a task and the program uploads everything that changed since the last "backup".
I am designing a web-based file-managment system that can be conceptualised as 3 different servers:
The server that hosts the system interface (built in PHP) where users 'upload' and manage files (no actual files are stored here, it's all meta).
A separate staging server where files are placed to be worked on.
A file-store where the files are stored when they are not being worked on.
All 3 servers will be *nix-based on the same internal network. Users, based in Windows, will use a web interface to create an initial entry for a file on Server 1. This file will be 'uploaded' to Server 3 either from the user's local drive (if the file doesn't currently exist anywhere on the network) or another network drive on the internal network.
My question relates to the best programmatic approach to achieve what I want to do, namely:
When a user uploads a file (selecting the source via a web form) from the network, the file is transferred to Server 3 as an inter-network transfer, rather than passing through the user (which I believe is what would happen if it was sent as a standard HTTP form upload). I know I could set up FTP servers on each machine and attempt to FXP files between locations, but is this preferable to PHP executing a command on Server 1 (which will have global network access), to perform a cross-network transfer that way?
The second problem is that these are very large files we're talking about, at least a gigabyte or two each, and so transfers will not be instant. I need some method of polling the status of the transfer, and returning this to the web interface so that the user knows what is going on.
Alternatively this upload could be left to run asyncrhonously to the user's current view, but I would still need a method to check the status of the transfer to ensure it completes.
So, if using an FXP solution, how could polling be achieved? If using a file move/copy command from the shell, is any form of polling possible? PHP/JQuery solutions would be very acceptable.
My final part to this question relates to windows network drive mapping. A user may map a drive (and select a file from), an arbitrarily specified mapped drive. Their G:\ may relate to \server4\some\location\therein, but presumably any drive path given to the server via a web form will only send the G:\ file path. Is there a way to determine the 'real path' of mapped network drives?
Any solution would be used to stage files from Server 3 to Server 2 when the files are being worked on - the emphasis being on these giant files not having to pass through the user's local machine first.
Please let me know if you have comments and I will try to make this question more coherant if it is unclear.
As far as I’m aware (and I could be wrong) there is no standard way to determine the UNC path of a mapped drive from a browser.
The only way to do this would be to have some kind of control within the web page. Could be ActiveX or maybe flash. I’ve seen ActiveX doing this, but not flash.
In the past when designing web based systems that need to know the UNC path of a user’s mapped drive I’ve had to have a translation of drive to UNC path stored server side. I did have a luxury though of knowing which drive would map to what UNC path. If the user can set arbitrary paths then this obviously won’t work.
Ok, as I’m procrastinating and avoiding real work I’ve given this some thought.
I’ll preface this by saying that I’m in no way a Linux expert and the system I’m about to describe has just been thought up off the top of my head and is not something you’d want to put into any kind of production. However, it might help you down the right path.
So, you have 3 servers, the Interface Server (LAMP stack I’m assuming?) your Staging Server and your File Store Server. You will also have Client Machines and Network Shares. For the purpose of this design your Network Shares are hosted on nix boxes that your File Store can scp from.
You’d create your frontend website that tracks and stores information about files etc. This will also hold the details about which files are being copied, which are in Staging and so on.
You’ll also need some kind of Service running on the File Store Server. I’ll call this the File Copy Service. This will be responsible for coping the files from your servers hosting the network shares.
Now, you’ve still got an issue with how you figure out what path the users file is actually on. If you can stop users from mapping their own drives and force them to use consistent drive letters then you could keep a translation of drive letter to UNC path on the server. If you can’t, well I’ll let you figure that out. If you’re in a windows domain you can force the drive mappings using Group Policies.
Anyway, the process for the system would work something like this.
User goes to system and selects a file
The Interface server take the file path and calls the File Copy Service on the File Store Server
The File Copy Service connects to the server that hosts the file and initiates the copy. If they’re all nix boxes you could easily use something like SCP. Now, I haven’t actually looked up how to do it but I’d be very surprised if you can’t get a running total of percentage complete from SCP as it’s copying. With this running total the File Copy Service will be updating the database on the Interface Server with how the copy is doing so the user can see this from the Interface Server.
The File Copy Service can also be used to move files from the File Store to the staging server.
As i said very roughly thought out. The above would work, but it all depends a lot on how your systems are set up etc.
Having said all that though, there must be software that would do this out there. Have you looked?
If iam right is this archtecture:
Entlarge image
1.)
First lets sove the issue of "inter server transfer"
I would solve this issue by mount the FileSystem from Server 2 and 3 to Server 1 by NFS.
https://help.ubuntu.com/8.04/serverguide/network-file-system.html
So PHP can direct store files on file system and dont need to know on which server the files realy is.
/etc/exports
of Server 2 + 3
/directory/with/files 192.168.IPofServer.1 (rw,sync)
exportfs -ra
/etc/fstab
of Server 1
192.168.IPofServer.2:/var/lib/data/server2/ /directory/with/files nfs rsize=8192,wsize=8192,timeo=14,intr
192.168.IPofServer.3:/var/lib/data/server3/ /directory/with/files nfs rsize=8192,wsize=8192,timeo=14,intr
mount -a
2.)
Get upload progress for realy large files,
here are some possibilitys to have a progress bar for http uploads.
But for a resume function you would have to use a flash plugin.
http://fineuploader.com/#demo
https://github.com/valums/file-uploader
or you can build it by your selfe using the apc extension
http://www.amwsites.com/blog/2011/01/use-a-combination-of-jquery-php-apc-uploadprogress-to-show-progress-bar-during-an-upload/
3.)
Lets Server load files from Network drive.
This i would try with a java applet to figurre out the real network path and send this to server, so the server can fetch the file in background.
But i never didt thinks like this before and have no further informations.
I have a website which has a lot of confidential data and code which I have custom made. I have hired a developer to do the designing and some simple PHP integration for me.
To prevent him from seeing all files, I made a test environment in one of subfolders like mywebsite.com/testfolder
Now I want him to access the db_test.php, function.php and parameter.php files which are located in the root folder such that he can just include them while executing the scripts (example mywebsite.com/testfolder/mainfile.php) and not download them (with php script or by any other means). The idea is to prevent him to see the code and just use the stuff as it is.
This would also mean that his access to the root folder should be also completely restricted except for the above mentioned files.
I have created a test database and a separate user for him so the database bit is secured.
I have also created a ftp user which can just access the testfolder through ftp
What I am concerned about is that he might run a php script that will give all secrets in the root folder.
I have myself been able to list and download files by running a simple php script from testfolder.
Please suggest how to make this work as I am planning to have a virtual team who will work on the website which will have restricted access to various different resources.
RULE NUMBER ONE: never develop on a live project.
you may create a development environment (=web site) somewhere else, put some meaningless files and/or databases there and allow your developers full access. then, from time to time, you update your working copy from the repository (you have setup hg/git repo, haven't you), review and test the changes and only then upload files to your main web site.
I have a webapp where by people are allowed to upload files, the webapp and upload form run on VPS1(24GB) I have another server called VPS2(1TB). I want user to use the webapp to upload files and for the files to be stored on the VPS2. How ever I'm not sure the best way to do this, would I upload the file to VPS1 and then transfer it to VPS2 via FTP(or other methods)? Or should I upload it directly to VPS2 using a post method on a webserver running on VPS2? This has to be scalable, I will be adding more webservers in the future.
I had thoughts about putting all the storage VPS servers in an array in PHP an array and randomly selecting which one to post files to. But I'm not sure, really lost and would like some advanced help.
1.You can post your files to your vps2 php script and store files there, Thats a good option and for scalability you can choose which server to choose depending on nearest location of server for clients or randomly choose one. this is the best option i see here, And rest work is your database.
2.Also you can backup a certain amount of files to your vps2 server using linux script when the disk is full using their local ip in case you have a local ip to share with other server.
But still first option is better, you can have different subdomains for the different web server like vps1.domain.com/file01 and vps2.domain.com/file02 and similarly other, and obviously script on different servers depends on sessions ,cookies , database.
I've been working in Dreamweaver CS4 on two computers, bot by accessing my hosting server (Bluehost) via the FTP feature in DW. Whenever I need to edit a file, I double click the file on my server in the Files Manager and it opens it with the submenu directly under the file name. Here's the example:
The issue I am having is when I edit includeA.php (which is included in PageA.php) on Computer1 and then go home to edit PageA.php on Computer2. I then save, which then saves over my changes made to includeA.php from Computer1.
Essentially, Is there any way to automatically update included files when opening a file? Or even when I start DW, updating all of my files in a certain directory? I believe I have tried refreshing my working directory before I edited includeA.php, but that did not help...I think... I have set up a test for this and will be testing by my return here (Computer2) tomorrow.
Any ideas? Thanks all!
Maybe you want to use a revision control system that solves your problems. git?