If using Windows Azure and an ASP.NET web site with PHP script to upload files, can I access those files from the server or must I use the Data Storage facilities?
i.e. I'd like to reference the files directly from html\server... etc. I think I probably should be able to.
Thank you.
How are you hosting your ASP.NET website. If it's hosted in a Web Role, then you won't have access to a persistent "standard" file system for that role. (especially if you've scaled over multiple instances).
Have a look at the following tutorial on using blob storage from PHP. http://www.windowsazure.com/en-us/develop/php/how-to-guides/blob-service/
You can use the blob storage quite easily to access them with standard HTTP links from your ASP.NET site. i.e.
http://your-storage-account.blob.core.windows.net/your-container/file.txt
Related
I have project that needs to output hundreds of photos from one template file, compress them into a .zip file, and push them to the customer's browser. After that, the .zip file can be deleted.
Google App Engine (PHP) does not allow you to write files like you would in a standard web server.
How can this be accomplished with GAE flexible?
As you have already known, App Engine flexible does not allow you to write files on the system even if it runs on a VM. The reason is that it runs within more Docker containers and you will not have the guarantee that you will find the file.
An alternative for this is to change a bit your workflow and to use Cloud Storage as an intermediate. You can send the photos directly to Cloud Storage, and the users will be able to download them directly from Cloud Storage. Here you have a guide on how to achieve this from App Engine flex for PHP.
I'm creating a SPA developed in Angular2 with a backend server developed in Laravel.
Angular2 app plays video files stored in different cloud providers (Google drive, SMB server, Dropbox, OneDrive,...) but those providers are handled by Laravel API.
This is the application architecture:
Is there a way to serve a file stored in those providers without download the file in the Laravel server first?
Any help would be appreciated.
Thanks.
EDIT
Files are not publicly accessible. Authentication set up and working in Laravel backend.
I presume by "download", you want to avoid the delay while the entire file is fetched before the user sees the first video frames. The only approach I can think of is to chunk the downloads using standard http chunking between Laravel and Drive, and then echo that in the http session between your Angular app and Laravel. NB I haven't tested this, but I believe chunking is supported by Drive.
In general case a kind of proxy is unavoidable. You can pipe streams from the cloud to the stdout e.g. Stream FTP download to output or similar.
If a cloud support presigned urls, e.g. s3 http://docs.aws.amazon.com/AmazonS3/latest/dev/ShareObjectPreSignedURL.html you can benefit from web server redirects, like X-Accel-Redirect in nginx https://www.nginx.com/resources/wiki/start/topics/examples/xsendfile/.
I want to develop a website like file manager. Where user register and will get fix disk space lets say 20MB.
Now user can upload their pdf, doc, txt, jpeg etc files upto their disk limit.
I can develop upto this using PHP.
Now below is my issue:
1) If user's files are corrupted they can rollback their folders before 2-3 days.
Files must be secure and safe from viruses as users are uploading their important documents.
Is there any 3rd party storage server who provides such facility?
2) Also all files should be previewed from browser.
I am using Google doc viewer. Is is good and safe way to preview file in browser?
But google links are accessible from all, I need to add some restrictions as file can be viewed only by their owner.
I know it's a major task, but i just need some sort of logic. Please share your thoughts.
Thanks.
Any cloud storage service can be used for this. You'll get HDD space. There is not storage server who provides revision control system for this. You can use git, svn for this though. But as the files are binary you can not get full facility of these tools.
How file will be previewed depends on you. If you use PHP you make the site and at the backend you use the API to interact with the storage service. Google doc is not an option for this if you use PHP. Also note Google links can be made private.
I suggest you this,
Find a cloud storage service and use the storage in your server. Any will do.
Create UI using PHP and control the access using PHP too.
Manipulate files in your server directly or in 3rd party storage server via API
Use a revision control system to track the changes. And use its API in PHP end.
Some cloud storage service
Amazon S3. It also supports Versioning.
Google Cloud Storage
Microsoft Azure
Try Microsoft SkyDrive or Google Drive or Dropbox
My php website allows authorised users share documents by uploading and downloading documents to a document library.
I want to store the files encrypted on the server, and I am trying to use EFS, instead of explicitly encrypting/decrypting in php code on upload/download.
Server is windows server 2003/IIS
I have created a new user on my server, logged on as this user, then encrypted the folder containing the documents using EFS.
I now want to IIS to run my web application as so it can view the encrypted documents. However I am having trouble with this, as doesn't have all the permissions it needs, for example to write session files.
The other approac I have tried is to keep my website running under IUSR_Machine_Name built in user, and getting php to present user credentials when it wants to open a file for streaming, however, I'm stuck with this too.
Looking for advice on:
Is EFS the right approach?
What permissions would
need for IIS to use it to run my web
application?
Is there any way in php to expicitly
present user credentials at the
point where php function fopen accesses my EFS
encrypted files?
I'm writing a web application in PHP which needs to store images and image meta data. In future, the application may need to work offline on the client. A user might need to download all the images and data to his laptop before going to a remote area without internet access. Whilst at the remote location the user could add new images to the system and be able to compare them with his local copy of the image database. When returning to an area with internet access, the user would run a sync operation which would copy his new images to the server and retrieve any new ones.
I've looked at the new web storage / localstorage options in HTML5 (web sql database seems to have been dropped) and I think this is going to be too limited as there is only 5MB space and one or two images could easily exceed that.
Is what I want to do actually possible / practical with a browser-based web application? Or should I be looking at writing a desktop/tablet application with local file storage capabilities for users without net access. Initially, it does need to be a web application, I'm just trying to think ahead. Will I give myself more options in future by using something like couchDB for the backend from the start? As I understand it, this comes with good syncing functionality.
Thanks,
I decided to use Titanium Desktop.
http://www.appcelerator.com/products/titanium-desktop-application-development/