Saved images in Heroku app are not being displayed? - php

I am new to web development and I have just deployed my site using Heroku. I am using Laravel as the framework and Postgres as the database.
In my site, I have a feature of storing images in my public/images folder. If I saved the images and changed the file permissions before deploying it on Heroku then the images are being displayed. However, if the images are uploaded directly through Heroku, the images won't be displayed.
I am guessing this is caused by file permissions. Maybe because the images that are being uploaded in the public/images folder in Heroku is not inheriting the permission of the folder?

The file system on Heroku is not persistent. You can't save files on the web server file system, and expect those files to be available for subsequent HTTP requests.
You need to use another persistence store, like storing those files on AWS S3 or in your database. There are also probably other addons that would allow you to save files simply enough.
Ref: https://devcenter.heroku.com/articles/dynos#ephemeral-filesystem
Here's an example of saving and displaying images from a Postgresql database: Upload/Download file in POSTGRESQL database with PHP

Related

How to manipulate cloud server files using Laravel?

We have developed our application in Laravel and now we are planning to transfer it to Amazon server where we have to separate our application logic with file storage. Basically, we want to move our whole application storage to a cloud server(Amazon S3) and application logic to Amazon EC2 server.
In our system, we manipulate (resize images, merge images, make thumbnails from videos, etc) many storage files locally. We will not be going to store any files on the application server once we migrate to Amazon server. So, our concern is how we can manipulate cloud server files?
Earlier all files are present on the application server so file manipulation was easy to process but after migrating whole storage to cloud server how we can manipulate files that are on the cloud server with manipulation logic resides on the application server?
Any Response will be helpful
Thanks in advance...
To manipulate S3 file, I think first we need to download the file locally. Once, we have file locally we can apply any operation on that particular file. We can delete the local file later.
Here are the documents to directly upload from or download to a local file using the Amazon S3.
https://aws.amazon.com/blogs/developer/transferring-files-to-and-from-amazon-s3/
https://docs.aws.amazon.com/aws-sdk-php/v3/guide/
Thanks

Laravel envoyer storage syncing

Ive written a Laravel application where I upload images and pdf files.
Currently In uploading those into a folder within the public directory.
Now I been trying Envoyer.io, where I can easily deploy my projects to the server(s).
The problem here is that each project has its own directory. So everytime all those uploads dissapear.
Ive figgered out that Envoyer does use a symlink for the storage directory in every deployed project.
I can upload the files to the storage directory, but when I return the URL from the files in de storage pth I receive a path like "/var/www/project/app/storage/file.ext" which is the base path. I dont want to return those links in my API cause of security reasons. I there any way I can upload to the storage path and get those uploads with an more friendly URL? Or does anyone have an ither solution?
If you use Laravel Storage feature then you do not need to worry about absolute path. You can even save the files in amazon cloud on completely different machine.
Storage works only with file contents and relative paths instead of absolute paths like PHP File. However you cannot mix the Laravel Storage and PHP File logics easily.
Envoyer must keep the storage folder same within one application. If you need to share it with other projects then upload to Amazon or write your own implementation of the Storage Facade.
More info at http://laravel.com/docs/5.0/filesystem

Download multiple images from Amazon S3 in a zip file?

I have a web page on a web hosting and images are stored on Amazon S3. I want with php be able to download multiple images from Amazon S3 through my web page in a zip file.
What are my options and what is the best?
What I know, it is not possible to compress files on S3. Can I use Amazon lambda?
Best solution I've come across.
The user selects on my website which images they want to downloaded.
I get the file name from my database on my web host and download the
images from S3 to a temporary directory on my web host.
A zip file is created in a temporary directory and a link is sent
to the user. After a certain time, I clear up the temporary directory (with a script) on my web host.
But it would be great if there are a way that did not go through my hosting to create and download the zip-file.
AWS S3 is "basic building blocks", so it doesn't support a feature like zipping multiple objects together.
You've come up with a good method to do it, though you could stream the objects into a zip file rather than downloading them. EC2 instances can do this very quickly because they tend to have fast connections to S3.
Lambda doesn't work for this, as it is only triggered when an object is placed into an S3 bucket. You are doing the opposite.

Wordpress installed with Elastick BeanStalk overwrites uploaded images

I have Wordpress instance on Amazon Elastic BeanStalk. When I upload instance with EB scripts, the whole page is being replaced, also uploaded images which can be attached to posts. And after such automatic deploy, posts have missing pictures:)
I tried to solve this:
1) I logged into Amazon machine with SFTP, but my user ec2-user has only read-access to files. So I was not able overwrite only part of application, with retaining uploaded files.
2) I read I can use Amazon S3 as external storage for upload files. This is still not tested by me:). Do you know if this is good approach?
3) Any other approach for this problem? How to organize it on amazon: machine backup probably should be set?
The Elastic Beanstalk environment is essentially stateless; meaning that all data that is persisted to disk will be lost when the application is updated, the server is rebuilt or the environment scales.
The best way in my option is to use a plugin that writes all media files to AWS S3; something similar to the Amazon S3 and Cloudfront plugin.
You log files should also be shipped to a remote syslog server which you can either build yourself or use a 3rd party.
Google: loggly, logstash, graylog, splunk

access data on apache server using xampp

i am writing an iphone application, which basically uploads and downloads Images to/from a server. in order to test my code i installed xampp and everything works fine now. if i upload an image the server creates a folder named with the UDID-number of the device
(via the http-method POST a php script is called).
but if i enter the directory of such a folder and the name of the image in the browser i can see it in the browser.
i am a newby on this topic and have no idea if there are better possibilities. my questions are: should i use databases where i save the images or is it just fine to create folders via a php script and save the images into these folders? can i hide all the datastructure, such that you cannot access it via a browser but only with the iphone application? (the application should only be able to download pictures randomly). thx
If you do not want an image serverable by apache, you need to store it outside the webroot. So if your webroot is C:\xampp\htdocs, you could store the images in a folder structure under C:\xampp\images
Your iphone app would have to do a little more work then to pull a random one and send it to the user.

Categories