Laravel Public vs Storage directories for audio files - php

I am a little bit confused about in which directory I should put my .mp3 files in Laravel 5.
public/, my first choice as my MP3 can be considered as media (like images which are already stored there)
storage/app, according to docs this directory is kind of recommended for this purpose.
I don't really mind being able to get an URL of these audio files since I serve it as a stream (to somehow prevent downloads, lol).
Any advice? Thanks.

Guess it just depends on whether you want direct public access or not.
public/ is simpler. Anything in there you can link to directly. Just like your js or css or images resources.
storage/app is obviously more secure, no way to access it directly. That's what I would use (actually I think I'd do storage/app/audio to be specific) so I have more control over how they are accessed.

Related

Json files to own domain

Don't know if this is the right place but, i'm using an API, Fortnite to be more precise, and the json files has the images url, for example www.apiwebsite.com/fortniteimage1.png. Is possible to pass that image to my own url automatically, like media.myurl.com/fortniteimage1.png ?
You need to download the images to your local server. I recommend using curl for that, just look at the docs, there is a lot of examples there.
After downloading them, they must be in a directory that is served publicly, if you are using some framework, usually is the "public" directory, where other assets (JS, CSS, images) are also located.
That way, the images will be in your domain and will be served from there, like:
https://my-crazy-domain.net/images/fortnite/person-avatar.jpg
I think that if the question was more closely related to "How can I do this in PHP" would be more fit.
Anyway, I hope you achieve what you need.

Web file manager for s3 with DB folder structure in PhP

I'm looking for a solution to an application need. I need a web-based file manager/explorer that works with Amazon S3 buckets. The problem with most potential solutions I have found is that they are somehow relying on the s3 to maintain the directory hierarchy. This is bad because it means additional latency when traversing folders (and listing their contents).
What I would like is a php app/class that maintains the directory structure (and filenames) in a database, so that listing/traversing files and directories is quick and the s3 is only accessed when actually downloading or uploading a file.
Does anyone know of anything like this? I'm hoping there is something already in existence rather than taking the time to build from scratch.
Thanks!
I'd definitely recommend using Gaufrette.
It abstracts away the filesystem, so you're able to switch between local storage, FTP, SFTP, S3 etc simply by switching the Adapter

php, own little file manager instead of FTP. Good idea?

Im planning to add file manager (very basic once) because I never used FTP functions, and it looks easier (FTP connection loses when scripts is done). I would simply use POST request (or what should I?) instead of FTP functions. Is it good idea? Anyone knows restrictions?
As far as I can see only FTP functions are to post and receive files.
What you need to do is add dynamic form where you can select multiple files and upload them to specific directory of your chose.
You will need to get all available directories and files in them, probably with some kind of recursive function. More optimal way is to get directories/files of current folder and when you click on folder it will get files/folder for it.
Can it be done - sure. Is it a good idea - no. People will have access for uploading malicious files, we are not talking about images here, php scripts, shell scripts, executable viruses and so on...
If you are doing this only for yourself, for file posting and receiving I suggest you to use FTP clients for that.
I wouldn't recommend it, but it's probably best to use a 3rd party tool, rather than to write your own.
PHP File Manager
PHPfileNavigator2
FileManager
...
Keep in mind that both PHP and your webserver can put certain restrictions on the size of files that you can transfer, it is of course possible to change these in the configuration files.

Storage Performance

I want to enable users to upload images & videos to a website.
Question is now, shall I just drop all the files in one folder or make a folder e.g. for each user?
(Of course it would be easier to find)
Does it make a difference in the performance?
Is it any difference in the access rate?
thanks
If you're dealing with many files, it's a common principle to distribute the files across multiple (sub-) directories. This is because if a directory contains too many directories and files, the file-system needs to do more work. Distributing the files helps then.
But this always depends on the underlying-file-system you use as a database. You need to look which one you use and then check the features it supports and which limits are given.
On the application layer, you should model file-access and handling so you can change how your application stores the files later on w/o rewriting your whole application.

PHP core code directory structure

I'm looking to centralize a lot of my web applications code, so that multiple components have access to the same core functionality. This is how I have the website set up:
/var/www/website - domain.com
/var/www/subdomain1 - subdomain1.domain.com
/var/www/subdomain2 - subdomain2.domain.com
Naturally I've had a lot of trouble when it comes to the duplication of common functionality, as any changes made to one area would also need to be applied to other areas. My proposed solution is to create a new directory in /var/www which will contain all of the core scripts:
/var/www/code - core code
I would then set the PHP include directory to /var/www/code, so scripts can include these files without having to specify the absolute path.
Can you think of any more efficient ways of centralizing the code code?
Many thanks!
Your approach is good enough for this purpose.
Little suggestion:
store your front-end scripts in directory like /var/www/website/www instead of /var/www/website. There will be index file and ajax processors and scripts like that. But your project-based inclusions (as well as other miscellaneous stuff) would be stored in directory like /var/www/website/includes. It is simple yet efficient defense from hacker attacks on your inclusion files
so, your document roots will be in /var/www/website/www (domain) and /var/www/website/subdomain/www/ (subdomain)
It seems that you are thinking correctly :
Share Code between multiple PHP sites
It's only a suggestion, but you should put the public content in the /var/www/* which may end being publicly accessible—either because of your http server or because of some misconfiguration—and create some other directories for your shared code/libs like /usr/local/lib/php/*.
For more security you should frame it with open_basedir adding the private and public dirs—as well as upload and session dirs.
And don't forget to version your libs, e.g.:
/usr/local/lib/php/myLib-1.0
/usr/local/lib/php/myLib-1.2
etc.
Thus, you'll be able to make changes without breaking everything.

Categories