I am developing a website for a rock band. I wanted to upload the music into the website and let the users listen to the songs. But, I want to add a functionality or some sort of code so that the users are not able to download the mp3 files that i will let the users listen. Suppose someone uses "Internet download manager", he she will be prompted to save the file on as soon as it get the music file. Is there any way to stop that.
Thanks.
There is always a way to bypass any restriction (either download the file or use an audio recorder).
The protection that iTunes and other music platforms use is to let you listen to a preview of the music (~30 seconds) so you cannot get the entire song.
Well, there is really no way to let someone listen to music without them downloading it. Even if you stream the music, they could still record it.
http://www.codewalkers.com/c/a/Miscellaneous/Using-PHP-to-Stream-MP3-Files-and-Prevent-Illegal-Downloading/
This could be of help to you, though, because it will make it much more difficult to download the music.
As mentioned by others, there is no foolproof way to keep the users from downloading the mp3s. Having said that, you could try streaming the content which will make it much more harder for most people.
You can use a streaming server like Red5 for this.
I've been working on the same, and what I did was to create two (or more) versions of each file. One file will be the listening file which in my case was a low quality MP3 file encoded with 112 kbps. The quality would be good enough for people to listen to it online, but not good enough if people wanted a decent quality when they are on the go with a portable player or the like.
Then i'd also have high quality MP3 versions (320 kbps) and WAV files which people could download only if they were logged in.
I can't say that I've found a solution which guarantees that people can't get hold of the music if that's what they are after. You could certainly use Red5 or some other streaming solution, but that requires you to spend more time and resources on configuring and maintaining that solution. Furthermore, I believe Red5/Wowza or any of the other streaming servers requires that the client use Flash to play music (because the communication is done over RTMP). That rules out iPad/iPhone/iPod users as Flash can't be used on those devices.
My conclusion was thus that with the (limited) resources I had the way to go was to offer playback over HTTP (aka "progressive http") using low quality files.
Related
I have a project for a small social network where user can upload their videos, video should not be longer than 1 or 2 minutes but they need to be private and related to user that uploaded.
Since uploaded video may be in many different format I have two options:
upload video and use ffmpeg to process them, can i find any PHP script that does that? PROS I have my files on my server and I'm not dependent from third party CONS I guess is a real intensive task and I may need a quite good noting plan since the beginning of this project
use a video hosting service that provide API to upload and process video, i actually find vzaar.com that may seems to do what I need. PROS Scalable CONS I rely on third party for my contents
since the project is a small social network it may be interesting this too opusphp.com, but i never user or read about
can Vimeo PRO suite my needs? Other suggestions?
In future it may be necessary to add a basic video editing function to trim uploaded video setting starting and ending point
I would suggest you use ffmpeg and call it from PHP using shell_exec() for example.
If you use x264 for encoding you can tune it to use moderate/lower encoding settings so that your server can deal with the load (up to a certain point of course in a size-controlled environment).
ffmpeg will also allow to trim (and much more) videos as you see fit. Video hosting services may not have that option or this could induce an additional cost.
Beyond that recommending online video services to fit your project does not fit Stackoverflow guidelines for asking questions.
My application requires downloading many images from server(each image about 10kb large). And I'm simply downloading each of them with independent AsyncTask without any optimization.
Now I'm wondering what's the common practice to transfer these images. For example, I'm thinking about saving zipped images at server, then send zipped file for user's mobile to unzip. In this case, is it better to combine the zip files into one big zip file for user to download?
Or there's better solution? Thanks in advance!
EDIT:
It seems combining zip files is a good idea, but I feel it may take too long for user to wait downloading and unzipping all images. So I may put ten or twenty images in each zip file, so user can see some downloaded ones while waiting for more to come. Having multiple AsyncTask fired together can be faster right? But they won't finish at the same time even given same file size and same address to download?
Since latency is often the largest problem with mobile connections, reducing the number of connections you have to open is a great way to optimize the loading times. Sending a zip file with all the images sounds like a very good idea, and is probably worth the time implementing.
Images probably are already compressed (gif, jpg, png). You will not reduce filesize but will reduce the number of connections. Which is a good idea for mobile. If it is always the same set of images you can use some sprite technology (sending one bigger image file containing all the images but with different x/y offset, in html you can use the backround with an offset to show the right image).
I was looking at the sidebar and saw this topic, but you're asking about patching when I saw the comments.
The best way to make sure is that the user knows what to do with it. You want the user to download X file and have Y output for a different purpose. On the other hand, it appears common practice is that chunks of resources for those not native to the Android app and not able to fit in the APK.
A comparable example is the JDIC apps, which use the popular Japanese resource that are in tandem used for English translations. JDIC apps like WWWJDIC use online downloads for the extremely large reference files that would otherwise have bad latency (which have been mentioned before) on Google servers. It's also bad rep to have >200 MB on Google apps unless it is 3D, which is justifiable. If your images cannot be compressed without extremely long loading times on the app itself, you may need to consider this option. The only downside is to request online connection (also mentioned before).
Also, you could use 7zip and program Android to self-extract it to a location. http://www.wikihow.com/Use-7Zip-to-Create-Self-Extracting-excutables
On another note, it would be optimal for the user to perform routine checks on the app while having a one-time download on initial startup. You can then optionally put in an AsyncTask so that your files will be downloaded to the app and used after restart or however you want it, so you really need only one AsyncTask. The benefit of this is that the user syncs on the apps and he may need to check only once. The downside is that the user may not always be able to update and may need to use 4G or LTE, but that is a minor concern if he can use WiFi whenever he wants.
im trying to work out the best way to have my site dynamicly transcode and stream video files to users who are mostly on mobile devices, site is php/mysql based and running on a windows 2003 server which i have full access to, any ideas how best to do this - id rather not need to transcode videos on upload if possible
For your services consider something with some oomph: Inlet, Digital Rapids, or Rhozet. Some of these players offer some form of live-stream encoding but you'll generally have limitations on hardware. They all have suitable APIs for interacting with the hardware and profiles.
You can also consider using public transcoding services but keep the assets private. It's not quite as elegant as a roll-your-own but it does solve the problem.
Transcode-/Encode-on-upload will probably serve your needs better if the volume of content or traffic increases. Real-time transcoding has many hurdles including race situations and bandwidth.
I had a similar similar situation in my past at that time I had book marked this like below which has some very interesting stuff ,
php video transcoder
I am sorry If this didn't help you
I have my system backed up on a server (Docs, Videos, Music, etc.)
I have only FTP access (can't really do anything fancy)
However I can run PHP and MySQL programs (pretty basic).
I was wondering about a good complete system that would let me
upload, control and manage the files I have there (including security features)
so that I would be able to edit some docs on the fly, listen to streaming music and upload more files if I wanted to.
Thanks.
I have used this in the past and it's open source so you can extend as needed.
http://sourceforge.net/projects/phpwebftp/
You can try:
http://extplorer.sourceforge.net/
or
http://pfn.sourceforge.net/index.php?opc=1&lg=ing
or
http://navphp.sourceforge.net/
I used the first one and it's pretty good. For media preview I know only http://www.filerun.com/ but it's not open-source.
I'm developing a PHP application which will charge users for the videos they watch. The business model is "everyone pays for how much she watches". For this purpose, I need to;
Implement secure video (FLV) access. (Authorized sessions will gain access)
Calculate how much video (FLV) data is sent from the server.
A trivial solution for this is to read FLV with PHP ("fread") and send it to client chunk by chunk (just "echo"). However I have real performance concerns about this method, because the application server has 1.7GB Rams and just a single core.
In short run we're expecting to get large number of impressions, however we would like to upgrade hardware as late as possible. That's why, I want to implement the requirement with the minimum overhead, in the most effective way.
I'm not tied to a webserver. I prefer Apache 2.2, however lighttpd can also be deployed if it offers a feature for the implementation.
Any idea is appreciated.
Thanks!
The PHP fread solution looks like the way to go, but with the server restriction, I think you will need to tweak the flash player. The flash player could send the server messages based on how much of the video has been played. This might be something to think about. Take a look at the JW FLV Media player, the customisation and Javascript integration will allow you to send xmlhttprequests to the server.
Why not using some videostreaming servers like Red5, I'm sure they have triggers that could perform writing some statistics to a db or something similar.
Another advantage would be that user could skip forward in the video.
So to sum up and for future reference I decided to go with the php fread method, since no satisfactory alternative is suggested.
Thanks to all contributers.