I am making a php mysql web app, my idea is to install in the customer home a web server so they can connect with whatever device they want, probably most of the time they will be using an ipad to connect to the app.
Sometimes the client needs to take out the app with them in the ipad, so after discarding other options(like phonegapp because i need to mantain a mysql db for some functions)i realized that Application Cache may be a good solution: They can use the application with the web server(using db functions like randomize the content, generate Statistics)and when they are offline they can access a local copy of the content, with limited function but working.
The problems that i have is that the site have images,video and audio so at least there are 20mb to cache, i read that with application cache you can only store 5mb and the other problem is that my content is dynamic so i cant add all the files that i need to the cache manifest. I want something like make a wget of the site(save an static html file) and to use the dynamic content when online. I dont know if i can make something like that.
Thanks
the cache.manifest for the ipad can store more than 5mb.
the currently ios limit is 50mb.
if you cache more files, automatically the ipad ask if you want increase the storeage to 50mb.
take a look at this
it explains you how to create and implement the cache.manifest. its a great tutorial
hope this help.
Related
I'm currently working on a project which allows my employees to upload files to a private work management site that I designed, and then the site informs a publicly accessible site that new files have been uploaded, and to inform the client those files belong to.
I'm trying to figure out the "best" way to go about doing this. Obviously giving my clients access to the work management site would be a terrible idea, but the files still need to be saved on the work management site as well.
When I started working on this feature, I figured I'd just write a cron on the public site (The site clients access) to download the new files every 24 hours, but it looks like there may be hundreds of files (Hundreds of megs) every 24 hours, so I'm starting to doubt that design. I'm also a little skeptical of using ftp/sftp/scp, as that's a possible security issue. Are there other methods I'm overlooking to do this?
Note: I'm using Code Igniter on the work management site, and Laravel on the public site.
Edit: I should note that both sites will be on the same server, domain, and under the same user. Are there any issues with writing a "wrapper" which basically forwards the file data through a php script to cover the actual download location?
sftp or scp will be just about as secure as anything. Why are you skeptical of using them? You could build a VPN between the two sites, but that's likely more work and resources than using sftp or scp.
Edit: Responding to edited question -- if both sites are on the same server, does that mean they essentially share the same file system (disks)? If so, then it would make sense to simply access the same files from both sites, and write code / configure the client site to display only the files the clients should see and give them only the limited access they should have. It is possible to code this in such a way that the only access they have to the files is through the site's code, yes. For example, this is a standard on/off configuration option in Drupal, if I remember correctly.
I'm trying to create a simple script which takes a URL in a form, download the file and deliver that file to the user. Something like a proxy server, only for downloading files. The only problem is that the server has limited execution time of 10 seconds which will fail for most large files. I can't change the execution time (using set_time_limit) because that's blocked too. Is there ANY way I can get past this?
you can use an cloud storage service to store the file for you
google drive api
dropbox-php
to "deliver" the file to the user you share the link on your cloud storage service.
Ps: Sorry for the previous wrong answer, ftp_get only works if you are trying tpo get it from an ftp server
Get a new web host, a cheap one that I can recommend is Dreamhost which is pretty darn cheap and they have a lot of PHP ini settings you can override (but not all). Or, if you're just playing around and are looking for something temporary, I recommend AWS EC2, the micro instance is as cheap as $0.02/hour depending on the region you select and you get 1 month for free, but most importantly, you get FULL root access.
Edit:
Forgot to mention where to view override PHP settings info: wiki.dreamhost.com/index.php/PHP.ini
(sorry I can't make it a link, I'm a n00b on stackoverflow and am limited to 2 links)
I have a website right now that is currently utilizing 2 servers, a application server and a database server, however the load on the application server is increasing so we are going to add a second application server.
The problem I have is that the website has users upload files to the server. How do I get the uploaded files on both of the servers?
I do not want to store images directly in a database as our application is database intensive already.
Is there a way to sync the servers across each other or is there something else I can do?
Any help would be appreciated.
Thanks
EDIT: I am adding the following links for people that helped me understand this question more:
Synchronize Files on Multiple Servers
and
Keep Uploaded Files in Sync Across Multiple Servers - LAMP
For all Reading this post NFS seems to be the better of the 2.
NFS will keep files in sync but you could also use ftp to upload the files across all servers as well but NFS looks like the way to go.
This is a question for serverfault.
Anyway I think you should definitely consider getting in the "cloud".
Syncing uploads from one server to another is simply unreliable - you have no idea what kind of errors you can get and why you can get them. Also the syncing process will load both servers. For me the proper solution is going in the cloud.
Should you chose the syncing method you have a couple of solutions:
Use rsync to sync the files you need between the servers.
Use crontab to sync the files every X minutes/hours/days.
Copy the files upon some event (user login etc)
I got this answer from server fault:
The most appropriate course of action in a situation like this is to break the file share into a separate service of its own. Don't duplicate files if you have a network that can let the files be "everywhere (almost) at once." You can do this through NFS/CIFS or through a proper storage protocol like iSCSI. Mount as local storage in the appropriate directory. Depending on the performance of your network and your storage needs, this could add a couple of undetectable milliseconds to page load time.
So using NFS to share server files would work OR
as stated by #kgb you could specify one single server to hold all uploaded files and have other servers pull from that (just make sure you run a cron or something to back up the file)
Most sites solve this problem by using a 3rd party designated file server like Amazon S3 for the user uploads.
Another answer could be to use a piece of software called BTSync, it is very easy to install and use and could allow you to easily keep files in sync accross as many servers as you need to. It takes only 3 terminal commands to install and is very efficient.
Take a look here
and here
You can use db server for storage... Not in the db i mean, have a web server running there too. It is not going to increase cpu load much, but is going to require a better channel.
you could do it with rsync.. people have suggested using nfs.. but that way you create one point of failure... if the nfs server goes down.. both your servers are screwed... correct me if im wrong
I have a simple CRM system that allows sales to put in customer info and upload appropriate files to create a project.
The system is already being hosted in the cloud. But the office internet upload speed is horrendous. One file may take up to 15 minutes or more to finish, causing a bottleneck in the sales process.
Upgrading our office internet is not an option; what other good solutions are out there?
I propose splitting the project submission form into 2 parts. Project info fields are posted directly to our cloud server webapp and stored in the appropriate DB table, the file submission will actually be submitted to a LAN server with a simple DB and api that will allow the cloud-hosted server webapp to communicate with to retrieve the file if ever needed again via a download link. Details need to be worked out for this set-up. But this is what I want to do in general.
Is this a good approach to solving this slow upload problem? I've never done this before, so are there also any obstacles to this implementation (cross-domain restrictions is something that comes into mind, but I believe that can be fixed with using an iFrame)?
If bandwidth is the bottleneck, then you need a solution that doesn't chew up all your bandwidth. You mentioned that you can't upgrade your bandwidth - what about putting in a second connection?
If not, the files need to stay on the LAN a little longer. It sounds like your plan would be to keep the files on the LAN forever, but you can store them locally initially and then push them later.
When you do copy the files out to the cloud, be sure to compress them and also setup rate limiting (so they take up maybe 10% of your available bandwidth during business hours).
Also put some monitoring in place to make sure the files are being sent in a timely manner.
I hope nobody needs to download those files! :(
I have a dynamic php (Yii framework based) site. User has to login to do anything on the site. I am trying to understand how caching and CDN work; and I am a bit confused.
Caching (memcache):
My site has a good amount of css, js, and images. I've been given to understand that enabling caching ("memcache"?) will GREATLY speed up my site. But this has me confused. How does caching help? I mean, how can you cache something that's coming out of DB for each user separately? For instance, user-1 logs-in, he sees his control panel. User-2 logs-in, user 2 will see their control panel.
How do I determine what to cache? Plus, how do I enable caching (memcaching)?
CDN:
I have been told to use a content delivery network like CloudFlare. It is suppose to automatically cache my site. So, when my user-1 logs in, what will it cache? Will it cache only the homepage CSS, JS, and homepage images? Because everything else requires login? What happens when user logs-out? I mean, do "sessions" interfere with working of a CDN?
Does serving up images via CDN reduce significant load on my server? I don't have much cash for getting a clustered-server configuration. So, I just want my (shared) server to be able to devote all its resources in processing PHP code. So, how much load can I save by using "caching" (something like memcache) and/or "CDN" (something like CloudFlare)?
Finally,
What would be general strategy to implement in this scenario for caching, cdn, and basic performance optimization? do I need to make any changes to my php-code to enable CDN like CloudFlare and to enable/implement/configure caching? What can I do that would take least amount of developer/coding time and will make my site run much much faster?
Oh wait, some of my pages like "about us" page etc. are going to be static html too. But they won't get as many hits. Except for maybe the iFrame page that will be used for my Facebook Page.
I actually work for CloudFlare & thought I would hop in to address some of the concerns.
"do I need to make any changes to my php-code to enable CDN like
CloudFlare and to enable/implement/configure caching? What can I do
that would take least amount of developer/coding time and will make my
site run much much faster?"
No, nothing like a need to re-write urls, etc. We automatically cache static content by file extension. This does require changing your DNS to point to us, however.
Does serving up images via CDN reduce significant load on my server?
Yes, and it should also help most visitors access the site faster and save you a fair amount on bandwidth.
"Oh wait, some of my pages like "about us" page etc. are going to be
static html too."
CloudFlare doesn't cache HTML by default. You use PageRules to setup more advanced caching options for things like static HTML.
Caching helps because instead of performing disk io for each user the data is stored in the memory, ie memcached. This provides a SIGNIFICANT increase in performance.
Memcache is generally used for cacheing data ie query results.
http://pureform.wordpress.com/2008/05/21/using-memcache-with-mysql-and-php/
There are lots of tutorials.
I have only ever used amazon s3 which is is not quite a cdn. It is more of a storage platform but still it helps to take the load off of my own servers when serving media.
I would put all of your static resources on a CDN so your own server would not have to serve these. It would not require any modifcation to your php code. This includes JS, and CSS.
For your static pages (your about page) I'd make sure that php isn't processing that since there is no reason for it. Your web server should serve it directly.
Cacheing will require changes to your code. For cacheing a normal flow is:
1) user makes a request
2) check if data is in cache
3) if it is not in cache do the DB query and put it in cache
4) if it is in cache retrieve it
5) return data.
You can cache anything that requires disk io and you should see a speed up.
Memcached works by storing database information (usually from a remote server or even a database engine on the same server) in a flat file format in the filesystem of the web server. Accessing a flat file directly to retrieve data stored in a regulated format is much much muuuuuch faster than accessing that data from a remote query each time. This is typically useful when you have data that can be safely stored for certain periods of time as it is not subject to regular changes.
The way this works is that if you want to store a user's account information in a cache to speed up loading pages where that user is logged in. You would load the information and cache it locally. On any subsequent requests for that data, it will load in a fraction of the time it normally would take to load that information from the database itself. Obviously you will need to make sure that you update/recache that information if the user changes it while logged in, but you will greatly reduce the time it takes to serve up pages if you implement a caching system that can minimize the time spent waiting on the database.
I'm personally not familiar with CloudFlare so I can't offer any advice to that effect, but in terms of implementing caching in your application, you should check out:
http://code.google.com/p/memcached/wiki/NewOverview
And read the rest of the Wiki entries there which cover installation/implementation/etc. That should get you started on the right track.