I have a file from a library with size > 50MB so I cannot deploy it with Git in my instances. I include this file in some of my PHP scripts, so what should I do in order to leave my instance replicable and include this file in my scripts?
I can store it in a S3 bucket but im not sure if that's a good practice (including external files).
Files that are needed but not practical to keep in a repository are perfect for S3. I typically create a [companyname]-ops or [companyname]-assets S3 bucket with narrow access- such as a read-only IAM role for standard machines.
Part of the deployment process is to push (or pull) your code, and to pull assets from S3.
Obviously this can be done a million ways. I tend to think of code repos, databases, and s3 as tools that have their own uses for deployments.
Storing the files on S3 is one option that I have seen quite often.
Alternatively you can bake your own AMI with the file already included, so you dont require extra bootstraping. This should also speed up the whole replication process.
Related
I'm building website with billions of images. I'm little confused to store images in single directory. how many images can be stored in single directory. or it slow down the server ?
Have you considered object storage such as AWS S3? http://aws.amazon.com/s3/
As for performance, I think it depends on the file system you intend to use. Some file systems index directory content in a linear manner, others use more efficient algorithms. It also depends whether any of your system services will need to scan the directory regularly.
I found this: http://events.linuxfoundation.org/slides/2010/linuxcon2010_wheeler.pdf
In this question: https://serverfault.com/questions/43133/filesystem-large-number-of-files-in-a-single-directory
I'm looking for a solution to an application need. I need a web-based file manager/explorer that works with Amazon S3 buckets. The problem with most potential solutions I have found is that they are somehow relying on the s3 to maintain the directory hierarchy. This is bad because it means additional latency when traversing folders (and listing their contents).
What I would like is a php app/class that maintains the directory structure (and filenames) in a database, so that listing/traversing files and directories is quick and the s3 is only accessed when actually downloading or uploading a file.
Does anyone know of anything like this? I'm hoping there is something already in existence rather than taking the time to build from scratch.
Thanks!
I'd definitely recommend using Gaufrette.
It abstracts away the filesystem, so you're able to switch between local storage, FTP, SFTP, S3 etc simply by switching the Adapter
I have an idea that will make all Linux servers running PHP more secure.
I want to configure php.ini File Uploads to scan any new file upload with ClamAV's clamscan.
Is it possible? How should I configure it?
While there are libraries for interacting with ClamAV within PHP scripts, there is no way to automatically scan all uploaded files automatically.
Mainly, due to flexibility. For example, if you run a security website to track various viruses and store a copy of each, ClamAV would get to them during the upload, rendering your website useless. The better approach would be to do this in the client code, and use one of the libraries in the link above to perform scanning.
Having said that, I don't believe this is something that's impossible to achieve. You can write a PHP extension that hooks into fileuploads to automatically run ClamAV on uploaded files.
I have a home headless server for dev+testing.For small projects my workflow is just drag throught sftp several files to a temp folder on my workmachine and then upload them to the "production" server. But now I face a big project that dapends on the release files has no changes. Production server is a shared hosting with apache. I don't want to waste time neither bandwidth. Seems that a revision control software could suit the purpose. But i can't install software above apache. What could the workflow be? Is there any "subversion","git", (...) for web deployement?Any other solution?
Thanks in advance
You need ssh (or direct) access to do this(svn or git deploy) but you can use phing deployment and composer for the dependencies.
https://github.com/composer/composer
http://www.phing.info/trac/
You can use rsync the same way you use sftp (obviusly, the server has to support it) and is instantaneos for something like 65000 files (wen only maybe 2 have changed).
Something like GIT or SVN can be much better, and have other features, but if you want something simple, and you are a single dev, you can use a backup system + rsync + a diff tool like Meld or WinMerge.
Possible workflow:
You develop in workstation A, in folder "dev/".
You review changes, and transfer changes with diff/winmerge to folder "rc/".
You upload "rc/" to "public_html/" in the public server with rsync.
Wen you copy changes from "dev/" to "rc/" you re-read then, and check if make sense, only transfering the changes that make sense and don't make rc/ unstable. If all changes are safe, you can do it just a single button.
This is a inferior system to using something based on git/svn.
VCS-systems and Deploy-systems are, really, different classes for different jobs and tasks. You have to ask, "How to deploy web application", I think.
Phing already was recommended, from other side you can (using tools of your SCM, if any used) export changed files after each changeset and transfer on shared hosting using any available transport. Can't see problem here.
For small projects my workflow is just drag throught sftp several
files to a temp folder on my workmachine and then upload them to the
"production" server
is perfectly applicable to big projects also
I am developing (solo web developer) a rather large web based system which needs to run at various different locations. Unfortunately, due to some clients having dialup, we have had to do this and not have a central server for them all. Each client is part of our VPN, and those on dialup/ISDN get dialed on demand from our Cisco router. All clients are accessable within a matter of seconds.
I was wondering what the best way to release an update to all these clients at once would be. Automation would be great as their are 23+ locations to deploy the system to, each of which is used on a very regular basis. Because of this, when deploying, I need to display a 'updating' page so that the clients don't try access the system while the update is partially complete.
Any thoughts on what would be the best solution
EDIT: Found FileSyncTask which allows me to rsync with Phing. Going to use that.
There's also a case here for maintaining a "master" code repository (in SVN, CVS or maybe GIT). This isn't your standard "keep editions of your code in the repo and allow roll backs"... this repo holds your current production code (only). Once an update is ready you check the working updated code into the master repo. All of your servers check the repo on a scheduled bases to see if it's changed, downloading new code if a change is found. That check process could even include turning on the maintenance.php file (that symcbean suggested) before starting the repo download and removing the file once the download is complete.
At the company I work for, we work with huge web-based systems which are both Java and PHP. For all systems we have our development environments and production environments.
This company has over 200 developers, so I guess you can imagine the size of the products we develop.
What we have done is use ANT and RPM build archives for creating deployment packages. This is done quite easily. I haven't done this myself, but might be worth for you to look into.
Because we use Linux systems we can easily deploy RPM packages, the setup scripts within a RPM package can make sure everything gets to the correct place. Also you get a more proper version handling and release process.
Hope this helped you.
Br,
Paul
There's 2 parts to this, lets deal with the simple one first:
I need to display a 'updating' page
If you need to disable the entire site while maintaining transactional integrity, and publishing a message to the users from the server being updated, then the only practical way to do this is via an auto-prepend - this needs to be configured in advance (note - I believe this can be done using a .htaccess file without having to restart the webserver for a new PHP config):
<?php
if (file_exists($_SERVER['DOCUMENT_ROOT'] . '/maintenance.php')) {
include_once($_SERVER['DOCUMENT_ROOT'] . '/maintenance.php');
exit;
}
Then just drop maintenance.php into your webroot and that file will be displayed instead of the expected file. Note that it should probably include a session_start() and auto-refresh to ensure the session is not expired. You might want to extend the above to allow a grace period where POSTs will still be processed e.g. by adding a second php file.
In terms of deploying to remote sites, I'd recommend using rsync over ssh for copying content files - which should be invoked via a controlling script which:
Applies the lock file(s) as shown above
runs rsync to replicate files
runs any database deployment script
removes the lock file(s)
If each site has a different set up then I'd recommend either managing the site specific stuff via a hierarchy of include paths, or even maintaining a comlpete image of each site locally.
C.