WindowsAzure: Develop on deployed PHP Files - php

I work a lot with the WindowsAzure4E(clipse) IDE. And it's always pain to wait for the local test deployment)
Isn't there a way to develop on the deployed PHP files which must be stored somewhere to inetput or something else?
thx for your ideas.

Yes! In fact, I just got this working myself yesterday.
After installing PHP 5.3 with CGI support for IIS (making the necessary php.ini modifications of course), I simply created a new site in IIS that mapped to a role in the workspace for my Eclipse project.
Keep in mind that there's one hiccup to this and that is that the php_azure.dll file, used to access the service configuration and mount azure drives, was built to run in the azure fabric (either development or hosted). In my case, I don't NEED these features so I removed referrences to things like getconfig and poof the project loads in IIS just fine. I only need to make sure I start Azure Storage prior to launching the application.
I've been told that some folks are able to update their systems path environment variable with the location of the azure diagnostics dll (diagnostics.dll) and have it work without this modification. But this route didn't work for me. :(
I'll actually be blogging on this more this weekend as it took me a week of evenings to get things sorted out.

I found out that after the deployment the project files are copied to the folder ServiceDefinition.csx.
When you now edit the source code in this place, you can see the changes directly, without another deployment.

Related

Multiple Instances of XAMPP on an External Portable Harddisk

Running Windows 11, 64 bit I need to install multiple instances of XAMPP on my external Hard disk. The drive will always be associated with the letter Z.
The purpose is to learn XAMPP including what the stack is made of i.e., Apache, MariaDB/MySQL, and PHP. Learning route is through various tutorials and online video courses such as:
WordPress Full Course in ONE VIDEO | ZERO to HERO | STEP BY STEP
How To Make A Digital Agency Website From Scratch In 2022 (WordPress And Elementor For Beginners)
These tutorials/courses asks to make projects such as the first is centered around "mywebsite" and the second is "jimakes". In order to do so I was thinking to have one installation of XAMPP at (I am using the Lab as the main folder in the drive Z)
Z:\Lab\xampp\htdocs\mywebsite
Z:\Lab\xampp\htdocs\jimakes
Now I will have 1 XAMPP installation and multiple WordPress installations (The WordPress installtions will be in each of the projects respective folders so that tinkering one WordPress installation does not effect the other).
The problem is following through the 2nd course/tutorial and installing the WordPress theme (jimakes), Astra, I came across problems that required me to change some settings in the .htaccess file and php.ini files which I messed up. This resulted in messing up the "mywebsite". This required me to uninstall everything and do a clean reinstall of XAMPP and WordPress. And that resulting in also removing the htdocs folder. (There is an option of not removing that folder during the uninstall process but I don't want to do that say just to be sure I have a clean reinstall with default settings).
Now I am thinking to have multiple XAMPP installations and a single WordPress installation such as
Z:\Lab\mywebsite\xampp\htdocs\PUBLIC_HTML
Z:\Lab\jimakes\xampp\htdocs\PUBLICC_HTML
The purpose is two fold. First I can happily chip away with messing everything up in one project and it won't effect the other. The second I am trying to replicate a real hosting live server from one of the server providers such as Go Daddy or Host Gator as much as possible so that migrating from the local development environment to the production one is as pain less as possible when I move on to real projects.
Now to cut the story short and trying to catch the river in a cup I would ask
To create a exact replica of the hosting providers environment on a local development environment with the aim of firstly learning technologies such as PHP, MariaDB/MySQL, WordPress and later on exporting real life project from the local development to hosting environment with the confidence that all I need to do is move the files (FTP via FileZilla?). This cannot be achieved since if I am not wrong, one can never install cPanel locally. Secondly who has the time to consider in so many variables for example you cannot create a new database or user using phpMyAdmin in a hosting account but you can on a local environment.
WHM & Cpanel cannot create database
What needs to be changed for once and once only. You will notice that I have changed the htdocs to PUBLIC_HTML as the root document folder (Is it same as the Server Folder?). The aim is to actually learn PHP and WordPress and MySQL without being bothered about why it's not working. I know why it's not working itself is an important part of learning but it just muddles the waters since your objective is to learn the technologies and not why it's not working (The why it's not working comes later, don't ask me why, I am too dumb to learn two things at a time).
I tried installing Apache, MySQL/MariaDB, and PHP manually without using XAMPP. This resulted in learning about more about installations and errors and how to fix them and blah blah blah rather than actually learning PHP, MariaDB/MySQL and WordPress. Exactly the thing that I was trying to avoid. Some would argue you cannot learn one without the other but I already have HTML and CSS under my belt (I know, I know, JavaScript, I will get there) and again the purpose is to move to the backend and save time by avoiding and worrying about how to fix them errors (took me two hours to figure out why my php manual installation was not working. Turns out one needs to set PATH in system environment variables in Windows 11)
PHP Not Executable in Command Prompt Windows, Environment Variable is set
Why can't I just use a hosting provider production environment? Than why XAMPP is there?
You have a lot of asks here, you shoould split them to individual asks, i Will help you with the Title Ask.
Is as common task have into dev environment many projects, but you can run all with a single instance and a single port, such as 80 in dev machine.
You can set Many Virtual Hosts so istead of use default http://localhost you can create a better host for each project, Eg:
http://project1.local
http://project2.local
and proceed with development with no pain, here a tutorial, teaching how you can create this env.
https://www.wdb24.com/how-to-setup-multiple-virtual-hosts-on-xampp/

Setting up a Laravel 4 app on a VPS

I am trying to deploy my first Laravel App. So I hope I am providing all the necessary info. I have walked down several paths trying to deploy this app. I tried a shared hosting account, but found too many errors to continue deploying my Laravel app. In the meantime, someone has said to me I need a VPS, so I may go with that.
So with a new VPS, I now am trying to install the following: phpMyAdmin, node.js, Composer, and Laravel 4. These are the technologies I am using on my local server with MAMP. Now after being overwhelmed with the information on installing each on a VPS, I have found myself extremely confused. Some places say I need to install Ubuntu. Some say I need to install Apache first. Some talk about using CentOS. I honestly have no idea what I need to install, and in what order. All I really need is to figure out how to set up a PHP environment on my VPS with phpMyAdmin, Node.js, and Composer. After that I am pretty sure it's all straight forward, as far as installing my app.
I also saw some one talking about committing my app to Git, and the cloning it to the VPS. If I did this, I would still need to set up the environment correct? Once again, I hope I have provided the necessary information. If my question is not clear, could you please refer me to a resource that I can study.
You don't need install Laravel separately from the app it is part of - these days a PHP app just contains everything it needs in its vendor folder. How to deploy depends on how you have arranged your dependencies locally, but the simplest way is to copy everything in your local project to your remote server (FTP or rsync). I don't think Laravel demands a VPS, but if you are using Node as well, then yes you will.
So, the short answer is: if it works locally, copy it up to the remote host, and it should work there. Make sure you've set up your config system in your app so that it can cope with the different settings you need in local/remote environments, such as database connection settings.
My feeling is that a shared host would be easier for you as a beginner - is the Node.js component of your app critical? Running your own VPS is not difficult, but there is quite a bit to learn. Your distro (such as Ubuntu) would be ready-installed, and on top of that you would use the package system (something like apt-get) to install Apache, PHP, PHP modules, phpMyAdmin, git, and whatever else you need.
Yes, you can certainly deploy using Git. One way to do this is to create bare repositories on your server in a private place, set it up as a remote in your local dev machine, and push to it as your off-site copy. Then, from your dev or production web folders, pull and update submodules. This is not trivial, and requires at least a working knowledge of Git - so presently I wouldn't recommend this route.

2 cloud servers, one dev, one prod; what's a good deployment process?

Currently using LAMP stack for my web app. My dev and prod are in the same cloud instance. Now I am getting a new instance and would like to move the dev/test environment to the new instance, separating it from the prod environment.
It used to be a simple Phing script that would do a SVN export into the prod directory (pointed to by my vhost.conf). How do I make a good build process now with the environments separated?
Thinking of transferring the SVN repository to the dev server and then doing a ssh+svn push (is this possible with Phing?)
What's the best/common practice for this type of setup?
More Info:
I'm currently using CodeIgniter for MVC framework, Phing for automated builds for localhost deployment. The web app is also supported by a few CRON scripts written in Java.
Update:
Ended up using Phing + Jenkins. Working well so far!
We use Phing for doing deployments similar to what you have described. We also use Symfony framework for our projects (which is not so much important for this but Symfony supports the concept of different environments so it's a plus).
However we still need to produce different configuration files for database, front controllers etc.
So we ended up having a folder with build.properties that define configuration for different environments (and in our case also for different clients we ship our product to). This folder is linked to the file structure using svn externals (again not necessary).
The Phing build.xml file then accept a property file as a parameter on the command line, takes the values from it and produces all necessary configuration files, controllers and other environment specific files.
We store the configuration in template files and then use copy/filter feature in Phing to replace the placeholders in the templates with the specific values.
The whole task of configuring the given environment can then be as simple as something like this:
phing configure-environment -DpropertyFile=./build_properties/build.properties.prod
In your build file you check if the propertyFile property that specifies the properties file is defined and load the file using <property file="./build_properties/build.properties.prod" override="true" />. Then you just do any magic with the values as you need.
You can still use your svn checkout/update and put all the resulting configuration files into svn ignore (you will have them generated by phing). We actually use additional steps in Phing. Those steps in the end produce a Linux shell installation self-deploy package. This is produced automatically in Jenkins. We then send the package to our clients or the support team can grab the package from Jenkins and they can do the whole deployment just by executing it (we still prefer manual deployments to production servers) or Jenkins can deploy it automatically (for example to test servers).
I'll be happy to write more info if needed.
I recommend using Capistrano (looks like they haven't updated the docs since they moved the site) and railsless-deploy for doing deployment. Eventually, you are probably going to need to add more app boxes and run other tasks as part of your deployment so choosing a framework that will support this can save you a lot of time in the future. I have used capistrano for two PHP deployments (one small and one large) and although its not perfect, it works well. It also handles all of the code checkout / update, moving symlinks into place, and rolling back if something goes wrong.
Once you have capistrano configured, all you have to do is something like:
cap dev deploy
cap prod deploy
Another option that I have explored for doing this is fabric. Although I haven't used it, if I had to deploy a complex app again, I would consider it. The interface is simple and straightforward.
A third option you might take a look at thought its still in the early stages of development is gantry (pardon the self promoting). This is something I have been working on out of frustration with using capistrano to deploy a PHP application in an environment with a lot of moving pieces. Capistrano is great and works well for non PHP application deployments, but you still have to some poking around in the code to understand what is happening and tweak it to suit your needs. This is also why I suggest giving fabric a good look.
I use a similar config now. Lamp + SVN + codeigniter + prd and dev servers.
I run the svn repos on dev. I checkout the repos into the root folder of the dev domain. Then use a post-commit hook to update the root folder everytime any developer commits.
When we are happy and have fully tested the code I ssh into the prd server and rsync the dev root to the prd root.
Heres my solution for the different configs. Outside the root folder I have a config.ini file. I parse the file in my codeigniter constants.php script. This means that the prd and dev server can have separate settings without them ever being in the repos.
If you want help with post-commit, rsync and ini code let me know.

Is there a way to use SVN for web development in a Mac shop that uses coda?

So we are pushing to create good processes in our office. I work in a web shop that has been doing web sites for over a decade. And we don't use version control. I know! It's bad, not my fault. I'm the guy with a SoftE background pushing for this at a minimum.
The tech lead has been looking into it. We all use Mac workstations and mostly use Coda for editing since it is a great IDE. It has SVN support built in but expects it to work on local files. We're trying to explore mounting the web directory as a local network drive with an SFTP tool.
We are a LAMP shop, BTW.
I am wondering what the model is here. I think we have typically would checkout the whole site to our local machine where we have apache running and then test it there? This isn't how we work yet, we do everything on the server. We've looked at checking things in and out, but some files are owned by apache and the ownerships change when I check them in, because I'm not apache.
I just want to know a way to do this that works given my circumstances. Would be nice to not have to run apache locally.
You might want to checkout the Coda mailing list and ask there. Lots of Coda enthusiasts there with specific experience.
If you don't want to have to run locally could make Apache on your server run a copy of the site for every developer, on a different port per person, and then mount those web-roots to the local macs and make that the working directory. If you're a small shop that's not hard to manage. I find that pretty easy to set up and saves a lot of resources on the local machines. The one-site-per-person helps to avoid conflicts with multiple people working on files at the same time.
What I'd additionally recommend is to have a script that gets the latest changes from SVN and deploys the entire site to the production server when you're ready. You could have that script change permissions on appropriate files/folders as needed to be owned by Apache. The idea once you're using source control is to never manually edit the production files -- you should have something that deploys it from SVN for you.
A few notes:
Take a look at MacFuse / MacFusion (the latter is the application, the former is the library behind it) to mount remote directories via SSH / FTP as local ones.
Allow your developers to check out into their local environment (with their own LAMP stack if they're savvy), or look into a shared dev environment with individual jails. This way your developers can run their own LAMP stack (which you could deploy for them on the machine) without interfering with others.
The idea being, let them use a workflow that works best for them, to minimize the pain in adapting to this change (if change management might be an issue!)
Just as an example, we have a shared dev server where jails are created with a single command for new developers. They have a full LAMP stack ready to go, and we can upgrade and re-deploy jails easily to keep software up to date. Developers have individual control to add custom settings / extensions if they need it for work, while the sys admins have the ability to reset everything when someone accidently breaks their environment :)
Those who prefer not to use jails, and are able to, manage their own local environments (typically through Macports or MAMP).

Development environment - VCS from development to staging server to production

I've read a number of topics in the same sort of ballpark as this one, but in all honesty I'm still not exactly sure on the best approach (as a starting point). I am a solo developer in a small office and I have around 30 websites which are hosted on a linux VPS. I want to start using using version control (probably SVN) and also set up a staging server. At the moment, I do development either locally on my machine before using FTP to upload to the live server, or ocassionally for small changes I edit the remote files directly, which is not an ideal approach.
I'm looking for some guidance on how to improve my development environment. I imagine I should be installing SVN on the web server, which would then allow me to check out versions to my local machine (which would also require SVN i think). Also, if I want to set up a staging server, should I just set up subdomains for each of the live websites, then use these subdomains for showing clients changes to the site before making them live?
Hope this makes sense!
This is what we do at work:
We have a staging server running Apache and a Subversion server. We have a post commit hook that updates a working copy in the htdocs directory, that way, when a developer commits something it automatically gets updated on the staging server, so everyone can see the latest code.
On the client's production servers (the ones we can control) we have the Subversion client installed and the website is a working copy. When we need to update the live site we login to a shell and run svn up. If you do something like this, make sure to limit access to the .svn directories, either with .htaccess files or from the main Apache config.
We have a custom app that manages the projects, but that is only because we're lazy and don't want to setup each project by hand, the app creates the necessary directories and working copies. You could write a quick script to do this.
We never, ever, edit files via FTP on the live site. All in all we have been using this setup for almost 2 years and aside from the occasional conflict on the staging server, we never have had any problems.
You can actually install the SVN server on your local machine, which I would recommend in lieu of installing it on the web server (assuming you make backups). The easiest thing to do, since it’s only you using it, would be to use the file:// protocol, but using svnserve is a little more robust, and the preferred method if you want to take the time to do it.
#Michael, I disagree - I would say it's better to install on the linux vps, especially if you are already paying for the hosting service. I find it very helpful to be able to browse and download stuff from my svn repo wherever I am, from whatever computer I'm on.
#nicky, I started with svn (and version control) several years ago and I took baby steps which made it easier to tackle.
If I had to do it over again, I'd read the svn book to start with. The book is very well laid out and didn't take more than 1-2 days to plow thru.
While you're reading, install svn on your linux vps with an apache front end.
Once you have that up, pick one of your websites and import it into svn. This is how I structure my svn repo. For example, say my repo is hosted at http://mysvn.mydomain.com/svn/:
mywebsite1
- trunk
- tags
- branches
mywebsite2
- trunk
- tags
- branches
Don't worry about creating the perfect structure. It's pretty easy to re-organize especially when you're starting out. After you import a few projects into svn, you'll start to get a feel for which projects should have their own "trunk/tags/branches" dir structure and which can be combined.
For creating test environments, I do exactly what you describe. I use build scripts to checkout from svn and download files into dirs that are mapped to subdomains like "test.clientsite.com" (I work primarily in java and use ant and maven, but I think you can use whatever scripting language you're familiar with).
Once you get used to version control, you'll never go back, good luck!

Categories