PHP project, going from local to online - php

I'm currently working on a project which involves:
Downloading a zip file from an API (bing ads, to be specific).
Unzipping the file (to get a csv), editing it and making database queries.
Re-zipping the file and uploading it back to the API using a service similar to the one used to download the zip file in the first place.
I'm restricted by the client libraries of the API to write the project in PHP. I've already written the project to run locally, and hence store the files on my hard disk.
However, I want to have the whole process running on-line. I've tried to use the Google App Engine, but the zip archive class doesn't seem to be working (although it does work locally).
I don't have much experience with putting apps on-line and I was wondering if anyone can point me in the right direction.
Thanks!

Related

Deploy Twig Standalone project to Live Server

I decided to use Twig as a standalone template motor to make a static web site, with the idea to grow it further into a database dynamic MVC project. By the moment I just have a static website.
I am working offline, using a local php server, I haven't had troubles working and updating the project in my browser locally. The thing is, when I upload the project to the server, nothing works, I get a 500 error.
I want to know how can I deploy my website to the live server, and what is the best approach to make it work without uploading all the bunch of vendors files and just keep functional ones. I use to work with smarty and was very easy to deploy, but I think twig must be equally easy.
This is my file tree:

Intercept calls to folder on ubuntu and emulate folder behaviour

I have a big CRM system on PHP, which active works with users files, and stores them in the folder in the root of the project. Now i need to change system to save files on Amazon S3, but because of bad arhitecture of the system (it's an old open-source system) i cannot just rewrite the code. So, i got a little bit crazy idea, to intercept all system calls to one folder ("/var/www/%my_project%/uploads"), and process them in special way. PHP should be sure that it works with usual folder, file_put_contents and file_get_contents should work as usual, but in fact they should work with code which will serve files from S3 for them. Is that possible (how?), or it's too crazy idea?
Thanks :)
Amazon S3 Stream Wrapper helped me. It's needed just to create a client object, and call registerStreamWrapper method. That's all, now you can work with a folder "s3://yourbacketname" as with usual folder.

how to use use mobile application build / zip file

I am totally beginner in mobile frameworks. I do not understand how to use use application i build using IntelXDK?
They offer build for different mobile platforms. As a result i have dmg (for) and zip files (WebApp).
I need to upload them to remote server.
How to make a functioning website from said files?
Actually, i also want to add Symfony2 back-end.
Do i have to add back-end for each build separately?
Seems that *.dmg files are programs to be run on Android.
For website build, you just have to upload zipped build and than unzip it in your server sub-directory and website will be working.

Deploy updates for code igniter web application

I have a web application written on Code Igniter and almost ready for the release.
I'm looking into ways to do "automatic" updates for the clients.
Now there are going to be versions of the application on which the users will choose to update and when to do it.
I'm curious on how to update the files for the user.
What i used to do before using a framework i used to make a zip file of the new-edited files store it on an FTP and on the user side when he updated i just unziped the file and replace the ones on his side.
For the database schema and other updates i intend to keep a file with the required queries and run during the update.
Should i go with that way or is there something other that i can implement on Code Igniter?
A more straight forward road?
Also i'm still figuring out the part where a user will have to update from say version 1.0.0 to 1.0.3 (2 or 3 versions apart) and the requirements on the files side but on the database also.
Thank you
If you are willing to put it under version control with Git, you can use PHPloy to push only the latest change to the server through FTP. Check it out on Github: https://github.com/banago/PHPloy
Disclaimer: I've written PHPloy

Specific files for using Google Drive Service (on SDK)

Good day, I am currently a noob in Google Drive SDK and I am planning to integrate it in my webapp. I checked out google-api-php-client from their svn and I noticed it's a bit large to include it all in my project (it's 5.6 mb). Excluding all but the src folder, it's still large for me (3.9 mb). I want to include only the files needed for the Google Drive integration to work. But I do not know what files do these two (Google_Client.php and Google_DriveService.php as in Google Developers' sample) depend on.
Can you guys pinpoint what files (that are not related to Drive) I can safely delete? I really want the total file size to be as small as possible 'cause I believe that it can affect the project's loading time.
Thanks(and pardon my English)! :)
Only checkout the library source code and remove all services other than Drive and Oauth2:
svn checkout http://google-api-php-client.googlecode.com/svn/trunk/src
Remove all files under src/contrib but preserve the following:
src/contrib/Google_DriveService.php
src/contrib/Google_Oauth2Service.php

Categories