Include Mediawiki in php application - php

Is there a way from my PHP application to include a single file from the MediaWiki installation, and everything would work fine?
I first tried doing an iFrame, but clicking some of the links inside the iFrame doesn't work.

It seems that only possible way to use MediaWiki engine for your own application is to install a separate instance on it on your server (source + db). Then you'll be able to use it through API.
Pros:
Easy to update;
Can be used with any application built on any platform;
Cons:
It's not too easy to install and maintain MediaWiki;
You should use external interface for it even from your script. Say, you should use CURL requests from your PHP application to your Wiki even if they are situated on one host.
Large overheat, 'cause you can't use only Wiki-parser.

Related

Use Vuejs with PHP MySQL in local environment (XAMPP for example)

So, I've been taking a course on Vue.js.
Before, I've made small projects where I've used jQuery to make ajax calls to local PHP files, that then connects to a MySQL database in phpmyadmin. Just a simple CRUD application.
Using Vue, I'm struggling to figure out how to connect with PHP and MySQL in a similar way. The course has gone into the Vue CLI and using webpack to set up a project and then import components and so on. I understand when using "run dev" it sets up a node server and displays the app. XXAMP, which I've used before, just loads a bank page.
Is there a way I can harness the full extent of Vue but used with a locally hosted PHP / MySQL backend? If not, what technology do I need to use / learn for a full stack app?
I suggest you step a bit out of your comfort zone for this one.
Doing it basic like that via XAMPP, etc, is fine, but why not use something like Laravel together with Vue. Even officially the frameworks collaborate, and there are lots of examples of combined usage.
Also, Laravel makes sure that it sets you up with a local development environment without much fuss, and also utilizes PHP's own built-in development server, rendering your need for XAMPP obsolete. For starters check out this tutorial.
Good luck and have fun!

React.js and PHP together?

I'm looking to rewrite an old website using React.js for my frontend framework and PHP for the little backend stuff I may need.
The server I'm using is IIS with Apache, but I have little/no SSH access/ability to install stuff on the machine (i.e. npm, node, etc.), so I'm restricted to using PHP as a backend (so I can't build/compile my React JSX serverside). However, couldn't I simply just upload built/compiled my react files to the server?
I have seen this, but I'm not entirely sure that I need it?
Basically, my question is, is using these two together practical? Would I need any other tools to accomplish this? (The website is likely to be pretty small.)
I've used PHP and React together briefly (and then moved on to isomorphic React). Running V8 with PHP is only useful if you need server-side rendering (for fast initial loading and easy SEO), but I've not tried it so I'm not sure whether it's stable/reliable enough. If you do not care about server-side rendering, then you can just build the React app and include it in your PHP view.
Basically, the PHP view would serve as a layout, with a React container element defined so that your React app can bootstrap with it.
You can also pre-fetch the initial data for the React app with PHP and somehow attach it to your PHP view. The simplest way would be to use a script block to assign the JSON-serialized data to a global variable. Another way would be to define element(s) and attach your JSON-serialized data as element attributes, to avoid globals. Either way, you'd have your Flux stores bootstrap with those initial data to avoid having to hit APIs before the app can load.

Using App Engine Python application as proxy to PHP application

During the past two years I have built an App Engine application in Python. Soon it will be possible to use PHP on App Engine. I would like to use off-the-shelf PHP applications such as Wordpress, Mediawiki and phpBB together with my Python application. To the user it should be transparent which of the two applications (Python or PHP) she is using for a particular page. I consider the Python application to be the main application where I will do most of the programming. This is because I have more experience with Python and also because I already have written a lot of reusable code for App Engine.
Currently my approach is to build a proxy in Python that maps HTTP requests like this:
http://www.yellow.com/blog/* to http://phpapp.appspot.com/wordpress/client1/*
http://www.yellow.com/community/* to http://phpapp.appspot.com/phpbb/client1/*
yellow.com is a domain mapped to my Python application.
http://www.blue.com/wiki/* to http://phpapp.appspot.com/mediawiki/client2/*
http://www.blue.com/* to http://phpapp.appspot.com/wordpress/client2/*
blue.com is a domain mapped to my Python application.
Besides the blog, community or wiki, there are a lot of URL's that don't require PHP. These URL's are handled by the Python application. For example: http://www.yellow.com/admin/*.
I'm still struggling with the proxy to get the passing of cookies between the two applications right, but I think it's possible to do this.
It would be awesome if I could get it to work this way. However, it seems to me this is not the most elegant way to handle this. I know I could use subdomains to serve the PHP applications, but I would rather just use URL patterns. Also, with the proxy approach, I can tweak the returned HTML by the PHP application before serving it to the user. Another advantage of this approach is the ability to cache the pages from the PHP applications in memcache.
I would like to hear what you think of my approach to use Google App Engine (custom) Python and (off-the-shelf) PHP applications together. Will I run into problems with the proxy (Javascript, cookies, ...)? Would it be better to build everything in Wordpress, for example, with custom plugins written in PHP (the plugins could fetch data from the Python application)? Other suggestions?
Your use case is a good example of what Appengine's Modules are intended for. Take also a look at the dispatch mechanism.

How to make a Joomla "build"?

In drupal, it's possible to create a "build", also known as "install profile" or "distribution" that basically combines several modules and your settings for them. So the next time you setup the same exact site, you don't have to re-configure all the modules.
Does Joomla have a similar concept, and what is it called? Please reference documentation as well if possible.
The concept is very simple - you just need to get a clean installation, install all the extensions you want and configure them the way you need.
Then it is enough to copy the files and the database to a new location and change the settings in the configuration file (configuration.php). That is all.
It is a very simple process and can easily be automated with a simple php script. I once did an asp.net app which was deploying new installations of joomla within seconds.
You could try something like http://www.akeebabackup.com/
This allows you to take a snapshot of a site and export it anywhere.

What is the Best Practices to share PHP scripts among different PHP applications?

I have a folder of PHP scripts, they are mostly utility scripts. How to share those scripts among different PHP applications so that reuse and deployment are easy?
I would have to package my app into an installer, and let the user install it.
I could put the lib and hardcode the include path, but that means I haven to change the PHP code every time i deploy the web application to a new customer. This is not desirable.
Another route I consider is to copy the lib to other apps, but still, since the lib is constantly updating, that means that I need to constantly do the copying, and this will introduce a lot of problems. I want an automated way to do this.
Edit: Some of the applications are Symfony, some are not.
You could create a PEAR package.
See Easy PEAR Package Creation for more information on how to do this.
This assumes that when you say anyone, you mean outside your immediate organisation.
Updated: You do not need to upload to a website to install the PEAR package. Just extract your archive into the pear folder to use in a PHP application.
Added: Why not create a new SVN repository for your library? Lets say you create a library called FOO. Inside the repostory you could use the folder heirachy of trunk\lib\foo. Your modules could then go into trunk\lib\foo\modules and have a file called trunk\lib\foo\libfoo.php. Now libfoo.php can include once or require once all the modules as required.
PHP now supports Phar archives. There's full documentation on php.net.
There's a complete tutorial on IBM website as well.
One neat thing you can do with Phar archives is package an entire application and distribute it that way.
http://php.net/phar
http://www.ibm.com/developerworks/opensource/library/os-php-5.3new4/index.html
Ahh, libraries...
There are two conflicting purposes here:
Sanity when updating scripts (ie. not breaking 10 other apps).
Keeping things in one organized logical place for developer efficiency.
I suggest you take a close look at git and git submodules
We use git submodules extensively for this very purpose. It allows the best of both worlds because shared scripts can be upgraded at will in any project, and then that change can be moved to the other projects (deliberately) when you have time to do so and test correctly.
Of course, you need to be using git to take advantage of submodules, but if you are not using git, and you start, you'll eventually wonder how you ever lived without it.
Edit: Since the original poster is using svn, consider using SVN Externals.
UPDATED:
you just have to put the lib in some place reachable by your apps (in a place where you can reach it via http or ftp or https or something else) and include it.
If you have to update it often you can package your library in a single phar file and you can then provide your client a function to pull the library from some remote path and update a parameter in their local configuration accordingly, like:
function updateLocalLibary(){
//read the remote library in a variable
$file= file_get_content($remoteLibraryRepository.$libraryPharFile);
//give it a unique name
$newLibraryName=$libraryPharFile."_".date('Ymdhsi');
//store the library it on a local file
file_put_content($localLibraryPath.$newLibraryName,$file);
//update the configuration, letting your app point to the new library
updateLatestLibraryPathInConfig($newLibraryName);
//possibly delete the old lib
}
In your include path then you don't have necesasrily to hardcode a path, you can include a parameter based on your config, like:
include( getLatestLibraryPathFromConfig() )
(you are responsible to secure the retrieval in order to let only your clients see the library)
Your conf can be in a db, so that when you call updateLibraryPathInConfig() you can perform an atomical operation and you are sure not to have client read dirty data.
The clients can then update their library as needed. They may even schedule regular updates.
There are a lot of options:
tar + ftp/scp
PEAR (see above #Wayne)
SVN
rsync
NFS
I recommend to use a continuous integration software (Atlassian Bamboo, CruiseControl); check out your repository, build a package, and then use rsync. Automatically.
You should also look into using namespace in order to avoid conflicts with other libraries you might use. pear is probably a good idea for the delivery method, however, you can just place it in the standard path /usr/share/php/, or any other place that is set as the include path in your php settings file.
Good question, and probably one that doesn't have a definite answer. You can basically pick between two different strategies for distributing your code: Either you put commonly used code in one place and let individual applications load from the same shared place, or you use a source-control-system to synchronise between local copies. They aren't mutually exclusive, so you'll often see both patterns in use at the same time.
Using the file system to share code
You can layer the include_path to create varying scopes of inclusion. The most obvious application of this pattern is a globally maintained PEAR repository and a local application. If your it-system consists of multiple applications that share a common set of libraries, you can add a layer in between these (a framework layer). If you structure the include_path such that the local paths come before the global paths, you can use this to make local overrides of files. This is a rather crude way to extend code, since it works per-file, but it can be useful in some cases.
Use source-control
Another strategy is to make a lot of local checkouts of a single shared repository. Some benefits over the layered-include-pattern is that you can make more fine grained local changes. It can be a bit of a challenge to manage the separation between application layers (infrastructure, framework, application). svn:externals can work, but has some limitations. It's also slightly more complicated to propagate global changes to all applications. An automated deployment process can help with that.

Categories