Integrate PHP plugin from Joomla in an Android Application - php

I have a website running with Joomla!, and I'm using PhocaGallery, which is a component used for managing a photo gallery.
In my articles, I simply put a tag, for example :
{phocagallery view=category|categoryid=2|limitcount=2}
This tag displays simply 2 images from the category which has 2 as ID.
I'm developing my own application for this website, and when I load an article on it, I have simply the tag displayed, without images, that's normal.
I have the long PHP code for this plugin.
I would like to know how :
1. Detect the tag when the article is loadind in application
2. Call PHP script in the application to browse good images
3. Display images
The problem I think, is that the PHP code may call to folders on the website, and I think the application can't...
Do you think it's possible ?

detect the tag: use a regular expression to parse the parameters you need
inkoking the plugin is pretty straightforward, just make sure you're making all the necessary resources available i.e. most likely you will want to load the whole joomla framework. Since you're just querying a single table you might be better off doing it on your own with mysql, it will save you a lot of time both developing and at runtime
just output the

Related

Scrape web site generated by Javascript

I think this is a real challenging one!
I write a website for my local football league, www.rdyfl.co.uk , and include javascript code snippets from the F.A's Full-Time system where we generate our fixtures, linking in tables fixtures recent results etc.
For another feature I want to add to the site I need to scrape the 'Upcoming Fixtures' for each agegroup and division but when I examine the source I have two problems.
The fixtures content is generated by javascript and therefore I need to see the generated source and not just the source.
When I view the generated source using Firefox the team names are actually further javascript links and not the name itself.
I basically want to somehow download the fixtures on a regular basis and write then to a mysql database ?
I have asked the F.A. and they have no more options available to access the data ?
Having never coded for scraping before can anyone point me to a simple solution or does anyone fancy the challange?
This question was asked a long time ago, but I noticed it was active today 🤷.
You should be able to scrape the website using a headless browser such as Puppeteer. Using Puppeteer you are able to access a URL and execute JavaScript or interact with the website as you would with an ordinary browser. Parsing the output DOM and storing it should then be relatively straightforward.
There are plenty of articles on this topic using Puppeteer.
The latest version of OutWit Hub is doing a pretty good job on dynamic content. The source scraped by outwit to extract links, images, documents and tables and text is the updated DOM. You can certainly make a job to grab what you need using these.
Custom scrapers are still applied to the static source in version 1.0.3 but version 1.1.x (still in beta) will offers the choice between the static source and the dynamically modified DOM.
Scrapping content produced by Javascript is challenging. AFAIK you will need to do this with AJAX. Hopefully the content has some css that you can grab with jQuery or at least some id's. Do you have id's or classes that you can grab?

First Larger-Scale Web Dev Project - Advice for Content Management?

I'm familiar with HTML, CSS, and some PHP and Javascript. I've made several fairly complicated websites for which I've acted as webmaster, manually adding all content in HTML.
I'm about to put in a proposal for my first outside client at a larger business. They have an IT person that would be responsible for updating the website that I create for them.
My question is what to do about content management. I've looked into things like Drupal, but they seem overly complex for this kind of situation, with a single person adding updates of things like text, images, and PDFs.
What would you recommend as the next step above the simple way of manually uploading files and editing HTML like I'm used to? Something like a MySQL database and PHP calls? Would I then store all the images in the database as well?
I guess I'm just trying to figure out what's most common at a medium-sized business. I appreciate any guidance you can offer!
Nathaniel
My company has built large scale projects and medium scale as well. What we like to do is setup a outer page with navigation and an inside page that the client has control of by a control panel with FCK Editor or TinyMCE.
So essentially we have a wrapper page (in our case a MasterPage but in PHP you would use an include or a index.php with a query string to pull the content) and then we drop in HTML content from the database.
That database is populated by the client in their control panel. FCK Editor allows them to upload images and manage links, etc.
For our bigger clients we get very specific in our control panel allowing them to add videos, PDF attachments, blog entries, FAQ content, etc.
Some examples we have are http://pspwllc.com and http://needsontime.com and http://nwacasa.org
Drupal can be bit complex at first but if you stick with the basic modules - it is great for website content management.You can write your own mini content management system - store text and images(MySQL blob format) in MySQL.It will be couple of PHP admin pages and a good render() function responsible for page rendering.
Also have a look at wordpress, it is much easier than drupal. It is less powerful but it may serve your needs. You will NOT need to configure modules like FCKeditor, with it bcoz they come inbuilt. Anybody will be able to edit the content easily. Do note that wordpress is not just for blogs, you can create different kinds of websites with it. Another choice is Joomla, it is also simpler than drupal. But, wordpress is the simplest.

How do I properly use code in joomla articles?

I am very new to web development and CMSs. I want to make a Joomla site that features articles with a lot of graphs at the top of the page and written content below them. The charts will probably be done with fusioncharts and some controls directly below them to dynamically influence the data displayed in the charts preferably without reloading the page.
My question is what is the most appropriate way to do this in joomla? Can I get the sourcer add in and simple create articles using inline javascript calls to place the charts and controls directly in the article? Is this how people usually embed non text based content in joomla? Is it possible to access the database with code directly embedded in the article to generate the chart?
I dont really want to learn too much of the joomla API right now, I'm more interested in using the CMS features to create the pages and then just coding everything else in javascript/php directly in the page but I'm not sure if that is appropriate or if it would introduce security concerns to my site.
Why not try the FusionCharts extension for Joomla -
This will be much easier than coding this yourself, the work has already been done.
I believe the best thing to do is just use a good WSIWYG and then use the source code feature.
TinyMCE does the work just fine.
Are you looking for plugins or components to add and do this or do you just want to log into administrator and start doing this right away?

How to change and add new web page and content in an existing PHP web appliaction and get rid of design and view problems?

Currently, I work in an existing php web application with sybase database connection.
The web site is built using HTML, CSS, JavaScript, Photoshop, Flash, PHP, IIS, and Sybase.
I would like to add some web pages to this web application so I take a copy of one web page like AboutUs.php for Example and I determine a certain area inside it to change the content either static (text and HTML tags) or dynamic (connecting to the database) and its appearence using CSS.
This web application uses a web template with *.dwt.php Extension and applys for all web site pages as the master web page contains some fixed data like header,footer,rightside,leftside,center and this data includes images and flash objects with predefined sizes and types.
How I can disable the template or modify it to get rid of the design and view changing problems ?
Is there any alternative to replace this template with a new customized template developed by me ?
The problem I face now is when i add my content in the web page the display and the view of it becomes strange and some items and elements overlapped especially in the header, the footer and the sides of the page.
It gives a bad looking for the web page as when data is got from the database or any change happened, the web page design differs especially the images and their borders change and overlapped and intersects.
I do not know Photoshop very well as i think to edit or change the sizes and revise all the web application images properties and sizes to know what is the cause of that.
Is there a fast and proper way to solve that problem instead of rebuilding the web application or testing and verifying all objects and items as this will take long time ?
.dwt is a Dreamweaver template format, so it sounds as if the site has been built in Dreamweaver (a WYSIWYG programme made by Adobe). If you don't know much about web development, or aren't comfortable editing the template then the easiest option may be to use Dreamweaver to edit the site, since this is the format it is currently in.
Well I think you need to invest some time and or money in webapplication development. It is hard to expain you how to do it if you have no experience in web application development.
So, no there is no fast way to solve your problem.

PHP: I want to create a page that extracts images from a forum thread, doable? codeigniter?

You have a forum (vbulletin) that has a bunch of images - how easy would it be to have a page that visits a thread, steps through each page and forwards to the user (via ajax or whatever) the images. i'm not asking about filtering (that's easy of course).
doable in a day? :)
I have a site that uses codeigniter as well - would it be even simpler using it?
assuming this is to be carried out on server, curl + regexp are your friends .. and yes .. doable in a day...
there are also some open-source HTML parsers that might make this cleaner
It depends on where your scraping script runs.
If it runs on the same server as the forum software, you might want to access the database directly and check for image links there. I'm not familiar with vbulletin, but probably it offers a plugin api that allows for high level database access. That would simplify querying all posts in a thread.
If, however, your script runs on a different machine (or, in other words, is unrelated to the forum software), it would have to act as a http client. It could fetch all pages of a thread (either automatically by searching for a NEXT link in a page or manually by having all pages specified as parameters) and search the html source code for image tags (<img .../>).
Then a regular expression could be used to extract the image urls. Finally, the script could use these image urls to construct another page displaying all these images, or it could download them and create a package.
In the second case the script actually acts as a "spider", so it should respect things like robots.txt or meta tags.
When doing this, make sure to rate-limit your fetching. You don't want to overload the forum server by requesting many pages per second. Simplest way to do this is probably just to sleep for X seconds between each fetch.
Yes doable in a day
Since you already have a working CI setup I would use it.
I would use the following approach:
1) Make a model in CI capable of:
logging in to vbulletin (images are often added as attachments and you need to be logged in before you can download them). Use something like snoopy.
collecting the url for the "last button" using preg_match(), parsing the url with parse_url() / and parse_str() and generating links from page 1 to page last
collecting html from all generated links. Still using snoopy.
finding all images in html using preg_match_all()
downloading all images. Still using snoopy.
moving the downloaded image from a tmp directory into another directory renaming it imagename_01, imagename_02, etc. if the same imagename allready exists.
saving the image name and precise bytesize in a db table. Then you can avoid downloading the same image more than once.
2) Make a method in a controller that collects all images
3) Setup a cronjob that collect images at regular intervals. wget -o /tmp/useless.html http://localhost/imageminer/collect should do nicely
4) Write the code that outputs pretty html for the enduser using the db table to get the images.

Categories