I am trying to create multiple landing pages populated dynamically with data from a feed.
My initial thought was to create a generic php page as a template that can be used to create other pages dynamically and populate them with data from a feed. For instance, the generic page could be called landing.php; then populate that page and other pages created on the go with data from a feed depending on an id, keyword or certain string in the url. e.g http://www.example.com/landing.php?page=cars or http://www.example.com/landing.php?page=bikes will show contents that are either only about cars or bikes as the case may be.
My question is how feasible is this approach and is there a better way to create multiple dynamic pages populated with data from a feed depending on the url query string or some sort of id.
Many thanks for your help in advance.
I use this quite extensively. For example, where I work, we often have education oriented landing pages, but target each landing page to different types of visitors. A good example may be arts oriented schools looking for a diverse array of potential students who may be interested in a variety of programs for any number of reasons.
Well, who likes 3d Modelling? Creative types (Generic lander => ?type=generic) from all sorts of social circles. Also, probably gamers (Gamer centric lander => ?type=gamer). So on.
I apply that variable to the body's class, which can be used to completely reorganize the layout. Then, I select different images for key parts of the layout based on that variable as well. The entire site changes. Different fonts can be loaded, different layout, different content.
I keep this organized via extensive includes. This sounds ugly, but it's not if you stick to a convention. You have to know the limitations of your foundation html, and you can't make too many exceptions. Sure, you could output extra crap based on if the type was gamer or generic, but you're going down the road to a product that should probably be contained in its own landing page if it needs to be that different.
I have a few landing pages which can be switched between several contents and styles (5 or 6 'themes'), but the primary purpose of keeping them grouped within the same url is only to maintain focus on the fact that that's where a certain type of traffic goes to in order to convert for this specific thing. Overlapping the purpose of these landing pages is a terrible idea.
Anyway, dream up a great template, outline a rigid convention for development, keep each theme very separate, and go to town on it. I find doing it right saves a load of time, but be careful - Doing it wrong can cost a lot of time too.
Have a look at htaccess URL Rewrite. Then your user (and google) can use a url like domanin.com/landing/cars but on your server the script will be executed as if someone entered domain.com/landing.php?page=cars;
If you use feed content to populate the pages you should use some kind of caching to ensure that you do NOT reload all feed on every requests/reloads the page.
Checking the feeds every 1 to 5 minutes should be enough and the very structure of feeds allows you to identify new items easily.
About URL rewrite: http://www.workingwith.me.uk/articles/scripting/mod_rewrite
A nice template engine for generating pages from feets is phptal (http://phptal.org)
You can load the feet as xml and directly use it in your template.
test.xml:
<foo><bar>baz!!!</bar></foo>
template.html:
<html><head /><body> ${xml/foo/bar}</body></html>
sample.php:
$xml = simplexml_load_file('test.xml');
$tal = new PHPTAL('template.html');
$tal->xml = $xml;
echo $tal->execute();
And it does support loops and conditional elements.
If you are not needing real time data then you can do this in a few parts
A script which pulls data from your rss feeds and stores the data somewhere (sql db?), timed by something like cron. It could also tag the entries into categories.
A template in php taking the url arguments and then adding the requested data and displaying it for the user. Really quite easy to do with php, probably a good project to use to learn as well if you are that way inclined
Related
I have a series of HTML pages which I am converting into Smarty syntax since I've learnt this. This site is a fairly old one in design terms, no include etc. - even though our .htaccess allows us to treat PHP as HTM extension.
I've saved a few as .tpl pages, but what's the best way to go about converting it into full-scale templating?
I've been slowly, but tediously, splitting pages into .tpl files, although not sure if that's the right way to do it... the site is a jointly-created one, dating back to 2006 originally as pure HTML.
I'm using a templating engine because that's what the original site-owner wanted, and we're both competent at using a templating engine.
The manual on Smarty was useful; but I'm wondering how to do a Smarty pagination script where the data is paginated like this for database results (moving some data into a new database that was formerly static data enclosed in < li > tags), rather than 1-10, 11-20 etc.:
http://i44.tinypic.com/5uh7nk.jpg
If there's another solution (for now we don't quite need CodeIgniter etc.) I'd appreciate the help! ;)
The SmartyPaginate plugin may help. This was written by one of the original Smarty developers. I'm not sure if it is compatible with Smarty 3 though, if that is what you're using.
The other possibility would be to create a template that outputs the pagination, and accepts some parameters to set the options.
e.g.
{include file="paginator.tpl" curpage="3" perpage="50" numitems="3428" link="showitems.php" param="p"}
Where...
curpage = the current page being viewed, this would be set by PHP.
perpage = how many items to show per page - this is probably a static value set by php, or from user preferences
numitems = how many total items there are - this is calculated by php/sql and passed to template
link = the page to link each page to
param = the parameter to use for the page number, this is used by PHP to get the page to be viewed. i.e. items.php?p=1
That said, you would use PHP code to check to see if a page number was set in the URL, and then check to see if the page number is valid and within the acceptable range.
Your actual page content would just loop over an array of items which would be the results from the database. Depending on what set of results was fetched, it will be showing items from the given page.
If the SmartyPaginate plugin won't work, you could probably look at it for help with the process of pagination and adapt it to your needs.
I am creating a PHP app that will display some classifieds/listings based on user location. For eg:
Our classifieds from Chicago:
Classified 1
Classified 2
Classified 3
now, I also want to display "classifieds" from some other classified sites into my own page. Like this:
More Classifieds from Chicago (courtsey of XYZ.com)
Classified 1
Classified 2
Classified 3
Classified 4
More Classifieds from Chicago (courtsey of ABC.com)
Classified 1
Classified 2
Classified 3
This way, user can see classifieds hosted on my server AND as well as classifieds from other common classified sites.
Is it possible this? Note that 1) there are no "RSS" feeds available for importing these classifieds; and 2)if possible I'd like to show these lists in widget format. That is display a iframe/widget box (not sure what the technical term is) and display all external-classifieds in that box.
See a rough mockup here: http://i.imgur.com/O19MR.jpg
I was thinking I could load the other classified sites into "iframes" but then I'd get the whole site (including their header/footer, logo etc.). I just want some relevant "classified" section from their site.
You want to look into doing some screen scraping through a spider and parser setup. You can use CURL or file_get_contents to bring in the web page, then use regular expressions and string operators to filter out the data you want, then build a page to display it. This is an overly simplified version of the full answer, but if i gave you the 100's of lines of code to complete this, that would be cheating!
Given the lack of API or feed, the only thing I can think of is to have to pull the relevant URLs and scrape the data from them. It should be pretty simple with a mix of file_get_contents and DOMDocument to parse the data, as long as the markup is tidy.
The best option i can think is set up a web crawler asynchronous that fetches the data from those sites.
You could set it up to crawl every day at 00:00 and store the content in your database, something like:
external_classified
id
site_source
city_id
extra_data
After that you could get it from your PHP app with no problems.
EDIT: Note that the solution i'm thinking is asynchronous! The other answers use an synchronous action to get the data. I think it's a waste of time to fetch the same classifies over and over again. Although, to be fair, those solutions are simpler to implement.
I have a MySQL Database of more or less 100 teachers, their names, and their phone numbers, stored in tables based upon their department at the school. I'm creating an iPhone app, and I've found that UITableViews and all the work that comes with it is just too time consuming and too confusing. Instead, I've been trying to create a web page on my server that loads all the data from MySQL and displays it using HTML, PHP, jQuery, and jQTouch for formatting.
My concept is that the separators will be by department, and the staff will be sorted alphabetically under each department. On the main page, each person's name will be clickable so they can go to ANOTHER page listing their name, email address, and telephone number, all linked so that the user can tap on the email or number and immediately email or call that person, respectively.
HOWEVER, I am completely at a loss for how I should start. Can anyone point me in the right direction for displaying the data? Am I going about it wrong in using PHP? Should I opt for something COMPLETELY different?
PHP to manage the database interaction and generate HTML is fine. There are heaps of tutorials on how to do that (e.g. http://www.w3schools.com/PHP/php_mysql_intro.asp) How to make it look nice is beyond the scope of this answer, and I'd recommend you search for table/CSS examples to get some ideas of what looks good and how they're implemented. If you need interactivity such as expanding rows or changing colors, then jQuery would be an appropriate next step, though you certainly don't need more than HTML + CSS for a nice looking table representation.
What I don't know about is the auto email/call functionality you're after, and whether you can get that "for free" from whatever is rendering the HTML. That's iPhone specific, not PHP/jQuery/etc... And I'd second Alex's advice that if UITableView is the right tool for the job then you will definitely be better off in the long run just buckling down and learning it. (And going through that will probably make pickup up other parts of the API much easier to boot.)
Instead of loading my PHP in my <body>, I created a function that retrieved the data via mysql_fetch_assoc(), which added all the information and created each individual div of data AS WELL AS injecting a <script> to $.append() the list item content for each item retrieved via the mysql_fetch_assoc(). Thanks for the responses anyway!
I have a list of keywords, about 25,000 of them. I would like people who add a certain < script> tag on their web page to have these keywords transformed into links. What would be the best way to go and achieve this?
I have tried the simple javascript approach (an array with lots of elements and regexping/replacing each) and it obviously slows down the browser.
I could always process the content server-side if there was a way, from the client, to send the page's content to a cross-domain server script (I'm partial to PHP but it could be anything) but I don't know of any way to do this.
Any other working solution is also welcome.
I would allow the remote site add a javascript file and using ajax connect to your site to get a list of only specific terms. Which terms?
Categories: Now if this is for advertising (where this concept has been done a lot) let them specify what category their site falls into and group your terms into those categories. Then only send those groups of terms. It would be in their best interest to choose the right categories because the more links they have the more income they can generate.
Indexing: If that wouldn't work, you can maybe when the first time someone tries to load the page, on your server index a copy of it and index all the words on their page with the terms you have and for any subsequent loads you have a list of terms to send them based on what their page contains. ideally after that you would have some background process that indexes their pages with your script like once a day or every few days to catch any updates. Possibly use the script to get a hash of the page contents and if changed at all you can then update your indexed copy.
I'm sure there are other methods, which is best is really just preference. Try looking at a few other advertising-link sites/scripts and see how they do it.
I would like to design a web app that allows me to sort, browse, and display various attributes (e.g. title, tag, description) for a collection of man pages.
Specifically, these are R documentation files within an R package that houses a collection of data sets, maintained by several people in an SVN repository. The format of these files is .Rd, which is LaTeX-like, but different.
R has functions for converting these man pages to html or pdf, but I'd like to be able to have a web interface that allows users to click on a particular keyword, and bring up a list (and brief excerpts) for those man pages that have that keyword within the \keyword{} tag.
Also, the generated html is somewhat ugly and I'd like to be able to provide my own CSS.
One obvious option is to load all the metadata I desire into a database like MySQL and design my site to run queries and fetch the appropriate data.
I'd like to avoid that to minimize upkeep for future maintainers. The number of files is small (<500) and the amount of data is small (only a couple of hundred lines per file).
My current leaning is to have a script that pulls the desired metadata from each file into a summary JSON file and load this summary.json file in PHP, decode it, and loop through the array looking for those items that have attributes that match the current query (e.g. all docs with keyword1 AND keyword2).
I was starting in that direction with the following...
$contents=file_get_contents("summary.json");
$c=json_decode($contents,true);
foreach ($c as $ind=>$val ) { .... etc
Another idea was to write a script that would convert these .Rd files to xml. In that case, are there any lightweight frameworks that make it easy to sort and search a small collection of xml files?
I'm not sure if xQuery is overkill or if I have time to dig into it...
I think I'm suffering from too-many-options-syndrome with all the AJAX temptations. Any help is greatly appreciated.
I'm looking for a super simple solution. How might some of you out there approach this?
My approach would be parsing the keywords (from your description i assume they have a special notation to distinguish them from normal words/text) from the files and storing this data as searchindex somewhere. Does not have to be mySQL, sqlite would surely be enough for your project.
A search would then be very simple.
Parsing files could be automated as post-commit-hook to your subversion repository.
Why don't you create table SUMMARIES with column for each of summary's fields?
Then you could index that with full-text index, assigning different weight to each field.
You don't need MySQL, you can use SQLite which has the the Google's full-text indexing (FTS3) built in.