I'm trying to write a script that will show the rss version of a single url (title, author, image, source, etc..). This should behave much the way that facebook does when you copy paste a link to share and it generates this information automatically. I'm trying to do this with a php script but would also be open to opensource programs that can do this as well.
also, if anyone knows of any Joomla/Drupal plugins that can do this that would be great. This may eventually end up on a site run on one of these frameworks.
thanks!!
I'm not really sure what you're asking for, but here is a list of Joomla extensions that handle RSS syndication:
link text
This seems like a pretty specific application to get started with a big framework. Anyway, some Drupal modules worth checking out:
http://drupal.org/project/facebook_link
http://drupal.org/project/views
http://drupal.org/project/views_rss
Related
I am php web developer. well, am new. I want to create a blog-site for a writer, but I don't know how to make the site text editable. So, am asking, how do I optimize my php site for texts contents, and also probably other media contents.
Thanks
The best solution is to use wordpress. The user can create,edit and destroy content and you will be able to style that information. Check this link
Since now we have used wordpress for our website, and the xml sitemap was very easy to create for the whole website with some plugins.
Now we are switching to a php website created from scratch, after some google search i haven't found something to help me to understand how can i create a sitemap for my website .
Please can some one help, with any kind of software or any web script?
Thank you.
It really depends on how you have the pages served (from database, include files, what-have-you) but, if you don't have very many pages you could simply create a document called sitemap.xml by hand and place it in your root directory. There are droves of examples you could emulate from a quick google search.
To create a sitemap.xml, read up on sitemap.xml format, then create an XML file that conforms to that format.
How to do this? Well, that depends on the structure of your site. Maybe you could write it by hand, maybe you could generate it based on stuff in the database, maybe you could crawl your site, maybe you could...? This question doesn't really have a specific answer--it all depends on how your site is organized.
If you have a list of pages in a database, you can use that. Almost all websites have either a directory of static pages, a database with pages, or a combination of both. You should be able to generate a list of all your pages. If you can put them in an array, you can put them in XML as well. Use the SimpleXML extension from PHP. It is, well, Simple. :)
If you cannot generate an export this way, you could use some kind of crawler to generate a list of urls that are found when crawling your main page and all successive pages in your domain.
Im in the beginning of developing three different web applications with the classic php/mysql technologies. These applications would all have photo galleries (with different requirements in sizing).
I think the best choice is to choose an open source solution rather than developing from scratch. However, even though Im an experianced php programmer I have no experiance in open source cms/photo galleries.
So, questions:
what are the best choices for an open source php photo gallery considering that I will surely have mess up with their code and extend it (ive seen plogger and zenphoto, not impressed)?
Is it wise to choose an open source php photo gallery or go with a cms (eg wordpress, joomla, typeo3 etc)?
If anyone has experiance in using and extending php open source apps please share some knowledge.
If your site mainly on photo gallery highlighted, better to choose Photo gallery open source software which are available instead of Joomla like CMS.
www.plogger.org
You can try:
Gallery
Coppermine
Both are well known PHP Photo Gallery open source software (GPL License). Also both are under active development and have a big community of users, so you can probably get help when you will need it.
Also both have some, kind of integration with the most known open source CMS.
I recommend to search a good gallery plugin that suits your requirements from wordpress plugin directory. http://wordpress.org/extend/plugins/
I actually use Gallery, but not in the sense that most people do. I install it in a directory that only the admin will see. I link it up through my CMS with an icon, and tie the authentication system to match my CMS's. It exists only for the admin to get photos in.
From there, I do queries on the front-end with php into Gallery's database to get the photos that I want. It's certainly not rocket science once you figure out which tables and which directories are required to get what you need.
Could I use something canned? Sure. But my clients demand more than that. Because I'm just too busy to get something completely custom finished that includes resizing, javascript cropping, folder traversing, etc in addition to the crazy front end transitions and presentation I already write, this has been a great solution. And it works, every time. I can focus my time on making the front-end really unique and perfect for that particular application.
I've been using Xoogallery, which is a responsive php photo gallery. It's open source
They offer a developer license so you can us it on your client's websites.
Try it http://xooscripts.com/product/html5-php-photo-gallery.html
I think this is a real challenging one!
I write a website for my local football league, www.rdyfl.co.uk , and include javascript code snippets from the F.A's Full-Time system where we generate our fixtures, linking in tables fixtures recent results etc.
For another feature I want to add to the site I need to scrape the 'Upcoming Fixtures' for each agegroup and division but when I examine the source I have two problems.
The fixtures content is generated by javascript and therefore I need to see the generated source and not just the source.
When I view the generated source using Firefox the team names are actually further javascript links and not the name itself.
I basically want to somehow download the fixtures on a regular basis and write then to a mysql database ?
I have asked the F.A. and they have no more options available to access the data ?
Having never coded for scraping before can anyone point me to a simple solution or does anyone fancy the challange?
This question was asked a long time ago, but I noticed it was active today 🤷.
You should be able to scrape the website using a headless browser such as Puppeteer. Using Puppeteer you are able to access a URL and execute JavaScript or interact with the website as you would with an ordinary browser. Parsing the output DOM and storing it should then be relatively straightforward.
There are plenty of articles on this topic using Puppeteer.
The latest version of OutWit Hub is doing a pretty good job on dynamic content. The source scraped by outwit to extract links, images, documents and tables and text is the updated DOM. You can certainly make a job to grab what you need using these.
Custom scrapers are still applied to the static source in version 1.0.3 but version 1.1.x (still in beta) will offers the choice between the static source and the dynamically modified DOM.
Scrapping content produced by Javascript is challenging. AFAIK you will need to do this with AJAX. Hopefully the content has some css that you can grab with jQuery or at least some id's. Do you have id's or classes that you can grab?
I'm familiar with HTML, CSS, and some PHP and Javascript. I've made several fairly complicated websites for which I've acted as webmaster, manually adding all content in HTML.
I'm about to put in a proposal for my first outside client at a larger business. They have an IT person that would be responsible for updating the website that I create for them.
My question is what to do about content management. I've looked into things like Drupal, but they seem overly complex for this kind of situation, with a single person adding updates of things like text, images, and PDFs.
What would you recommend as the next step above the simple way of manually uploading files and editing HTML like I'm used to? Something like a MySQL database and PHP calls? Would I then store all the images in the database as well?
I guess I'm just trying to figure out what's most common at a medium-sized business. I appreciate any guidance you can offer!
Nathaniel
My company has built large scale projects and medium scale as well. What we like to do is setup a outer page with navigation and an inside page that the client has control of by a control panel with FCK Editor or TinyMCE.
So essentially we have a wrapper page (in our case a MasterPage but in PHP you would use an include or a index.php with a query string to pull the content) and then we drop in HTML content from the database.
That database is populated by the client in their control panel. FCK Editor allows them to upload images and manage links, etc.
For our bigger clients we get very specific in our control panel allowing them to add videos, PDF attachments, blog entries, FAQ content, etc.
Some examples we have are http://pspwllc.com and http://needsontime.com and http://nwacasa.org
Drupal can be bit complex at first but if you stick with the basic modules - it is great for website content management.You can write your own mini content management system - store text and images(MySQL blob format) in MySQL.It will be couple of PHP admin pages and a good render() function responsible for page rendering.
Also have a look at wordpress, it is much easier than drupal. It is less powerful but it may serve your needs. You will NOT need to configure modules like FCKeditor, with it bcoz they come inbuilt. Anybody will be able to edit the content easily. Do note that wordpress is not just for blogs, you can create different kinds of websites with it. Another choice is Joomla, it is also simpler than drupal. But, wordpress is the simplest.