I am working on a website with a large amount of information (news articles). So I have a list of news articles that have an intro and a header. When I click on one it brings up the full article, the way I did it was with ajax so it happens dynamically, but that left me with the problem of not being able to bookmark the larger picture of the article.
I couldn't find a good tutorial on how to do this, I thought it would be good to do it the way I have seen it on other sites (the url being something like index.php?id=512435345, like auto-generated sub pages). I don't even know what that solution is called, so I couldn't find anything I could use.
If anyone can point me in the right direction that would help me out a lot
When we create an article it is stored in the db with a unique ID. Then the developer creates a single page which is pretty a much a template that shows the title, description, date, likes, shares, etc. The page uses the unique ID of the post that was stored in the Database to get it's likes, shares and all the things!
And yes, they are the auto-generated sub-ids but are generated by the Database itself.
I would recommend you creating a single page like article.php and send a ID of the article by fetching it from the database thorugh the super global ($_GET) <-- if using PHP. Then make queries or a query to get the data from the database's table and display it to the end user in a nice and clean manner.
Related
I have to add the custom Post id for the custom post type. can anyone help me?
Check screen short for more details. http://prntscr.com/mo3eyu
This is a work around that works.
Background info:
We maintain a few almost identical web pages with different branding. The code repo is shared across these. The developers hard coded page IDs into the PHP in the past. I was asked to add new pages. After adding new pages on each website, I noticed the IDs were all randomly generated. This was breaking PHP code's logic. So, I needed to edit these IDs just like OP.
The work around:
In WP web admin page, there's an export / import functionality under tools menu. The new pages should be created in one site, then exported / imported onto other websites. This way, page IDs don't get created randomly. It copies the IDs from the exported page. So you can have same IDs on all pages.
PS:
Not sure why the past developers decided to hard-code page IDs in the code. This shouldn't have been the case to start with. When you don't have much time to refactor everything, stack overflow always comes to help.
IDs determine post record storage in database, changing them directly would be highly prone to breaking things. For example extensions often store IDs as a way to refer to specific post and do something with it.
The simple way to change to some ID would be to just create a new post and copy data over (through admin or with code either).
Post ID of a post also reflect in tables other than Post table, like post comments table. Changing post ID will disturb the relationship of post table with other tables, so it is not advisable.
What I am trying to do is to make something similar to what I see all the time on almost any website. The button that says Share to facebook. The goal for me is to let my guests share the item they are viewing in my store (Ran on prestashop) on their blog I run (Running on Oxwall).
The goal is for the button to not only link to a blog post submission webpage but to already have the subject line filled out with the item they are sharing's name and the blog post to display the information about the item. I would like to try and do all this using PHP. I am not sure how to go about doing it but I am sure that I could pass the value. Please note that I can mod BOTH the blog site and the shop as I run both and want to connect them.
As an extra bonus I am also running a forum using phpbb3 if I could do the same thing but onto that as well I would greatly thank you. I am trying to interlink everything into one big network. I know its not an easy task but I am sure there is an easy way to pass data onto the other site so that this can be done.
Facebook a 2 tools to get items informations in the page, it parses the page looking for the most common tags and it uses OpenGraph.
You can also provide product informations in the head of your page (between head tags), then blog side, you retrieve only the contents and parse it as XML.
I advise you to cache this data to avoid useless connections between websites and awful overloads while parsing.
You can use your own specifications, Open Graph or another standard, but i advise to use a standard.
This is really a point me in the right direction question. What path should I take if I want to display the number of page views each gallery page receives?
Retrieve Google Analytics Data via PHP, or
Capture the page views directly on my pages with my own PHP and mySQL setup?
Seems, like number 1 would be the better choice. I just don't know how difficult this option will be. Any insights on this?
Option 2 is definitely simpler.
If you do figure out how to get the page results out of Google Analytics, they will not be up-to-date. It takes Google at least several hours before the page views show up.
That also depends if You want to have info on more questions (statistics). I often use custom setup to track such things cause I can later use some data to do statistics about users actions.
For example, You want to give users (owners of photos) info about witch users (male or female, 30+ or below 30, and so on) are viewing. How many guests, how many registered users. There are tons of data You can retrive by building custom system.
It all depends on what You want to have at the end.
I suspect that if You want to have only raw data about number of users You coud somehow parse data from Google Analytics as well.
And as posted below. Google have to take time to update, custom setup not.
I have created a widget for my web application. User's getting code and just pasting that code in their website and my widget works on their website something like twitter, digg and other social widgets.
My widget is on the basis of post, for a single post (say postid: 234) I am providing single widget, so anyone can embed the widget on their website.
Now I want to know that where all my widget is posted and for which post? for that I have recorded the URL of the site when my widget start (onload) but the problem arises when someone placed the widget in their blog or website's common sidebar. I am recording URL each time and hence if it's in sidebar of a blog then it's recording URL for every post which is creating duplicates.
can anyone help on this? How should I go so that I have only one single record for a widget on a site?
I think doing something like this is a bit tricky. Here are some ideas that pop to mind
You could for example ask the user to input their site's URL when they get the widget, or the widget could track the domain or subdomain, thus giving less URLs.
Just tracking the domain would obviously be problematic if the actual site is domain.com/sitename/, and there could be more than one site under the domain. In that case, you could attempt to detect the highest common directory. Something like this:
You have multiple URLs like this: domain.com/site/page1, domain.com/site/page2, and so on. Here the highest common directory would be domain.com/site.
I don't think that will always work correctly or provide completely accurate results. For accuracy, I think the best is to just ask the user for the URL when they download the code for the widget.
Edit: new idea - Just generate a unique ID for each user. This could be accomplished by simply taking the current timestamp or something, and hiding it into the code snippet the user is supposed to copy. This way you can track the ID itself and any URLs and domains it appears in can be grouped under it.
If you have an ID which doesn't get a hit in say week or something you could remove it from your database, and that way avoid filling it up with unused IDs.
I agree with Jani regarding a unique id. When you dish out the script you'll then be able to always relate back to that id. You are still going to have duplicates if the user uses the same id over and over, but at least you'll have a way of differentiating one user from another. Another useful advantage is that you are now able to, as Jani said, group by the ID and get a cumulative number for all of the instances where that user used the script & id.
I am creating a classifieds website.
Im storing all ads in mysql database, in different tables.
Is it possible to find these ads somehow, from googles search engine?
Is it possible to create meta information about each ad so that google finds them?
How does major companies do this?
I have thought about auto-generating a html-page for each ad inserted, but 500thousand auto-generated html pages doesn't really sound that good of a solution!
Any thoughts and idéas?
UPDATE:
Here is my basic website so far:
(ALL PHP BASED)
I have a search engine which searches database for records.
After finding and displaying search results, you can click on a result ('ad') and then PHP fetches info from the database and displays it, simple!
In the 'put ad' section of my site, you can put your own ad into a mysql database.
I need to know how I should make google find ads in my website also, as I dont think google-crawler can search my database just because users can.
Please explain your answers more thoroughly so that I understand fully how this works!
Thank you
Google doesn't find database records. Google finds web pages. If you want your classifieds to be found then they'll need to be on a Web page of some kind. You can help this process by giving Google a site map/index of all your classifieds.
I suggest you take a look at Google Basics and Creating and submitting SitemapsPrint
. Basically the idea is to spoon feed Google every URL you want Google to find. So if your reference your classifieds this way:
http://www.mysite.com/classified?id=1234
then you create a list of every URL required to find every classified and yes this might be hundreds of thousands or even millions.
The above assumes a single classified per page. You can of course put 5, 10, 50 or 100 on a single page and then create a smaller set of URLs for Google to crawl.
Whatever you do however remember this: your sitemap should reflect how your site is used. Every URL Google finds (or you give it) will appear in the index. So don't give Google a URL that a user couldn't reach by using the site normally or that you don't want a user to use.
So while 50 classifieds per page might mean less requests from Google, if that's not how you want users to use your site (or a view you want to provide) then you'll have to do it some other way.
Just remember: Google indexes Web pages not data.
How would you normally access these classifieds? You're not just keeping them locked up in the database, are you?
Google sees your website like any other visitor would see your website. If you have a normal database-driven site, there's some unique URL for each classified where it it displayed. If there's a link to it somewhere, Google will find it.
If you want Google to index your site, you need to put all your pages on the web and link between them.
You do not have to auto-generate a static HTML page for everything, all pages can be dynamically created (JSP, ASP, PHP, what have you), but they need to be accessible for a web crawler.
Google can find you no matter where you try to hide. Even if you can somehow fit yourself into a mysql table. Because they're Google. :-D
Seriously, though, they use a bot to periodically spider your site so you mostly just need to make the data in your database available as web pages on your site, and make your site bot-friendly (use an appropriate robots.txt file, provide a search engine-friendly site map, etc.) You need to make sure they can find your site, so make sure it's linked to by other sites -- preferably sites with lots of traffic.
If your site only displays specific results in response to search terms you'll have a harder time. You may want to make full lists of the records available for people without search terms (paged appropriately if you have lots of data).
First Create a PHP file that pulls the index plus human readable reference for all records.
That is your main page broken out into categories (like in the case of Craigslist.com - by Country and State).
Then each category link feeds back to the php script the selected value regardless of level(s) finally reaching the ad itself.
So, If a category is selected which contains more categories (like states contain cities) Then display the next list of categories. Else display the list of ads for that city.
This will give Google.com a way to index a site (aka mysql db) dynamically with out creating static content for the millions (billions or trillions) of records involved.
This is Just an idea of how to get Google.com to index a database.