At a website of ours, we've had a normal FB like button and gathered some likes. After a while we realised we would like to be able to post to people liking our site. I created an app and added the fb:app_id meta tag and the app started gathering the likes rightaway. However, the likes we had from before are not counted in the app although they are still shown in the number by our like button.
Have a look at https://graph.facebook.com/?ids=http://leveransrapport.se compared to https://graph.facebook.com/?ids=http://www.leveransrapport.se. The first one shows 105 shares whereas the second one shows 10 likes.
What am I missing?
My intention is to get all the likes to show up in the app page for the admins.
I think the issue goes beyond Facebook. This is known as the canonical problem and causes search engines to treat the URLs as different and split search ranking!
Ideally you want all URLs to map to one of the two (www. or non-www.) to prevent this kind of problem.
You could put something like this in your .htaccess file (google shows you more examples)
RewriteEngine On
RewriteCond %{HTTP_HOST} ^domain.com$ [NC]
RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301,L]
Many browsers appends http to url without protocol so that url.com will become http://url.com and www.url.com will become http://www.url.com. It depends on how the app treats urls. Since I have not programmed with FB API I cannot add much than general information
Edit:
According to this link and some on internet the two makes no difference and it is a matter of preference> I think you have to check if the links is not supposed to be https not http!
Related
I'm trying to redirect thousands of ids on a single dynamic PHP and MySQL page. I work for a news website with around 7,000 articles published and my boss decided to change a 10 year old URL to a new one at the end of last year, crazy right!
I have put in redirects from the old to the new site for standard static pages, but it is the dynamic pages that hold 1000's of article ids that has got me in a pickle. I have tirelessly looked for an answer, but to no avail.
For example the below redirect is for just one id. The code below will redirect said specific article. Is there a way to redirect all ids on my MySQL database, without hand coding all of them like below, as this would be impractical and an impossible mission? If not what would be best practice in my situation after a massive website URL change?
RewriteRule ^article\.php$ - [L]
RewriteCond %{QUERY_STRING} ^article_id=224509$
RewriteRule ^securitieslendingnews/article.php/?$ https://www.securitiesfinancetimes.com/securitieslendingnews/article.php?article_id=224509 [L,NE,R=301]
Thank you for any help in advance.
Not tested but something like this ..
RewriteRule ^securitieslendingnews/article.php?article_id=([0-9]+)$ ./securitieslendingnews/article.php?article_id=$1 [L,NE,R=301]
Im trying to write a redirect rule with mod_rewrite, but somehow it never functions, so i must be doing something wrong :(
The below is the rewrite rule as i've defined it in an .htaccess file, it should be noted that there's another block above this one, that comes from WordPress, i tried combining that one and the below into 1 rewrite rule to see if that was maybe the issue, but that also didn't fix things. Anyway, this is what i have:
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} =facebookexternalhit\/[0-9]+(\.[0-9]+)*
RewriteRule ^/hash/([A-Za-z0-9-]+)/?$ \#$1 [R]
</IfModule>
The RewriteCond part checks if the request comes via Facebook (kept out of the equation so far during my testing of the RewriteRule) i'm not sure if this actually works, but i came across it in multiple StackOverflow questions, so i'm assuming that's fine (honestly it doesn't really matter, and i may omit that condition all together, just posted here in-case it turns out to be invalid)
The RewriteRule part should redirect all url's containing /hash/ too url's containing #, so what this means is that http://www.example.com/hash/home should redirect the browser too http://www.example.com#home
I've tried a load of different iterations of the above, but whatever i try, it simply doesn't appear to function, hence this question, does anybody see anything wrong with the code? would it be an issue that there are multiple blocks defined in 1 .htaccess file?
Finally, is this even a valid solution? the actual underlying problem is this:
I have a site that uses hash-tags to navigate to certain pages, in the site there are some sharing buttons that share the current url a user is looking at, Twitter LinkedIn and Google+ all handle the shared url's (containing a hash-tag) without issue, but Facebook strips the hash-tag from the url in the sharer (if send it along as # hash-tag, if i try send it along as url encoded %23 hashtag Facebook reports a 404 on the url), so i need to make a workaround for Facebook where i can share a url without a hash-tag with Facebook, and then redirect those url's to something actually understood by the website when a visitor or the Facebook sharing api 'comes knocking'.
Any help highly appreciated!
I'm building a simple site that will only have a homepage and a contact page and wondered if I could use .htaccess to rewrite the urls for different companies.
So for example if I go to website.com/companyname-contact/ it will display that url in the browser bar but actually load a generic contact.php page, I can then use php to pull in the correct contact details for that specific companyname.
This would need to work for different company names (e.g. website.com/anothercompany-contact/) but only work for an array of approved company names.
I realise this may not be possible but I thought I'd ask because i spent about 4 hours this morning Googleing it with no real progress.
Thanks
Unless you want to manually list the approved company names in your .htaccess file (which looks UGLY) I'd suggest this:
RewriteEngine On
RewriteRule (.*)-contact$ /contact.php?company_name=$1 [L,QSA,NC]
and then in your contact.php
determine if valid company name - check db or whatever method you are using. (Make sure to escape the input)
if not valid you have a couple options:
redir to your default 404 page
issue an intelligent warning page (ie include suggestions for alternate spelling that is in the db) and set a 404 header. (better IMO)
if similar company name in the db possibly redirect to that with a note at the top of the page
Yes you can. You need to enable the rewrite engine, and then you will be able to use regular expressions to accomplish what you're trying to do.
This is an example of what your htaccess could like:
RewriteEngine On
RewriteRule ^contact/([A-Za-z0-9-]+)/?$ contact.php?company=$1 [NC,L]
I have a dynamic PHP based site and I've recently noticed its generating a lot of weird pages like this:
http://www.festivalsnap.com/festival/3151748-16th+Annual+Magnolia+Fest+/hotels/3151748-16th+Annual+Magnolia+Fest+/ticket/hotels
The site architecture should be like this www.mysite.com/festival/ and then there are 4 possible child pages for each event... /lineup /tickets /hotels /news
As you can see from the URL it just keeps creating more and more unwanted child pages. When I run a sitemap generator it will just keep going forever and creating more of these pointless pages.
It shouldn't go any deeper than the /hotels page but for some reason its just adding more and more child pages using any combination of the above pages.
I'm no good with PHP and my developer isnt being very helpful. Anyone know what could be causing this?
Edit:
The main event page comes from a file called festival.php and then there are 4 child pages under that - lineup.php tickets.php hotel.php and news.php that get variables from the event page (event title, dates, location, etc) and use it to search for tickets, hotels, etc.
I have noticed that I can tack on basically anything to the URL and it will add it in as part of the page title/event title. It looks like there is something weird going on with .htaccess
Here is the .htaccess code:
RewriteEngine on
RewriteCond %{HTTP_HOST} !^www.festivalsnap.com$ [NC]
RewriteRule ^(.*)$ http://www.festivalsnap.com/$1 [R=301,L]
RewriteRule festival/(.*)-(.*)/lineup$ lineup.php?eveid=$1&festival=$2
RewriteRule festival/(.*)-(.*)/news$ news.php?eveid=$1&festival=$2
RewriteRule festival/(.*)-(.*)/tickets$ ticket.php?eveid=$1&festival=$2
RewriteRule festival/(.*)-(.*)/hotels$ hotel.php?eveid=$1&festival=$2
RewriteRule festival/(.*)-(.*)/hotels/(.*)$ hotel.php?eveid=$1&festival=$2&hsort=$3
RewriteRule festival/(.*)-(.*)$ event_page.php?eveid=$1&festival=$2
RewriteRule artists/(.*)-(.*)$ artists.php?artid=$1&artname=$2
This is partly something to do with your generator, and partly to do with .htaccess. The .* operator is extremely aggressive, so your .htaccess file says pretty much anything containing festival/ with a hyphen somewhere later in the URL is a valid URL.
But that doesn't explain why your generator is "finding" all of those pages; there must be some bad links being created somewhere, either internally in the generator or in links on pages on your site.
Can you post some code?
EDIT: The .htaccess code should be much narrower - try replacing each of the occurrences of (.*) with ([^/]*).
As for the PHP, it's impossible to say exactly what is going on, but it sounds like the generator is finding those links on your site somewhere and following them, in which case the sitemap generator is working correctly, but your content has problems. Check your logs, find one of the incorrect URLs, and see what page referred the user there. That will tell you where to look for the bad code.
In my site, I have used mod rewrite to make search engine and user friendly urls.
Only 3 rules:
RewriteRule ^articles/([a-z]+)/([0-9]+)/?$ /index.php?page=articles&cat=$1&id=$2 [L]
RewriteRule ^articles/([a-z]+)/?$ /index.php?page=articles&cat=$1 [L]
RewriteRule ^([a-z]+)/?$ /index.php?page=$1 [L]
But index.php is still accessible by anyone and will work even if a friendly URL is not used(that is, instead parameters are passed).
So, does this down rank by search engine ? Do I have to block direct access to files with .php extension ?
If you have 2 URLs that load the same page where one is search engine friendly and the other is not, this is not really detrimental to your site AFAIK. Basically you just want to expose to search engines as much as you can, so if you need to provide a parallel track, for example an anchor tag that works fine without Javascript because the action will take you to the correct place (which is ideal for a bot) but typically is managed by Javascript for clients that have it (most standard web browsers) then you're golden.
EDIT:
Per OP question in a comment about parallel paths.. Say I have a link, an anchor tag.
<a id="moxune_services" href="http://moxune.com/services" action="get" target="_self">Moxune Services</a>
You can see that this is a valid link (and I will be getting SEO points for it from StackOverflow ;P But anyway, say this is part of a heave JS driven site, and rather than refreshing the whole page when this link is clicked, I just want to have a subsection of the page like where a div w/ id="content" is present be replaced by the fresh content after I have AJAX load it. The js would be something like this (w/o testing, this is just off the top of my head) (a jQuery solution as well):
// very crude jQuery example!
$('#moxune_services').click(function() {
$.get($(this).attr('href'), function(sNewHtml) {
$('#content').replaceWith(sNewHtml);
});
});
Now you see, the google bot can reach the page through the HTML a tag, no problem, but your customers looking for a Web 2.0 (TM) website will be able to enjoy the lack of full page refreshes as they have JS enabled (and hopefully aren't using IE 6 :O).
One term for this is 'graceful degradation'.
quickshiftin is right here. There's no point hiding the index.php
If you must, however do this:
RewriteRule ^index\.php.+$ / [L,R=301]
I didn't test this, so it might not work, but the general idea is to redirect index.php to /