I'm trying to redirect thousands of ids on a single dynamic PHP and MySQL page. I work for a news website with around 7,000 articles published and my boss decided to change a 10 year old URL to a new one at the end of last year, crazy right!
I have put in redirects from the old to the new site for standard static pages, but it is the dynamic pages that hold 1000's of article ids that has got me in a pickle. I have tirelessly looked for an answer, but to no avail.
For example the below redirect is for just one id. The code below will redirect said specific article. Is there a way to redirect all ids on my MySQL database, without hand coding all of them like below, as this would be impractical and an impossible mission? If not what would be best practice in my situation after a massive website URL change?
RewriteRule ^article\.php$ - [L]
RewriteCond %{QUERY_STRING} ^article_id=224509$
RewriteRule ^securitieslendingnews/article.php/?$ https://www.securitiesfinancetimes.com/securitieslendingnews/article.php?article_id=224509 [L,NE,R=301]
Thank you for any help in advance.
Not tested but something like this ..
RewriteRule ^securitieslendingnews/article.php?article_id=([0-9]+)$ ./securitieslendingnews/article.php?article_id=$1 [L,NE,R=301]
Related
I have a dynamic PHP based site and I've recently noticed its generating a lot of weird pages like this:
http://www.festivalsnap.com/festival/3151748-16th+Annual+Magnolia+Fest+/hotels/3151748-16th+Annual+Magnolia+Fest+/ticket/hotels
The site architecture should be like this www.mysite.com/festival/ and then there are 4 possible child pages for each event... /lineup /tickets /hotels /news
As you can see from the URL it just keeps creating more and more unwanted child pages. When I run a sitemap generator it will just keep going forever and creating more of these pointless pages.
It shouldn't go any deeper than the /hotels page but for some reason its just adding more and more child pages using any combination of the above pages.
I'm no good with PHP and my developer isnt being very helpful. Anyone know what could be causing this?
Edit:
The main event page comes from a file called festival.php and then there are 4 child pages under that - lineup.php tickets.php hotel.php and news.php that get variables from the event page (event title, dates, location, etc) and use it to search for tickets, hotels, etc.
I have noticed that I can tack on basically anything to the URL and it will add it in as part of the page title/event title. It looks like there is something weird going on with .htaccess
Here is the .htaccess code:
RewriteEngine on
RewriteCond %{HTTP_HOST} !^www.festivalsnap.com$ [NC]
RewriteRule ^(.*)$ http://www.festivalsnap.com/$1 [R=301,L]
RewriteRule festival/(.*)-(.*)/lineup$ lineup.php?eveid=$1&festival=$2
RewriteRule festival/(.*)-(.*)/news$ news.php?eveid=$1&festival=$2
RewriteRule festival/(.*)-(.*)/tickets$ ticket.php?eveid=$1&festival=$2
RewriteRule festival/(.*)-(.*)/hotels$ hotel.php?eveid=$1&festival=$2
RewriteRule festival/(.*)-(.*)/hotels/(.*)$ hotel.php?eveid=$1&festival=$2&hsort=$3
RewriteRule festival/(.*)-(.*)$ event_page.php?eveid=$1&festival=$2
RewriteRule artists/(.*)-(.*)$ artists.php?artid=$1&artname=$2
This is partly something to do with your generator, and partly to do with .htaccess. The .* operator is extremely aggressive, so your .htaccess file says pretty much anything containing festival/ with a hyphen somewhere later in the URL is a valid URL.
But that doesn't explain why your generator is "finding" all of those pages; there must be some bad links being created somewhere, either internally in the generator or in links on pages on your site.
Can you post some code?
EDIT: The .htaccess code should be much narrower - try replacing each of the occurrences of (.*) with ([^/]*).
As for the PHP, it's impossible to say exactly what is going on, but it sounds like the generator is finding those links on your site somewhere and following them, in which case the sitemap generator is working correctly, but your content has problems. Check your logs, find one of the incorrect URLs, and see what page referred the user there. That will tell you where to look for the bad code.
I think my question is quite simple but I've been banging my head against the wall for the past few hours.
I have my website using rewriterule to ensure messy path names with php variables are now nice and tidy(I've removed [http://www] from my examples because the system thinks I am putting links and won't let me).
So somebody comes to my site at mysite.co.uk/my-product-P1.html the website will know to post mysite.co.uk/product.php?id=1 to the server.
But I also want to tidy it up the other way around. If an old customer or an old link uses the pathname mysite.co.uk/product.php?id=1 then I want it to return mysite.co.uk/my-product-P1.html instead even though the old pathname will actually still work. I don't want customers accessing the same page from different pathnames.
How do I do this and will it create a loop? On another website I have it working using:
RewriteCond %{QUERY_STRING} ^id=1$ [NC]
RewriteRule ^product\.php$ product-P1.html? [R=301,L]
But on that site there are only around 10 products so I'm able to write these lines for each products. On my other site I have hundreds of products so this isn't practical and I need to do it automatically.
Hopefully this makes sense. I have read through other posts and can't find my solution so apologies if this is clearly explained somewhere else.
How do I do this and will it create a loop?
The rules that you have (on the site with 10 products) need to match against the actual request as opposed to the URI:
RewriteCond %{THE_REQUEST} \ /product\.php\?id=([^\ &]+)
RewriteRule ^ /product-P%1.html? [R=301,L]
But you're better off doing this in your php script rather than enumerating all the products in the htaccess file:
On my other site I have hundreds of products so this isn't practical and I need to do it automatically
You can't do that using only mod_rewrite. You'll need to script that in your product.php script. The product.php script will need to check the $_SERVER'[REQUEST_URI'] php variable, and see if it starts with: /product.php.
If it does, then you know someone accessed the php script directly, and you'll need to fetch the product name using the id passed in $_GET['id'], then redirect to the product name + "-P$_GET['id'].html".
The htaccess file and mod_rewrite won't know the mapping between product IDs and product names, so you need to do this in your php script (which does have access to this mapping).
You need this additional rule:
RewriteCond %{THE_REQUEST} ^[A-Z]{3,}\s/+product\.php\?id=([^\s&]+) [NC]
RewriteRule ^ /product-P%1.html? [R=302,L]
I have a website developed in PHP. I have recently done the URL rewriting which works fine. However, I just found out that my pages with parameters are also accessible. For eg.
I converted this URL
domainname.com/index.php?page=product&pid=5&proTitle=Samsung Galaxy
After rewrite it looks like this
domainname.com/products/5/Samsung-Galaxy.html
Everything works just fine. However, My site is still accessible using the old parameters. I want if someone types in the old URL should be automatically redirected to Ideally New Page if not then index page. Google and MSN shouldn't access these pages with parameter. Any help will be highly appreciated.
Thanks for your input. Here is more detail to my question.
I converted this URL
domainname.com/index.php?page=product&pid=5&proTitle=Samsung Galaxy
After rewrite it looks like this
domainname.com/products/5/Samsung-Galaxy.html
The code looks like this.
RewriteRule ^products/(.*)/(.*).html$ index.php?page=product&pid=$1&proTitle=$2 [nc]
Rewrite Works fine. However, if I try to access old URL i.e domainname.com/index.php?page=product&pid=5&proTitle=Samsung Galaxy the page is still accessible and on top of that being crawled by Google and other search engines. I want If someone tries to access this URL, it should direct them to Page Not Found and this should also not be sniffed by any crawlers.
Thanks a lot again for your time and I hope I can get your valuable reply soon.
You need another rule which redirects the browser to the nicer looking URL if the request is made for the ugly looking one. For example:
# need to replace spaces with "-"
RewriteRule ^(.*)\ (.*)$ /$1-$2 [L,R=301]
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.php\?page=([^&\ ]+)&pid=([^&\ ]+)&proTitle=([^&\ ]+)
RewriteRule ^ /%1/%2/%3 [L,R=301]
I have a website and i am using MySQL to store and fetch data from, there is a bunch of data of different destinations (Yes this is a travel agent website) i am wondering how can i setup .htaccess file to display SEO friendly URL
For example: http://www.mywebsite.com/flights-result.php?id=10 this URL is a details page for a flight to Entebbe in Africa, i would like to have the URL for this like http://www.mywebsite.com/Africa/Entebbe.htm
And so on for them, one more thing do i need to add this for every page? the data is being update on daily basis so is there any easy way to write URL automatically?
Any help highly appreciated.
I don't really think what you are trying to accomplish has anything to do with mysql. What you are looking for is called URL rewriting. There are countless number of articles out there that could show you the direction to follow. I am not very sure which web server you are using right not. I presume it is Apache. Here is Apache module_rewrite guide.
Given the original URL, there isn't all the information in there to use mod_rewrite to do this completely.
So what you could do it send all web requests to a controller file, and from there parse the request uri and load the correct page.
So in htaccess, something like...
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^.*$ controller.php [L]
Then in controller.php you parse the url and load the correct page.
A different option you may prefer, (if you're flexible on the specific final URL) is to have URLs ending up looking like this:
http://www.mywebsite.com/flights/10/Africa/Entebbe.htm
This would likely be simpler to do instead of implementing a controller (although I prefer the controller for routing requests).
So in htaccess...
RewriteRule
^/flights/([0-9]{1,10})/([a-zA-Z]+)/([a-zA-Z]+)\.htm$
flights-result.php?id=$1&country=$2&place=$3 [L]
Then near the start of the flights-results.php file you should load the data for the id, then check that the provided "country" and "place" are correct (to stop people just entering anything here), and return a 4040 if it's not.
Remember to change all the links your app outputs to the new style as well.
You could also, as you mentioned, hard code all these URLs into a htaccess, but that's not ideal :)
At a website of ours, we've had a normal FB like button and gathered some likes. After a while we realised we would like to be able to post to people liking our site. I created an app and added the fb:app_id meta tag and the app started gathering the likes rightaway. However, the likes we had from before are not counted in the app although they are still shown in the number by our like button.
Have a look at https://graph.facebook.com/?ids=http://leveransrapport.se compared to https://graph.facebook.com/?ids=http://www.leveransrapport.se. The first one shows 105 shares whereas the second one shows 10 likes.
What am I missing?
My intention is to get all the likes to show up in the app page for the admins.
I think the issue goes beyond Facebook. This is known as the canonical problem and causes search engines to treat the URLs as different and split search ranking!
Ideally you want all URLs to map to one of the two (www. or non-www.) to prevent this kind of problem.
You could put something like this in your .htaccess file (google shows you more examples)
RewriteEngine On
RewriteCond %{HTTP_HOST} ^domain.com$ [NC]
RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301,L]
Many browsers appends http to url without protocol so that url.com will become http://url.com and www.url.com will become http://www.url.com. It depends on how the app treats urls. Since I have not programmed with FB API I cannot add much than general information
Edit:
According to this link and some on internet the two makes no difference and it is a matter of preference> I think you have to check if the links is not supposed to be https not http!