fragmented urls causing problems in SEO - php

First of all I want to tell that I don't know my question is correct or not but I have searched google and SO for this answer but didn't find anything.
Try to explain my problem
I am making a website which is in php a server side scripting language and html css jquery and few other libraries.
It contains may pages and in few pages it contains a form which I am submitting using ajax.
Now When the person who is doing SEO for the website analyzes it by using some tool (A1 site map generator) it find outs some unwanted urls like:
https://wwww.example.com/graphic-design?_escaped_fragment_=
and
https://wwww.example.com/graphic-design#!
this problem shows on those pages which contains the ajax form.
can anyone explain me what is the cause I am seeing this and how to solve this problem.

I think the problem is with your ajax request. I had this problem too may be your sending data through get in your ajax form. change ajax's get to post.

Related

Web Scraping in php? need a specific piece of data

I'm trying to pull a piece of data from the website www.coinmarketcap.com
specifically, the market cap number up the top.
I've been trying to figure this out the past hour or so and have read MANY different ways people use these web scrapers but have not been successful at all. Could someone shed some light?
There are multiple ways, but the easiest is just take their url:
https://files.coinmarketcap.com/generated/stats/global.json
Please note: They might not like this. Maybe they dont want external parties to use their scripts. So also buidl a check wether the file still exists and doesnt give a 403 back.
How did I find this:
When the page loads,the header with the information loads after the document ready, so it can not have been made by the server and has to be AJAX.
Now we know that it is AJAX, we want to know which file. You do this by opening your browsers console. All browsers have a network tab, showing all resources being loaded. When you filter by XHR you see all AJAX request. Then you try to find the right one.

Cookies in Iframe page loaded with Curl

On http://aegon.nl you'll get an opt-in which only displays when you haven't got the right cookie set for the website.
Now I'm trying to load the website in an Iframe with an injected file (to do some fancy stuff with javaScript). To be able to do that, I need to CURL the sourcecode and change the base href to "aegon.nl" in order to show the same layout (for the url's to the stylesheets and such).
Now, been there and done that. But now I've got a problem with the cookies. I can click the button in the opt-in as many times as I want, but It keeps coming back. I only have this problem after CURLing the content and changing the base href.
After searching the sourcecode I think I found the problem:
var dom=(window.location.hostname.indexOf(".")>0)?'aegon.nl':window.location.hostname;
var AEGONCookieSettings = new AEGON.CookieOptIn(dom)
Probably the problem is solved by changing this part of sourcecode.... But that doesn't solve the problem for the other websites which are using the same kind of opt-in.
Does anyone have a solution for this?

CodeIgniter - Is it possible to change the URL when an AJAX action is performed?

I have been working on ajax for quite a while. I know about the hashtag and hashbang. Recently I have started using Code Igniter framework. Lets say I am in a page - http://domain.com/media, and there is a link called 'audio'.
When I click on this audio link, I send an ajax request, fetch the contents and display the content dynamically. But in such case the URL still remains the same (i.e.,) http://domain.com/media.
Now, is it possible to change the url to http://domain.com/media/audio when I click on the audio link and I still want to have the ajax functionality. I dont want to do it the usual way of the code-igniter (/controller/method).
Any help would be appreciated. Thanks in advance.
If it is not possible in Code Igniter, is it possible in any other framework?
You can use the pushState to achieve this.
Here's an example of my own site using it for browsing given images; no reloading occurs. I've also added support for each gallery image and it's loaded url, so that you can share the url with anyone without problems. The fallback is that if pushState is not supported, you will simply browse as you always do.

How to make sure a page appears once the iFrame on the page has loaded first

I'm pretty new to the world of web developing and web design in PHP, but I was hoping that someone could help me with my question.
I am loading a particular page on my PHP powered website which loads an iframe from another page elsewhere in the center of the page. I was wondering if anyone knows how to only get the whole page to load once the iframe has loaded. It's only a matter of a further one or two seconds before the iframe loads, but this needs to look as professional as possible.
I'd appreciate any help. I'd also like to assure you guys that although this is my first question, I know how Stack Overflow works as one of my best friends is a contributors here and he's always telling me how annoying it is when people don't respond or give points to good answers, so I'd like to assure you that I will do that.
Thanks again.
Steve.
Use JQuery. You won't be able to do this in PHP, as it's a client side issue.
function callIframe(url, callback) {
$(document.body).append('<IFRAME id="myId" ...>');
$('iframe#myId').attr('src', url);
$('iframe#myId').load(function()
{
callback(this);
});
}
See:
jQuery .ready in a dynamically inserted iframe

Using YQL in javascript/php to scrape article html?

I'm new to YQL, and just trying to learn how to do some fairly simple tasks.
Let's say I have a list of URLs and I want to get their HTML source as a string in javascript (so I can later insert it to a database via ajax). How would I go about getting this info back in Javascript? Or would I have to do it in PHP? I'm fine with either, really - whatever can work.
Here's the example queries I'd run on their console:
select * from html where url="http://en.wikipedia.org/wiki/Baroque_music"
And the goal is to essentially save the HTML or maybe just the text or something, as a string.
How would I go about doing this? I somewhat understand how the querying works, but not really how to integrate with javascript and/or php (say I have a list of URLs and I want to loop through them, getting the html at each one and saving it somewhere).
Thanks.
You can't read other pages with Javascript due to a built-in security feature in web browsers. It is called the Same origin policy.
The usual method is to scrape the content of these sites from the server using PHP.
There is an other option with javascript called a bookmarklet.
You can add the bookmarklet in your bookmarks bar, and each time you want the content of a site click the bookmark.
A script will be loaded in the host page, it can read the content and post it back to your server.
Oddly enough, the same origin policy, does not prevent you to POST data from this host page to your domain. You need to POST a FORM to an IFRAME that has a source hosted on your domain.
You won't be able to read the response you get back from the POST.
But you can poll with a setInterval making a JSONP call to your domain to know if the POST was successful.

Categories