Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I looked all over google and the internet as to performing In Page A/B testing.
What I am trying to do is perform A/B testing on a single page, but that page's content varies on the referring url, performed through <?php include
Say if you come from Google, it displays, 'Hey, are you new here?!', or if you come from another page on our website it will display 'Let's get you started'. The goal is then to see which page has longer visit duration.
Does anyone know of how to do this through Google analytics/Optimizely or any other analytics plugin?
Ryan,
I believe this shouldn't be too difficult to do... just depends on the tools you use :-).
Personally, I can recommend Visual Website Optimizer, which allows you to segment a running test to specific segment based on various conditions. Referring URL is one of them (see screen below).
However, you can then use only one variation of the page, so if you have more segments that you would like to test, you would need to:
Duplicate the test itself,
Change the copy in the variation,
Set up the segment rules according to your needs,
Follow this procedure with every segment :)
I have done this, but can't say it had much impact. It was too much work and I personally prefer segmenting based on customer data (new/existing customer etc.), where you can notice much more impact and it's also then "easier" to report since the differences are quite noticeable.
Hope this helps!
You should be able to set this up in any modern A/B testing tool on the market. Here’s how to do it in Optimizely:
Create a new experiment and go to Options and Targeting. Select the page that the experiment should run on and select the referrers it should run for in Visitor Conditions:
Make sure that the messaging isn’t displayed for the the Original / Control / A in the experiment.
For the Variation / A, use the visual editor to add an element with the messaging or select an existing element on the page to change it’s text. You can also write Javascript code to insert the element (via ‘Edit Code’).
If you want to display different messages for different referrers, click on the the ‘Edit Code’ ribbon in Optimizely and wrap the Javascript in if clauses for each referrer (and create a backup message), like so:
if (document.referrer.match(/^https?:\/\/([^\/]+\.)?reddit\.com(\/|$)/i)) {
$('#welcome-message').text('Hi redditor!');
} if (document.referrer.match(/^https?:\/\/([^\/]+\.)?huffingtonpost\.com(\/|$)/i)) {
$('#welcome-message').text('Hi Huffington Post reader!');
} else {
$('#welcome-message').text('Hi! I’m a backup message, just in case !');
}
Select Options and Analytics Integration. Enable Google Analytics.
Start the experiment. Within a few minutes you should see the first results in Optimizely. In a few hours, results will be available in Google Analytics, where you can drill down to see how things like visit duriation, pageview per visit and bounce rate changed based on custom segments (that’s how the variations are displayed in Google Analytics).
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have a webpage I'm trying to promote via ad banners. I want to associate a utm code to those those links, so when a visitor lands on my website, I'll be able to track where they came from (mysite.com?utm_campaign=adXYZ)
Normally these ad banners lead to a single webpage with single point of conversion where I'm able to capture the utm_campaign ID and gauge how effective my marketing is. However, I'm now leading users to a full website with many pages and many points of conversion. I'm hoping to keep that utm_campaign ID across multiple pages using some crafty JS or PHP.
For example:
user clicks ad banner to mysite.com?utm_campaign=adXYZ
user lands on mysite.com but wants to go to mysite.com/features
user goes from mysite.com/features to mysite.com/pricing
By the time the user reaches mysite.com/pricing, I want ?utm_campaign=adXYZ to still be there in the URL.
I know there are ways via analytics and what not to track a session/conversion, but I specifically need to capture the referral utm code in an HTML form down the road. Can anyone point me in the right direction? Thanks a bunch!
Edit: An important point to note. The site should still be accessible organically via search, bookmark, linking, etc and not have the trailing campaign ID in the URL. Only when user visits the site from ad banner should the campaign be there and all subsequent pages.
I would set a cookie containing the relevenat information the first time the user enters your website.
Otherwise you have to pass the information every time again with every request (GET / POST). This solution will work even if the user don't allow cookies. Murat Cem YALIN wrote how this works in detail. But if you want to use the JS-method: Be aware that the user must have JS activated!
The third option might be using PHP Sessions.
you can do it with both php and js.
in php use simplehtmldom (http://simplehtmldom.sourceforge.net/) to access all links in html output and add ?utm_campaign=adXYZ to all of them just before outputting rendered html.
in js you use jquery to do the same when the document loaded. ex:
$("a").each(function(){
$(this).attr("href", $(this).attr("href") + '?utm_source=adxYZ');
});
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am about to create an Online Shopping site for my one of the client. I have to make this site SEO Friendly and therefore I must have to understand few things before I proceed to make a custom CMS Based website.
As I said I am going to make a Custom CMS Based website so that my client will be able to add new content through CMS but I don't understand few things.
For Example: I have an index.php page which has many links to different products and all of these links are created through Database using PHP. Site Link like
http://www.def.com/shoes/Men-Shoes
My Questions:
1) I want to know that when the GoogleBot crawls my site, will it also open my dynamically created links and index them? Will GoogleBot also index the content of my dynamic links?
2) Do I have to create seperate pages for all of the products on site and store them on my server? Or just a single page which serves dynamically according to user query for every product?
I read this
"It functions much like your web browser, by sending a request to a web server for a web page, downloading the entire page, then handing it off to Google’s indexer."
is it right?
my above query was actually looking like this and I used .htaccess file to make it pretty
http://www.def.com/shoes.php?type=Men-Shoes
so is it right and google will crawl it to index?
SEO is a complex science in itself and Google is always changing the goal posts and modifying their algorithm.
While you don't need to create separate pages for each product, creating friendly URL's using the .htaccess file can make them look better and easier to navigate. Also creating a site map and submitting this to Google Via their webmaster tools will help them to know which pages to index.
GoogleBot will follow the links in your site, including dynamically created one, but it is important not to try and game the system using Blackhat methods if long term success is your aim.
Also, use social media (Twitter, Facebook, Google+) to help promote your brand and make sure you follow Google's guidelines with regards to SEO and inpage optimisation.
There is a huge amount of information on the internet on this subject, but be careful what advice you follow.
Google and other search engines index the dynamic links too. So a way to avoid duplicate content is to use the "Crawl"->"URL Parameters" tool in Google Webmasters. You can read more about how that works here https://support.google.com/webmasters/answer/6080548?rd=1. Set "Crawl" field to "No URLs". By this way you could hide from search dynamic links but you have to have a list of all of your dynamic links of your website/CMS in order not to hide important content accidentally. The "URL Parameters" feature is available in Bing Webmaster tools also http://searchenginewatch.com/sew/how-to/2195777/bing-webmaster-tools-an-overview#.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I update my website content daily, around 15 to 20 new pages.
From my webmaster account, I "Fetch as Google" for each page, long process..
Is there a way to automate it by PHP?
Can PHP "auto submit" my new pages for me (The New Links are in a MySql data) to "fetch as google"?
Please help.
Thank you.
It is wrong to do that "by hand".
I will cite another answer
I would say it is not a preferred way to alert Google when you have a
new page and it is pretty limited. What is better, and frankly more
effective is to do things like:
add the page to your XML sitemap (make sure sitemap is submitted to Google)
add the page to your RSS feeds (make sure your RSS is submitted to Google)
add a link to the page on your home page or other "important" page on your site
tweet about your new page
status update in FB about your new page
Google Plus your new page
Feature your new page in your email newsletter
Obviously, depending on the page you may not be able to do all of
these, but normally, Google will pick up new pages in your sitemap. I
find that G hits my sitemaps almost daily (your mileage may vary).
I only use fetch if I am trying to diagnose a problem on a specific
page and even then, I may just fetch but not submit. I have only
submitted when there was some major issue with a page that I could not
wait for Google to update as a part of its regular crawl of my site.
As an example, we had a release go out with a new section and that
section was blocked by our robots.txt. I went ahead and submitted the
robots.txt to encourage Google to update the page sooner so that our
new section would be :"live" to Google sooner as G does not hit our
robots.txt as often. Otherwise for 99.5% of my other pages on sites,
the options above work well.
The other thing is that you get very few fetches a month, so you are
still very limited in what you can do. Your sitemaps can include
thousands of pages each. Google fetch is limited, so another reason I
reserve it for my time sensitive emergencies.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
There is a site that does not have a standalone search or API product available at this time, and I need a PHP page which can get information another way. This site has a manual search function to pull data, is there a way for my page to go to the site, click on the search parameters that I need and pull the results into my page? Maybe there's a better way?
Example.
realestate.com.au, I want a user to enter the same search fields in my php file, go to realestate.com.au, plug in those search parameters in that site, then as the results of my search come through, eg, 3 bedrooms, 2 bathrooms in the Suburb of Marylands, I get the price of all the results and in my page calculate the median price, and that field gets outputted to the user on my screen.
Now I have no code for this as I don't even know how to start. I cant do most of it if I can find out a way to start by having php go to realestate.com.au (not redirecting the user there, just hooking into the website) and doing the search.
I hope I make sense.
Regards
This is somewhat complex... more than I can cover here. but here is the general steps involved:
1) Build your URL
From your target site, find out what the URL looks like on the search results page. That should tell you how to build your URL for later on.
2) Use CURL to retrieve your custom results page
Using the format that you saw above, sned a request for that page using CURL. Only you will be substituting your visitors search terms into the URL.
3) Scrape the resulting page
CURL will fetch the HTML from the search results page. You will then have to iterate through that result and pull out the needed data using DIV IDs and classes.
4) Serve the final results to your visitor
Drop the scraped data into your own page and send to the browser.
See aslo:
http://voices.yahoo.com/how-scrape-web-page-using-php-curl-5442987.html
NOTE: I would make sure you have permission to scrape the site.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
An application I am involved with is in dire need of a restructuring of the reports section... I am open to suggestions. All the development is currently in PHP (nginx/php/linux/mysql/redis environment), although other suggestions which fit in to the environment are welcome.
There is already ongoing logging for the current system which feeds into mysql tables. All tables are basically the same structure and different things logged with different logtypes.
There are a couple of different metrics/actions we'd like to report on, and be able to have the users drill down, by date or other filters.
Example metrics:
User searches a topic. I log his user id, search keyword, each result ID.
User accesses an item on the system (either from a search result above, or from my main page - i have separate logs for both). I log (currently) the "ID" of the page, the unique ID of the user (all users have an ID), the time, the Category of the page.
User submits a request for an item. I log the ID of the request (new id), the unique ID of the user, the Category of the report.
List all users who clicked on item X.
etc
Can someone give me some opinions on whether I would be able to leverage the existing functionality in Piwik (www.piwik.org) or Open Web Analytics (http://demo.openwebanalytics.com) to build an easy to use dashboard of sorts and report tool? The idea is that most if not all of the queries to insert and to select the data for the metrics above we already have. What we need is a uniform way of displaying the data, where the user can view different reports in a constant format, etc...
Filtering by category where we have a category ID would also be something necessary. Category is a hierarchichal tree and picking a parent node means we basically list all child nodes and make an IN (x,x,x) with all of the child IDs (we are investigating changing to linear tree traversal, but thats for another discussion...)
basically, once again, sorry if this has become confusing: from those who have experience with piwik/owa/other web analytic frameworks, have you used it to deliver custom metrics from custom applications, not related directly to webpage viewing?
If so, could you share examples?
Also, any reasons to favor piwik or owa? OWA seems to have some nice things which we could maybe add in the future like heatmaps and recordings, but the main focus right now is the custom metrics so the web metrics stuff would be disabled at first...
Thanks for the help...
Using Piwik with a mix of Custom Variables & Segmentation should allow for your requirements.