I have an ecommerce system and a blog both working independently on two servers. The ecommerce server reverse proxies the blog
/blog
This is all working fine, however my question is what is the best way to attain the header webpage element from one server to the other, taking into consideration there are dynamic elements within the header, basket count and user name if logged in.
The only solution I came up with is to grab the header via curl or the likes and cache periodically. Then enhance it with session data which is shared across the servers, i.e. the basket count, username.
Thanks in advance, hope this makes sense
Andrew
Since you're using a single server to reverse proxy, you can perform ajax requests to this single site without worrying about any of the cross-site policy shenanigans.
Lets say you have the following
/blog -> Your blog server
/notblog -> Your EC server.
From a page served from under /blog you can fetch a url like /notblog/basket.php with an ajax request, and update various parts of your header with the relevant information.
Alternatively (a far more crude solution) would be to include /notblog/basket.php as a script file, and dynamically generate javascript that simply prints your header contents. (remember to add headers to prevent it from caching!)
Both of these require an extra request to your site, but so would your proposed solution of using cURL.
Related
I am using an API that is pretty expensive. Each call costs about 1 cent. I noticed that visits from spiders and crawlers generate thousands of calls to that API and I am being charged for them. Is there a way to block the section of the webpage that shows content generated by that API, in the way that only actual visitors can see it and no API calls will be generated when the webpage is being crawled?
You could do the API call in front-end instead of doing it server-side. For example, during the page load, do an AJAX request to your server that will make a call to the API and return the data.
Presumably, the spiders and crawlers just parse the source code and do not execute the JS, thus they will not execute the AJAX request and you will not be charged. However, if some of your visitors do not have JS enabled, you should provide them a way to get the results anyway.
Apart from this, what you could do if you want to reduce your cost is to implement a caching system so that you do not do the same call multiple times in a row to the API. You can define the caching time according to the criticality of fresh data.
There are some many methods to prevent from crawlers crawl your site / specific pages. The problem is that you need to define which kind of crawlers you want to block as there are many types of them. As a starting point, Google & Bing do not respect robots.txt setting for the crawl-delays (you can change the crawl rate of those by changing this manually in their dashboard).
As you mentioned you are working with PHP, well if you are using Apache than you can try the Apache access log - it registers all requests Apache receives - analyze the log files and you can which crawlers are making all of the traffic you are talking about (when you know which crawlers make the heavy traffic you know which one you can kill by using blocking them using .htaccess file - you can redirect web requests coming from specific IP addresses or user agents to 403 http error or any desired redirect output)
I figured out this but still looking for better ideas:
<?php
if (preg_match('/slurp|inktomisearch|[Gg]rub|[Bb]ot|archiver|[Ss]qworm/', $_SERVER['HTTP_USER_AGENT'])) {
include("no-api-call.php");
} else {
include("yes-api-call.php");
}
?>
I'm building a Shopify website and I'm trying to create a simple "Contact Us" page that allows the user to POST their comments (e.g. name, email, comments). The Shopify website is hosted (as all Shopify accounts are) but I can set up a CNAME from my domain to point the Shopify hosted pages (that way I have a vanity URL).
I'm wondering, will this enable me to POST directly from the Shopify hosted pages to a script on my server?
Example:
The Shopify pages are at: http://myawesomestore.shopify.com
On my contact-us page: http://myawesomestore.shopify.com/pages/contact-us/
I want to POST to a script on my domain (where I can store in a database): www.my-domain.com/contact-us.php
If I cannot do this, what is the best solution for posting from a hosted solution to an owned solution (i.e. I cannot set up a proxy on their domain to POST to mine).
I hope this makes sense, I'm still very much a novice and there are just too many fundamentals here to comprehend before I could logically build this solution myself.
Thank you all so much in advance!
Cheers,
Rob
POSTing a Form to any URL on the internet is never a problem. It doesn't matter if it's on the same domain or a different domain.
You can set up a contact form on your Shopify store and have it send a POST request to any URL where your custom application is listening to.
So the short answer is: This is absolutely no issue, just go ahead and add the form to your shop!
Some confusion on the topic might be caused by "Cross Domain AJAX". If you're doing your POST requests via Javascript then yes, it is only possible to do so if the target URL is on the same domain as the source that is sending the request. See also http://snook.ca/archives/javascript/cross_domain_aj
Hope this clears this up!
No, you can't just use a CNAME in most cases. The CNAME will forward to the target server and if it's using virtual hosting, it uses the domain name to figure out how to handle the request. If they have it set up so the server only uses one domain name, it may be set up without a virtual host, which will allow this to work, but you don't really want your script to be dependent upon their server setup.
If you need to have it look like your url, I'd just put it inside a frame.
A form can POST to any destination on the internet that the user's browser has access to.
you don't need a cname, if you can set up a html form you can post it to any url on the net
...
post is a standard http header, you cant really have a webserver and it not accept post. its up to the site posted to, to check the source of the post and decide what to do with the data
I have a secure site with private information that uses https. We have a partnership with another site that provides functionality for our users. We want the header and footer to be the same, but the body functionality to come from their site. I thought I'd create a template file that they can request from our server, which would allow me to keep creative control for whenever our site has changes.
However, the header has account information, so it needs to access the session information for the current user. So number one, is this possible? If a user clicks from my site to theirs and they request the template from our servers, how can it be sure to connect to the correct session? And number two, is this safe? How can I be sure this connection is secure?
Edit: It appears this option is not worth pursuing. I'm going to work on some other ways for the other server to access the information. Thanks.
What you are trying to do in that way is a completely mess.
You should avoid outtputting a page built from different website putted all togheter.
That would become:
Hard to maintain;
Prone to security hole.
If you want the file on your site to be a template, it should be only that. Have the other site add the information to the header after fetching it.
I think for what you are doing, the only option you have to is to redirect any request through your server acting as a proxy to maintain session vars without causing to many security holes.
two years ago I had to design a system to share authentication data across multiple domains, all of them shared the same server/db. I was able to pull this off with a complex system of cookie sharing which, to date still works.
I'm now in the process of redesigning the system and I was wondering if there are better ways to achieve this without having to write cross domain cookies.
Basically the system MUST do this.
Once logged in one site the user must be logged in all of the other site seamlessly, not only following a link, but even by directly writing the domain name on the address bar.
To my knowledge the only way to achieve this are cross-domain cookies, if there are alternatives please tell me.
Thank you very much
My Idea would be to include a login-Javascript from a third domain which gets includet in all sites. This javascript sets and reads the session-cookie and calls the current domains server via ajax with the result. (No validation should be done in the JS - this simply sets and reads the cookie)
If cross domain AJAX does not work, you can still call the thirds domain server which acts like a proxy and calls the current domains server.
The StackOverflow sites have implemented something similar to this. Check out the details at the following links.
Here is a post giving an outline of how they did it.
And here is even more detail.
For this you do have to use cookies, but you can vary what you store in the cookie. The cookie doesn't have to contain user credentials but can instead contain something more like a token that you use to "centralize" your sessions.
Easies way would be to let all hosts share a single memcached server and use the content of the users cookie as your key.
The company I work for has four domains and I'm trying to set up the cookies, so one cookie can be generated and tracked across all the domains. From reading various posts on here I thought it was possible.
I've set up a sub domain on one site, to serve a cookie and 1*1 pixel image to all four sites.
But I can't get this working on the other sites.
If anyone can clarify that:
Its possible?
If I'm missing something obvious or a link to a good example?
I'm trying to do this server side with PHP.
Thanks
Are you having issues due to Internet Explorer and their Privacy stuff?
Session variables are lost if you use FRAMESET in Internet Explorer 6
Back in my former internet days, when IE6 first came out, we had to implement this because it broke some of our tracking. Its amazing that all you have to do is fake it, and everything works fine.
Your on the right track, we had a domain that hosted the tracking cgi that served the 1x1 transparent pixel and tracked what page a user was visiting. We then had a custom parser that would combine that data with Apache logs and dynamically created a graph of users traffic patterns through our website. This was using dot from the Graphviz package.
This kind of thing is pretty easy if you are just trying to do stats, but if you're actually trying to persist user data across domains you'll have to do something more complicated.
The best way to set a cross-domain cookie is to make sure all your sites are subdomains of one master domain, say initech.com. So one of your site, site1.initech.com, sets the cookie with a domain of ".initech.com" and it works fine.
It could be a problem if your sites are on totally different domains though.
Rather than try to set one cookie that each site can access, what you'll have to do is make sure that each site has its own exact duplicate of the original cookie. So, have your site, site1.com, set the cookie for itself and output three 1x1 gifs, or AJAX calls or whatever, to site2.com, site3.com and site4.com setting the same cookie to the same value.
This will be difficult to do securely and reliably ;)
To make sure somebody can't set arbitrary cookies on your domain, you'll habe to pass through a hash of the cookie value on the image tag. If the cookie to be set is "mycookieval", also pass through md5("mycookieval"."somesecretstring".$_SERVER['REMOTE_ADDR']). This is potentially bad because it might allow an attacker to set the same cookie to the same IP address, or possibly to brute-force the hash generation.
You could compensate for this by inserting a record into a backend database whenever you set the cookie, and having the other three sites check against it for validity.
This question's pretty cold, but in case anyone else stumbling on it, or the OP still has need, I've created an NPM module, which allows you to share locally-stored data across domains. It looks like this would exactly address the OP's need here, and it doesn't require all sites share a base domain.
https://www.npmjs.com/package/cookie-toss
By using an iframe hosted on Domain A, you can store all of your user data on Domain A, and reference that data by posting requests to the Domain A iframe.
Thus, Domains B, C, etc. can inject the iframe and post requests to it to store and access the desired data. Domain A becomes the hub for all shared data.
With a domain whitelist inside of Domain A, you can ensure only your dependent sites can access the data on Domain A.
The trick is to have the code inside of the iframe on Domain A which is able to recognize which data is being requested. The README in the above NPM module goes more in depth into the procedure.
Hope this helps!