I'm customising a wordpress site with a static front page. I'm using a responsive/dynamic theme.
What I want to do is make it so that the featured image that is loaded on the front page will be different based on a time. For example: Say someone goes to my website at 6pm, I would want the featured image on the front page to show a picture of the night sky. However, if someone visits my site before 6pm (and as early as say 5am), then the featured image should show a picture of the day sky.
Iv'e looked everywhere for a plugin to do this for me but one does not seem to exist. Iv'e also googled this endlessly and could not find specific solutions but I have been able to find sources to get me started.
Adding featured image from external source via php and sql:
http://www.wpexplorer.com/wordpress-featured-image-url/
(it would be easier to just use and UPDATE statement and change the image url in the database)
Getting time in given timezone:
Get current date, given a timezone in PHP?
I could have used a lot of the code from the links provided to achieve what I want but I ran into a PROBLEM. I CANNOT find the image url of the featured image in the wordpress database. I've looked in both _postmeta and _posts with no luck. Instead it just shows the image urls of the images that came with the theme originally but have since been changed. This could possibly have something to do with the fact that I've been using the Jetpack plugin (photon) to load my images from wordpress.com servers but I have turned the feature off and still can't find the image url in the database.
As I am not a pro in php and sql it would be great to be pointed in the right direction or even better if someone could come up with a solution. I know it's not an easy implementation but definitely worthwhile as solutions cannot be found elsewhere.
This can easily be done with javascript..
Here is a quick example I did:
https://jsfiddle.net/p2snuf1a/
var date = new Date();
var hours = date.getHours();
var sunset_img= "http://at-cdn-s01.audiotool.com/2014/06/03/documents/85NwUJbwC0Gx0rbiKePSWWcKUqhmdP/0/cover256x256-46428e19514b49058125b21b8107c2eb.jpg";
var night_img = "https://pbs.twimg.com/profile_images/2973095145/47f74748e0c6240db5e0b3034af6da16.jpeg";
if(hours > 18){
document.getElementById('feature_img').setAttribute('src', night_img);
}
else{
document.getElementById('feature_img').setAttribute('src', sunset_img);
}
Basically, it gets the current hour, sets some variables to the image links (these can be locally stored images too). Then it checks if the hours is greater or less than a certain time, so 18 is 6pm.
You can add more 'else if()' options for other time/image combinations.
If you have your images located in your css, you can use the '.sytle' instead of .setAttribute
Well this is a preety simple task, but i can see one issue with it. PHP will be using it's own server's time to calculate day and night. So if your server is in New york and some one is surfing your website from INDIA, it will show the image depending on the time in NEW YORK. So to achieve the user's local time you will either have to use javascript or use a geo locating api that will show the exact time of the given location. Here is an example.
http://api.geonames.org/timezone?lat=47.01&lng=10.2&username=demo
For now lets just suppose you want to change the image depending on the server time. So as you only have 2 images for day and night, hardcoding it should not be an issue. (If you want to have custom fields, can look into "Advanced custom fields" plugin, it is a brilliant plugin to add custom fields in your posts).
I haven't tested this code, but it should work.
date_default_timezone_set('America/Chicago');
$am_pm = date('a');
if($am_pm == 'am'){?>
<img src="<?php echo get_template_directory();?>/images/am.png" alt="Good Morning">
<?php }elseif($am_pm == 'pm'){?>
<img src="<?php echo get_template_directory();?>/images/pm.png" alt="Good Evening">
<?php }?>
Related
I want show all google review in my website. I am using phantomjscloud api.
Here the link it showing only 10 review. I need all review.
https://phantomjscloud.com/api/browser/v2/REDACTED-API-KEY/?request={url:%22https%3A%2F%2Fsearch.google.com%2Flocal%2Freviews%3Fplaceid%3DChIJF47RCZLBeUgRSpiTvcpPEfA%22,renderType:%22html%22}
I do support for PhantomJsCloud. You know, you can always email support#PhantomJsCloud.com to get help. I dare say it's a better than a place like SO for a niche SaaS product. Only Google Alerts saved you this time.
Anyway, to answer your question: you can look at the doc page at https://phantomjscloud.com/docs/http-api/, there is an example showing exactly this:
Scripts: LazyLoad then force render Scrolls 50px every 50ms, and
renders when done scrolling to the bottom. This eldiariomontanes.es
site lazy loads images only when it's shown in the browser, so if you
take a screenshot without the provided script, the bottom part will be
missing images. Important: This page takes aprox 25 seconds to load,
and weighs aprox 2mb in size. (Costs about 0.0047 credits)
HOWEVER you just pasted your private api key into a public message board! Sorry but I am going to have to delete your account. You can go ahead and sign up again though. Please read up on "Ops Sec" to avoid getting something important stolen in your future.
I used a javascript I found on the facebook developers site. It used to show my pages newfeed on the website and had a like button for people to join the page, it worked great up until around the time they made a change to the pages layout (about 1 month ago), which unfortunately broke it.
The script I am using is http://static.ak.facebook.com/js/api_lib/v0.4/FeatureLoader.js.php/en_US
you can see it in action:
http://fullmount.co.nz/index.php?main_page=news
I am wondering if, there is something simple that could be change dint he script to get it working again, or if there is another script I could use instead that works with the newer pages format?
I am a php wizard but a js amateur so I haven't even really looked at the code, I just knew it used to work.
I'm wondering if there's a more efficient way to serve a large number of link redirects. For background: my site receives tens of thousands of users a day, and we "deep link" to a large number of individual product pages on affiliate websites.
To "cloak" the affiliate links and keep them all in one place, I'm currently serving all our affiliate links from a single PHP file, e.g. a user clicks on mysite.com/go.php?id=1 and is taken to the page on the merchant's site, appended with our affiliate cookie, where you can buy the product. The code I'm using is as follows:
<?php
$path = array(
‘1′ => ‘http://affiliatelink1.com’,
‘2′ => ‘http://affiliatelink2.com’,
‘3′ => ‘http://affiliatelink3.com’,
);
if (array_key_exists($_GET['id'], $path))
header(‘Location: ‘ .$path[$_GET['id']]);
?>
The problem I'm having is that we link to lots of unique products every day and the php file now contains 11K+ links and is growing daily. I've already noticed it takes ages to simply download and edit the file via FTP, as it is nearly 2MB in size, and the links don't work on our site while the file is being uploaded. I also don't know if it's good for the server to serve that many links through a single php file - I haven't noticed any slowdowns yet, but can certainly see that happening.
So I'm looking for another option. I was thinking of simply starting a new .php file (e.g. go2.php) to house more links, since go.php is so large, but that seems inefficient. Should I be using a database for this instead? I'm running Wordpress too so I'm concerned about using mySQL too much, and simply doing it in PHP seems faster, but again, I'm not sure.
My other option is to find a way to dynamically create these affiliate links, i.e. create another PHP file that will take a product's URL and append our affiliate code to it, eliminating the need for me to manually update a php file with all these links, however I'm not sure about the impact on the server if we're serving nearly 100K clicks a day through something like this.
Any thoughts? Is the method I'm using spelling certain death for our server, or should I keep things as is for performance? Would doing this with a database or dynamically put more load on the server than the simple php file I'm using now? Any help/advice would be greatly appreciated!
What I would do is the following:
Change the URL format to have the product name in it for SEO purposes, such as something like "my_new_product/1"
Then use mod_rewrite to map that url to a page with a query string such as:
Rewriterule ^([a-zA-Z0-9_-]*)/([0-9]*)$ index.php?id=$2 [L]
Then create a database table containing the following fields:
id (autonumber, unique id)
url (the url to redirect to)
description (the text to make the url on your site)
Then, you can build a simple CRUD thing to keep those up to date easily and let your pages serve up the list of links from the DB.
We're using the Google CSE (Custom Search Engine) paid service to index content on our website. The site is built of mostly PHP pages that are assembled with include files, but there are some dynamic pages that pull info from a database into a single page template (new releases for example). The issue we have is I can set an expire date on the content in the database so say "id=2" will bring up a "This content is expired" notice. However, if ID 2 had an uploaded PDF attached to it, the PDF file remains in the search index.
I know I could write a cleanup script and have cron run it that looks at the db, finds expired content, checks to see if any uploaded files were attached and either renames or removes them, but there has to be a better solution (I hope).
Please let me know if you have encountered this in the past, and what you suggest.
Thanks,
D.
There's unfortunately no way to give you a straight answer at this time: we have no knowledge of how your PDFs are "attached" to your pages or how your DB is structured.
The best solution would be to create a robots.txt file that blocks the URLs for the particular PDF files that you want to remove. Google will drop them from the index on its next pass (usually in about an hour).
http://www.robotstxt.org/
What we ended up doing was tying a check script to the upload script that once it completed the current upload, old files were "unlinked" and the DB records were deleted.
For us, this works because it's kind of an "add one/remove one" situation where we want a set number of of items to appear in a rolling order.
I have created a widget for my web application. User's getting code and just pasting that code in their website and my widget works on their website something like twitter, digg and other social widgets.
My widget is on the basis of post, for a single post (say postid: 234) I am providing single widget, so anyone can embed the widget on their website.
Now I want to know that where all my widget is posted and for which post? for that I have recorded the URL of the site when my widget start (onload) but the problem arises when someone placed the widget in their blog or website's common sidebar. I am recording URL each time and hence if it's in sidebar of a blog then it's recording URL for every post which is creating duplicates.
can anyone help on this? How should I go so that I have only one single record for a widget on a site?
I think doing something like this is a bit tricky. Here are some ideas that pop to mind
You could for example ask the user to input their site's URL when they get the widget, or the widget could track the domain or subdomain, thus giving less URLs.
Just tracking the domain would obviously be problematic if the actual site is domain.com/sitename/, and there could be more than one site under the domain. In that case, you could attempt to detect the highest common directory. Something like this:
You have multiple URLs like this: domain.com/site/page1, domain.com/site/page2, and so on. Here the highest common directory would be domain.com/site.
I don't think that will always work correctly or provide completely accurate results. For accuracy, I think the best is to just ask the user for the URL when they download the code for the widget.
Edit: new idea - Just generate a unique ID for each user. This could be accomplished by simply taking the current timestamp or something, and hiding it into the code snippet the user is supposed to copy. This way you can track the ID itself and any URLs and domains it appears in can be grouped under it.
If you have an ID which doesn't get a hit in say week or something you could remove it from your database, and that way avoid filling it up with unused IDs.
I agree with Jani regarding a unique id. When you dish out the script you'll then be able to always relate back to that id. You are still going to have duplicates if the user uses the same id over and over, but at least you'll have a way of differentiating one user from another. Another useful advantage is that you are now able to, as Jani said, group by the ID and get a cumulative number for all of the instances where that user used the script & id.