I have a wordpress site which I manage(I am not a developer)
I ran a pagespeed test via https://developers.google.com/speed/pagespeed/
I got some issues like caching problems and so on so I used several plugins to take care of them.
however Im now stuck with Optimize CSS Delivery problems.
so I thought to try and fix it by myself and move the problematic URLs to the end of the page, however I cant figure out where these URLS are coming from. or which page is requesting for them.
appreciate any help with this
What I do in similiar situations: put some text inside the HTML elements of any php-file. E.g. put in page.php the "page-top" somewhere in the beginning of the file, put "page-end" at the end. In footer.php etc do the same. So you come very fast to the corner where the URLs might come from.
Related
I built my website in 2007 using html and css which I learnt from a book. I'm a jeweller and it's used as a portfolio of my work and therefore has lots of photos on it and separate pages with each photo. I have continued to add photos and pages and not made much changes to the overall structure of the website, just copied and pasted the code into each new page and changed the bits I needed to.
But now I want to change the headers and footers of all these pages, and there's hundreds of pages!
After some reading it seems I can use PHP (just finding out about this) to insert headers and footers. Which seems to mean I'd need to edit every page of code anyway, and change all links in the code to .php, which would be the same (or more!) amount of work as just changing the code on every page to be what I want, although will make it easier next time I want to change.... so wondered if there was another way of doing this?
First time asking anything on a web developer forum! As I'm sure you can tell I'm no expert so keep things simple please! My website is www.islayspalding.co.uk. Many thanks :)
I'm trying to make a really simple website with multiple pages. I've been coding in HTML for a while now, but just started to experiment with PHP.
My page setup goes like this:
Rootfolder/settings/language -- In there, I just put a index.php
Rootfolder/settings/Privacy -- In there, I also just put a index.php
In this particular case, I could indeed use rename the files to language.php/Privacy.php to reduce space. But that is still not what I'm expecting from a lightweight website.
In order to make it easier for myself, I use include_once to include the header, meta data (I know it's not smart, because of SEO reasons), footers and other stuff that is very general and has to be the same across the whole website. But here is the thing; I really think that this method is way to complicated. Every time I need a new file, I just have to duplicate the page that includes the include_once files, and then change the content of it. Same goes for the titles.
So, what I'm looking for is a page setup like this (1) :
Rootfolder/Pages/index.php -- Inside that index.php, the content has to change dynamically. The visitors still have to go to: https://domain.tld/settings/language but the server has to decode it to the setup from (1) and change the content to the original content of Rootfolder/settings/language/index.php.
In the past, I've downloaded some PHP scripts, which all included title setups like: <title> $title - $SiteName </title> (or something like that) - So it must be possible to change content dynamically.
The problem is... I really don't know where to start. Does anyone has good/lightweight solution for this?
OK, so on some of the pages on my site, I've included a foot.php file at the end so that when I make changes to it, it effects all pages on the site. On most pages, this works perfectly, but on some pages, it just cuts off, and no includes after it take effect. The weird thing is that it includes only a portion of the file on pages where this happens.
I thought it might be because of the number of includes I used, but I have pages with more that work just fine. Take for instance, this one:
http://www.kelvinshadewing.net/codeSquirrel5.php
Here, you can see the bottom gets cut off, and if you view the source code, the rest of what goes in that div is gone, yet the div itself is closed off properly. But then go here:
http://www.kelvinshadewing.net/sprTartii.php
You'll see that the full code is there, and the Disqus app is present as well. This issue has been going on since before I added Disqus, and also happened when I'd been using includes in a different way to generate global content, so it's something about those pages in particular. It does it with only my Squirrel tutorials, and nothing else. I'm totally stumped and have no idea what's causing this. I've gone over my code a dozen times, and verified that every page uses the same PHP scripts.
As for the scripts themselves, it's just this:
<?php include "foot.php";
include "disqus.php"; ?>
The problem just disappeared, so I'm ruling it as a server glitch. If anyone else is reading this, I suggest checking out Andrew's comment, because that code was nice to know.
Until now, I've been using the <iframe> tag to load things like headers/footers/navbars into my webpage. These cause so much hassle though and as I'm about to start building a new site I thought I'd get it sorted now.
I was thinking of having all the html code in a php file and just loading it in dynamically.. Ideally I'd like the code to become a part of the page. So it appears inline. But I also want to be able to edit one single file if I need to change one bit rather than editing the same file 100 times.
<iframe>'s did this well until recently and I don't want to use workarounds to solve my problems. Could someone please post some code I could adapt or post a link to something that tells me how to do this? Cheers
You can use PHP's include() function to include elements like headers and footers in your pages.
So:
include('header.php');
. . . will look for a file called header.php in the same directory and include it in your page. Then you just need to write this at the top of your pages.
That said, this isn't really a very good way to go about designing your site. How about looking for a content management system, that allows you to keep the design and content of your site separate?
Are PHP includes what you're looking for ? http://php.net/manual/en/function.include.php
I've been doing some scraping with PHP and getting some strange results on a particular domain. For example, when I download this page:
http://pitchfork.com/reviews/tracks/
It works fine. However if I try to download this page:
http://pitchfork.com/reviews/tracks/1/
It returns an incomplete page, even though the content is exactly the same. All subsequent pages (tracks/2/, tracks/3/, etc) also return incomplete data.
It seems to be a problem with the way the URLs are formed during pagination. Most other sections on the site exhibit the same behaviour (the landing page works, but not subsequent pages). One exception is this section:
http://pitchfork.com/forkcast/
Where forkcast/2/ etc work fine. This may be due to it being only one directory deep, where most other sections are multiple directories deep.
I seem to have a grasp on WHAT is causing the problem, but not WHY or HOW it can be fixed.
Any ideas?
I have tried using file_get_contents() and cURL and both give the same result.
Interestingly, on all the pages that do not work, the incomplete page is roughly 16,000 chars long. Is this a clue?
I have created a test page where you can see the difference:
http://fingerfy.com/test.php?url=http://pitchfork.com/reviews/tracks/
http://fingerfy.com/test.php?url=http://pitchfork.com/reviews/tracks/1/
It prints the strlen() and content of the downloaded page (plus it makes relative urls into absolute, so that CSS is correct).
Any hints would be great!
UPDATE: Mowser, which optimizes pages for mobile has no trouble with these pages (http://mowser.com/web/pitchfork.com/reviews/tracks/2/) so the must be a way to do this without it failing....
It looks like pitchfork's running a CMS with "human" urls. That'd mean that /review/tracks would bring up a "homepage" with multiple postings listed, but "/reviews/tracks/1" would bring up only "review #1". It's possible they've configured the CMS to output only a fixed length excerpt, or have an output filter mis-configured and chop off the individual posts pages early.
I've tried fetching /tracks/1 through /tracks/6 using wget, and they all have different content which terminates at 16,097 bytes exactly, usually in the middle of a tag. So, it's not likely this is anything you can fix on your end, as it's the site itself sending bad data.