An SEO analyst interviewer asked me such that “how many pages have in normal website?” What is the exact answer for this question is?
Right now we can create Github pages for a project repo using a url like so:
the problem I am having is I want to be able to version the docs for each version of the repo, for example each git tag.
So I am looking to publish docs like this:
http://username.github.io/repository/1.1 http://username.github.io/repository/1.2 http://username.github.io/repository/2.0 http://username.github.io/repository/2.1
what is the best way to store documenation by version? I can’t think of a good way to do this. I can manually create a site with a sidepanel that has links to each version, but what url will the links point to?
Is there any way to then replace all URLs reffering an old site
http://servername/sites/contoso with a new URL
http://servername/sites/newContoso using PowerShell?
I mean replacing links and shortcuts in the SharePoint sites content (scanning all sites content in the tenant).
My goal is to rename multiple sites (change their URLs), and that means all links in SharePoint that referred to the old site URL will be broken, so I would like to replace all the old links with the new URL.
The tenant has too many sites to do it manually, so a PowerShell script would really make it easy to automate.
Is it possible?
Does anyone know if there is a way to use WHMCS Admin Login to secure other pages on your website?
We’re currently trying to … | Read the rest of http://www.webhostingtalk.com/showthread.php?t=1753515&goto=newpost
It’s something that I believe many of us are wondering about when we are doing backlinking.
It’s really worth the read and I will just put here one of the main conclusions:
"its possible to rank on the first page of Google without links from pages with traffic even if other pages in the SERP have such links.""
We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file.
Is it possible to edit the WordPress admin overview of pages: /wp-admin/edit.php?post_type=page
into sections based on post meta values?
So for example I got 3 pages with the meta value of “Planten” en also 3 pages with a meta of “Bomen”. Is it possible to group those pages in a section?
I have a local business schema on the homepage of our company website. We also have multiple locations, and have pages set up for each of these locations. On these location pages I have a schema to show the reviews for each location. I am wondering if adding a standard localbusiness schema on the main homepage will screw anything up with the review schemas or the other location pages?
I do not have localbusiness schemas set up for the other locations, but I may just do that and throw it all on the homepage.
Let me know what you think,
Thanks and I look forward to your feedback 🙂
I have several 404 errors with a URL such as: myWebsite/tag/foo/page/4/. Page no longer exists because a few posts were deleted tagged as “foo.” Which is better: 301 redirect that to myWebsite/tag/foo/page/1/? Or just 410 delete them?
I’m thinking if I do a 301 redirect — then add more posts tagged as “foo,” a 4th page will exist but those will be redirected back to page 1. Or if I 410 delete page 4 and it later exists in the future, it won’t be found. I’m confused.
This question already has an answer here:
- rel=“next” in anchor tag is not working 2 answers
In search console there is a warning for two pages with the same title, in such these case we mostly use rel next rel prev, we did it, but google still show this warning. How should we correct it?