Create github pages for each tag of a github repo

Right now we can create Github pages for a project repo using a url like so:

http://username.github.io/repository 

the problem I am having is I want to be able to version the docs for each version of the repo, for example each git tag.

So I am looking to publish docs like this:

http://username.github.io/repository/1.1 http://username.github.io/repository/1.2 http://username.github.io/repository/2.0 http://username.github.io/repository/2.1 

what is the best way to store documenation by version? I can’t think of a good way to do this. I can manually create a site with a sidepanel that has links to each version, but what url will the links point to?

Replace all links (String) within pages on tenant using PowerShell

Is there any way to then replace all URLs reffering an old site http://servername/sites/contoso with a new URL http://servername/sites/newContoso using PowerShell?

I mean replacing links and shortcuts in the SharePoint sites content (scanning all sites content in the tenant).

My goal is to rename multiple sites (change their URLs), and that means all links in SharePoint that referred to the old site URL will be broken, so I would like to replace all the old links with the new URL.

The tenant has too many sites to do it manually, so a PowerShell script would really make it easy to automate.

Is it possible?

Do Links From Pages With Traffic Help You Rank Higher?

That’s a very interesting study made by Ahrefs.
It’s something that I believe many of us are wondering about when we are doing backlinking.
https://ahrefs.com/blog/links-with-traffic-study/

It’s really worth the read and I will just put here one of the main conclusions:
"it’s possible to rank on the first page of Google without links from pages with traffic even if other pages in the SERP have such links.""

Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google?

Hi all,

We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file.

Thanks

Will my local business schema screw up my other location pages?

Hello,

I have a local business schema on the homepage of our company website. We also have multiple locations, and have pages set up for each of these locations. On these location pages I have a schema to show the reviews for each location. I am wondering if adding a standard localbusiness schema on the main homepage will screw anything up with the review schemas or the other location pages?

I do not have localbusiness schemas set up for the other locations, but I may just do that and throw it all on the homepage.

Let me know what you think,

Thanks and I look forward to your feedback 🙂

Redirecting Pages That No Longer Exist — But Could in the Future

I have several 404 errors with a URL such as: myWebsite/tag/foo/page/4/. Page no longer exists because a few posts were deleted tagged as “foo.” Which is better: 301 redirect that to myWebsite/tag/foo/page/1/? Or just 410 delete them?

I’m thinking if I do a 301 redirect — then add more posts tagged as “foo,” a 4th page will exist but those will be redirected back to page 1. Or if I 410 delete page 4 and it later exists in the future, it won’t be found. I’m confused.

Google show warning for duplicat title of two pages with rel next rel prev [duplicate]

This question already has an answer here:

  • rel=“next” in anchor tag is not working 2 answers

In search console there is a warning for two pages with the same title, in such these case we mostly use rel next rel prev, we did it, but google still show this warning. How should we correct it?