Web 2.0 – SER engines

i am not succesful with SER engines
– even with fresh emails, low threads (3 or 5), 2Captcha and good content I only can get 11 platforms of 39 (mainly: Evernote, Hatenablog, Kinja, Pen.io, Postbit, Skyrock, Unblog.fr, Webgarden Blog, Webgarden.at, Webgarden.cz, Wikidot)
NO Tumblr and WordPress at all!
My questions:
1. do you have a similar experience with SER engines?
2. what can I do to improve GSA-SER posting on Tumblr and WordPress, is a private/dedicated PBN blogs an option?
3. what are the web 2.0 results with RankerX?
4. and the esults on RankerX with using Tumblr and WordPress private/dedicated PBN blogs?
Thx Harry

Why is OpenSearch support sometimes not added to “Other search engines”?

I find OpenSearch support for sites extremely useful, but I cannot figure out why it is sometimes not automatically added to my Other search engines in Chrome after visiting that site and performing a search.

The issue comes when I blow away my cached browser data and OpenSearch support for those sites also disappears. For example, when I visit amazon.com and do a search, Amazon.com is automatically added to Preferences/Settings > Search Engines > Manage search engines > Other search engines, but when I visit youtube.com, it is not automatically added, even though it previously had been.

Can I control OpenSearch support in a way that always automatically adds a site’s query string without manually adding it to Other search engines?

Make JavaScript Generated content appear on Search Engines, where content differs on a Single Page

I have a PWA hosted on Firebase. On the main URL, like example.com, there’s a lot of content that is generated by JavaScript.

At the same time, whenever someone accesses example.com/?key=value, where the key and the value are predefined, the JavaScript on the page reads the key and the value and accordingly removes the content generally shown on example.com and shows a different content.

The content differs with each value of the key.

With this, the website efficiently acts like a multi-page website, although it essentially has only a single web page.

The problem here is, I don’t get the content displayed on these sub-pages visible on Google Search (or any other search engine). Is there any workaround on this?


The problem here is that I cannot generate a different webpage (an HTML file) for each of these sub-pages because the website uses only Front End Technologies, available with Firebase.

Using GSA Search Engine Ranker and SER Engines for ranking, but serp keep dancing, what should i do?

I have been using GSA Search Engine Ranker  and SER engines, several sites ranking improve, but they keep dancing, what should i do ?
I habe been using GSA Search Engine Ranker SER engines and GSA Search Engine Ranker to building links for several of my sites, at initial stage, the ranking has been greatly improved, some keywords even ranked into the 1st page after some days . But they dropped after ranking for 1-2 months, some sites ranking dropped and then recovered after keep building links, some sites never recovered again.
My operating process like this
first i build a new site, for wholesale website:
1.1.  i put “wholesale *keyword* manufacturer, supplier” in the homepage SEO title, and extend this seo title in the seo description ( means the seo description contains all keywords mentioned in the SEO title. 
1.2. on the sub page, i put  ” *keyword* for sale” as SEO title, extend this seo title in the SEO description ( usually its length about 160-300  characters), then i copy this SEO description as the page description in the sub page ( which end user can see it ) and try to extend this description to 400-600 characters so that i can put more LSI keywords into this page, of course, this description is readable for users.
1.3. for the product page and blog post page, i try to put lsi keywords on the SEO title, and extend the SEO title in the SEO description meta.
After the website have been set up for 7 days, i am moving to the link building progress:
2.1 I set up a general GSA Search Engine Ranker project with 2 tiers, tier 1 point to homepage ( sometimes i will point to subpage too ); and the tier 2 links point to tier 1. 
2.1.1 Usually i check ” article, directory, forum, social bookmark, social network, web2.0, wiki” (only check around 50% for each category ) for tier 1; check ” article, blog, forum, guestbook, image comment, microblog, social bookmark, social network, web2.0,wiki” for tier 2.
2.1.2 For tier 1 anchor, i usually ensure ” generic anchor text” and “Domain as anchor text ” ratio up to 50%-60% in total, “Anchor” and “LSI anchor text” raito use the rest 40%-50% ratio ( sometimes i use 5%-10% “branding anchor text”). I will put 10-20 keywords ( for example, my main keyword in A, then i will put “A wholesale”,”A manufacturer”,”A design”,”A solution”,”A company”,”A price”,”cheap A” in the “anchor” area )in the “anchor” area in spin version, and 30-150 keywords in the “LSI anchor text” area. Tier 2 is the same, just use even more wider keywords.
2.1.3 I usually set the link building limit for tier 1 with “30-50 verifications per day”, for tier 2, i didn’t set any verification.
2.1.4 this project was running on my own computer.
2.2 I set up a GSA SER Engine project with 2 tiers, tier 1 point to homepage only, and the tier 2 links point to tier 1.
2.2.1 i didn’t check social bookmark and social profile at all.
2.2.2 usually the articles are unique, some times they are not unique, just articles scrapped by GSA content generator, whether unique or not, i will spin them by spin rewriter, and import the spin version to the articles section in GSA.
2.2.3 for the tier 1 anchor, i usually ensure “generic anchor text” and “domain as anchor text” ratio around 30%-35% in total, and 10%-15% for “branding anchor text”, the left 40%-50% for the “anchor” ( usually contains around 10-20 keywords) and “LSI anchor text” ( usually contains 40-80 keywords).
2.2.4 after the tier 1 link was build, i will send them to indexer service automatically. after around 3-5 days, the tier 2 project will start running.
2.2.5 By the way, i use yahoo email accounts and captcha service from 2captcha for SER engine tier 1 link building.
Sometimes, i will buy some socail signs from www.blackhatlinks.com which consist of “facebook, twitter,google+,linkedin,mix,pinterest,reddit”, each platform singal number 3-10 random.
That’s what i have did for link building. Below is ahrefs ranking screen shot for some of my projects:
Project 1

project-1
Project 2project-2
Project 3project-3
Project 4project-4
Project 5project-5
Project 6project-6
Project 7project-7
Project 8project-8
Project 9project-9
Project 10project-10
Project 11project-11
Project 12project-12
Project 13project-13

Two friends told me that:
1. i should improve onsite SEO, for example, if i found a keyword “A” ranking for page “URL:X”, then i should add more contents regarding “A+C”, “A+D” (LSI keywords ) to the page “URL:X”, if it is a sub page, i should add more internal links to this sub page.
2. adding more anchor text with “A”, “A+C”, “A+B” etc from external links.
Frankly speaking, i have managed more than above 13 websites, actually around 50+ websites, i am unable to improve the onsite content manually one by one.
So what should i do in this stage? create more SER engine web2.0 to the sub page ? or should i manually building web2.0 and publish articles on this web2.0, so these web2.0 blog can be more stable ? and then i blast links to these manual building web2.0 blogs or PBN post ? 

Should i use less keywords in the “anchor” area in SER engines, so the keywords will be more focused ?

Besides, should i blast links to sub page and detailed product/ post by GSA Search Engine Ranker ( not Ser engine )

PS: i am sending links to my homepage directly from GSA Search Engine Ranker, because i want to hide/diversify my link profile, although they have some weak link juice indeed.

I am waiting for suggestions from experienced GSA Search Engine Ranker operator.

Thanks in advance !

Submit your website to 775 search engines and directories – Google, Bing and many more for $7

Dears, When you submit your website to many search engines, this will increase your website traffic. Our service includes all the major search engines including Google, and Bing plus hundreds of relevant free directories from around the world you can get your website listed in. Our propriety search engine submission automatically requests inclusion of your website saving you hours of time manually submitting to each directory one by one. All I need is your website URL and leave the submitting to me, you will have a full list of search engines and directories your website is submitted. Regards

by: aaymann
Created: —
Category: Directory Submission
Viewed: 185


Detailed analysis of your site to generate search engines (full report) for $5

1 – I will give you a detailed analysis of your site and coverage of more than 18 key factors within your site interested in the search engines when ranking websites stronger with showing the strengths of your site and weaknesses. 2 – I will also give you tips on how to correct the errors of the Siu in your site. 3 – You will also get an accurate plan of how to reach your site to the first results in the search. 4 – All this and more by only $ 5.

by: bombo92
Created: —
Category: Onsite SEO & Research
Viewed: 91


Does search engines like Google or Bing search through all web pages for a single query every time?

Here is what I imagine would happen when I type in a search query in Google or Bing:

  1. The search query is converted into a vector through some pre-trained machine learning models. The vector captures semantics features, etc.

  2. The search engine goes through all webpages that it has ever crawled and computes a similarity score of each webpage to my search query based on the vectors of a query and the webpage. (Assuming the search engine have already pre-computed the semantics feature vector for all webpages that it crawled)

  3. The search engine ranks the similarity scores of all the webpages, and return the ranked list to me.

To me, it just seems that searching through all webpages would be too expensive and I find it hard to believe the searching and ranking are all done in a few milliseconds.

Can someone comment?