How can I force WordPress + React Router to soft 404 without modifying core? (So that it is crawlable by Google)

So I have been working on migrating one of my clients to a WordPress site. For the most part, the requirements of the site are pretty simple, but the site did need to have a fully custom faceted search built that interfaces with their custom inventory API.

I decided to architect the faceted search as a stand-alone react SPA that gets pulled in by a custom page template. The catch is that the inventory details routes are all dynamic. WordPress does not know what inventory pages should exist. The route looks something like this:

/inventory/detail/{stockNo} ie. /inventory/detail/1NF8FT 

The index pages of the react faceted search spa link out to all of these pages, then react router uses the route parameter to make the appropriate AJAX call and populate the page.

To allow the WordPress and react routing systems to work together I hijacked the 404.php template in my child theme and rerouted it to the index page like so:

<?php   header('HTTP/1.1 200 OK');   status_header(200);   include get_query_template('index'); ?> 

The react spa is included on the index.php template, so when a URL is hit it first goes through the WordPress routing system, then falls back to the index.php where it is assessed by react router. If the route does not match any of the react routes the spa throws a soft 404.

For the most part, this works pretty great. The two routing systems work together seamlessly, but I am having trouble getting Google to index these dynamic routes. When I test the URL with the Google Search Console or Screaming Frog the inventory detail pages return as a 404 (even though I am explicitly telling the 404 page to throw a 200).

I have tried adding the header status override code seen above into the functions.php, index.php, 404.php, and header.php without success. The ONLY way I have been able to fix the problem and make the pages crawlable was by modifying the WordPress Core. Specifically updating wp-includes/class-wp-query.php so that set_404 function always returns false.

public function set_404() { $  is_feed = $  this->is_feed;  $  this->init_query_flags(); $  this->is_404 = true;  $  this->is_feed = $  is_feed; } 

This has the desired effect (making the dynamic pages indexable) and is a very small and simple patch, but I know the pains and dangers of modifying the WordPress core.

Has anyone attempted a routing mixture like this? I would like to figure out a way I can solve this problem without modifying the core.

How to make # url crawlable

Hi,

I have an url abc.com/#/emi-calculator

Currently this url is not being crawled and indexed by google since it contains a "#" in it.

After some research i found that if we use "#!" Then the url will be crawlable.
But this method recommended for SEO? Or should i just go ahead and create a new page and host my emi calculator there?

Request experts to share your views.

Thanks