separate prerendered static page for open graph crawlers on netlify ( cant redirect by detecting bots ) [closed]

I want to create high browser cached shell app which is basically one cached file index.html which works like spa and it uses progressive enchantment to fill with content, but there is problem for open graph meta tags for Facebook, LinkedIn, Twitter, and of course with SEO also ( yes I know that these days google can parse JavaScript oriented applications but others can’t ), and we cant redirect on server side by bot detection because we are using static file hosting like netlify, so basic idea is this:

put canonical url for every page to redirect crawler to /seo/$ page, and on /seo subfolder we will be serving pre-rendered static pages which contains all open graph tags and also don’t contain all the css ans JavaScript, just content and html tags ( this is not necessary but idea ise to optimize unnecessary bandwidth )

Would this solution be considered good practice? Would this solution be considered cloaking? What are downsides of this solution? Is it considered bad practice to serve “striped down” pages to bots? ( with same content but not all functionality ) Do you maybe have any other proposition how to handle static pre-rendered pages that will be used for SEO and open graph tags and be served only to bots

and most important question is beacuse i suppose that links in google search will then be with additional parameter /seo/ which is not good, is there any solution to force google to use original links or to redirect to /og/ urls just for serving open graph tags for facebook and other social network regarding information that google actually can parse javascript today.

Would sitemap.xml or robots.txt somehow be helpful for redirecting just facebook parser?