Does Google penalise overwriting of content at render time with JS?

I have found out from Google’s Javascript SEO video, that Goodle indexes webpages in 2 waves and the second wave which involves content generated with Javascript is slower and less frequent.

I have an idea to tackle this issue:

  1. I will call/incule the API (I will get the Parameters by changing it to pretty URL) in the current page’s backend and print the data in plain text inside a div. This will make google crawlers get its hands on the content in the first wave itself.
  2. Then once JS is loaded, I will call that particular API again and this time I will replace the previously created div and render all the data recieved from the API once again.

My questions:

A. Is it SEO friendly? Or Google will penalise this practice? B. Will it solve the late indexing issue or am I being an idiot? C. Are there more pros than cons or vice versa?