Why Google Cache Showing 404 on my React Base Webpages?

My web page is client-side rendered with React. On hitting the cache, it is showing 404 instead of a rendered page.

Example Screenshot enter image description here

When I inspect the Page URL on Search Console, Its parsing and rendering my page properly on Google Index as well as on Live Test.

Example Screenshots enter image description here enter image description here

Why cache page is showing 404? Is the Client-side react is responsible? If not then what is the issue? Or will it be resolved automatically in a few days?

Note It’s my new site which recently goes live in last week.

Cache Miss and Processor Speed

today in my class my professor mentioned that

Cache misses becomes more expensive as the speed of the processor increases 

But he didn’t explain the reason. I searched this statement over the internet and found no answer whatsoever.

According to me, this statement is true because, when the speed of the processor increases it can execute more instructions at a given clock cycle, thus a miss leads to a stoppage of more instructions from executing. So cache misses becomes expensive as processor speed increases. Is my thinking correct or am I doing it all wrong?

How to create a html file offline with app cache

We have an online html file that is accessible through the browser, we want this to be available offline and looking in to it the best thing would be using the App Cache with a manifest file.

I believe i have done all the steps correctly but it still doesn’t preload all the files for the html. Here is what i have done:

1. Updated the html tag to read

html manifest="location of manifest file"

2. Created a manifest file listing the individual files

CACHE MANIFEST

1360_VT_04data60_VT_03.js

1360_VT_04data60_VT_03.swf

1360_VT_04data60_VT_03.xml

1360_VT_04data60_VT_03_core.xml

etc….

3. Edited the .htaccess file with

AddType text/cache-manifest .manifest ExpiresByType application/x-web-app-manifest+json "access plus 1 year" ExpiresByType text/cache-manifest "access plus 1 year"

Any ideas why this is not working?

A cache has 64K lines where each line can store 8 blocks of the memory at a time

A cache has 64K lines where each line can store 8 blocks of the memory at a time. If the size of the memory is 2GB and block size is 4 Bytes, then which cache line is supposedly be holding the memory block with number "256" if cache supports Direct Mapping. Also compute total number of blocks this memory has and total number of blocks the cache in question can store at a time.

Algorithm to figure our deps in a graph that can resolve deps from cache (dictionary) of walked paths

I have a graph like this that starts from one top node and has cycles:

enter image description here

I need to write an algorithm to figure if node1 depends on node2. The most primitive algorithm I’ve written simply starts with node1 and recursively follows all available edges looking for node2 starting from node1. It’s very inefficient because I traverse the graph over and over again. I’m wondering if there’s an algorithm I can look at that caches nodes and dependencies it already walked through so that I don’t go through paths I’ve walked once and can figure if there’s a dependency or not from cache?

For example, if I’m given node D and the question is whether it depends on node F, I’ll walk D->E->F, and when next time I get the question if node E depends on F, I’ll get that from cache without walking the graph.

Thanks for any ideas and suggestions!

How does the cache / memory know where to return results of read requests to?

The pipeline of a modern processor has many stages that may issue read requests to main memory, e.g. in fetching the next command or loading some memory location into a register. How is the result of a read request returned to the right pipeline stage, given that there are more than one possible recipients? Since most CPUs access main memory via a cache hierarchy, the question becomes: how does the L1 cache know which part of the pipeline to return a result to?

I imagine that access to the L1 cache is queued, but each access presumably needs a ‘return address’. How is this typically handled?

Cache Blocks Direct Mapping

If the main memory address has 18 bits (7 for tag,7 for line and 4 for word) and each word is 8 bits. I found that the main memory capacity is 256-KBytes, total cache lines is 128 line, total cache words is 128*16(16 word per block/line) = 2048 words. Then what will be the size of cache words? I am very confusing on it. I can’t get the definition of cache words. Can anyone tell me what is the cache words? Thank you!

direct-mapped cache

problem is as follow:

For a direct-mapped cache design with a 64-bit address, the following bits of the address are used to access the cache. (1 word = 64-bits) Tag: 63-10 Index: 9-5 Offset: 4-0

I figured out that each block has size of 4 words, and cache has 32 lines. And here’s the additional problem:

For each reference, complete the following table. “Replace” represents which bytes replaced if any. r

and I got this solution, but I can’t understand why 0x1e is "Miss". As I understood, to determine Hit or Miss, I have to compare Index, then compare Tag. If block for index/Tag is not full, it hits. but references that both Tag and Index is 0x0 are only three before ox1e; 0x00, 0x04 and 0x10. Why does ox1e has Miss?