This is a question from a Computer Architecture exam and I don’t understand how to get to the correct answer.

Here is the question:

This question deals with main and cache memory only.

Address size: 32 bits

Block size: 128 items

Item size: 8 bits

Cache Layout: 6 way set associative

Cache Size: 192 KB (data only)

Write policy: Write BackWhat is the tag size in bits and what is the total number of cache bits?

In order to get the number of tag bits, I find that 7 bits of the address are used for byte offset (0-127) and 8 bits are used for the block number (0-250) (250 = 192000/128/6), therefore 17 bits of the address are left for the tag.

The part I don’t understand is the second part. To find the total number of bits in the cache, I would take (`valid bit`

+ `tag size`

+ `bits per block`

) * `number of blocks per set`

* `number of sets`

= (1 + 17 + 1024) * 250 * 6 = **1,536,000**. This is not the correct answer though.

The correct answer is 17 tag bits and **1,602,048** total bits in the cache. After trying to reverse engineer the answer, I found that 1,602,048 = 1043 * 256 * 6 but I don’t know if that is relevant to the solution because I don’t know why those numbers would be used.

I’d like if someone could explain what I did wrong in my calculation to get a different answer.

Thanks