I am hoping someone here can help me with this question. Is this posted in the correct place?
My receiver uses a TM4M16SD70 sdram. Which I believe is 8 meg.
According to the STI5518 Register Manual the Memory Interval is set by writing to the VID_CFG_MCF register. It states the following example for to determine the value to use:
If 2048 rows must be refreshed every 32 ms for a SDRAM clock of 100MHz, the following value must be stored:
32 * ( .001/2048 ) * (100000000/24) = 65
In my case:
If 8 meg of rows must be refreshed every X ms for a SDRAM clock of 121.5MHz, where X = 32 ms (I think, is this correct for this chip?)
it uses the following formula:
interval = ((121500 * 500 decimal  1280 decimal) >> 12 decimal) and 127 decimal
(">>" means to shift result 12 bits to the right)
I am trying to understand what the different values in this formula represent.
I know that 121500 is the frequency determined by the formula:
frequency = clock frequency / 1000 where clock frequency = 121500000 hz
can anybody explain what the other values represent?
500 = ?
1280 = ?
How does this formula relate to the other formula?
