1) The reference design RAMTEST works when localparam C1_SKIP_IN_TERM_CAL = 1; and does not work when localparam C1_SKIP_IN_TERM_CAL = 0; also the current consumption goes up by 500mA (2.5W!) when this parameter is set to 0. In the ucf file, the in_term is set to none. Is there some connection here?
Having calibration turned ON for the internal termination makes sense but the design does not work. Why is it so? The soft-calibration module is anyhow enabled, then why does this calibration step increase the power consumption dramatically?
- The arb_time_slot settings in the reference design (i.e., ramtest) are modified from those given in example_top (generated by MIG), mainly the priority settings. However since only one port is used (of 6 available), then why is this needed?
- May I also ask you to explain a bit more about the following settings:
localparam C1_INCLK_PERIOD = 20833; // OK System Clock 20833/1000 = 20.83ns -> 48Mhz
localparam C1_CLKOUT0_DIVIDE = 1; // 624MHz system clock //Is the PLL configured to generated 624MHz from 48MHz input?
localparam C1_CLKOUT1_DIVIDE = 1; // //is it used?
localparam C1_CLKOUT2_DIVIDE = 4; // 156MHz Test Bench Clock //are we using it?
localparam C1_CLKOUT3_DIVIDE = 8; // 78MHz Calibration Clock //is it used by the soft_calib_module?
localparam C1_CLKFBOUT_MULT = 26; //what is it used for?
localparam C1_DIVCLK_DIVIDE = 2; //what is it used for?
localparam C1_INCLK_PERIOD = ((C1_MEMCLK_PERIOD * C1_CLKFBOUT_MULT) / (C1_DIVCLK_DIVIDE * C1_CLKOUT0_DIVIDE * 2)); is not used.
Also, the above have been changed from the default values; I guess since the MIG does not take in the input system clock (which in this case is 48MHz) the PLL has to be configured accordingly. May I ask how these values are derived.
Thanks a lot,