I also had problems to get the Ramtester working. When regenerating the MIG and testing in my own application it worked just fine though.
The high power consumption, or rather heat generation, of the SDRAM interface is definitely an issue. In my application I use about 40% of the FPGA resources but the device does not run particularly hot unless I use the SDRAM in which case it runs VERY hot. I only occasionally need to grab a burst of data into the SDRAM for subsequent downloading to the host but I don't need to keep data in the SDRAM for any longer period of time. Ideally I would like to be able to completely disable the SDRAM to save power and only enable it during short periods when grabbing data. However, I have not yet found a way to do this. Anyone out there having any ideas on this?
I have not tried the Xilinx power estimate tools but in order to reduce power I measured the overall power consumption of the board which increased significantly when adding the MIG interface into the design. The first thing I tested was to disable the clock to the SDRAM interface but it made very little difference. It seems like the big power consumers are the DCI terminations. When generating a MIG interface according to the specifications in the XEM5010 manual the single ended pins will use SSTL18_II_DCI I/O-standard and the differential signals DIFF_SSTL18_II_DCI. However, the class II SSTL18_II_DCI includes a split resistor termination in both the driving and receiving ends (see figure 6-80 in the Virtex-5 User Guide) which increase power consumption. I first tested to replace all the unidirectional address and control signals to the SDRAM to class I SSTL18_I_DCI (figure 6-77) without the termination in the driving end, i.e. in the FPGA. This worked fine and reduced power somewhat.
SSTL18_I_DCI can't be used for the bidirectional Dq signals but there is a new I/O-standard SSTL18_II_T_DCI which behaves like a bidirectional SSTL18_I_DCI in which the termination is invoked only when the port is 3-stated, i.e. in receiving mode. This looks promising but in order to save FPGA power I need to figure out how to force the Dq signals of the MIG interface to go into output mode when the SDRAM is not accessed. Maybe they do so by default or if ending with a write operation.
Unfortunately, Xilinx has in their infinite wisdom decided that starting from ISE version 10.1, it is no longer allowed to place SSTL18_II_T_DCI and DIFF_SSTL18_II_DCI in the same bank. In previous versions it works just fine. Since the dqs signals are both bidirectional and differential this will make this an impossible equation. There is not much to do about the pinout and bank selection of signals in the XEM5010.
I then made an ugly desperate attempt to run the dqs signals single-ended to and from the FPGA by setting the I/O-standard to SSTL18_II_T_DCI. I just assumed that if the negative dqs input on the SDRAM had a similar split-resistor termination the voltage would stay somewhere in the middle if left open. I also had to change the IOBUFDS to IOBUF directly in the ddr2_phy_dqs_iob.v file generated by MIG, but I assume there is a less ugly way to do this by setting appropriate parameters. To my surprise it actually worked with quite a significant reduction in power since the source end terminations now were completely removed also on the dq and dqs signals.
Ugly as it may seem this fix, studying the data sheet for the Micron MT47H64M16HR-3 SDRAM part used in the XEM5010 reveals that there actually is a mode bit that can be set defining whether the dqs signals are single-ended or differential. This would make my solution less ugly but I haven't figured out how to change this bit yet.
This is as far as I've come today. If somebody wants to continue the experiments please do so and report the results. I'm especially interested if somebody finds a way to programmatically disable the SDRAM completely and only wake it up when needed.