DDR2 SDRAM interface doesn't work?

Hi sir,

I’m using XEM5010.

I tested the board with the RAMTester-xem5010.bit .(renamed to ramtest.bit)

But it doesn’t work at all!!

And I have 2 5010 boards and the test results are same.

I have the screen copy image…but can’t upload.

All:

So I have an application which I built on the pre-built MIGs that come with the 5010 Sample/RAMtest directory.

I cannot successfully rebuilt the ramtest.bit file using this MIG example and the .ucf provided.

The ramtest.bit file along with the associated C function executes, however I get memory error.

I need a working solution. Do I need to rebuild the MIG by hand ( I have moved to ISE 12.3)? Is the 266 speed proven, or does this interface need to be run slower?

Suggestions please…

Peter

Peter–

We recommend building the MIG yourself.

Which version of FrontPanel do you have installed? What architecture? What’s the failure? You said you’re not able to build from the sample files. Why not?

We have successfully run the V5 at 266 (and beyond).

Peter–

We just rebuilt the sample bitfile from “scratch”. That is, we built a brand new project from the files installed in the Samples directory. It worked just fine on ISE 12.2.

Quick question, does anyone know why this DDR2 interface heat up the FPGA so much. Is it because the clock is always running?

Have you done a power estimate using the Xilinx tools? That should tell you where the power consumption is coming from.

I also had problems to get the Ramtester working. When regenerating the MIG and testing in my own application it worked just fine though.

The high power consumption, or rather heat generation, of the SDRAM interface is definitely an issue. In my application I use about 40% of the FPGA resources but the device does not run particularly hot unless I use the SDRAM in which case it runs VERY hot. I only occasionally need to grab a burst of data into the SDRAM for subsequent downloading to the host but I don’t need to keep data in the SDRAM for any longer period of time. Ideally I would like to be able to completely disable the SDRAM to save power and only enable it during short periods when grabbing data. However, I have not yet found a way to do this. Anyone out there having any ideas on this?

I have not tried the Xilinx power estimate tools but in order to reduce power I measured the overall power consumption of the board which increased significantly when adding the MIG interface into the design. The first thing I tested was to disable the clock to the SDRAM interface but it made very little difference. It seems like the big power consumers are the DCI terminations. When generating a MIG interface according to the specifications in the XEM5010 manual the single ended pins will use SSTL18_II_DCI I/O-standard and the differential signals DIFF_SSTL18_II_DCI. However, the class II SSTL18_II_DCI includes a split resistor termination in both the driving and receiving ends (see figure 6-80 in the Virtex-5 User Guide) which increase power consumption. I first tested to replace all the unidirectional address and control signals to the SDRAM to class I SSTL18_I_DCI (figure 6-77) without the termination in the driving end, i.e. in the FPGA. This worked fine and reduced power somewhat.

SSTL18_I_DCI can’t be used for the bidirectional Dq signals but there is a new I/O-standard SSTL18_II_T_DCI which behaves like a bidirectional SSTL18_I_DCI in which the termination is invoked only when the port is 3-stated, i.e. in receiving mode. This looks promising but in order to save FPGA power I need to figure out how to force the Dq signals of the MIG interface to go into output mode when the SDRAM is not accessed. Maybe they do so by default or if ending with a write operation.

Unfortunately, Xilinx has in their infinite wisdom decided that starting from ISE version 10.1, it is no longer allowed to place SSTL18_II_T_DCI and DIFF_SSTL18_II_DCI in the same bank. In previous versions it works just fine. Since the dqs signals are both bidirectional and differential this will make this an impossible equation. There is not much to do about the pinout and bank selection of signals in the XEM5010.

I then made an ugly desperate attempt to run the dqs signals single-ended to and from the FPGA by setting the I/O-standard to SSTL18_II_T_DCI. I just assumed that if the negative dqs input on the SDRAM had a similar split-resistor termination the voltage would stay somewhere in the middle if left open. I also had to change the IOBUFDS to IOBUF directly in the ddr2_phy_dqs_iob.v file generated by MIG, but I assume there is a less ugly way to do this by setting appropriate parameters. To my surprise it actually worked with quite a significant reduction in power since the source end terminations now were completely removed also on the dq and dqs signals.

Ugly as it may seem this fix, studying the data sheet for the Micron MT47H64M16HR-3 SDRAM part used in the XEM5010 reveals that there actually is a mode bit that can be set defining whether the dqs signals are single-ended or differential. This would make my solution less ugly but I haven’t figured out how to change this bit yet.

This is as far as I’ve come today. If somebody wants to continue the experiments please do so and report the results. I’m especially interested if somebody finds a way to programmatically disable the SDRAM completely and only wake it up when needed.

Erik

I would like to pick up this two year old thread again.

I?m using the XEM5010 in a camera system within a closed box with limited possibilities to dissipate heat.

[ATTACH=CONFIG]258[/ATTACH]

Normally it works fine but if I enable the SDRAM interface the heat emission from the FPGA increases significantly to an unacceptable high level. However I don?t use the SDRAM during normal operation, I only occasionally need to grab a burst of data at high speed for subsequent transfer to the host. I only need to keep data in SDRAM a few seconds and would like to find a way to disable SDRAM completely to save power consumption and only wake it up a few seconds at a time when needed. Anyone having any ideas on how to achieve this?


camera.jpg (30.8 KB)

Erik-- Please contact Xilinx on this issue. They may be able to guide you.

The heat dissipation is most significantly due to the on-FPGA signal termination. I’m not sure if you would be able to disable this during runtime, but the folks at Xilinx would know.

You can certainly shut down the SDRAM, disable refreshes, etc. But I think that would only have a somewhat small effect on power consumption and heat dissipation. (relative to the on-chip termination)