BorisMeister
Programmer
I have a signal of type std_logic_array(47 downto 0). For test bench, I'm writing hex AAAAAAAAAAAA to a shift/latch register. The data correctly asserts in the register, and I have traced the line up to an area in the core of the chip. The basic problem is this:
In questasim I have verified that the signal w_data_out = X"AAAAAAAAAAAA", or 10101010....
Using the following line of code:
DATA_OUT <= w_data_out;
Changes DATA_OUT (of type out std_logic_vector(47 downto 0)) to 101010101010101010101010101010101010U01010101010
Note the "U" on bit#11. I have also tried mapping this bit to a single std_logic signal, but it also stays uninitialized. Why would a bit in the middle of an array get corrupted?
In questasim I have verified that the signal w_data_out = X"AAAAAAAAAAAA", or 10101010....
Using the following line of code:
DATA_OUT <= w_data_out;
Changes DATA_OUT (of type out std_logic_vector(47 downto 0)) to 101010101010101010101010101010101010U01010101010
Note the "U" on bit#11. I have also tried mapping this bit to a single std_logic signal, but it also stays uninitialized. Why would a bit in the middle of an array get corrupted?