florindaniel
Programmer
Hello,
I've searched for this topic but the approach I've found is about splitting a table into many small tables BY ROW;
the issue here is about the necessity of spliting a large table BY COLUMN.
My problem is that I have this 500 column table, and, let's say 300.000 records.
Should I use it like this or it's better to split it into 5 101 columns (1 extra column for the primary key) tables,
each having 300.000 records? There's nothing to do with normalisation, it's simply about 500 different parameters
for each record. Actually, the storage space increases by dividing the table (due to the extra primary key field needed
in each table).
Thank you,
Daniel
I've searched for this topic but the approach I've found is about splitting a table into many small tables BY ROW;
the issue here is about the necessity of spliting a large table BY COLUMN.
My problem is that I have this 500 column table, and, let's say 300.000 records.
Should I use it like this or it's better to split it into 5 101 columns (1 extra column for the primary key) tables,
each having 300.000 records? There's nothing to do with normalisation, it's simply about 500 different parameters
for each record. Actually, the storage space increases by dividing the table (due to the extra primary key field needed
in each table).
Thank you,
Daniel