Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Error: Column count won't match row???? Help needed

Status
Not open for further replies.

wecc

Programmer
Aug 3, 2009
9
US
I am wanting to upload data into a mysql table as fast as possible and so I made a string so it will load multiple rows at once. However, when I run it I get an error saying: DBD::mysql::st execute failed: Column count doesn't match value count at row 1

Now, I know that I may have not enough or too many columns and I have checked that a thousand times and I have the right number. The script is below. Your help is greatly appreciated! Thank you very much!!

#!/usr/local/bin/perl
#opening narr files

use warnings;
use strict;
use Math::Trig;
use DBI;

my ($onestring, $bigstring, $variance_speed, $variance_angle, $stnd_dev, @x, @y, $date, $hour, $dbh, $sth, $x, $y, $lat, $longi, $theta, $rho, $R, $pi, $rads, $degs, $phizero, $phi1, $phi2, $lambda0, $N, $F, $rhozero, $i, @files1, $file1, $filename, @uwind, $uwind, @vwind, $vwind, $resultant, $angle, $uw, $vw);

$R=6370.997;
$pi = (atan2 1,1) * 4;
$rads = $pi / 180;
$degs = 180 / $pi;
$phizero = 39 * $rads;
$phi1 = 33 * $rads;
$phi2 = 45 * $rads;
$lambda0 = -96;
$x = -5632.668;
$y = -4612.566;


$N = log(cos($phi1) / cos($phi2)) / log(tan($pi / 4 + (($phi2) / 2)) / tan($pi / 4 + (($phi1)/ 2)));
$F = cos($phi1) * (tan($pi/4 + $phi1/2))**($N) / $N;
$rhozero = ($R * $F) / ((tan($pi/4 + $phizero/2))**$N);

###Open Directory to where the NARR text files are located
opendir (DIR, '1980') or die "$!";

###Grabs all the files names that have "10m_u" and end in ".txt" and place them into @files1

$dbh=DBI->connect('dbi:mysql:narr_project','root','indent0') || die "Error opening database: $DBI::errstr\n";

@files1 = grep {/30m_u.txt*?/} readdir DIR;
foreach my $filename (@files1) {
$i = 0;
print "Uploading " . $filename . " into mysql 10m table\n";
print "---------------------------------------------------\n";

#Takes the filename from $filename and skips the first 11 characters
#then grabs the next 8 characters and puts them into $1. next
#space in filename is an underscore and then grabs the next
#four characters and puts them into $2 and ignores the rest

if ($filename =~ /.{11}(\d\d\d\d\d\d\d\d)_(\d\d\d\d)/) {
$date = $1;
$hour = $2;
}

#$filename now equals Directory to open one of text files (This
#will change to get the other text files one at a time as the foreach
#loop (above) cycles through. All below is inside that loop and
#repeats as the foreach loop cycles through each time

$filename = "1980\\" . $filename;

#Open's the file with the data. First will be the U wind data

open(FILE, $filename) or die "$!";
@uwind = <FILE>;

$filename =~ s/u.txt/v.txt/;

open(FILE, $filename) or die "$!";
@vwind = <FILE>;
$y=-4612.566;
#for ($y=-4612.566; $y<=4300; $y=$y+32.463) {
for ($x=-5632.668; $x<=-5600; $x=$x+32.463) {

$i = $i + 1;


if ($uwind[$i] < 100 && $vwind[$i] < 100) {

$rho = sqrt($x*$x + ($rhozero - $y)*($rhozero - $y));
$theta = atan2($x,($rhozero - $y));
$lat = (2*atan(($R*$F/$rho)**(1/$N)) - $pi/2) * $degs;
$longi = (($theta/$N) * $degs) + $lambda0;

$resultant = sqrt($uwind[$i]**2 + $vwind[$i]**2);
$angle = (atan2($vwind[$i],$uwind[$i])) * $degs;
if ($angle<0) {
$angle=$angle + 360;
}

$variance_speed = 0;
$variance_angle = 0;

$onestring = 1000000 + $i . ",'" . $date . "','" . $hour . "'," . int($lat*100)/100 . ',' . int($longi*100)/100 . ',' . int($x*100)/100;
$onestring = $onestring . ',' . int($y*100)/100 . ',' . int($uwind[$i]*100)/100 . ',' . int($vwind[$i]*100)/100 . ',' . int($resultant*100)/100 . ',' . $variance_speed . ',' . int($angle*100)/100 . ',' . $variance_angle;
print $onestring . "\n";
if ($bigstring) {
$bigstring = $bigstring . ',(' . $onestring . ')';
}
else {
$bigstring = '(' . $onestring . ')';
}

print $bigstring . "\n";
} # if
} # x
$sth=$dbh->prepare
("INSERT INTO 30m (ID, DATE, TIME, LATITUDE, LONGITUDE, X, Y, U, V, RESULTANT, VARIANCE_RESULTANT, ANGLE, VARIANCE_ANGLE)
VALUES ($bigstring)");
$sth->execute () || die "Couldn't insert record : $DBI::errstr";
print "Data loaded into database\n";
#} # y
}
close(FILE);
close DIR;
 
Ok, a few thoughts here.

1) If you want fast then why not bulk load rather than using perl at all?
2) If you want to use perl then why not "prepare" an insert statement with placeholders for the values and repeatedly "execute" it with consecutive rows? (I'm guessing you're gonna say "speed" to this one).
3) Is it possible that any of your values are "undef"?

Personally I don't like to bypass placeholders because they manage data types and help (to some extent) protect against SQL-injection so I always feel uncomfortable seeing code try to just throw a text string at the DB.

Just my thoughts and personal preferences.



Trojan.
 
Suggest that as you have 30M rows, you do all the calculations to create a flat file for load with Perl, (possibly CSV format or wahtever the MySQL bulk load supports), and then just slurp it into the database with a bulk load from there. It will be way faster.

Steve

[small]"Every program can be reduced by one instruction, and every program has at least one bug. Therefore, any program can be reduced to one instruction which doesn't work." (Object::perlDesignPatterns)[/small]
 
Ok, I have decided to bulk load. I will have multiple files that I want to load but I want the data to be average. Is it possible to bulk load one file into mysql than take that data and average it with the next file that is about to be bulk loaded in? How would I be able to do this if so? Thank you for all the help too!! It is much appreciated!
 
Why would bulk loading affect your averaging calcs?

Surely you just run your code "as is" but alter it first to write the data out to flat files (as CSV or similar) and then manually run a bulk loader on the result.

I don't see how bulk loading the data affects how you average your data.

Maybe I'm missing something obvious ... [ponder]


Trojan.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top