We have several clients whos send us flat files comma seperated .csv files. These files may have text qualifiers around the text fields (") . the problem we r having is the parsing of these rows dont always come out correct. We have gotten to the point to try and use excel to parse the fields correctly, however this has introduced a new pbl. when a data element such as "10-23" is inserted into an excel sheet it turns it into a date, which then becomes too large to be inserted into the target table in SQL Srvr. we have commas and double quotes embedded throughout our data so parsing it is a bit difficult. the SSIS package uses Bulk load to insert the records and it creates the table based off of our DDictionary which may have less columns then the parsed files once the data is split out base don the delimiter. Does anyone know if we can force excel to treat everything as text so it doesnt reformat any of the data.??