Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Search results for query: *

  1. ehenry

    Transaction log issues

    Changed autogrowth from 1 to 5000 unrestricted. The .mdb is 35.5 GB. There are only 2 tables in the database one contains the metadata and they are cleared before the monthly insert, but the size of this file will slowly increase over time. I will work on the Instant File Initialization and...
  2. ehenry

    Transaction log issues

    ~223 million rows took 25 min to the raw table, and another 25 to the actual table. Used to take well over an hour for each. Huge performance increase :)
  3. ehenry

    Transaction log issues

    Worked great, thanks for the help. The bulk insert with BATCHSIZE, ROWS_PER_BATCH and TABLOCK did not add anything to the transaction log...Inserting from the raw table to the actual table with no index and using TABLOCK kept the log growth minimal. Much appreciated.
  4. ehenry

    Transaction log issues

    I will experiment with this. Thanks for all the input.
  5. ehenry

    Transaction log issues

    Here is the procedure...assuming no indexing on the tables. IF EXISTS ( select id from sysobjects where name like 'raw_pden_prod' and xtype = 'u' ) DROP TABLE raw_pden_prod GO CREATE TABLE raw_pden_prod ( ENTITY_ID varchar(max), PROD_DATE varchar(max), LIQ varchar(max), GAS varchar(max)...
  6. ehenry

    Transaction log issues

    Thanks for the response...Yes, I am using a BULK INSERT...So if I specify ROWS_PER_BATCH while in simple mode, will the log be truncated after each batch?
  7. ehenry

    Transaction log issues

    The database is in Simple recovery mode...sorry for the confusion I wasn't saying I did not want to put it in simple mode, just wondering if there was a way to prevent writing to the log at all, or minimizing what SQL Server writes to the log. When I run the bulk insert command the transaction...
  8. ehenry

    Transaction log issues

    Thanks for the response...I figured as much since SQL Server must use the log to operate. The log file is on its own drive...I am going to split the file then insert smaller batches and truncate the log after each batch like you said. Thank you.
  9. ehenry

    Transaction log issues

    I have a table that I update in my database once a month. The data comes from a 7+ GB flat file I usually end up with well over 100 million rows in a table with 5 columns. My problem is that during this insert the transaction log blows up to over 100GB. After the insert I truncate the log. I...
  10. ehenry

    Bulk Import Issues with large files

    Yeah I added both field and row and seems to be working....my mind slipped I was thinking ',' was default for field. Thanks for the help guys.
  11. ehenry

    Bulk Import Issues with large files

    Added row terminator '\n' to the bulk insert and worked with the smaller file. Seems to be working with the larger file as well. I thought the default would have worked.
  12. ehenry

    Bulk Import Issues with large files

    Its just BULK INSERT tblName FROM 'filename' The file is on the server, so its a regular directory path.
  13. ehenry

    Bulk Import Issues with large files

    Using the import wizard works fine on the small file and imports all rows.
  14. ehenry

    Bulk Import Issues with large files

    I wrote a quick program to read through the first lines of the file and write them to a new file, which I posted above. When I try to bulk import that file I don't get any errors, but it says 0 rows imported.
  15. ehenry

    Bulk Import Issues with large files

    I have a table with 6 columns all varchar(max). Using a straight bulk insert with default parameters/
  16. ehenry

    Bulk Import Issues with large files

    221997,1973-12-01,0,2298,0,1 221997,1974-01-01,0,881,0,1 221997,1974-02-01,0,322,0,1 221997,1974-03-01,0,1036,0,1 221997,1974-04-01,0,591,0,1 221997,1974-05-01,0,500,0,1 221997,1974-06-01,0,527,0,1 221997,1974-07-01,0,508,0,1 221997,1974-08-01,0,448,0,1 221997,1974-09-01,0,467,0,1...
  17. ehenry

    Bulk Import Issues with large files

    There are 5 columns, ID, Date, and 3 columns with a decimal value. No large fields. I think in this instance each column is treated as a LOB, meaning the column object would have a max size of 2GB??
  18. ehenry

    Bulk Import Issues with large files

    I am attempting to bulk insert from a CSV file to a table. The CSV is a 6GB file with 5 columns, I am unsure of how many rows are in the data, but it is over 10 million. Any Bulk Insert runs into an error "Attempting to grow LOB beyond maximum allowed size" Is there any other way to import...
  19. ehenry

    Select max update time for each ID

    SELECT ROW_NUMBER() OVER(PARTITION BY ID ORDER BY UPDATETIME DESC), * FROM TBL If I SELECT all with Row# = 1 that should be the most recent entry for that ID. Then perform update from here.
  20. ehenry

    Select max update time for each ID

    I have a staging table with an update time column and an ID column. In the final table the ID is unique, but in this table there are multiple IDs with different update times. I need to select the Max Update time for each ID in the staging table to perform an update on the final table. What is...

Part and Inventory Search

Back
Top