Monday, March 12, 2012

how to import hundreds of CSV files into SQL server?

Hi,
I have about 500 CSV files, mostly they are unser same structure. I need to import them all into SQL server. I create a new table for each one based on the firstline as the column name. I can do them one by one but it is very time consuming and boring. Is there a fast way of doing this?

If someone give me a information, I promise to share all of MY CSV files. They are the zone chart from UPS and I have edit all of them.

Thanks.Well ... am working on the same myself ... will let you know as soon as i figure it out.|||Really? Someone here was complaining that it's nearly impossible to get your hands on updated info for this stuff!

Anyway, you'll have to be logged on with sysadmin fixed security role to do the following:

declare @.cmd varchar(8000)
create table #files (fname varchar(128) null)
insert #files exec master.dbo.xp_cmdshell [dir <your_directory> /b]
delete #files where fname is null

After this code is run you have a list of all the files that you want to import. If needed, you can incorporate a filter into your DIR command to filter out anything you don't want.

Now you're ready to build your loop to BULK INSERT each file.|||Now you're ready to build your loop to BULK INSERT each file.

Well .. bulk insert requires the table to be present and we both do not have that luxury ... have posted on this before and did not get a satisfactory answer ... so need to think of something else

http://www.dbforums.com/showthread.php?threadid=980175|||I'll send you a VB app to do this...|||I was working on a DTS package using a Dynamic Properties tast to pull file names from a table, and update the path of a Text Input file, but getting it to iterate is the kicker...|||The main problem for me is not to get it to iterate but to get the size of the columns and the column names.

Excel uploads are better coz you can make a linked server to them.|||With a Text Input file in a DTS package, you can set it up to pull the first row as column names. Sizes are, I believe, automatic, but probably defaults to NVARCHAR(255).|||So, how would you like to determine the field size?|||Well, I BULK INSERT into the same table, into either TEXT or IMAGE field. Actually, I have more than 1 table, and I am dealing with archiving EDI transactions coming from/going to the state. IMAGE datatype I use to store all their PDF's and DOC's with their daily ammendments, and TEXT field for the actual transaction files. I also implemented archiving on the tables with TEXT field, so that the database doesn't have to be a tera-byte size for 2 years worth of transactions to be available.|||Dynamically based on the data ...

In case of an excel .. i am using a opendatasource query and then issuing a select * into ... but am not able to do that on csv and flat files|||but if it's a CSV, then the length of a field can change from line to line. I see that working only if it is a fixed-length field file.|||Thats exactly the problem ... and DTS is not an option for me|||Working on it...|||Is it okay if the fields default to some larger size (VARCHAR(255)), then you can shrink them manually?|||How can you load a table that does not exist?

How are you going to know the layout to build it?

I still say that for flat files you either generate a format file and bcp it in, or load it all in to a table with 1 column varchar(8000) and parse the data in to the destination table...

csv is easy if the tables already built...

why aren't the tables built?

How are you going to derive the structure of the file?

on the fly?|||Try this.

Not too flashy, certainly not rock solid, but it should work.

Customize the connection string first, replacing <servername> and <databasename>.

Browse to the folder holding the text files (it will iterate through all files, so be sure you only have CSV files there).

It creates a table named after the text file, adds fields using the first row (all varchar(255)), and runs through the file inserting data.|||I only had two small csv files to test with. Let me know if there are any major bugs.

If you really want, I can go through the file contents first to find the MAX size of each field before creating the table, but I want to go home tonight...|||I was working on the UPS calculator for a while and forget anything else. I will test the zip file that bpdwork created and let you know what happened.

I have the calculator down. The database is kind of big, I don't know how to let everyone share it. send me a mail if you need to take a look.

No comments:

Post a Comment