Thread: How to open a text file in a stored Procedure

1. Registered User
Join Date
Jul 2004
Location
Mars
Posts
137

How to open a text file in a stored Procedure

Hai..

I have data in text files ( not in csv format but in a properitary format).
My requirment is to read the text files and parse it into corresponding data and then store it into the MSSQL Server Database.

Is there any way to do this by using Stored Procedure or extended stored prodcedure in SQL Server. Or is there any other way to do this in
SQL Server. My database is there in SQL 2000.

Any help will be appreciated..

with regards
Sudar

2. Registered User
Join Date
Apr 2004
Location
Kansas City, MO
Posts
734
Well, of course you can do it in an extended stored procedure. If you have any kind of an ODBC driver, you can use OPENQUERY or linked server to read it. You might want to look at using DTS for this though.

3. Resident Curmudgeon
Join Date
Feb 2004
Location
In front of the computer
Posts
14,905
I'll second the recommendation for DTS, that is the kind of task that it was designed for!

SQL Server has stayed with a cleaner model for Transact-SQL than some of the other database vendors... Instead of trying to turn Transact-SQL (the database management language) into a full blown programming language, the designers have left it for data declaration and management. This inhibits users from doing naughty things like user intereaction within the scripting language, which I see as a good thing.

To work around this limitation, Microsoft has provided DTS. DTS gives the user a substantial sub-set of Visual Basic, with easy access to both the local server and other data sources.

-PatP

4. Registered User
Join Date
Jul 2004
Location
Mars
Posts
137

I cannot use OPENQUERY. pls help

Hai..

I found out that i cannot use OPENQUERY for this. My requirement is something like a fopen function in C.
I want to read the dat from the file line by line and then use my bussiness rule to extract the data and then store only the data
in the database.

Any help will be appreciated..

with regards
Sudar

5. Registered User
Join Date
Feb 2004
Posts
6
Here is a way that may work.

create table #out (output varchar(255) null)
insert #out xp_cmdshell 'type c:\text.txt'
select * from #out

6. Window Washer
Join Date
Nov 2002
Location
Jersey
Posts
10,322
I think that I'd create the table with an identity and use bcp

Then "move" the data with an INSERT to the final table...

why do you need to do it row by row?

7. Resident Curmudgeon
Join Date
Feb 2004
Location
In front of the computer
Posts
14,905
Originally Posted by Sudar
I found out that i cannot use OPENQUERY for this. My requirement is something like a fopen function in C.
I want to read the dat from the file line by line and then use my bussiness rule to extract the data and then store only the data
in the database.
This would be a perfect application for DTS. That is exactly what DTS is supposed to do.

-PatP

8. Registered User
Join Date
Feb 2004
Posts
490
As an alternative, you could also try using the bulk insert statement and specify an fmt-file if necessary. I'd try the dts option first though.

9. Window Washer
Join Date
Nov 2002
Location
Jersey
Posts
10,322
Explain to me again how you easily release DTS to a production envirnoment from dev with no alterations?

10. Resident Curmudgeon
Join Date
Feb 2004
Location
In front of the computer
Posts
14,905
Originally Posted by Brett Kaiser
Explain to me again how you easily release DTS to a production envirnoment from dev with no alterations?
There is a "Save As" option in the DTS designer. It's a piece of cake!

It doesn't sound to me like this is a scheduled production job, it sounds a lot more like a one-time data load to me. If it is a one-time shot, the appropriate values can be coded directly into the DTS package. If it is a scheduled job, the package needs to read the appropriate data from a data source like a table, then process it according to the business rules.

I don't see this as rocket science. Am I missing something?

-PatP

11. Window Washer
Join Date
Nov 2002
Location
Jersey
Posts
10,322
Well....we isolate our prod boxes....we have 1 group that supports all of them...

And whilst I have sa access to my dev boxes, I don't have sa to prod....

keeps the developers from building apps using sa...so save as is not an option

I guess dts can be saved as a different owner though...see there's my ignorance showing again...

I still don't like DTS...

And "it's only a one time thing"

Man, if that ain't the biggest lie in the bd'ness....

I'd still go with a sproc...it might be harder at first....but it's way more flexible...

Only time I'd use DTS would be if I had to read excel or access...

And even in those cases, I'd make the data deliverer save it as tab delimited...before I had to deal with it...

12. Registered User
Join Date
Jul 2004
Location
Mars
Posts
137

Opening the text file

Hai.

Thnks for your replies....

I think my last post was not clear. So i have a clear (and long) post now..

I have a text file with data values inside it as shown..

===================================== Extract From the File ===================================
Cache Copy Read Hits %()(B) 0.947033882141 1 0.92694246769 1 0.892471790314 1 0.820934414864 1
3(F)Data Map Hits %(*)(B) 0.987548351288 1 0.993828952312 1 0.98747831583 1 0.982723712921 1
2(B)Disk Read Time(2)C:(B) 0 18.7727622986 1 10.976811409 1 39.6991119385 1 35.7257003784 1
================================================== ===============================================

Here each line contains data values for a particular Parameter for each time frame. The Parameter name comes first, followed
by a '0' and then values for each time frame seperated by '1'. This is just a extract from the file. Actually the file contains
more than 1000 parameter with more than 100 values for each parameter. I have to read the text file, parse the data and store
it into different tables as i have shown below.

Cache Copy Read Hits
--------------------
Time Frame Value
--------- -----
1 0.947033882141
2 0.92694246769
3 0.892471790314
4 0.820934414864

Data Map Hits %
---------------
Time Frame Value
---------- ------

1 0.987548351288
2 0.993828952312
3 0.98747831583
4 0.982723712921

--------------
Time Frame Value
---------- -----

1 18.7727622986
2 10.976811409
3 39.6991119385
4 35.7257003784

I definitely cannot use DTS for this because, first it is not a data souce, second the file is not in CSV (Comma Seperated Value) format.
I have to read each and every line from the file one line at a time and then parse it and then get the value and then store
it in the table. I have written the code to pharse the line and store it in the table provided the text file is hard coded in the
stored procedure as a varchar variable. Now what i need is to automatically read the text file and retrive each line seperately
as we do in C using 'fopen' function. I heared that i can do this using extended stored procedure but i could'nt find out how.

Any help will be appriciated

with regards
Sudar
Last edited by Sudar; 08-24-04 at 02:02.

13. Registered User
Join Date
Feb 2004
Posts
490
I'd go for a stored procedure in which the input file is either bulk inserted or bcp-ed into a table with say, a single column (if there's no column seperation possible). From that point on, you could process the table, reading each line, parsing it, etc.

14. Resident Curmudgeon
Join Date
Feb 2004
Location
In front of the computer
Posts
14,905
DTS doesn't care if the file is a data source. DTS was designed to handle exactly this kind of problem.

You could write a program in 6502 assembler to run on a Commodore 64 to read the data via a serial port, process it, and return the results back via the serial port. This would be a lot more work, but it certainly is possible.

Let's go with an idea that is easier than the 6502 assembler so we'll get done in your lifetime, but enough harder than DTS so that you'll feel good about having done "enough" work instead of doing it the easy way.

Create a table using CREATE TABLE with a single varchar column that is wider than the lines of your text file. Import the data into that table using BULK INSERT. Once you've got the text file loaded into the table, parse it using Transact-SQL as you see fit.

You should be done in only three to five times as long as it would have taken using DTS!

-PatP

15. Window Washer
Join Date
Nov 2002
Location
Jersey
Posts
10,322
Here....I'll do the first one...

Code:
USE Northwind
GO

CREATE TABLE myTable99(Col1 varchar(8000))
GO

INSERT INTO myTable99(Col1)
SELECT '===================================== Extract From the File ===================================' UNION ALL
SELECT '1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890' UNION ALL
SELECT 'Cache Copy Read Hits %()(B) 0.947033882141 1 0.92694246769 1 0.892471790314 1 0.820934414864 1' UNION ALL
SELECT '3(F)Data Map Hits %(*)(B) 0.987548351288 1 0.993828952312 1 0.98747831583 1 0.982723712921 1' UNION ALL
SELECT '2(B)Disk Read Time(2)C:(B) 0 18.7727622986 1 10.976811409 1 39.6991119385 1 35.7257003784 1' UNION ALL
SELECT '================================================== ==============================================='
GO

EXEC master..xp_cmdshell 'bcp Northwind.dbo.myTable99 out d:\myTable99.dat -SNJROS1D151\NJROS1D151DEV -T -c'
GO

SELECT * FROM myTable99
TRUNCATE TABLE myTable99
SELECT * FROM myTable99
GO

EXEC master..xp_cmdshell 'bcp Northwind.dbo.myTable99 in d:\myTable99.dat -SNJROS1D151\NJROS1D151DEV -T -c'
SELECT * FROM myTable99
GO

CREATE TABLE Cache(Col1 int IDENTITY(1,1), Col2 varchar(25))
GO

SET NOCOUNT ON
DECLARE @start int, @end int, @value varchar(25)

SELECT 	  @start = CHARINDEX('(B)',Col1) + 4
, @end = CHARINDEX(' 1', Col1, CHARINDEX('(B)',Col1) + 4)
FROM myTable99
WHERE SUBSTRING(Col1,1,5) = 'Cache'

WHILE @end <> 0
BEGIN
INSERT INTO Cache(Col2)
SELECT SUBSTRING(Col1,@start,@end-@start)
FROM myTable99
WHERE SUBSTRING(Col1,1,5) = 'Cache'

SELECT 	  @start = @end + 3
, @end = CHARINDEX(' 1', Col1, @start)
FROM myTable99
WHERE SUBSTRING(Col1,1,5) = 'Cache'
SELECT @start, @end
END

SELECT * FROM Cache
SET NOCOUNT OFF
GO

DROP TABLE Cache
DROP TABLE myTable99
GO

Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•