SQLite Extender (Alpha? Beta?)

Started by JTaylor, May 14, 2020, 08:03:50 PM

Previous topic - Next topic

JTaylor

Wow.  I am impressed it performed that well.

Yes.   Depending on many factors the cell limit for an Array in WinBatch is around 10mil but you have a lot of wide columns so 8 mil is not unexpected.

Glad to hear my Extender wasn't a limiting factor.  I am very pleased.  Performance was pretty reasonable as well I thought, at least up to 250,000.  80 seconds seemed a bit long but probably nothing I can do about that. 

Thanks for the info.

Jim


Quote from: KeithW on May 20, 2020, 02:36:29 PM
Jim,

Here are some stats on dbGetRows() on my system, a i7 notebook with 16GB Ram, 1 TB Intel SSD (Split C/D and both partitions are really full, my bad, and the system is anything but dry... many programs and a boat load of background stuff = busy system, but that is how I run so I wanted to test i a realistic mode for me)

The Table I was running again is just over 3.5MM rows, 20 columns, average row width is just shy of 400 characters...  I was using a LIMIT on the SELECT to see what happened.... see code...

ROWS  TIME/SECs
100         0
500         0
1000       0
1500       0
2500       0
5000       0
10000     1
25000     0
50000     2
100000   3
250000   9
500000   ?  the process stopped at 415,832 lines (line incomplete) array.txt file size 59,848,153 bytes
415000   80  process complete initially 415,002 lines (completed) array.txt file size 59,728,611 bytes.

Then I got the following error trying to display the ARRAY
3096 Memory Allocation Failure, Out Of memory for strings.

Using: WinBatch 32 2020A
WIL Version 6.20ata


so I am guessing that is the max my system can chew at a time.


i = 500000
SQLText = "SELECT * FROM summ_by_Customer LIMIT %i%;"

ScriptStart = TimeYmdHms()
stmta = dbGetRows(db,SQLText,@TRUE,i + 3)
ScriptEnd = TimeYmdHms()

ArrayFilePutCsv("array.txt", stmta)  ; stmta not an Array?  Out-of-range error
Elapsed = TimeDiffSecs(ScriptEnd,ScriptStart)
Message("dbGetRows()","Processing Complete !!" : @CRLF : "Executed in: %Elapsed% Seconds" )


Keith

KeithW

Jim,

Keep in mind that the file I/O of creating Array.txt is in that time, I do believe and my full filesystem will be impacting that time as well.
I watched  array.txt swell during those 80 seconds.

Keith

KeithW

Jim,

I would guess that v4 may well become a workable piece of software.  I am in the process of automating my project.  It was in a different (DBased RAD Tool and I wanted to migrate it to SQL based tool) and I have all of the support scripts saved as separate .sql files for the entire process.  The goal was to move the project to a SQL based environment and then automate all but two or three steps that could not and make it easier to manage on a monthly basis.  I think dbSQLite is almost ready to take on the challenge and I certainly appreciate your attention to getting this put together as I think it will make my life a bit easier going forward.

Regards,
Keith

stanl

Jim;


I modified my original script to create an SQLite db and Table populated from a tab-delimited CSV file. The attached wakecounty.csv is tab=delimited and I included a 1-liner schema.ini (I'm working on a interface to autocreate the .ini) so Ace Provider wouldn't error. The process of inserting rows from the datatable into SQLite is a little slow and clumsy to look at. Since the database/sqlite table are identical in structure maybe there is a way to INSERT INTO....FROM - or some kind of bulk copy.  Anyway, be a nice project to replace the System.Data.SQLite with your extender. There was one bug. The original csv had a column Time Zone - which although could be created in the SQLite table, put out an error during the INSERT command. I changed it to TimeZone figuring there may be a need in SQLite to enclose multi-word columns in brackets

[/size]
Code (WINBATCH) Select


;Winbatch 2020A - Parsing csv for SQLite Structure
;Stan Littlefield, May 4, 2020
;Updated to Create SQLite db and table May 18
;///////////////////////////////////////////////////////////////////////////////////////////////////////////////
gosub udfs
IntControl(73,1,0,0,0)
folder = dirscript()
;file is tab-delimited and simple schema.ini included
;future plan is to create schema.ini on the fly
file=folder:"WakeCounty.csv"
If ! FileExist(file) Then Terminate(@TRUE,"Cannot Continue",file:" is missing")


BoxOpen("Please Wait","Transferring Text File To SQLite Table")


;cConn='Provider=Microsoft.ACE.OLEDB.12.0;Data Source=%folder%':';Extended Properties="TEXT;HDR=YES;FMT=Delimited"'
cConn='Provider=Microsoft.ACE.OLEDB.12.0;Data Source=%folder%':';Extended Properties="TEXT;HDR=YES"'
;Message("Connection",cConn)


db = DirScript():"WakeCounty.sqlite"
ObjectClrOption("Appbase", DirScript())
ObjectClrOption('use','System.Data.SQLite')


ObjectClrOption("useany","System.Data")
oConn = ObjectClrNew( 'System.Data.OleDb.OleDbConnection',cConn)
oConn.Open()
oCmd = ObjectClrNew('System.Data.OleDb.OleDbCommand')
cSQL = "SELECT * FROM ":file


oCmd.Connection = oConn
oCmd.CommandText = cSQL


Boxtext("Creating .NET DataTable")


oAdapter = ObjectClrNew( 'System.Data.OleDb.OleDbDataAdapter')
oTable = ObjectClrNew( 'System.Data.DataTable')   
oTable.TableName = 'TEST'             
oAdapter.SelectCommand = oCmd
rowcount = oAdapter.Fill(oTable)
colcount = oTable.Columns.Count
BoxText("Columns ":colcount:@LF:"Rows ":rowcount)
oCols = oTable.Columns
cCreate = "Create Table Accounts (":@LF
ForEach col in oCols
   name = col.ColumnName
   colType = col.DataType.ToString()
   objRow = oTable.Rows.Item(0)
   val = objRow.Item(col)
   ;cCreate = cCreate:name:" ":cvt(colType):",":"[VALUE: %val% .net type %colType%]":@LF
   cCreate = cCreate:name:" ":cvt(colType):",":@LF
Next
cCreate = StrSub(cCreate,1,Strlen(cCreate)-2):");"


;WB SQLite Extender in place of System.Data.SQLite
oSQLite = ObjectClrNew("System.Data.SQLite.SQLiteConnection","Data Source=%db%;Version=3;Pooling=true;Legacy Format=True")
oSQLite.Open
oCmd1 = ObjectClrNew("System.Data.SQLite.SQLiteCommand",oSQLite)
oCmd1.CommandText = "DROP TABLE IF EXISTS Accounts";
oCmd1.ExecuteNonQuery();


SQLText = cCreate
oCmd1.CommandText = SQLText
oCmd1.ExecuteNonQuery();


oRows = oTable.Select()
For i =0 To rowcount -1
  oRow=oRows[i]
  SQLText = "INSERT INTO Accounts ("
  ForEach col in oCols
     SQLText = SQLText:col.ColumnName:","
  Next
  SQLText = StrSub(SQLText,1,Strlen(SQLText)-1):") VALUES ("
  For col = 0 To ColCount-1
     oCol = oTable.Columns.Item(col)
     SQLText = SQLText:"'":oRow.Item(oCol):"'":","
  Next
  SQLText = StrSub(SQLText,1,Strlen(SQLText)-1):");"
  oCmd1.CommandText = SQLText
  oCmd1.ExecuteNonQuery();
Next


oAdapter.Dispose()
oTable=0
oCmd=0
oConn=0
ObjectClose(oCmd1)
ObjectClose(oSQLite)
BoxShut()
Message("Complete","Check for SQLite db ":db:@LF:"and Table Accounts")


Exit


:WBERRORHANDLER
geterror()
Message("Error Encountered",errmsg)
Exit


;/////////////////////////////////////////////////////////////////////////////////////


:udfs
#DefineSubRoutine geterror()
   wberroradditionalinfo = wberrorarray[6]
   lasterr = wberrorarray[0]
   handlerline = wberrorarray[1]
   textstring = wberrorarray[5]
   linenumber = wberrorarray[8]
   errmsg = "Error: ":lasterr:@LF:textstring:@LF:"Line (":linenumber:")":@LF:wberroradditionalinfo
   Return(errmsg)
#EndSubRoutine


#DefineSubRoutine cvt(colType)
retval = "BLOB"
If colType == "System.Boolean" Then  retval="INTEGER"
If colType == "System.Byte" Then  retval="INTEGER"
If colType == "System.Char" Then  retval="TEXT"
If colType == "System.DateTime" Then retval="TEXT"
If colType == "System.Decimal" Then retval="NUMERIC"
If colType == "System.Int32" Then retval="INTEGER"
If colType == "System.Int64" Then retval="REAL"
If colType == "System.Int16" Then retval="INTEGER"
If colType == "System.Object" Then retval="BLOB"
If colType == "System.Single" Then retval="REAL"
If colType == "System.String" Then retval="TEXT"
If colType == "System.Double" Then retval="REAL"
If colType == "System.Array" Then retval="BLOB"
If colType == "System.UInt16" Then retval="INTEGER"
If colType == "System.UInt32" Then retval="REAL"
If colType == "System.UInt64" Then retval="REAL"
If colType == "System.IntPtr" Then retval="TEXT"


Return(retval)
#EndSubRoutine


Return

JTaylor

Nice work, as always.   Agreed on the clunky insert (speaking of SQLite insert, not your script).   They don't have a bulk option in SQLite, at least I don't think so, and one needs to use a transaction to make batches of INSERTs faster.   I am sure you and most others know this but for thoroughness...

BEGIN TRANSACTION;
INSERT....;
INSERT....;
COMMIT;

I could create a batch insert but not sure how long it would take me to sort that one out :)    Will add it to my list.   Had thought about some "file" options, for OUT especially, since I am struggling getting the dbGetString() to allow for larger results.  Guess I could take the easy way out and have the batch be submitted as an Array.  Would avoid me having to parse delimited data.  Let WinBatch handle that load.  One extra step for the user and a whole lot of hours saved by me :)   The Batch Insert option would be fairly easy at that point (famous last words).

The next thing I want to get in place is UNICODE support.   Once that is done and everything is stable I will consider it ready for initial release and then continue adding new stuff.   While limited on some fronts, at that point it will allow one to do most anything they want with the database.    Need to test BLOBs I guess.   Haven't even checked on that yet.  Hopefully they work, as is.

Jim

JTaylor

I wondered since that was a big jump.   I am pleasantly surprised and pleased with the performance.  Thanks again.

Jim

Quote from: KeithW on May 20, 2020, 04:03:11 PM
Jim,

Keep in mind that the file I/O of creating Array.txt is in that time, I do believe and my full filesystem will be impacting that time as well.
I watched  array.txt swell during those 80 seconds.

Keith

JTaylor

I think I misunderstood what you were saying at first.    If I went with my idea of submitting an array for insert I think that would work well with what you are doing.   Do your query in the one and I think you can easily turn that into an array.  I assume RS.GetRows() works and submit that to my Extender.   Am I getting what you want?

Jim

Quote from: stanl on May 20, 2020, 05:56:13 PM
Jim;


I modified my original script to create an SQLite db and Table populated from a tab-delimited CSV file. The attached wakecounty.csv is tab=delimited and I included a 1-liner schema.ini (I'm working on a interface to autocreate the .ini) so Ace Provider wouldn't error. The process of inserting rows from the datatable into SQLite is a little slow and clumsy to look at. Since the database/sqlite table are identical in structure maybe there is a way to INSERT INTO....FROM - or some kind of bulk copy.  Anyway, be a nice project to replace the System.Data.SQLite with your extender. There was one bug. The original csv had a column Time Zone - which although could be created in the SQLite table, put out an error during the INSERT command. I changed it to TimeZone figuring there may be a need in SQLite to enclose multi-word columns in brackets

stanl

Quote from: JTaylor on May 20, 2020, 08:17:26 PM
Do your query in the one and I think you can easily turn that into an array.


Actually oTable.Select() returns an array. Could have come up with a full schema.ini with column types pre-defined then there would be no need to work with a .NET Datatable for column types. Basically use ACE Provider to query file as Recordset where GetRows() would work. I'm more interested in parsing delimited text on the fly and only stamping a schema.ini with a delimiter. I probably should have posted the script as a separate thread so as not to hi-jack the intent here. I was just pleased to get the syntax of the Create and Insert statements correct with minimal frustration. ;)

JTaylor

No problem.  Glad you got it sorted.   I could use some "minimal frustration" myself.   Plus such things generate ideas for improving what is being done here.

Not sure how much work it would be to parse delimited data in the Extender.   Mainly thinking about embedded quotes and such.  Now, if one had a file and then did dbInsertBatch(db,table,ArrayFileGetCSV(....)), if such a function existed, that would give me a cleanly separated source with which to work or, as noted, one can usually get an array of the results from the various recordset tools.   Really can't think of a situation where it would be a hassle to provide an array.

In any event, good work on what you posted and I am off to get this UNICODE thing sorted.

Jim

JTaylor

Here is the latest.   UNICODE is still whipping my butt so didn't want to hold up the dbInsertBatch() option and @dbVersion constant.

See notes in script on new function.  Nothing will run by default.  Also, if your source is UNICODE it will load correctly.   Now if only the output did the same.

Let me know how those 3.55 mil records do...hope they don't catch on fire from the blazing speed  ;)

http://www.jtdata.com/anonymous/wbsql44i_v4.zip

Jim

KeithW

Jim,

While I have not yet looked at v4 did you add a function that will take a flat file (no delimiters)?

If not, what about a function that will take column settings like 1-10, 11-10, etc however if you can do
that do not assume there isn't white space that is not needed like so many program do.  I would rather
specify every column start position and column length whether fields butt up to each other of not.

If that functionality isn't there my 3.5MM is not of value yet with your SQLite Extender.

Keith

JTaylor

Added a Help file to the v4 file just now.   Don't forget you will probably have to "Unblock" it under File Properties.

Also, if you have a suggestion for the SQLTemplate field indicators, let me know.  Don't really like all the braces but was having trouble thinking of something to use.

Jim

KeithW

Jim

What about a Control string for SQLite   dbFlatFile (file_to_load_from,tablename,"FieldName!start-length,Field2!Start-length") with any undefined SQLite Fields set to NULL?  So you only specify what you are loading.

Keith

KeithW

Jim,

Between (file-to-load, ADD, tablename, ...)

add =lines to skip in file-to-load to account for headers, etc)

Keith

JTaylor

Probably good to change the Column Header option to a start row, as you suggest.   Give it a little more flexibiltiy.   Thinking of dbInsertBatch here.

Help me understand how the control string would be better than what I did?  Not disagreeing, just wanting to understand.  The current option allows you to specify which columns to load and there is a solid parser already in place or is this to allow for handling larger files which might not fit in an array?

What is "start-length"?

Jim

Quote from: KeithW on May 21, 2020, 06:07:48 PM
Jim

What about a Control string for SQLite   dbFlatFile (file_to_load_from,tablename,"FieldName!start-length,Field2!Start-length") with any undefined SQLite Fields set to NULL?  So you only specify what you are loading.

Keith

KeithW

Jim,

IN  file_to_load  "start" is starting column and for "length" is to grab for the field.
The fieldname in the file_to_load does not need to be named as you are giving physical positions
FieldName regards to field in SQLite table to receive data

Internally you would need to dbSchema the table
Set what FieldName references you are given
Set unamed columns (in SQLIte) to NULL
and process file_to_load based on the above settings

Possible ???

Keith

Keith

JTaylor

It has been a long day and I will probably get it tomorrow but if you get a chance please mock up a function call for me using the Company table and load ID_NO, NAME and AGE.

Wouldn't you want to let SQLite handle the unaddressed fields with the Default values?   I can't imagine setting all to NULL as a default would be the desired behavior by all.   Will keep that in mind though.

I am off to bed.  Thanks again for all the help here.

Jim

KeithW

Jim,

Hopefully the attach file, data, example, etc will make things clearer for you.
If not, do not hesitate to ask for more clarification.  If you will add this it will
be a huge benny for me, as I come across this several times a year in the
work I do.

Regards,
Keith

JTaylor

Okay.   Wondered if that was what you meant but was so tired my brain couldn't get any traction to follow through on the thought.

I have a plan.   We'll see where it goes.     Also going to tackle delimited files but only accept "clean" ones.   My brain almost explodes thinking about trying to parse something like:

1,""HELLO WORLD": A Programming Guide for C,C#,C++,etc.","Mr. Programmer,"aka Mr. P" Joseph"

Jim

JTaylor

Here is the latest.   The size limit on dbGetStrings()  and dbMapData() should be gone.


http://www.jtdata.com/anonymous/wbsql44i_v5.zip

Jim

KeithW

Jim,

I downloaded and will look at in a bit, on a client project for the next several hours.

Keith

JTaylor

Added dbInsertFile().    This is a delimited text file option.  Not the Flat file.   This was warmup for the other :)

http://www.jtdata.com/anonymous/wbsql44i_v6.zip

Jim

KeithW

Jim,

Cool, I finished up on my project for now and now back to playing with this Lib...

Keith

JTaylor

Added dbInsertFlat().  I think you can do the "-" sign and NULL stuff via SQL without any problem.  See the examples in the test script and Help fil.

The configuration might seem a little hokey at first glance but I think it will work well and be easier to maintain.   Let me know if you disagree.

         http://www.jtdata.com/anonymous/wbsql44i_v7.zip

Jim

KeithW

Jim,

Will explore when I get a chance, my client came right back and added to my workload.

Regards,
Keith

KeithW

Jim,

Not sure what I am doing wrong here, I dropped the header line since it did not appear that the function would parse it correctly.
I had originally set to start with Line 2, but you were trying to parse line 1 anyway.  Deleting the Header was the simple fix for now.

The file (sample included) has 42 fields per row.  The first many rows have no values in he 1st two fields, but they start later in the file.
It is returning a "Syntax Error" on what I am guessing is the 10th field in the first row BUT since there are multiple values
(in this case a date in yyyymmdd format) I cannot be sure which field it really choked on.... nothing gets written to the SQL3
database.  Maybe you can either spot something I have wrong in the setup or get the Lib to give you a bit more information
about what it does not seem to like.

After a few hours of sleep, I will look at the flat, fixed record file... basically doing almost identical to this example...just different data.

Regards,
Keith

JTaylor

Not sure.  It is returning useful errors for me but not sure what it doesn't like yet.   It is failing on the 10th column no matter which row I start on.  I thought at first it was because you have 43 table columns but only 42 data columns.  Also thought it might be choking on the ID_NO because it is set to unique but most of the first column were the same.  I fixed those issues but get the exact same error.   Have to leave for a family thing so will be late today before I can get it sorted.  Sorry for the delay.

Jim

KeithW

Jim,

Sorry about the 43 vs 42, was almost asleep and meant to chance the ID_NO to: ID_NO          INTEGER   PRIMARY KEY   AUTOINCREMENT
so that would be 43 fields  (ID plus my 42)  forgot to go back up and make that change.

I too have to run some errands today but have some other things to try when I get back.

Keith

JTaylor

You are missing a quote on every holder with the number 8 in it.  8, 18, 28, 38.

That was masking the other issues.

Jim

KeithW

Dang, another trip to the eye doctor and another pair glasses...

Thanx, Jim

Keith

KeithW

Jim,

With the AUTOINCREMENT up front this SQLtext definition successfully loaded the 140 lines....


SQLText = "INSERT INTO BFS_Data Values "
SQLText = SQLText : "(NULL,'{{{0}}}','{{{1}}}','{{{2}}}','{{{3}}}','{{{4}}}','{{{5}}}','{{{6}}}','{{{7}}}','{{{8}}}','{{{9}}}',"
SQLText = SQLText :   "'{{{10}}}','{{{11}}}','{{{12}}}','{{{13}}}','{{{14}}}','{{{15}}}','{{{16}}}','{{{17}}}','{{{18}}}','{{{19}}}',"
SQLText = SQLText :   "'{{{20}}}','{{{21}}}','{{{22}}}','{{{23}}}','{{{24}}}','{{{25}}}','{{{26}}}','{{{27}}}','{{{28}}}','{{{29}}}',"
SQLText = SQLText :   "'{{{30}}}','{{{31}}}','{{{32}}}','{{{33}}}','{{{34}}}','{{{35}}}','{{{36}}}','{{{37}}}','{{{38}}}','{{{39}}}',"
SQLText = SQLText :   "'{{{40}}}','{{{41}}}');"

Message("SQL Template",SQLText)   ; DEBUG CODE

stmt = dbInsertFile(db, SQLText, DirScript() : "AR_ATB_JUNE_BFS1_K140.csv", 1, ",")
Message("dbInsertFile()",stmt)



Is that what you would have expected for 42 loaded files with an AUTO_ID up front?

Keith

JTaylor

Yes.   The other option is to provide the column list before "Values" and leave out that column name.  But I am sure you know that.     Just to be clear there is nothing special I do here with the templates.  It is straight SQL.  You could do things like

       INSERT INTO BFS_Data Values (NULL, '[{0}]', '[{1}]', '[{2}]', Substr('[{3}]',1,6), '[{4}]', 'Peaches', '[{6}]'/'[{7}]', '[{8}]', CASE Substr('[{9}]',length('[{9}]'),1) WHEN '-' THEN '[{9}]'*-1 END;

Also, I have posted a new version which changes the place holders to

       [{n}]

I am going to stick with that unless someone tells me SOON that it isn't a good choice.  Seems a lot easier to read.

http://www.jtdata.com/anonymous/wbsql44i_v8.zip



Jim

KeithW

Jim

This is still v7 (I downloaded v8 but have not adjusted the {{{n}}} to [{n}] as of yet....
Thought you would like to have some real stats.....

CSV Load File is: 304,696KB containing 1,096,988 lines with the single header line.
the skip (starting line) DID NOT WORK, I ended up with the Header Line in my Database  !!
also  the dbSchema() function choKes on this file, process silently dies on BFS_Data table

Now here is the good news, I am using the following extender:
AddExtender("wilx44i.dll",0,"wilx64i.dll")               ; xGetElapsed

the xGetElapsed() & GetExactTime() timng functions

it loaded all (1,096,988 lns) on my full filesystem yielding a 197,460KB SQL3 DB in
1 min, 10secs & 54 hundereths

NOT BAD !!


this is the WB used for the testing....

AddExtender("wbsql44i.dll") ; SQLite3 Functions()
AddExtender("wilx44i.dll",0,"wilx64i.dll") ; xGetElapsed


db  = DirScript():"BFS_CSV_Test.sqlite"      ;INCLUDE FULL PATH FOR DATABASE
If FileExist(db) Then FileDelete(db) ;start over fresh every running

Message("Version",@dbVersion) ;show Library version

; UNIQUE   NOT NULL
;#############################################################
SQLText = $"CREATE TABLE IF NOT EXISTS BFS_Data (
ID_NO INTEGER PRIMARY KEY AUTOINCREMENT,
   RSTXTLG Text(20),
x0PMNTTRMS Text(4),
x0COMP_CODE Text(4),
x0CRED_ACCNT Text(10),
x0CRED_GROUP Text(4),
x0REPR_GROUP Text(3),
x0DEBITOR Text(10),
x0AC_DOC_TYP Text(2),
x0ITEM_NUM Text(3),
x0NETDUEDATE Text(25),
x0DOC_DATE Text(25),
x0AC_DOC_NO Int(10),
x0PSTNG_DATE Text(25),
x0DOC_CURRCY Text(3),
x0LOC_CURRCY Text(3),
ZARSNAPD Text(10),
ZARCDUE REAL(13,2),
ZARDCDUE REAL(13,2),
ZARDFDUE REAL(13,2),
ZARDPDU10 REAL(13,2),
ZARDPDUE1 REAL(13,2),
ZARDPDUE2 REAL(13,2),
ZARDPDUE3 REAL(13,2),
ZARDPDUE4 REAL(13,2),
ZARDPDUE5 REAL(13,2),
ZARDPDUE6 REAL(13,2),
ZARDPDUE7 REAL(13,2),
ZARDPDUE8 REAL(13,2),
ZARDPDUE9 REAL(13,2),
ZARDTDUE REAL(13,2),
ZARFDUE REAL(13,2),
ZARPDUE1 REAL(13,2),
ZARPDUE10 REAL(13,2),
ZARPDUE2 REAL(13,2),
ZARPDUE3 REAL(13,2),
ZARPDUE4 REAL(13,2),
ZARPDUE5 REAL(13,2),
ZARPDUE6 REAL(13,2),
ZARPDUE7 REAL(13,2),
ZARPDUE8 REAL(13,2),
ZARPDUE9 REAL(13,2),
ZARTDUE REAL(13,2) );$"

resp = dbExecute(db,SQLText)
Message("Create",resp)
;#############################################################


;#############################################################
;dbTables(db)
;Returns a TAB Delimited List of Tables
;#############################################################
stmt = dbTables(db)
Message("dbTables",stmt)
;#############################################################



;#############################################################
;dbSchema(db,Table_Name)
;Returns Database Table Schema in a String Format
;2nd Argument(optional) is Table Name. Leave off if you want all tables.
;#############################################################
stmt = dbSchema(db,"BFS_Data")
;Message("dbSchema",stmt) ; This chokes on the configuration !!!!
;#############################################################


;#############################################################
;dbInsertFile()
; Batch Insert from a Delimited File (CLEANLY delimited, no embedded delimiters)
;  Arg 1 - STRING - Full Path for Database.
;  Arg 2   STRING - SQL Template
;  Arg 3 - STRING - Full File Path
;  Arg 4 - Int(optional)  -  Row to Start - Default 1
;  Arg 5 - Field Delimiter(optional) - Default @TAB
;
; Quote (") enclosed fields will have the quotes removed. Ex.  ("Hope") will turn into (Hope)
;
;#############################################################


SQLText = "INSERT INTO BFS_Data Values "
SQLText = SQLText : "(NULL,'{{{0}}}','{{{1}}}','{{{2}}}','{{{3}}}','{{{4}}}','{{{5}}}','{{{6}}}','{{{7}}}','{{{8}}}','{{{9}}}',"
SQLText = SQLText :   "'{{{10}}}','{{{11}}}','{{{12}}}','{{{13}}}','{{{14}}}','{{{15}}}','{{{16}}}','{{{17}}}','{{{18}}}','{{{19}}}',"
SQLText = SQLText :   "'{{{20}}}','{{{21}}}','{{{22}}}','{{{23}}}','{{{24}}}','{{{25}}}','{{{26}}}','{{{27}}}','{{{28}}}','{{{29}}}',"
SQLText = SQLText :   "'{{{30}}}','{{{31}}}','{{{32}}}','{{{33}}}','{{{34}}}','{{{35}}}','{{{36}}}','{{{37}}}','{{{38}}}','{{{39}}}',"
SQLText = SQLText :   "'{{{40}}}','{{{41}}}');"
Message("SQL Template",SQLText)
Timedelay(1)
time1 = GetExactTime()
;stmt = dbInsertFile(db, SQLText, DirScript() : "AR_ATB_JUNE_BFS1_K140.csv", 1, ",") ; <---Initial Script Testing, no headers
stmt = dbInsertFile(db, SQLText, DirScript() : "AR_ATB_JUNE_BFS1.csv", 2, ",") ; <---REAL McCOY   !!! With Headers !!
time2 = GetExactTime()
rc = xGetElapsed(time2, time1)
Message("dbInsertFile()",stmt)
;Message("Processing Complete","Lines Read:     %icount%%@CRLF%Lines Written: %ocount%%@CRLF%Elapsed: %rc%%@CRLF%----------  H: M: S. hh")
Message("Processing Complete","Elapsed: %rc%%@CRLF%----------  H: M: S. hh")


Regards,
Keith

KeithW

Jim,

dbTables() is starting to return garbage which is causing the process to silently die....
See screenshot.... DO NOT KNOW WHAT SUDDENLY CHANGED, been running the process over & over
to see what the variation in the load times I gave you.   Thy are within a second of each other until
this corruption. now the script will not even run thru. start when I tried the dbSchema() but that is
commented out and the database get deleted for each running....any ideas?

Keith

JTaylor

Please download v8 again and give that a try.   I thought I had removed all the HEAP stuff but those two were still using it.   Guessing it is related.   I was deleting it but maybe not correctly.

jim