This is not a Winbatch problem, but there are some good SQL members on this board and I need some advice. For years, whenever I have needed a report or some data manipulation involving multiple data sources, temp tables, INSERTS, UPDATES I would place all of the SQL steps into a table [memo field] and use a generic WB udf to step through the SQL and generate what I needed.
The other day, my udf failed (in Access) executing SQL that represented a basic SELECT INTO
FROM [ODBC Source]. The exact WB error was "ODBC Call Failed". And when I tried to execute directly from Access, it went into "Not responding..." Bottom line: the data the query was selecting had grown exponentially. To the rescue: redesign the SELECT portion as a Pass-Through Query, then create an additional query to reference the pass-through and execute as a Make Table. Worked perfectly. So: replace the original SQL in the memo with the second query. A little more on the logic of how the rows in the lookup table are arranged DROP SELECT INTO <---- was failing, now replaced with Pass-through UPDATE Now here is the kicker. The table is dropped. The pass-through is executed, but instead of moving to the UPDATE I get an SQL error. That Already Exists. It is almost like the pass-through SQL creates the table then tries to create it again in the same statement. Since the WB udf is using basic ADO Connection.Execute(SQL) where the connection and command timeouts are set to 0, I'm wondering if the Pass-Through (which has it's own ODBC connection and Timeout properties set to avoid a pop-up asking for a DSN) is somehow conflicting. In other words, never use a pass-through in framework using a table of SQL statements. |
|
|
|
|