I think the initial code is fine:
SqlCommand param = new SqlCommand();
SqlGeometry point = SqlGeometry.Point(center_lat,center_lng,0);
SqlGeometry poly = SqlGeometry.STPolyFromText(new SqlChars(new SqlString(polygon)),0);
param.CommandText = "INSERT INTO Circle (Center_Point, Circle_Data) VALUES (#point,#poly);";
param.Parameters.Add(new SqlParameter("#point", SqlDbType.Udt));
param.Parameters.Add(new SqlParameter("#poly", SqlDbType.Udt));
param.Parameters["#point"].UdtTypeName = "geometry";
param.Parameters["#poly"].UdtTypeName = "geometry";
param.Parameters["#point"].Value = point;
param.Parameters["#poly"].Value = poly;
However I realised there could be a problem when the polygon string is created.
in javascript - I create it like so:
var Circle_Data = "POLYGON ((";
for (var x = 0; x < pointsToSql.length; x++) { // formatting = 0 0, 150 0, 150 50 etc
if (x == 360) { Circle_Data += pointsToSql[x].lat.toString() + " " + pointsToSql[x].lng.toString() + "))"; }
else { Circle_Data += pointsToSql[x].lat.toString() + " " + pointsToSql[x].lng.toString() + ","; }
}
It is then passed to C#. So is this safe? even though the parametrization has happened in the query?
With the parameter you will be saved from SQL Injection, If some SQL is injected in the POLYGON string, it will error out at SQL Server end.
So for example if you have :
POLYGON(12.33 12.55,13.55; DROP TABLE students;)
SQL server will try to construct a geometry type based on the passed string, and it will fail doing so.
Related
I'm making a telegram bot with c#.
I want to read a number of names (not more than 20 tough thy could be less) and give them to the user as keyboardMarkup. With a one dimensional array they go all in one line and it's unreadable. I wanted to make 4 line with 5 names so i tried a 4x5 array. But i get this error
Error CS0029 Cannot implicitly convert type 'NetTelegramBotApi.Types.KeyboardButton[,]' to 'NetTelegramBotApi.Types.KeyboardButton[][]'
if (text == "/lista")
{
// Read a text file line by line.
listapz = System.IO.File.ReadAllLines("listapz.txt");
string selezione = "Seleziona il paziente:";
int i = 0;
int x = 0;
int y = 0;
var arrays = new KeyboardButton[4,5];
for (i = 0; i < listapz.Length; i++)
{
y = i / 5;
x = i - (y * 5);
arrays[y,x] = new KeyboardButton("$" + (i + 1) + " " + listapz[i]);
selezione += "\n$" + (i + 1) + " " + listapz[i] + "";
}
var keyb = new ReplyKeyboardMarkup()
{
Keyboard = arrays,
OneTimeKeyboard = true,
ResizeKeyboard = true
};
var reqAction = new SendMessage(update.Message.Chat.Id, selezione) { ReplyMarkup = keyb };
bot.MakeRequestAsync(reqAction).Wait();
Console.WriteLine(reqAction);
continue;
}
any solution?
You can create new one dimensional array for every line
Then
keyb.row(first_array) //you need get only values in python it will like keyb.row(*array)
keyb.row(second_array) //and so on
and add all keyboards in one reply_markup=keyb
I've made a function that builds a query to insert into mysql. The upload is blazing fast but for inserting longer values the building takes somewhat longer. Is there A way to speed up a function like this? Because I know that a loop in a loop takes a lot of time for higher amounts of data.
foreach (string[] st in dataToUpload)
{
buildQuery += " ('";
for (int i = 0; i < st.Length; i++)
{
buildQuery += st[i];
if (i < st.Length - 1)
buildQuery += "','";
}
buildQuery += "')";
if (st != dataToUpload[dataToUpload.Count - 1])
buildQuery += ",";
}
This is the query I would like to build for example;
string test = INSERT INTO test (test, test1, test2, test3) values
test = test + " " + buildquery;
so test will be
INSERT INTO test (test, test1, test2, test3)
values ("testvalue1", "testvalue2" , "testvalue3" , "testvalue4"),
("testvalue1", "testvalue2" , "testvalue3" , "testvalue4"),
I can work with INNODB and MYISAM and it's working on a centos server with a 6700k processor with 32gb ram.
So the main question is: How can I make the building of the query faster?
I would recommend to use a StringBuilder which gets Initialized to the right size right from the beginning. This reduces reallocation of memory on every string append.
Assuming that dataToUpload is a List you can try this:
// String Builder Initialization
// Size is calculated by getting the length of all strings and add 3 for each (',').
// Additionally there are 6 chars for " ('" and "')," per array
StringBuilder build = new StringBuilder(dataToUpload.Sum(data => data.Sum(s => s.Length) + data.Length * 3) + 6);
foreach (string[] st in dataToUpload)
{
build.Append(" ('" + string.Join<string>("','", st) + "'),");
}
buildQuery = build.ToString().TrimEnd(',');
Seems that your buildQuery is a String. Try StringBuilder instead. It's probably the best way to do string concatenation.
Feels like need more information here, but I suppose your are building an insert statement like this:
INSERT INTO MyTable ( Column1, Column2 ) VALUES
( Value1, Value2 ), ( Value1, Value2 )
So, perhaps the best way to not do a for inside a foreach is replacing the string[] in the foreach, for a string with the correct values. Something like this:
var count = 0;
foreach (string st in dataToUpload)
{
buildQuery += " ('" + st + "'") "
if (count++!=0 )
buildQuery += ","
}
Maybe this:
var count = dataToUpload.Count;
var i = 0;
foreach (string[] st in dataToUpload)
{
buildQuery += " ('" + string.Join(",", st) + "')";
if (i++ < count - 1)
buildQuery += ",";
}
Instead of comparing the st with the contents of the dataToUpload, use an index variable to speed it up.
And string.Join is a good way to concatenate strings.
I have 3 arrays. Two are arrays of strings and one is of date/time. I pulled all 3 from user input. Each array is always going to have the same exact amount of entries, so what I want to do is be able to loop through all 3 at once to make a string.
I was trying:
List<string> results = new List<string>();
// select
foreach (string line in array1)
{
foreach (string lines in array2)
{
foreach (DateTime date in datearray1)
{
results.Add("select * from table1 d, table2 c where d.specheader = c.specheader and c.true_false = true and d.number = " + lines.ToString() + " and d.date = '" + date.ToShortDateString() + "' and d.specnum like '%" + line.ToString() + "';");
}
}
}
results.ToArray();
foreach (string line in results)
{
MessageBox.Show(line);
}
The user types in information into 3 boxes and I'm just trying to concatenate sql statements based on the input. However when I tried doing it this way it looped through 6 times when I had only 2 entries. Is there a way to concatenate a string using all 3 arrays at the same time (so like entry 1 of array 1, entry 1 of array 2, entry 1 of array 3 - Then move on to creating the next string, entry 2 of array 1, entry 2 of array 2, entry 2 of array 3, etc.)
Any input would be appreciated. Thank you!
As the first commenter said (Yuck) don't use concatenation of strings into your SQL like that. You will want to setup an SQL Command and then pass in parameters.
That is however beside the point as you are asking about rolling together data from multiple arrays into 1 string.
Iterate through one of the arrays, If they all have the same count you will neatly get the data in one.
for(int i = 0; i < array1.Length; i++)
{
results.Add(string.format("Hello you! {0} , {1}, {2}", array1[i], array2[i], datearray[i])
}
This will get your desired result but your code is open to vulnerabilities as it stands. You need to change your approach.
Because your loops are nested, you're getting every value of array2 combined with every value in array1 (and similarly with datearray1. That's why you get too many results.
Your loops would work as intended like this (I've used similar local variables to avoid retyping the results.Add line, and to make clear how the code differs from yours):
for (int i = 0; i < array1.Length; i++)
{
string line = array1[i];
string lines = array2[i];
DateTime date = datearray1[i];
results.Add("select * from table1 d, table2 c where d.specheader = c.specheader and c.true_false = true and d.number = " + lines.ToString() + " and d.date = '" + date.ToShortDateString() + "' and d.specnum like '%" + line.ToString() + "';");
}
As a side-note: building a database query in this manner is inefficent and very insecure (try reading up on "Sql Injection" to understand why). You would see better results if you used a stored procedure instead.
if number of entries are going to be same for all you can simple do a for loop
for (int 1 = 0; i < datearray1.length; i++)
{
results.Add("select * from table1 d, table2 c
where d.specheader = c.specheader and c.true_false = true
and d.number = " + array2[i].ToString() + "
and d.date = '" + datearray1[i].ToShortDateString() + "'
and d.specnum like '%" + array1[i].ToString() + "';");
}
What is the best way to perform bulk inserts into an MS Access database from .NET? Using ADO.NET, it is taking way over an hour to write out a large dataset.
Note that my original post, before I "refactored" it, had both the question and answer in the question part. I took Igor Turman's suggestion and re-wrote it in two parts - the question above and followed by my answer.
I found that using DAO in a specific manner is roughly 30 times faster than using ADO.NET. I am sharing the code and results in this answer. As background, in the below, the test is to write out 100 000 records of a table with 20 columns.
A summary of the technique and times - from best to worse:
02.8 seconds: Use DAO, use DAO.Field's to refer to the table columns
02.8 seconds: Write out to a text file, use Automation to import the text into Access
11.0 seconds: Use DAO, use the column index to refer to the table columns.
17.0 seconds: Use DAO, refer to the column by name
79.0 seconds: Use ADO.NET, generate INSERT statements for each row
86.0 seconds: Use ADO.NET, use DataTable to an DataAdapter for "batch" insert
As background, occasionally I need to perform analysis of reasonably large amounts of data, and I find that Access is the best platform. The analysis involves many queries, and often a lot of VBA code.
For various reasons, I wanted to use C# instead of VBA. The typical way is to use OleDB to connect to Access. I used an OleDbDataReader to grab millions of records, and it worked quite well. But when outputting results to a table, it took a long, long time. Over an hour.
First, let's discuss the two typical ways to write records to Access from C#. Both ways involve OleDB and ADO.NET. The first is to generate INSERT statements one at time, and execute them, taking 79 seconds for the 100 000 records. The code is:
public static double TestADONET_Insert_TransferToAccess()
{
StringBuilder names = new StringBuilder();
for (int k = 0; k < 20; k++)
{
string fieldName = "Field" + (k + 1).ToString();
if (k > 0)
{
names.Append(",");
}
names.Append(fieldName);
}
DateTime start = DateTime.Now;
using (OleDbConnection conn = new OleDbConnection(Properties.Settings.Default.AccessDB))
{
conn.Open();
OleDbCommand cmd = new OleDbCommand();
cmd.Connection = conn;
cmd.CommandText = "DELETE FROM TEMP";
int numRowsDeleted = cmd.ExecuteNonQuery();
Console.WriteLine("Deleted {0} rows from TEMP", numRowsDeleted);
for (int i = 0; i < 100000; i++)
{
StringBuilder insertSQL = new StringBuilder("INSERT INTO TEMP (")
.Append(names)
.Append(") VALUES (");
for (int k = 0; k < 19; k++)
{
insertSQL.Append(i + k).Append(",");
}
insertSQL.Append(i + 19).Append(")");
cmd.CommandText = insertSQL.ToString();
cmd.ExecuteNonQuery();
}
cmd.Dispose();
}
double elapsedTimeInSeconds = DateTime.Now.Subtract(start).TotalSeconds;
Console.WriteLine("Append took {0} seconds", elapsedTimeInSeconds);
return elapsedTimeInSeconds;
}
Note that I found no method in Access that allows a bulk insert.
I had then thought that maybe using a data table with a data adapter would be prove useful. Especially since I thought that I could do batch inserts using the UpdateBatchSize property of a data adapter. However, apparently only SQL Server and Oracle support that, and Access does not. And it took the longest time of 86 seconds. The code I used was:
public static double TestADONET_DataTable_TransferToAccess()
{
StringBuilder names = new StringBuilder();
StringBuilder values = new StringBuilder();
DataTable dt = new DataTable("TEMP");
for (int k = 0; k < 20; k++)
{
string fieldName = "Field" + (k + 1).ToString();
dt.Columns.Add(fieldName, typeof(int));
if (k > 0)
{
names.Append(",");
values.Append(",");
}
names.Append(fieldName);
values.Append("#" + fieldName);
}
DateTime start = DateTime.Now;
OleDbConnection conn = new OleDbConnection(Properties.Settings.Default.AccessDB);
conn.Open();
OleDbCommand cmd = new OleDbCommand();
cmd.Connection = conn;
cmd.CommandText = "DELETE FROM TEMP";
int numRowsDeleted = cmd.ExecuteNonQuery();
Console.WriteLine("Deleted {0} rows from TEMP", numRowsDeleted);
OleDbDataAdapter da = new OleDbDataAdapter("SELECT * FROM TEMP", conn);
da.InsertCommand = new OleDbCommand("INSERT INTO TEMP (" + names.ToString() + ") VALUES (" + values.ToString() + ")");
for (int k = 0; k < 20; k++)
{
string fieldName = "Field" + (k + 1).ToString();
da.InsertCommand.Parameters.Add("#" + fieldName, OleDbType.Integer, 4, fieldName);
}
da.InsertCommand.UpdatedRowSource = UpdateRowSource.None;
da.InsertCommand.Connection = conn;
//da.UpdateBatchSize = 0;
for (int i = 0; i < 100000; i++)
{
DataRow dr = dt.NewRow();
for (int k = 0; k < 20; k++)
{
dr["Field" + (k + 1).ToString()] = i + k;
}
dt.Rows.Add(dr);
}
da.Update(dt);
conn.Close();
double elapsedTimeInSeconds = DateTime.Now.Subtract(start).TotalSeconds;
Console.WriteLine("Append took {0} seconds", elapsedTimeInSeconds);
return elapsedTimeInSeconds;
}
Then I tried non-standard ways. First, I wrote out to a text file, and then used Automation to import that in. This was fast - 2.8 seconds - and tied for first place. But I consider this fragile for a number of reasons: Outputing date fields is tricky. I had to format them specially (someDate.ToString("yyyy-MM-dd HH:mm")), and then set up a special "import specification" that codes in this format. The import specification also had to have the "quote" delimiter set right. In the example below, with only integer fields, there was no need for an import specification.
Text files are also fragile for "internationalization" where there is a use of comma's for decimal separators, different date formats, possible the use of unicode.
Notice that the first record contains the field names so that the column order isn't dependent on the table, and that we used Automation to do the actual import of the text file.
public static double TestTextTransferToAccess()
{
StringBuilder names = new StringBuilder();
for (int k = 0; k < 20; k++)
{
string fieldName = "Field" + (k + 1).ToString();
if (k > 0)
{
names.Append(",");
}
names.Append(fieldName);
}
DateTime start = DateTime.Now;
StreamWriter sw = new StreamWriter(Properties.Settings.Default.TEMPPathLocation);
sw.WriteLine(names);
for (int i = 0; i < 100000; i++)
{
for (int k = 0; k < 19; k++)
{
sw.Write(i + k);
sw.Write(",");
}
sw.WriteLine(i + 19);
}
sw.Close();
ACCESS.Application accApplication = new ACCESS.Application();
string databaseName = Properties.Settings.Default.AccessDB
.Split(new char[] { ';' }).First(s => s.StartsWith("Data Source=")).Substring(12);
accApplication.OpenCurrentDatabase(databaseName, false, "");
accApplication.DoCmd.RunSQL("DELETE FROM TEMP");
accApplication.DoCmd.TransferText(TransferType: ACCESS.AcTextTransferType.acImportDelim,
TableName: "TEMP",
FileName: Properties.Settings.Default.TEMPPathLocation,
HasFieldNames: true);
accApplication.CloseCurrentDatabase();
accApplication.Quit();
accApplication = null;
double elapsedTimeInSeconds = DateTime.Now.Subtract(start).TotalSeconds;
Console.WriteLine("Append took {0} seconds", elapsedTimeInSeconds);
return elapsedTimeInSeconds;
}
Finally, I tried DAO. Lots of sites out there give huge warnings about using DAO. However, it turns out that it is simply the best way to interact between Access and .NET, especially when you need to write out large number of records. Also, it gives access to all the properties of a table. I read somewhere that it's easiest to program transactions using DAO instead of ADO.NET.
Notice that there are several lines of code that are commented. They will be explained soon.
public static double TestDAOTransferToAccess()
{
string databaseName = Properties.Settings.Default.AccessDB
.Split(new char[] { ';' }).First(s => s.StartsWith("Data Source=")).Substring(12);
DateTime start = DateTime.Now;
DAO.DBEngine dbEngine = new DAO.DBEngine();
DAO.Database db = dbEngine.OpenDatabase(databaseName);
db.Execute("DELETE FROM TEMP");
DAO.Recordset rs = db.OpenRecordset("TEMP");
DAO.Field[] myFields = new DAO.Field[20];
for (int k = 0; k < 20; k++) myFields[k] = rs.Fields["Field" + (k + 1).ToString()];
//dbEngine.BeginTrans();
for (int i = 0; i < 100000; i++)
{
rs.AddNew();
for (int k = 0; k < 20; k++)
{
//rs.Fields[k].Value = i + k;
myFields[k].Value = i + k;
//rs.Fields["Field" + (k + 1).ToString()].Value = i + k;
}
rs.Update();
//if (0 == i % 5000)
//{
//dbEngine.CommitTrans();
//dbEngine.BeginTrans();
//}
}
//dbEngine.CommitTrans();
rs.Close();
db.Close();
double elapsedTimeInSeconds = DateTime.Now.Subtract(start).TotalSeconds;
Console.WriteLine("Append took {0} seconds", elapsedTimeInSeconds);
return elapsedTimeInSeconds;
}
In this code, we created DAO.Field variables for each column (myFields[k]) and then used them. It took 2.8 seconds. Alternatively, one could directly access those fields as found in the commented line rs.Fields["Field" + (k + 1).ToString()].Value = i + k; which increased the time to 17 seconds. Wrapping the code in a transaction (see the commented lines) dropped that to 14 seconds. Using an integer index rs.Fields[k].Value = i + k; droppped that to 11 seconds. Using the DAO.Field (myFields[k]) and a transaction actually took longer, increasing the time to 3.1 seconds.
Lastly, for completeness, all of this code was in a simple static class, and the using statements are:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using ACCESS = Microsoft.Office.Interop.Access; // USED ONLY FOR THE TEXT FILE METHOD
using DAO = Microsoft.Office.Interop.Access.Dao; // USED ONLY FOR THE DAO METHOD
using System.Data; // USED ONLY FOR THE ADO.NET/DataTable METHOD
using System.Data.OleDb; // USED FOR BOTH ADO.NET METHODS
using System.IO; // USED ONLY FOR THE TEXT FILE METHOD
Thanks Marc, in order to vote you I created an account on StackOverFlow...
Below is the reusable method [Tested on C# with 64 Bit - Win 7, Windows 2008 R2, Vista, XP platforms]
Performance Details:
Exports 120,000 Rows in 4 seconds.
Copy the below code and pass the parameters... and see the performance.
Just pass your datatable with the same schema, as of target Access Db Table.
DBPath= Full path of access Db
TableNm = Name of Target Access Db table.
The code:
public void BulkExportToAccess(DataTable dtOutData, String DBPath, String TableNm)
{
DAO.DBEngine dbEngine = new DAO.DBEngine();
Boolean CheckFl = false;
try
{
DAO.Database db = dbEngine.OpenDatabase(DBPath);
DAO.Recordset AccesssRecordset = db.OpenRecordset(TableNm);
DAO.Field[] AccesssFields = new DAO.Field[dtOutData.Columns.Count];
//Loop on each row of dtOutData
for (Int32 rowCounter = 0; rowCounter < dtOutData.Rows.Count; rowCounter++)
{
AccesssRecordset.AddNew();
//Loop on column
for (Int32 colCounter = 0; colCounter < dtOutData.Columns.Count; colCounter++)
{
// for the first time... setup the field name.
if (!CheckFl)
AccesssFields[colCounter] = AccesssRecordset.Fields[dtOutData.Columns[colCounter].ColumnName];
AccesssFields[colCounter].Value = dtOutData.Rows[rowCounter][colCounter];
}
AccesssRecordset.Update();
CheckFl = true;
}
AccesssRecordset.Close();
db.Close();
}
finally
{
System.Runtime.InteropServices.Marshal.ReleaseComObject(dbEngine);
dbEngine = null;
}
}
You can use a KORM, object relation mapper that allows bulk operations over MsAccess.
database
.Query<Movie>()
.AsDbSet()
.BulkInsert(_data);
or if you have source reader, you can directly use MsAccessBulkInsert class:
using (var bulkInsert = new MsAccessBulkInsert("connection string"))
{
bulkInsert.Insert(sourceReader);
}
KORM is available from nuget Kros.KORM.MsAccess and it's opensource on GitHub
Thanks Marc for the examples.
On my system the performance of DAO is not as good as suggested here:
TestADONET_Insert_TransferToAccess(): 68 seconds
TestDAOTransferToAccess(): 29 seconds
Since on my system the use of Office interop libraries is not an option I tried a new method involving the writing of a CSV file and then importing it via ADO:
public static double TestADONET_Insert_FromCsv()
{
StringBuilder names = new StringBuilder();
for (int k = 0; k < 20; k++)
{
string fieldName = "Field" + (k + 1).ToString();
if (k > 0)
{
names.Append(",");
}
names.Append(fieldName);
}
DateTime start = DateTime.Now;
StreamWriter sw = new StreamWriter("tmpdata.csv");
sw.WriteLine(names);
for (int i = 0; i < 100000; i++)
{
for (int k = 0; k < 19; k++)
{
sw.Write(i + k);
sw.Write(",");
}
sw.WriteLine(i + 19);
}
sw.Close();
using (OleDbConnection conn = new OleDbConnection(Properties.Settings.Default.AccessDB))
{
conn.Open();
OleDbCommand cmd = new OleDbCommand();
cmd.Connection = conn;
cmd.CommandText = "DELETE FROM TEMP";
int numRowsDeleted = cmd.ExecuteNonQuery();
Console.WriteLine("Deleted {0} rows from TEMP", numRowsDeleted);
StringBuilder insertSQL = new StringBuilder("INSERT INTO TEMP (")
.Append(names)
.Append(") SELECT ")
.Append(names)
.Append(#" FROM [Text;Database=.;HDR=yes].[tmpdata.csv]");
cmd.CommandText = insertSQL.ToString();
cmd.ExecuteNonQuery();
cmd.Dispose();
}
double elapsedTimeInSeconds = DateTime.Now.Subtract(start).TotalSeconds;
Console.WriteLine("Append took {0} seconds", elapsedTimeInSeconds);
return elapsedTimeInSeconds;
}
Performace analysis of TestADONET_Insert_FromCsv(): 1.9 seconds
Similar to Marc's example TestTextTransferToAccess(), this method is also fragile for a number of reasons regarding the use of CSV files.
Hope this helps.
Lorenzo
First make sure that the access table columns have the same column names and similar types. Then you can use this function which I believe is very fast and elegant.
public void AccessBulkCopy(DataTable table)
{
foreach (DataRow r in table.Rows)
r.SetAdded();
var myAdapter = new OleDbDataAdapter("SELECT * FROM " + table.TableName, _myAccessConn);
var cbr = new OleDbCommandBuilder(myAdapter);
cbr.QuotePrefix = "[";
cbr.QuoteSuffix = "]";
cbr.GetInsertCommand(true);
myAdapter.Update(table);
}
Another method to consider, involving linking tables via DAO or ADOX then executing statements like this:
SELECT * INTO Table1 FROM _LINKED_Table1
Please see my full answer here:
MS Access Batch Update via ADO.Net and COM Interoperability
To add to Marc's answer:
Note that having the [STAThread] attribute above your Main method. will make your program easily able to communicate with COM objects, increasing the speed further. I know it's not for every application but if you heavily depend on DAO, I would recommend it.
Further more, using the DAO insertion method. If you have a column that is not required and you want to insert null, don't even set it's value. Setting the value cost time even if it's null.
Note the position of the DAO component here. This helps explain the efficiency improvements.
I have to upgrade the following code to use prepared statements:
OdbcCommand cmd = sql.CreateCommand();
cmd.CommandText = "SELECT [EMail] from myTable WHERE "+,
for (int i = 0; i < 50; i++)
{
if (i > 0)
{
cmd.CommandText += " OR ";
}
cmd.CommandText += "UNIQUE_ID = " + lUniqueIDS[i];
}
Forbid my stupid code above, it's just an example... I'm trying to fetch all the Emails of users with IDs either x, y, z, etc...
The question is - how can I rewrite it using prepared statements?
A blind naive guess would be
for (int i = 0; i < 50; i++)
{
if (i > 0)
{
cmd.CommandText += " OR ";
}
cmd.CommandText += "UNIQUE_ID = ?";
cmd.Parameters.Add("#UNIQUE_ID", OdbcType.BigInt).Value = lUniqueIDS[i];
}
Should it work? Can I append the same parameter (unique_id) more than once?
It looks like you're using positional parameters (i.e. ? in the query, rather than #UNIQUE_ID) which means the names of the parameters shouldn't matter as far as the SQL is concerned. However, I wouldn't be entirely surprised to see the provider complain... and it may make diagnostics harder too. I suggest you use the index as a suffix:
cmd.Parameters.Add("#UNIQUE_ID" + i, ObdcType.BigInt).Value = lUniqueIDs[i];