The follow code will insert some values in my database. It gets 6 random values, puts the stuff in an array and then inserts it in the database.
public void LottoTest(object sender, EventArgs e)
{
Dictionary<int, int> numbers = new Dictionary<int, int>();
Random generator = new Random();
while (numbers.Count < 6)
{
numbers[generator.Next(1, 49)] = 1;
}
string[] lotto = numbers.Keys.OrderBy(n => n).Select(s => s.ToString()).ToArray();
foreach (String _str in lotto)
{
Response.Write(_str);
Response.Write(",");
}
var connectionstring = "Server=C;Database=lotto;User Id=lottoadmin;Password=password;";
using (var con = new SqlConnection(connectionstring)) // Create connection with automatic disposal
{
con.Open();
using (var tran = con.BeginTransaction()) // Open a transaction
{
// Create command with parameters (DO NOT PUT VALUES IN LINE!!!!!)
string sql =
"insert into CustomerSelections(val1,val2,val3,val4,val5,val6) values (#val1,#val2,#val3,#val4,#val5,#val6)";
var cmd = new SqlCommand(sql, con);
cmd.Parameters.AddWithValue("val1", lotto[0]);
cmd.Parameters.AddWithValue("val2", lotto[1]);
cmd.Parameters.AddWithValue("val3", lotto[2]);
cmd.Parameters.AddWithValue("val4", lotto[3]);
cmd.Parameters.AddWithValue("val5", lotto[4]);
cmd.Parameters.AddWithValue("val6", lotto[5]);
cmd.Transaction = tran;
cmd.ExecuteNonQuery(); // Insert Record
tran.Commit(); // commit transaction
Response.Write("<br />");
Response.Write("<br />");
Response.Write("Ticket has been registered!");
}
}
}
What is the best way to loop and insert MASS entries into the database. Lets say, 100,000 records via C#? I want to be able to generate the random numbers by my method and utilize the insert which i have too..
For true large scale inserts, SqlBulkCopy is your friend. The easy but inefficient way to do this is just to fill a DataTable with the data, and throw that at SqlBulkCopy, but it can be done twice as fast (trust me, I've timed it) by spoofing an IDataReader. I recently moved this code into FastMember for convenience, so you can just do something like:
class YourDataType {
public int val1 {get;set;}
public string val2 {get;set;}
... etc
public DateTime val6 {get;set;}
}
then create an iterator block (i.e. a non-buffered forwards only reader):
public IEnumerable<YourDataType> InventSomeData(int count) {
for(int i = 0 ; i < count ; i++) {
var obj = new YourDataType {
... initialize your random per row values here...
}
yield return obj;
}
}
then:
var data = InventSomeData(1000000);
using(var bcp = new SqlBulkCopy(connection))
using(var reader = ObjectReader.Create(data))
{ // note that you can be more selective with the column map
bcp.DestinationTableName = "CustomerSelections";
bcp.WriteToServer(reader);
}
You need Sql bulk insert. There is a nice tutorial on msdn http://blogs.msdn.com/b/nikhilsi/archive/2008/06/11/bulk-insert-into-sql-from-c-app.aspx
MSDN Table Value Parameters
Basically, you fill a datatable with the data you want to put into SqlServer.
DataTable tvp = new DataTable("LottoNumbers");
forach(var numberSet in numbers)
// add the data to the dataset
Then you pass the data through ADO using code similar to this...
command.Parameters.Add("#CustomerLottoNumbers", SqlDbType.Structured);
command.Parameters["CustomerLottoNumbers"].Value = tvp;
Then you could use sql similar to this...
INSERT CustomerSelections
SELECT * from #CustomerLottoNumbers
Related
I'm trying to write some code to loop through a data range and add new documents to SAP based on a query input. I need the values to be added to the documents based on the supplier field and when the supplier changes create a new document. Currently I am only able to loop through adding items to the document and rather than moving to the next supplier it just loops the items again. I'm pretty new to C# so looping is pretty new to me but hoping someone can help?
int recordCount = oRecordset.RecordCount;
string Supplier = oRecordset.Fields.Item(1).Value.ToString();
string Item = oRecordset.Fields.Item(0).Value.ToString();
Qty = Convert.ToInt32(oRecordset.Fields.Item(3).Value.ToString());
if(recordCount>0)
application.MessageBox("Adding PQ");
System.Threading.Thread.Sleep(2);
{
for(int i = 0; i < recordCount; i++)
{
OPQT.CardCode = Supplier ;
OPQT.DocDate = DateTime.Now;
OPQT.DocDueDate = DateTime.Now;
OPQT.RequriedDate = DateTime.Now;
OPQT.Lines.ItemCode = Item;
OPQT.Lines.RequiredQuantity = Qty;
OPQT.Lines.Add();
oRecordset.MoveNext();
}
OPQT.Add();
application.MessageBox("PQ Added");
}
You'd be much better of starting to learn with SqlDataReader, IMO.
Adapting to your business case you'd get something like this, this is not working code, I don't have enough info for that. However this should help you progress in the right direction.
using System;
using System.Data;
using System.Data.SqlClient;
class Program
{
static void Main()
{
string connectionString = "Data Source=(local);Initial Catalog=...";
ReadOData( connectionString);
}
private static IEnumerable<OPQT> ReadOrderData(string connectionString)
{
string queryString =
#"
SELECT
T0.[ItemCode],
T0.[CardCode],
'10' [Qty]
FROM
[OSCN] T0
JOIN
[OCRD] T1
ON T0.[CardCode] = T1.[CardCode]
WHERE
T1.[CardType] ='S'";
using (SqlConnection connection =
new SqlConnection(connectionString))
{
SqlCommand command =
new SqlCommand(queryString, connection);
connection.Open();
SqlDataReader reader = command.ExecuteReader();
try
{
// Call Read before accessing data.
while (reader.Read())
{
yield return ReadSingleRow((IDataRecord)reader);
}
}
finally
{
// Call Close when done reading.
reader.Close();
}
}
}
private static OPQT ReadSingleRow(IDataRecord dataRecord)
{
return new OPQT
{
Lines.ItemCode = dataRecord[0],
CardCode = dataRecord[1],
Lines.RequiredQuantity = dataRecord[2]
};
}
}
I am really clueless here.
I import ~2 million rows into my Azure SQL database. I create a temp table where I put my values, when I use merge technique to insert only rows that do not have duplicates. My code and scripts are shown below.
public async Task BulkImportWithoutDuplicates(DataTable reader)
{
var tableName = "##tempImport";
using (var connection = new SqlConnection(sqlCOnn.ConnectionString))
{
using (SqlCommand command = new SqlCommand("", sqlCOnn))
{
try
{
sqlCOnn.Open();
// Creating temp table on database
command.CommandText = Scripts.GetTempTableScript();
command.ExecuteNonQuery();
// Bulk insert into temp table
using (SqlBulkCopy b = new SqlBulkCopy(conString, SqlBulkCopyOptions.TableLock))
{
b.BulkCopyTimeout = 0;
b.BatchSize = reader.Rows.Count;
b.DestinationTableName = tableName;
//dataTable
await b.WriteToServerAsync(reader);
b.Close();
}
// Updating destination table, and dropping temp table
command.CommandText = Scripts.GetMergeScript();
var rows = command.ExecuteNonQuery();
}
catch (Exception ex)
{
// Handle exception properly
}
finally
{
connection.Close();
}
}
}
}
public static string GetTempTableScript()
{
return $#"
IF OBJECT_ID('tempdb.dbo.##tempImport', 'U') IS NOT NULL
BEGIN
DROP TABLE ##tempImport;
END
CREATE TABLE ##tempImport ( ... all the columns);";
}
public static string GetMergeScript()
{
return $#"MERGE INTO dbo.Data AS target
USING ##tempImport AS source
ON (source.TransactionId = target.TransactionId AND source.UserId = target.UserId)
WHEN NOT MATCHED THEN
INSERT (Start, Spend, UserId, Product, Shop, ClientId, UploadDataId, UniqueId, TransactionId, q, cq, c2)
VALUES (source.Start, source.Spend, source.UserId, source.Product, source.Shop,
source.ClientId, source.UploadDataId, source.UniqueId, source.TransactionId, source.q, source.c1, source.c2);
";
}
I really do not get why it takes ages until it finishes. I waited for 24 minutes until it was added to temporary table alone.
I was reading this article and it seems that it shouldn't take long. https://www.adathedev.co.uk/2011/01/sqlbulkcopy-to-sql-server-in-parallel.html?m=1
What I am doing wrong here? How can I improve the speed?
I tried using both IDataReader and DataTable but both of them do not work well for me...
We have a big list around 100000 records and want to insert it into a sql table.
What are we doing is; converting that list into data table and passing datatable to SqlBulkcopy method.
This conversion from list to Datatable taking more time. Tried using Parallel but as Datatable is not thread safe so avoided that.
Adding sample poc code which generates integer list and insert it into temp table
static void Main(string[] args)
{
List<int> valueList = GenerateList(100000);
Console.WriteLine("Starting with Bulk Insert ");
DateTime startTime = DateTime.Now;
int recordCount = BulkInsert(valueList);
TimeSpan ts = DateTime.Now.Subtract(startTime);
Console.WriteLine("Bulk insert for {0} records in {1} miliseconds.-> ", recordCount, ts.Milliseconds);
Console.WriteLine("Done.");
Console.ReadLine();
}
private static int BulkInsert(List<int> valueList)
{
SqlBulkHelper sqlBulkHelper = new SqlBulkHelper();
var eventIdDataTable = CreateIdentityDataTable(valueList, "SqlTable", "Id");
return FillBulkPoundTable(eventIdDataTable, "#SqlTable");
}
private static List<int> GenerateList(int size)
{
return Enumerable.Range(0, size).ToList();
}
private static DataTable CreateIdentityDataTable(List<int> ids, string dataTableName, string propertyName)
{
if (ids == null) return null;
using (var dataTable = new DataTable(dataTableName))
{
dataTable.Locale = CultureInfo.CurrentCulture;
var dtColumn = new DataColumn(propertyName, Type.GetType("System.Int32"));
dataTable.Columns.Add(dtColumn);
foreach (int id in ids)
{
DataRow row = dataTable.NewRow();
row[propertyName] = id;
dataTable.Rows.Add(row);
}
return dataTable;
}
}
private static int FillBulkPoundTable(DataTable dataTable, string destinationTableName)
{
int totalInsertedRecordCount = 0;
using (SqlConnection _connection = new SqlConnection(CongifUtil.sqlConnString))
{
string sql =
#"If object_Id('tempdb..#EventIds') is not null drop table #EventIds
CREATE TABLE #EventIds(EvId int) ";
_connection.Open();
using (var command = new SqlCommand(sql, _connection))
{
command.ExecuteNonQuery();
}
using (var sqlBulkCopy = new SqlBulkCopy(_connection))
{
sqlBulkCopy.BulkCopyTimeout = 0;
sqlBulkCopy.DestinationTableName = destinationTableName;
sqlBulkCopy.WriteToServer(dataTable);
}
using (var command = new SqlCommand(sql, _connection))
{
command.CommandText = "Select Count(1) as RecordCount from #EventIds";
SqlDataReader reader = command.ExecuteReader();
if (reader.HasRows)
{
while (reader.Read())
{
totalInsertedRecordCount = Convert.ToInt32(reader["RecordCount"]);
}
}
}
}
return totalInsertedRecordCount;
}
Currently it is taking around 8 seconds but we need to make it more faster. Reason is our target is to insert 900,000 records which will be devided into 100,000 batch each.
Can you give us any hint how can we make it perfect and faster?
PS. Tried with Dapper insert too but it is not faster than BulkCopy.
First Covert your list into XML something like
List<int> Branches = new List<int>();
Branches.Add(1);
Branches.Add(2);
Branches.Add(3);
XElement xmlElements = new XElement("Branches", Branches.Select(i => new
XElement("branch", i)));
Then pass the xml to a SP as parameter and insert it directly to your table, Example :
DECLARE #XML XML
SET #XML = '<Branches>
<branch>1</branch>
<branch>2</branch>
<branch>3</branch>
</Branches>'
DECLARE #handle INT
DECLARE #PrepareXmlStatus INT
EXEC #PrepareXmlStatus= sp_xml_preparedocument #handle OUTPUT, #XML
SELECT * FROM OPENXML(#handle, '/Branches/branch', 2)
WITH (
branch varchar
)
EXEC sp_xml_removedocument #handle
Bach Size
From what I understand, you try to insert with a BatchSize of 100000. Higher is not always better.
Try to lower this amount to 5,000 instead and check for the performance difference.
You increase the amount of database round-trip but it may also go faster (Too much factor such as the row size are involved here)
TableLock
Using the SqlBulkCopyOptions.TableLock will improve your insert performance.
using (var sqlBulkCopy = new SqlBulkCopy(_connection, SqlBulkCopyOptions.KeepIdentity))
I searched on the net something but nothing really helped me. I want to update, with a list of article, a database, but the way that I've found is really slow.
This is my code:
List<Article> costs = GetIdCosts(); //here there are 70.000 articles
conn = new OleDbConnection(string.Format(MDB_CONNECTION_STRING, PATH, PSW));
conn.Open();
transaction = conn.BeginTransaction();
using (var cmd = conn.CreateCommand())
{
cmd.Transaction = transaction;
cmd.CommandText = "UPDATE TABLE_RO SET TABLE_RO.COST = ? WHERE TABLE_RO.ID = ?;";
for (int i = 0; i < costs.Count; i++)
{
double cost = costs[i].Cost;
int id = costs[i].Id;
cmd.Parameters.AddWithValue("data", cost);
cmd.Parameters.AddWithValue("id", id);
if (cmd.ExecuteNonQuery() != 1) throw new Exception();
}
}
transaction.Commit();
But this way take a lot of minutes something like 10 minutes or more. There are another way to speed up this updating ? Thanks.
Try modifying your code to this:
List<Article> costs = GetIdCosts(); //here there are 70.000 articles
// Setup and open the database connection
conn = new OleDbConnection(string.Format(MDB_CONNECTION_STRING, PATH, PSW));
conn.Open();
// Setup a command
OleDbCommand cmd = new OleDbCommand();
cmd.Connection = conn;
cmd.CommandText = "UPDATE TABLE_RO SET TABLE_RO.COST = ? WHERE TABLE_RO.ID = ?;";
// Setup the paramaters and prepare the command to be executed
cmd.Parameters.Add("?", OleDbType.Currency, 255);
cmd.Parameters.Add("?", OleDbType.Integer, 8); // Assuming you ID is never longer than 8 digits
cmd.Prepare();
OleDbTransaction transaction = conn.BeginTransaction();
cmd.Transaction = transaction;
// Start the loop
for (int i = 0; i < costs.Count; i++)
{
cmd.Parameters[0].Value = costs[i].Cost;
cmd.Parameters[1].Value = costs[i].Id;
try
{
cmd.ExecuteNonQuery();
}
catch (Exception ex)
{
// handle any exception here
}
}
transaction.Commit();
conn.Close();
The cmd.Prepare method will speed things up since it creates a compiled version of the command on the data source.
Small change option:
Using StringBuilder and string.Format construct one big command text.
var sb = new StringBuilder();
for(....){
sb.AppendLine(string.Format("UPDATE TABLE_RO SET TABLE_RO.COST = '{0}' WHERE TABLE_RO.ID = '{1}';",cost, id));
}
Even faster option:
As in first example construct a sql but this time make it look (in result) like:
-- declaring table variable
declare table #data (id int primary key, cost decimal(10,8))
-- insert union selected variables into the table
insert into #data
select 1121 as id, 10.23 as cost
union select 1122 as id, 58.43 as cost
union select ...
-- update TABLE_RO using update join syntax where inner join data
-- and copy value from column in #data to column in TABLE_RO
update dest
set dest.cost = source.cost
from TABLE_RO dest
inner join #data source on dest.id = source.id
This is the fastest you can get without using bulk inserts.
Performing mass-updates with Ado.net and OleDb is painfully slow. If possible, you could consider performing the update via DAO. Just add the reference to the DAO-Library (COM-Object) and use something like the following code (caution -> untested):
// Import Reference to "Microsoft DAO 3.6 Object Library" (COM)
string TargetDBPath = "insert Path to .mdb file here";
DAO.DBEngine dbEngine = new DAO.DBEngine();
DAO.Database daodb = dbEngine.OpenDatabase(TargetDBPath, false, false, "MS Access;pwd="+"insert your db password here (if you have any)");
DAO.Recordset rs = daodb.OpenRecordset("insert target Table name here", DAO.RecordsetTypeEnum.dbOpenDynaset);
if (rs.RecordCount > 0)
{
rs.MoveFirst();
while (!rs.EOF)
{
// Load id of row
int rowid = rs.Fields["Id"].Value;
// Iterate List to find entry with matching ID
for (int i = 0; i < costs.Count; i++)
{
double cost = costs[i].Cost;
int id = costs[i].Id;
if (rowid == id)
{
// Save changed values
rs.Edit();
rs.Fields["Id"].Value = cost;
rs.Update();
}
}
rs.MoveNext();
}
}
rs.Close();
Note the fact that we are doing a full table scan here. But, unless the total number of records in the table is many orders of magnitude bigger than the number of updated records, it should still outperform the Ado.net approach significantly...
I have in my database a table called students that have the number and name, address....
I have a form where I load all information for one student at a a time , and I have a next button and a back button.
How can I iterate to the next row (or previous row) in mysql (to be able to see the info of the next student) ?
I tried to use the primary key (auto increment) to iterate and when I want to see the next record I add 1 to the id or subtract 1 to see the previous record.
But if one record is deleted it will show an empty record.
Can you point me in the rigth direction?
I´m using WinForms
Sorry about my english..
string config = "server=localhost; userid = root; database = databaseName";
MySqlConnection con = new MySqlConnection(config);
MySqlDataReader reader = null;
string query = "SELECT * FROM students WHERE id = " + id; //id is the primary Key (auto increment)
MySqlCommand command = new MySqlCommand(query, con);
con.Open();
reader = command.ExecuteReader();
while (reader.Read())
{
string studentName = (string)reader["studentName"];
string studentNum = (string)reader["studentNum"];
tbstudentName.Text = Convert.ToString(studentName);
tbstudentNum.Text = Convert.ToString(studentNum);
.....
}
con.Close();
You should not be calling the database each time you want to view the next record. Try reading all the data into a List.
I am not sure what you are using.. WinForms? WPF?
If WinForms you will need to do something like this.
public class Student
{//First create a class to hold your data in
public string Name { get; set; }
public string Num { get; set; }
}
public class MyForm : Form
{
int Index = 0;
List<Student> FormData { get; set; }
void GetData()
{
//This will hold all your data in memory so you do not have to make a database call each and every "iteration"
List<Student> dbData = new List<Student>();
string config = "server=localhost; userid = root; database = databaseName";
MySqlConnection con = new MySqlConnection(config);
MySqlDataReader reader = null;
string query = "SELECT * FROM students";
MySqlCommand command = new MySqlCommand(query, con);
con.Open();
reader = command.ExecuteReader();
while (reader.Read())
{
Student newStudent = new Student();
newStudent.Name = (string)reader["studentName"];
newStudent.Num = (string)reader["studentNum"];
//Add data to the list you created
dbData.Add(newStudent);
.....
}
con.Close();
//set the Form's list equal to the one you just populated
this.FormData = dbData;
}
private void BindData()
{
//If winforms
tbstudentName.Text = FormData[Index].Name;
tbstudentNum.Text = FormData[Index].Num;
//If wpf you will have to use view models and bind your data in your XAML but I am assuming you are using
//winforms here.
}
private void NextRecord()
{ //If you reached the end of the records then this will prevent IndexOutOfRange Exception
if (Index < FormData.Count - 1)
{
Index++;
BindData();
}
}
private void PreviousRecord()
{
if (Index != 0)
{
Index--;
BindData();
}
}
}
Now the above scenario will get it working quickly; however, there are better ways in doing this that would help you when you need to alter that data. I would recommend WinForms Binding. You can check it out here http://msdn.microsoft.com/en-us/library/c8aebh9k(v=vs.110).aspx
To get the next you can write:
select * from students where id > #id
order by id asc
limit 1
And to get previous
select * from students where id < #id
order by id desc
limit 1
DataReader Designed to quick one-time read.
If you want to hold the data, you need to fill memory arrays.
the DataTable implements it very well.
You will need to think a little different.
Getting id+1 you are being very careless.. Even identity, the Id can be another value and you will get an Exception.. I suppose that you don't want it.
You will need to Adjust your logic to return lines with top or, in mysql, limit statement..
This will be easy using lambda to use .Take() and Skip() methods...
You also can use the limit parameter to pass throug this sample.. you can understand..
MySQL skip first 10 results
Hope it helps.