I am trying to use C# with a SQL Server database and I have a problem.
I have an array something like that actually original array have a 10001x1 size
long[] lvl = { 0, 7200000, 15840000, 25920000, 37440000, 50400000, 64800000, 80640000 }
When I try to take same long[] array from the database, I am getting an error.
string sorgu = "select * from paragon";
var komut = new SqlCommand(sorgu, baglanti);
var reader = komut.ExecuteReader();
IList<long> lvl = new List<long>();
while (reader.Read())
{
lvl.Add((long)reader["Paragon"]);
}
reader.Close();
reader.Dispose();
long ns = Convert.ToInt64(textBox1.Text);
long sns = Convert.ToInt64(textBox2.Text);
long nsxp = lvl[ns];
long snsxp = lvl[sns];
long toplam = nsxp + snsxp;
for (int i = 0; i < lvl.Count; i++)
{
if (toplam < lvl[i])
{
textBox3.Text = Convert.ToString(i - 1);
break;
}
}
İmage 1
Error image
your problem is data type mismatch.
reader gives you one value in every reader.Read() operation.
IList<long> myArray = new List<myArray>();
while (reader.Read())
{
myArray.Add(reader.GetInt64(0));
}
reader.Close();
reader.Dispose(); // always close and dispose your reader whenever you are done.
long ns = Convert.ToInt64(textBox1.Text);
long sns = Convert.ToInt64(textBox2.Text);
long nsxp = lvl[ns];
long snsxp = lvl[sns];
long toplam = nsxp + snsxp;
for (int i = 0; i < lvl.Length; i++)
{
if (toplam < lvl[i])
{
textBox3.Text = Convert.ToString(i - 1);
break;
}
}
SqlDataReader's Read() reads a single record from the database. You are trying to use it to read all records at once. This code illustrates an example of how to read each value sequentially:
while (reader.Read()) // This returns true if a record is available, and false once all records have been read.
{
var paragonValue = reader.GetInt64(0); // This reads the current record's Paragon value.
// Do something with paragonValue.
}
See the Microsoft Docs on SqlDataReader.Read for more information.
Related
Below is my class:
MsSql.cs:
public class MSSqlBLL
{
public static long RowsCopied { get; set; }
public long BulkCopy()
{
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(conn))
{
bulkCopy.DestinationTableName = "dbo.Table1";
bulkCopy.BatchSize = 100;
bulkCopy.SqlRowsCopied +=
new SqlRowsCopiedEventHandler(OnSqlRowsCopied);
bulkCopy.NotifyAfter = 100;
try
{
bulkCopy.WriteToServer(reader);
}
return RowsCopied;
}
}
private static void OnSqlRowsCopied(object sender, SqlRowsCopiedEventArgs e)
{
RowsCopied = RowsCopied + e.RowsCopied;
}
}
I am calling BulkCopy function from this class and i want to get currently processed record in my affected records variable.
For eg :For each iteration of loop i would like to get affected records in my affectedRows variable.
public class MySqlBLL
{
public void GetTotalRows()
{
int totalRecords = 500;
var table = "Table1";
for (int i = 0; i < totalRecords / 100; i++) //
{
query = "SELECT * FROM " + table + " LIMIT " + 0 + "," + 100;
var reader = Execute(conn, query);
long affectedRecords = msSql.BulkCopy();
reader.Close();
}
}
}
In the above method i am sending chunk by chunk data to BulkCopy method to perform bulk copy but for each bulk copy i would like to get number of records that are processed by bulk copy but the problem is i am getting 0 in affectedRecords variable.
I want to get access current rows processed by sql bulk copy.
The RowsCopied property is only updated after 100 records are copied (as set using NotifyAfter). If you place
Console.WriteLine("Copied {0} so far...", e.RowsCopied);
in OnSqlRowsCopied event handler you will get ongoing progress in case of Console app.
But in your case you can simply select count(*) from source table to show the count.
-Source
I'm comparing object materialization speed of different ORM libraries and ADO.NET. The testing consist of querying 1, 10, 100 and 1000 results 11 times(I'll call 1 set) and repeat whole test again 10 times, so there are 10 set per querying testing.
The result of 1000 querying for no tracking entity is interesting, as expect the very first time query was executed is very slow, the following(2-11) is faster but the following set(2-10) is faster than ADO.NET!.
code for EF Core testing. Keys is 2 dimensions array of 11 rows and 1000 columns.
public override async Task TestFindByPK(int[,] keys)
{
// force load all assemblies before real executing
using (var forceLoad = new AdventureWorks2012Context())
{
var ignore = await forceLoad.ErrorLog.ToListAsync();
}
for (var t = 0; t < 10; t++)
{
using (var context = new AdventureWorks2012Context())
{
context.ChangeTracker.QueryTrackingBehavior = QueryTrackingBehavior.NoTracking;
var stopWatch = new Stopwatch();
for (var r = 0; r < keys.GetLength(0); r++)
{
List<int> rows = GetRow(ref keys, r);
stopWatch.Restart();
var result = await context.SalesOrderHeader.Where(so => rows.Contains(so.SalesOrderId)).ToListAsync();
stopWatch.Stop();
await PrintTestFindByPKReport(stopWatch.ElapsedMilliseconds);
}
}
Log.WriteLine($"Finish {t + 1} run{Environment.NewLine}");
Thread.Sleep(TimeSpan.FromSeconds(2));
}
await Log.WriteLineAsync($"Complete on {DateTime.Now.ToShortTimeString()}{Environment.NewLine}");
}
I guessed the result of query must be cached so that the following test set that use the same keys can retrieve result from cache otherwise EF Core must slower than ADO.NET.
So, question 1, Does EF Core cache query result when specified no tracking result?
Just in case that ADO.NET test code isn't efficient enough, this is code for ADO.NET test.
public override async Task TestFindByPK(int[,] keys)
{
await PreloadAssemblies();
var sql = "SELECT * FROM Sales.SalesOrderHeader WHERE SalesOrderID IN " + CreateInSQLCommand(keys.GetLength(1));
SqlConnection conn = null;
var stopWatch = new Stopwatch();
for (var t = 0; t < 10; t++)
{
try
{
conn = new SqlConnection(ConnectionString);
conn.Open();
var sqlCmd = new SqlCommand(sql, conn);
for (var r = 0; r < keys.GetLength(0); r++)
{
var rows = GetRow(ref keys, r);
stopWatch.Restart();
sqlCmd.Parameters.Clear();
sqlCmd.Parameters.AddRange(CreateParamForInClause(rows));
var reader = await sqlCmd.ExecuteReaderAsync();
SalesOrderHeaderSQLserver salesOrderHeader = null;
while (await reader.ReadAsync())
{
salesOrderHeader = new SalesOrderHeaderSQLserver();
salesOrderHeader.SalesOrderId = reader.GetInt32(0);
salesOrderHeader.RevisionNumber = reader.GetByte(1);
salesOrderHeader.OrderDate = reader.GetDateTime(2);
salesOrderHeader.DueDate = reader.GetDateTime(3);
salesOrderHeader.ShipDate = await ConvertTo<DateTime?>(reader, 4);
salesOrderHeader.Status = reader.GetByte(5);
salesOrderHeader.OnlineOrderFlag = reader.GetBoolean(6);
salesOrderHeader.SalesOrderNumber = reader.GetString(7);
salesOrderHeader.PurchaseOrderNumber = await ConvertTo<string>(reader, 8);
salesOrderHeader.AccountNumber = await ConvertTo<string>(reader, 9);
salesOrderHeader.CustomerID = reader.GetInt32(10);
salesOrderHeader.SalesPersonID = await ConvertTo<int?>(reader, 11);
salesOrderHeader.TerritoryID = await ConvertTo<int?>(reader, 12);
salesOrderHeader.BillToAddressID = reader.GetInt32(13);
salesOrderHeader.ShipToAddressID = reader.GetInt32(14);
salesOrderHeader.ShipMethodID = reader.GetInt32(15);
salesOrderHeader.CreditCardID = await ConvertTo<int?>(reader, 16);
salesOrderHeader.CreditCardApprovalCode = await ConvertTo<string>(reader, 17);
salesOrderHeader.CurrencyRateID = await ConvertTo<int?>(reader, 18);
salesOrderHeader.SubTotal = reader.GetDecimal(19);
salesOrderHeader.TaxAmt = reader.GetDecimal(20);
salesOrderHeader.Freight = reader.GetDecimal(21);
salesOrderHeader.TotalDue = reader.GetDecimal(22);
salesOrderHeader.Comment = await ConvertTo<string>(reader, 23);
salesOrderHeader.Rowguid = reader.GetGuid(24);
salesOrderHeader.ModifiedDate = reader.GetDateTime(25);
}
stopWatch.Stop();
reader.Close();
await PrintTestFindByPKReport(stopWatch.ElapsedMilliseconds);
}
}
catch (SqlException ex)
{
Console.WriteLine($"Exception message : {ex.Message}");
}
finally
{
conn.Dispose();
}
Log.WriteLine($"Finish {t + 1} run{Environment.NewLine}");
Thread.Sleep(TimeSpan.FromSeconds(2));
}
await Log.WriteLineAsync($"Complete on {DateTime.Now.ToShortTimeString()}" + Environment.NewLine);
}
Other weird thing is that the first set is always slower than following set, I know that the very time query was executed it must be processed and translated to corresponding SQL command but it is cache but why 2-11 is slower than the following test set(it much faster than the very first time though).
Does it because the IN clause with a set of values is considered to be different if set of values is difference?
I have 2GB files (9 of them) which contains approximately 12M records of strings that i want to insert each one as a document to local mongodb (windows).
Now i'm reading line by line and inserting every second line (the first is unnecessary header) like this:
bool readingFlag = false;
foreach (var line in File.ReadLines(file))
{
if (readingflag)
{
String document = "{'read':'" + line + "'}";
var documnt = new BsonDocument(
MongoDB
.Bson
.Serialization
.BsonSerializer
.Deserialize<BsonDocument>(document));
await collection.InsertOneAsync(documnt);
readingflag = false;
}
else
{
readingflag = true;
}
}
This method is working but not as fast as i expected. I'm now in the middle of the file and i assume it will end in about 4 hours for just one file. (40 hours for all my data)
I think that my bottleneck is the file reading but since it is very big file VS doesn't let my load it to memory (out of memory exception).
Is there any other way that i'm missing here?
I think we could utilize those things:
Get some lines and add in a bunch by insert many
insert data on separate thread as we don't need to wait for finish
use a typed class TextData to push serialization to other thread
You can play with limit at once - as this depend of amount of data read from file
public class TextData{
public ObjectId _id {
get;
set;
}
public string read {
get;
set;
}
}
public class Processor{
public async void ProcessData() {
var client = new MongoClient("mongodb://localhost:27017");
var database = client.GetDatabase("test");
var collection = database.GetCollection < TextData > ("Yogevnn");
var readingflag = false;
var listOfDocument = new List < TextData > ();
var limiAtOnce = 100;
var current = 0;
foreach(var line in File.ReadLines( # "E:\file.txt")) {
if (readingflag) {
var dataToInsert = new TextData {
read = line
};
listOfDocument.Add(dataToInsert);
readingflag = false;
Console.WriteLine($ "Current position: {current}");
if (++current == limiAtOnce) {
current = 0;
Console.WriteLine($ "Inserting data");
var listToInsert = listOfDocument;
var t = new Task(() = > {
Console.WriteLine($ "Inserting data START");
collection.InsertManyAsync(listToInsert);
Console.WriteLine($ "Inserting data FINISH");
});
t.Start();
listOfDocument = new List < TextData > ();
}
} else {
readingflag = true;
}
}
// insert remainder
await collection.InsertManyAsync(listOfDocument);
}
}
Any comments welcome!
In my experiments I found Parallel.ForEach(File.ReadLines("path")) to be the fastest.
File size was about 42 GB. I also tried batching a set of 100 lines and save the batch but was slower than Parallel.ForEach.
Another example: Read large txt file multithreaded?
I have this code running calling the data from the core system,
public int MuatTurunMTS(hopesWcfRef.Manifest2 entParam, string destination, int capacity)
{
try
{
EL.iSeriesWcf.IiSeriesClient iClient = new iSeriesWcf.IiSeriesClient();
EL.iSeriesWcf.Manifest2 manifest2 = new iSeriesWcf.Manifest2();
manifest2 = iClient.GetManifest2(entParam.NOKT, destination, capacity);
DataTable dt = new DataTable();
dt = manifest2.Manifest2Table;
List<hopesWcfRef.Manifest2> LstManifest2 = new List<hopesWcfRef.Manifest2>();
for (int i = 0; i < dt.Rows.Count; i++)
{
hopesWcfRef.Manifest2 newDataRow = new hopesWcfRef.Manifest2();
newDataRow.NOPASPOT = dt.Rows[i][1].ToString();
newDataRow.NOKT = dt.Rows[i][0].ToString();
newDataRow.KOD_PGKT = dt.Rows[i][2].ToString();
newDataRow.KOD_KLGA = dt.Rows[i][3].ToString();
newDataRow.NAMA = dt.Rows[i][4].ToString();
newDataRow.TKHLAHIR = (dt.Rows[i][5].ToString() == "1/1/0001 12:00:00 AM" ? DateTime.Now : Convert.ToDateTime(dt.Rows[i][5])); //Convert.ToDateTime(dt.Rows[i][5]);
newDataRow.JANTINA = dt.Rows[i][6].ToString();
newDataRow.WARGANEGARA = dt.Rows[i][7].ToString();
newDataRow.JNS_JEMAAH = dt.Rows[i][8].ToString();
newDataRow.NO_SIRI = dt.Rows[i][9].ToString();
newDataRow.NO_MS = Convert.ToInt16(dt.Rows[i][10]);
newDataRow.BARKOD = dt.Rows[i][13].ToString();
newDataRow.NO_DAFTAR = dt.Rows[i][14].ToString();
//bydefault make cell empty
newDataRow.STS_JEMAAH = "";
newDataRow.SEAT_NO = "";
newDataRow.SEAT_ZONE = "";
newDataRow.BERAT = 0;
newDataRow.JUM_BEG = 0;
LstManifest2.Add(newDataRow);
}
int cntr = 0;
if (LstManifest2.Count != 0)
{
foreach (hopesWcfRef.Manifest2 manifest in LstManifest2)
{
cntr++;
SaveManifestTS(manifest);
}
}
return LstManifest2.Count;
}
catch (Exception ex)
{
throw ex;
}
}
And goes to :
public void SaveManifestTS(hopesWcfRef.Manifest2 manifest)
{
try
{
ent.BeginSaveChanges(SaveChangesOptions.Batch, null, null);
ent.AddObject("Manifest2", manifest);
ent.SaveChanges(SaveChangesOptions.Batch);
}
catch (Exception ex)
{
//Uri u = new Uri(ent.BaseUri + "GetErrorMsg"
// , UriKind.RelativeOrAbsolute);
//string datas = (ent.Execute<string>(u)).FirstOrDefault();
//Exception ex = new Exception(datas);
throw ex;
}
}
When SaveChanges run, if the data exist it will duplicate the entire row,
How to avoid the data being duplicate when insert (savechanges)??????
Many Thanks
What about: Do not insert.
In these cases I am using a MERGE statement that updates existing data (based on primary key) and inserts new data.
Oh, and all the code example you loved to post is totally irrelevant to the question, which is a pure SQL side question. You literally quote your car manual then asking which direction to turn.
I have some serous issues with this code. I have the method GetWeatherItemData that takes a parameter name and a period as input for getting data from a SQL database and it seems to work fine.
But I want to create a method that can do the same work for multiple parameters, so I created a method named GetSelectedWeatherItemsData that takes an array of parameters as input and loops through my first method, but for some reason it will only return data for 1 parameter only, the first in the array of input parameters.
Here is the first method:
public CustomDataType GetWeatherItemData(string parameterName, string fromTime, string toTime)
{
/* This method takes parameter name, start time and end time as input it will then return
* all the measurement values and their timestamp as array for the specific parameter
*/
CustomDataType getWeatherItemObj = new CustomDataType();
List<double> valueList = new List<double>();
List<string> timeStampList = new List<string>();
List<int> parameterIdList = new List<int>();
List<string> ParameterNameList = new List<string>();
try
{
using (conn = new SqlConnection(connectionString))// create and open a connection object
{
// 1. create a command object identifying the stored procedure
cmd = new SqlCommand("GetWeatherItemData", conn);
// 2.Let the command object know we will execute a stored procedure
cmd.CommandType = CommandType.StoredProcedure;
// 3. add the 3 parameters to command, so the can be passed to the stored procedure
cmd.Parameters.Add("#WeatherParameterName", SqlDbType.VarChar).Value = parameterName;
cmd.Parameters.Add("#FromTime", SqlDbType.VarChar).Value = fromTime;
cmd.Parameters.Add("#ToTime", SqlDbType.VarChar).Value = toTime;
//open connection
conn.Open();
// execute the command
reader = cmd.ExecuteReader();
if (reader.HasRows)
{
while (reader.Read())
{
valueList.Add((double)reader["MeasurementValue"]);
timeStampList.Add(reader["MeasurementDateTime"].ToString());
parameterIdList.Add((int)reader["WeatherParameterID"]);
}
}
//close connection
reader.Close();
//changed to arrays to support webservices
getWeatherItemObj.arrayOfValue = valueList.ToArray();
getWeatherItemObj.arrayOfTimestamp = timeStampList.ToArray();
getWeatherItemObj.arrayOfParameterID = parameterIdList.ToArray();
for (counter = 0; counter < getWeatherItemObj.arrayOfValue.Length; counter++)
{
ParameterNameList.Add(GetParameterInfo(parameterName).ParameterName);
}
getWeatherItemObj.arrayOfParameterName = ParameterNameList.ToArray();
}
}
catch (SqlException e)
{
Console.WriteLine("Connection failed");
Console.WriteLine(e.Message);
Thread.Sleep(5000);
}
return getWeatherItemObj;
}
Here is the code I have problem with. It takes an array of parameter names, and period as input. But it only returns data for the first element in the input array, as if it does the for loop only once and jumps out.I tested the code inside the for loop by assigning them a fixed number like parameterName[3] instead of parameterName[counter] and this resulted in that I got data for that element not the first element. So for some reason the for loops does only 1 iteration.
public CustomDataType GetSelectedWeatherItemsData(string[] parameterName, string fromTime, string toTime)
{
CustomDataType tempObj;
List<double> valueList = new List<double>();
List<string> timeStampList = new List<string>();
List<int> paramIdStampList = new List<int>();
List<string> ParameterNameList = new List<string>();
for (counter = 0; counter < (parameterName.Length); counter++)
{
tempObj = GetWeatherItemData(parameterName[counter], fromTime, toTime);
valueList.AddRange(GetWeatherItemData(parameterName[counter], fromTime, toTime).arrayOfValue);
timeStampList.AddRange(GetWeatherItemData(parameterName[counter], fromTime, toTime).arrayOfTimestamp);
//paramIdStampList.AddRange(tempObj.arrayOfParameterID);
ParameterNameList.AddRange(GetWeatherItemData(parameterName[counter], fromTime, toTime).arrayOfParameterName);
}
getSelectedItemsObj = new CustomDataType();
getSelectedItemsObj.arrayOfValue = valueList.ToArray();
getSelectedItemsObj.arrayOfTimestamp = timeStampList.ToArray();
//getSelectedItemsObj.arrayOfParameterID = paramIdStampList.ToArray();
getSelectedItemsObj.arrayOfParameterName = ParameterNameList.ToArray();
return getSelectedItemsObj;
}
What seems definitely wrong is the fact you're calling the simple web service four times inside your loop - why don't you just call it once and then use the results you get back?
Something like:
for (counter = 0; counter < (parameterName.Length); counter++)
{
tempObj = GetWeatherItemData(parameterName[counter], fromTime, toTime);
valueList.AddRange(tempObj.arrayOfValue);
timeStampList.AddRange(tempObj.arrayOfTimestamp);
//paramIdStampList.AddRange(tempObj.arrayOfParameterID);
ParameterNameList.AddRange(tempObj.arrayOfParameterName);
}