Is it possible to have dynamic object properties in .net 4.0 - c#

I want to add new properties to an object based on loop iterations, is this possible in .Net? The reason I want to do this is I am looping through rows in an excel spreadsheet and for every successfully read row I want to create a new dynamic object property. So when the loop is complete I can simply pass the object to a method and log all the records.
See below for code so far:
protected void ReadData(string filePath, bool upload)
{
StringBuilder sb = new StringBuilder();
#region upload
if (upload == true) // CSV file upload chosen
{
using (CsvReader csv = new CsvReader(new StreamReader(filePath), true)) // Cache CSV file to memory
{
int fieldCount = csv.FieldCount; // Total number of fields per row
string[] headers = csv.GetFieldHeaders(); // Correct CSV headers stored in array
SortedList<int, string> errorList = new SortedList<int, string>(); // This list will contain error values
ORCData data = new ORCData();
bool errorFlag = false;
int errorCount = 0;
// Check if headers are correct first before reading data
if (headers[0] != "first name" || headers[1] != "last name" || headers[2] != "job title" || headers[3] != "email address" || headers[4] != "telephone number" || headers[5] != "company" || headers[6] != "research manager" || headers[7] != "user card number")
{
sb.Append("Headers are incorrect");
}
else
{
while (csv.ReadNextRecord())
try
{
//Check csv obj data for valid values
for (int i = 0; i < fieldCount; i++)
{
if (i == 0 || i == 1) // FirstName and LastName
{
if (Regex.IsMatch(csv[i].ToString(), "^[a-z]+$", RegexOptions.IgnoreCase) == false) //REGEX letters only min of 5 char max of 20
{
errorList.Add(errorCount, csv[i]);
errorCount += 1;
errorFlag = true;
string text = csv[i].ToString();
}
}
}
if (errorFlag == true)
{
sb.Append("<b>" + "Number of Error: " + errorCount + "</b>");
sb.Append("<ul>");
foreach (KeyValuePair<int, string> key in errorList)
{
sb.Append("<li>" + key.Value + "</li>");
}
}
else // All validation checks equaled to false. Create User
{
string message = ORCLdap.CreateUserAccount(rootLDAPPath, svcUsername, svcPassword, csv[0], csv[1], csv[2], csv[3], csv[4], csv[5], csv[7]);
// TODO: Add to object here
sb.Append(message);
//sb.Append("<b>New user data uploaded successfully</b>");
}
}// end of try
catch (Exception ex)
{
sb.Append(ex.ToString());
}
finally
{
lblMessage.Text = sb.ToString();
sb.Remove(0, sb.Length);
hdnRdoSelection.Value = "1";
}
}
}
}
#endregion
I have never tried to do this before so I am not sure how I would approach it but any help would be great. Thanks.

I want to add new properties to an object based on loop iterations, is this possible in .Net?
Sort of. You probably want to use ExpandoObject, treating it as an IDictionary<string, object> when you're adding the properties.
Having said that, if you're not going to try to use those properties as properties later, do you actually need them to be properties at all? Why not just use a Dictionary<string, object> to start with?

Yes, you could use ExpandoObject for this. Simply assign the properties you want and it will assume those properties.

dynamic settings= new ExpandoObject();

Related

Accessing resources attached to list items in SharePoint

I'm accessing a list of calendar items on a SharePoint2013 site like so:
public ListItemCollection GetListByTitle(string title)
{
ClientContext context = new ClientContext(_site);
List list = context.Web.Lists.GetByTitle(title);
ListItemCollection listItems = list.GetItems(new CamlQuery()); // Empty CamlQuery to return all items in list
context.Load(listItems);
context.ExecuteQuery();
return listItems;
}
Then I'm passing that ListItemCollection to another method which will map some of the item's properties to a custom model
public List<CustomModel>GetListOfCustomModel(ListItemCollection listItems)
{
List<CustomModel> customModelList = new List<CustomModel>();
foreach(ListItem i in listItems)
{
FieldUserValue contact = (FieldUserValue)i.FieldValues["Contact"];
string s = (string)(contact.LookupValue);
string t = (string)i.FieldValues["Title"];
DateTime start = (DateTime)i.FieldValues["EventDate"];
// etc.
}
}
All of the "in-built" properties are easy to get, but I can't figure out how to access the resources the company has created and attached to these items.
E.g. each calendar item has a "Room" resource attached. I understand this is "meta data" but surely I should be able to access it somehow? It must be linked to the item I just don't know where to look. When I do a SharePoint list view for every column in the list I can see the "room" resource is generated as a link with reference to the resource.
Or am I going to end up viewing the text response from viewing my LISTALL page in a web request and parse the room out using good old fashioned string manipulation?!
I'd been looking at this for a couple of days, and I found a piece of code that translates a ListItemCollection to a DataTable
This code handled Microsoft.SharePoint.Client.FieldLookupValue, Microsoft.SharePoint.Client.FieldUserValue and Microsoft.SharePoint.Client.FieldUserValue[] but when I was looking at my Excel output I saw a Microsoft.SharePoint.Client.FieldLookupValue[]
Debugged the code again and drilled down into this instance of a FieldLookupValue[] called Facilities which, lo and behold, has the room and all other "Resources" in there.
SHORT ANSWER: Don't look for resources, look for FACILITIES
Here's some code I lifted from another answer site that cycles through ListItemCollection and transposes info to a DataTable but amended to show Id as well as value for FieldUserValue arrays and, more importantly, do the same for FieldLookupValue arrays:
public DataTable GetDataTableFromListItemCollection(ListItemCollection listItems)
{
DataTable dt = new DataTable();
foreach (var field in listItems[0].FieldValues.Keys)
{
dt.Columns.Add(field);
}
foreach (var item in listItems)
{
DataRow dr = dt.NewRow();
foreach (var obj in item.FieldValues)
{
if (obj.Value != null)
{
string key = obj.Key;
string type = obj.Value.GetType().FullName;
if (type == "Microsoft.SharePoint.Client.FieldLookupValue")
{
dr[obj.Key] = ((FieldLookupValue)obj.Value).LookupValue;
}
else if (type == "Microsoft.SharePoint.Client.FieldUserValue")
{
dr[obj.Key] = ((FieldUserValue)obj.Value).LookupValue;
}
else if (type == "Microsoft.SharePoint.Client.FieldUserValue[]")
{
FieldUserValue[] multValue = (FieldUserValue[])obj.Value;
foreach (FieldUserValue fieldUserValue in multValue)
{
dr[obj.Key] += "&" + fieldUserValue.LookupId + "=" + fieldUserValue.LookupValue;
}
}
else if (type == "Microsoft.SharePoint.Client.FieldLookupValue[]")
{
FieldLookupValue[] multValue = (FieldLookupValue[])obj.Value;
foreach (FieldLookupValue fieldLookupValue in multValue)
{
dr[obj.Key] += "&" + fieldLookupValue.LookupId + "=" + fieldLookupValue.LookupValue;
}
}
else if (type == "System.DateTime")
{
if (obj.Value.ToString().Length > 0)
{
var date = obj.Value.ToString().Split(' ');
if (date[0].Length > 0)
{
dr[obj.Key] = date[0];
}
}
}
else
{
dr[obj.Key] = obj.Value;
}
}
else
{
dr[obj.Key] = null;
}
}
dt.Rows.Add(dr);
}
return dt;
}
https://social.technet.microsoft.com/Forums/en-US/4bf89ee1-50a1-4c21-9ef9-51bd4d2ae155/convert-listitemcollection-to-datatable-without-looping-through-all-list-items-using-csom?forum=SP2016

How to add data from Firebase to DataGridView using FireSharp

I just want to retrieve data from a Firebase to DataGridView. The code I have is retrieving data already, however, it's retrieving everything to the same row instead of creating a new one. I'm a beginner in coding, so I really need help with that.
I read online that Firebase doesn't "Count" data, so it'd be needed to create a counter, so each time I add or delete data, an update would be needed. I did it and it's working. I created a method to load the data.
private async Task firebaseData()
{
int i = 0;
FirebaseResponse firebaseResponse = await client.GetAsync("Counter/node");
Counter_class counter = firebaseResponse.ResultAs<Counter_class>();
int foodCount = Convert.ToInt32(counter.food_count);
while (true)
{
if (i == foodCount)
{
break;
}
i++;
try
{
FirebaseResponse response2 = await client.GetAsync("Foods/0" + i);
Foods foods = response2.ResultAs<Foods>();
this.dtGProductsList.Rows[0].Cells[0].Value = foods.menuId;
this.dtGProductsList.Rows[0].Cells[1].Value = foods.name;
this.dtGProductsList.Rows[0].Cells[2].Value = foods.image;
this.dtGProductsList.Rows[0].Cells[3].Value = foods.price;
this.dtGProductsList.Rows[0].Cells[4].Value = foods.discount;
this.dtGProductsList.Rows[0].Cells[5].Value = foods.description;
}
catch
{
}
}
MessageBox.Show("Done");
}
OBS: A DataTable exists already(dataTable), there's a DataGridView too which has columns(ID,Name, Image, Price, Discount, Description), which match the number and order given to the .Cells[x]. When the Form loads, dtGProductsList.DataSource = dataTable; I tried replacing [0] for [i].
I expect the data that is beeing retrieved to be set to a new row and not to the same, and to not skip rows. I'm sorry if it's too simple, but I can't see a way out.
I Faced the same problem and here is mu solution :
Counter_class XClass = new Counter_class();
FirebaseResponse firebaseResponse = await client.GetAsync("Counter/node");
string JsTxt = response.Body;
if (JsTxt == "null")
{
return ;
}
dynamic data = JsonConvert.DeserializeObject<dynamic>(JsTxt);
var list = new List<XClass >();
foreach (var itemDynamic in data)
{
list.Add(JsonConvert.DeserializeObject<XClass >
(((JProperty)itemDynamic).Value.ToString()));
}
// Now you have a list you can loop through to put it at any suitable Visual
//control
foreach ( XClass _Xcls in list)
{
Invoke((MethodInvoker)delegate {
DataGridViewRow row(DataGridViewRow)dg.Rows[0].Clone();
row.Cells[0].Value =_Xdcls...
row.Cells[1].Value =Xdcls...
row.Cells[2].Value =Xdcls...
......
dg.Insert(0, row);
}

Parsing multi-line string in C#

I have a string that looks like this:
TYPE Email Forwarding
SIGNATURE mysig.html
COMPANY Smith Incorp
CLIENT NAME James Henries
... heaps of others ....
I need to get the values of Type, Signature, Company and Client Name. There are others but once I can find a soution on how to do these, I can do the rest. I have tried to split and trim the string but then it splits fields like CLIENT NAME or on values like Email Forwarding.
I would put all of the "key" values into a collection, and then parse the string into another collection and then compare the values of the collections.
Here is a rough outline of how you could get the values:
static void Main(string[] args)
{
//Assuming that you know all of the keys before hand
List<string> keys = new List<string>() { "TYPE", "SIGNATURE", "COMPANY", "CLIENT NAME" };
//Not sure of the origin of your string to parse. You would have to change
//this to read a file or query the DB or whatever
string multilineString =
#"TYPE Email Forwarding
SIGNATURE mysig.html
COMPANY Smith Incorp
CLIENT NAME James Henries";
//Split the string by newlines.
var lines = multilineString.Split(new string[] { Environment.NewLine }, StringSplitOptions.RemoveEmptyEntries);
//Iterate over keys because you probably have less keys than data in the event of duplicates
foreach (var key in keys)
{
//Reduce list of lines to check based on ones that start with a given key
var filteredLines = lines.Where(l => l.Trim().StartsWith(key)).ToList();
foreach (var line in filteredLines)
{
Console.WriteLine(line.Trim().Remove(0, key.Length + 1));
}
}
Console.ReadLine();
}
That will do your job.
If it is multiple lines then you can loop through each line and call KeyValue extension method as given below:
public static class Program
{
public static void Main()
{
var value = "TYPE Email Forwarding".KeyValue();
var value1 = "CLIENT NAME James Henries".KeyValue();
}
public static KeyValuePair<string, string> KeyValue(this string rawData)
{
var splitValue = rawData.Split(new[] { ' ' }, System.StringSplitOptions.RemoveEmptyEntries);
KeyValuePair<string, string> returnValue;
var key = string.Empty;
var value = string.Empty;
foreach (var item in splitValue)
{
if (item.ToUpper() == item)
{
if (string.IsNullOrWhiteSpace(key))
{
key += item;
}
else
{
key += " " + item;
}
}
else
{
if (string.IsNullOrWhiteSpace(value))
{
value += item;
}
else
{
value += " " + item;
}
}
}
returnValue = new KeyValuePair<string, string>(key, value);
return returnValue;
}
}
Please note that this logic will work only when keys are all upper and the values are not all upper case. Otherwise, there is no way to identify which one is key (without having a manual track on keys) and which one is not.

How to read .mst(transform) Table along with .msi

mst + .msi table information.
I created following function to read msi Tables
// This method returns all rows and columns of a Table specified by Name
public DataTable ReadMsiTableByName(string msiFile, string tableName)
{
DataTable msiTable = new DataTable(tableName);
Database database = null;
View view = null;
try
{
using (database = new Database(msiFile, DatabaseOpenMode.ReadOnly))
{
string sqlQuery = String.Format("SELECT * FROM {0}", tableName);
view = database.OpenView(sqlQuery);
view.Execute(null);
Record record = view.Fetch();
ColumnCollection columnCollection = view.Columns;
for (int i = 0; i < columnCollection.Count; i++)
{
string columnName = columnCollection[i].Name.ToString();
System.Type columnType = columnCollection[i].Type;
msiTable.Columns.Add(columnName, columnType.UnderlyingSystemType);
}
while (record != null)
{
DataRow row = msiTable.NewRow();
for (int i = 0; i < columnCollection.Count; i++)
{
string type = columnCollection[i].Type.ToString();
if (type == "System.String")
{
row[columnCollection[i].Name.ToString()] = record.GetString(columnCollection[i].Name.ToString());
}
else if (type == "System.Int16")
{
row[columnCollection[i].Name.ToString()] = record.GetInteger(columnCollection[i].Name.ToString());
}
else if (type == "System.Int32")
{
row[columnCollection[i].Name.ToString()] = record.GetInteger(columnCollection[i].Name.ToString());
}
else if (type == "System.IO.Stream")
{
System.IO.Stream stream;
stream = record.GetStream(columnCollection[i].Name.ToString());
row[columnCollection[i].Name.ToString()] = stream;
}
}
msiTable.Rows.Add(row);
record = view.Fetch();
}
}
}
catch (Exception ex)
{
CommonFn.CreateLog(ex.ToString());
}
finally
{
if (database != null)
database.Close();
if (view != null)
view.Close();
}
return msiTable;
}
However I am unable to read .mst with this function. I read that you need to use msi transform for it, But I don't want to change the content of msi or mst, I just need to read all the tables. Please point me in right direction. Thanks in Advance :)
An MST is by definition a transform. It only contains the deltas of a base MSI.
The database class has a method called ViewTransform. If the MST is compatible with the MSI it'll succeed and the changes appear in the _TransformView table.
Alternatively if you don't want to see whats changed but you want to see the final state, you could copy the MSI to a temporary MSI and use the ApplyTransform method to apply the transform and then commit it. Now you could query it using the code you already have.
Works for me:
msiDatabase = new Database(#"Foo.msi", DatabaseOpenMode.ReadOnly);
msiDatabase.ApplyTransform(#"Foo.mst");

Bulk insert/Update with Petapoco

I'm using the Save() method to insert or update records, but I would like to make it perform a bulk insert and bulk update with only one database hit. How do I do this?
In my case, I took advantage of the database.Execute() method.
I created a SQL parameter that had the first part of my insert:
var sql = new Sql("insert into myTable(Name, Age, Gender) values");
for (int i = 0; i < pocos.Count ; ++i)
{
var p = pocos[i];
sql.Append("(#0, #1, #2)", p.Name, p.Age , p.Gender);
if(i != pocos.Count -1)
sql.Append(",");
}
Database.Execute(sql);
I tried two different methods for inserting a large quantity of rows faster than the default Insert (which is pretty slow when you have a lot of rows).
1) Making up a List<T> with the poco's first and then inserting them at once within a loop (and in a transaction):
using (var tr = PetaPocoDb.GetTransaction())
{
foreach (var record in listOfRecords)
{
PetaPocoDb.Insert(record);
}
tr.Complete();
}
2) SqlBulkCopy a DataTable:
var bulkCopy = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.TableLock);
bulkCopy.DestinationTableName = "SomeTable";
bulkCopy.WriteToServer(dt);
To get my List <T> to a DataTable I used Marc Gravells Convert generic List/Enumerable to DataTable? function which worked ootb for me (after I rearranged the Poco properties to be in the exact same order as the table fields in the db.)
The SqlBulkCopy was fastest, 50% or so faster than the transactions method in the (quick) perf tests I did with ~1000 rows.
Hth
Insert in one SQL query is much faster.
Here is a customer method for PetaPoco.Database class that adds ability to do a bulk insert of any collection:
public void BulkInsertRecords<T>(IEnumerable<T> collection)
{
try
{
OpenSharedConnection();
using (var cmd = CreateCommand(_sharedConnection, ""))
{
var pd = Database.PocoData.ForType(typeof(T));
var tableName = EscapeTableName(pd.TableInfo.TableName);
string cols = string.Join(", ", (from c in pd.QueryColumns select tableName + "." + EscapeSqlIdentifier(c)).ToArray());
var pocoValues = new List<string>();
var index = 0;
foreach (var poco in collection)
{
var values = new List<string>();
foreach (var i in pd.Columns)
{
values.Add(string.Format("{0}{1}", _paramPrefix, index++));
AddParam(cmd, i.Value.GetValue(poco), _paramPrefix);
}
pocoValues.Add("(" + string.Join(",", values.ToArray()) + ")");
}
var sql = string.Format("INSERT INTO {0} ({1}) VALUES {2}", tableName, cols, string.Join(", ", pocoValues));
cmd.CommandText = sql;
cmd.ExecuteNonQuery();
}
}
finally
{
CloseSharedConnection();
}
}
Here is the updated verision of Steve Jansen answer that splits in chuncs of maximum 2100 pacos
I commented out the following code as it produces duplicates in the database...
//using (var reader = cmd.ExecuteReader())
//{
// while (reader.Read())
// {
// inserted.Add(reader[0]);
// }
//}
Updated Code
/// <summary>
/// Performs an SQL Insert against a collection of pocos
/// </summary>
/// <param name="pocos">A collection of POCO objects that specifies the column values to be inserted. Assumes that every POCO is of the same type.</param>
/// <returns>An array of the auto allocated primary key of the new record, or null for non-auto-increment tables</returns>
public object BulkInsert(IEnumerable<object> pocos)
{
Sql sql;
IList<PocoColumn> columns = new List<PocoColumn>();
IList<object> parameters;
IList<object> inserted;
PocoData pd;
Type primaryKeyType;
object template;
string commandText;
string tableName;
string primaryKeyName;
bool autoIncrement;
int maxBulkInsert;
if (null == pocos)
{
return new object[] { };
}
template = pocos.First<object>();
if (null == template)
{
return null;
}
pd = PocoData.ForType(template.GetType());
tableName = pd.TableInfo.TableName;
primaryKeyName = pd.TableInfo.PrimaryKey;
autoIncrement = pd.TableInfo.AutoIncrement;
//Calculate the maximum chunk size
maxBulkInsert = 2100 / pd.Columns.Count;
IEnumerable<object> pacosToInsert = pocos.Take(maxBulkInsert);
IEnumerable<object> pacosremaining = pocos.Skip(maxBulkInsert);
try
{
OpenSharedConnection();
try
{
var names = new List<string>();
var values = new List<string>();
var index = 0;
foreach (var i in pd.Columns)
{
// Don't insert result columns
if (i.Value.ResultColumn)
continue;
// Don't insert the primary key (except under oracle where we need bring in the next sequence value)
if (autoIncrement && primaryKeyName != null && string.Compare(i.Key, primaryKeyName, true) == 0)
{
primaryKeyType = i.Value.PropertyInfo.PropertyType;
// Setup auto increment expression
string autoIncExpression = _dbType.GetAutoIncrementExpression(pd.TableInfo);
if (autoIncExpression != null)
{
names.Add(i.Key);
values.Add(autoIncExpression);
}
continue;
}
names.Add(_dbType.EscapeSqlIdentifier(i.Key));
values.Add(string.Format("{0}{1}", _paramPrefix, index++));
columns.Add(i.Value);
}
string outputClause = String.Empty;
if (autoIncrement)
{
outputClause = _dbType.GetInsertOutputClause(primaryKeyName);
}
commandText = string.Format("INSERT INTO {0} ({1}){2} VALUES",
_dbType.EscapeTableName(tableName),
string.Join(",", names.ToArray()),
outputClause
);
sql = new Sql(commandText);
parameters = new List<object>();
string valuesText = string.Concat("(", string.Join(",", values.ToArray()), ")");
bool isFirstPoco = true;
var parameterCounter = 0;
foreach (object poco in pacosToInsert)
{
parameterCounter++;
parameters.Clear();
foreach (PocoColumn column in columns)
{
parameters.Add(column.GetValue(poco));
}
sql.Append(valuesText, parameters.ToArray<object>());
if (isFirstPoco && pocos.Count() > 1)
{
valuesText = "," + valuesText;
isFirstPoco = false;
}
}
inserted = new List<object>();
using (var cmd = CreateCommand(_sharedConnection, sql.SQL, sql.Arguments))
{
if (!autoIncrement)
{
DoPreExecute(cmd);
cmd.ExecuteNonQuery();
OnExecutedCommand(cmd);
PocoColumn pkColumn;
if (primaryKeyName != null && pd.Columns.TryGetValue(primaryKeyName, out pkColumn))
{
foreach (object poco in pocos)
{
inserted.Add(pkColumn.GetValue(poco));
}
}
return inserted.ToArray<object>();
}
object id = _dbType.ExecuteInsert(this, cmd, primaryKeyName);
if (pacosremaining.Any())
{
return BulkInsert(pacosremaining);
}
return id;
//using (var reader = cmd.ExecuteReader())
//{
// while (reader.Read())
// {
// inserted.Add(reader[0]);
// }
//}
//object[] primaryKeys = inserted.ToArray<object>();
//// Assign the ID back to the primary key property
//if (primaryKeyName != null)
//{
// PocoColumn pc;
// if (pd.Columns.TryGetValue(primaryKeyName, out pc))
// {
// index = 0;
// foreach (object poco in pocos)
// {
// pc.SetValue(poco, pc.ChangeType(primaryKeys[index]));
// index++;
// }
// }
//}
//return primaryKeys;
}
}
finally
{
CloseSharedConnection();
}
}
catch (Exception x)
{
if (OnException(x))
throw;
return null;
}
}
Below is a BulkInsert method of PetaPoco that expands on taylonr's very clever idea to use the SQL technique of insert multiple rows via INSERT INTO tab(col1, col2) OUTPUT inserted.[ID] VALUES (#0, #1), (#2, 3), (#4, #5), ..., (#n-1, #n).
It also returns the auto-increment (identity) values of inserted records, which I don't believe happens in IvoTops' implementation.
NOTE: SQL Server 2012 (and below) has a limit of 2,100 parameters per query. (This is likely the source of the stack overflow exception referenced by Zelid's comment). You will need to manually split your batches up based on the number of columns that are not decorated as Ignore or Result. For example, a POCO with 21 columns should be sent in batch sizes of 99, or (2100 - 1) / 21. I may refactor this to dynamically split batches based on this limit for SQL Server; however, you will always see the best results by managing the batch size external to this method.
This method showed an approximate 50% gain in execution time over my previous technique of using a shared connection in a single transaction for all inserts.
This is one area where Massive really shines - Massive has a Save(params object[] things) that builds an array of IDbCommands, and executes each one on a shared connection. It works out of the box, and doesn't run into parameter limits.
/// <summary>
/// Performs an SQL Insert against a collection of pocos
/// </summary>
/// <param name="pocos">A collection of POCO objects that specifies the column values to be inserted. Assumes that every POCO is of the same type.</param>
/// <returns>An array of the auto allocated primary key of the new record, or null for non-auto-increment tables</returns>
/// <remarks>
/// NOTE: As of SQL Server 2012, there is a limit of 2100 parameters per query. This limitation does not seem to apply on other platforms, so
/// this method will allow more than 2100 parameters. See http://msdn.microsoft.com/en-us/library/ms143432.aspx
/// The name of the table, it's primary key and whether it's an auto-allocated primary key are retrieved from the attributes of the first POCO in the collection
/// </remarks>
public object[] BulkInsert(IEnumerable<object> pocos)
{
Sql sql;
IList<PocoColumn> columns = new List<PocoColumn>();
IList<object> parameters;
IList<object> inserted;
PocoData pd;
Type primaryKeyType;
object template;
string commandText;
string tableName;
string primaryKeyName;
bool autoIncrement;
if (null == pocos)
return new object[] {};
template = pocos.First<object>();
if (null == template)
return null;
pd = PocoData.ForType(template.GetType());
tableName = pd.TableInfo.TableName;
primaryKeyName = pd.TableInfo.PrimaryKey;
autoIncrement = pd.TableInfo.AutoIncrement;
try
{
OpenSharedConnection();
try
{
var names = new List<string>();
var values = new List<string>();
var index = 0;
foreach (var i in pd.Columns)
{
// Don't insert result columns
if (i.Value.ResultColumn)
continue;
// Don't insert the primary key (except under oracle where we need bring in the next sequence value)
if (autoIncrement && primaryKeyName != null && string.Compare(i.Key, primaryKeyName, true) == 0)
{
primaryKeyType = i.Value.PropertyInfo.PropertyType;
// Setup auto increment expression
string autoIncExpression = _dbType.GetAutoIncrementExpression(pd.TableInfo);
if (autoIncExpression != null)
{
names.Add(i.Key);
values.Add(autoIncExpression);
}
continue;
}
names.Add(_dbType.EscapeSqlIdentifier(i.Key));
values.Add(string.Format("{0}{1}", _paramPrefix, index++));
columns.Add(i.Value);
}
string outputClause = String.Empty;
if (autoIncrement)
{
outputClause = _dbType.GetInsertOutputClause(primaryKeyName);
}
commandText = string.Format("INSERT INTO {0} ({1}){2} VALUES",
_dbType.EscapeTableName(tableName),
string.Join(",", names.ToArray()),
outputClause
);
sql = new Sql(commandText);
parameters = new List<object>();
string valuesText = string.Concat("(", string.Join(",", values.ToArray()), ")");
bool isFirstPoco = true;
foreach (object poco in pocos)
{
parameters.Clear();
foreach (PocoColumn column in columns)
{
parameters.Add(column.GetValue(poco));
}
sql.Append(valuesText, parameters.ToArray<object>());
if (isFirstPoco)
{
valuesText = "," + valuesText;
isFirstPoco = false;
}
}
inserted = new List<object>();
using (var cmd = CreateCommand(_sharedConnection, sql.SQL, sql.Arguments))
{
if (!autoIncrement)
{
DoPreExecute(cmd);
cmd.ExecuteNonQuery();
OnExecutedCommand(cmd);
PocoColumn pkColumn;
if (primaryKeyName != null && pd.Columns.TryGetValue(primaryKeyName, out pkColumn))
{
foreach (object poco in pocos)
{
inserted.Add(pkColumn.GetValue(poco));
}
}
return inserted.ToArray<object>();
}
// BUG: the following line reportedly causes duplicate inserts; need to confirm
//object id = _dbType.ExecuteInsert(this, cmd, primaryKeyName);
using(var reader = cmd.ExecuteReader())
{
while (reader.Read())
{
inserted.Add(reader[0]);
}
}
object[] primaryKeys = inserted.ToArray<object>();
// Assign the ID back to the primary key property
if (primaryKeyName != null)
{
PocoColumn pc;
if (pd.Columns.TryGetValue(primaryKeyName, out pc))
{
index = 0;
foreach(object poco in pocos)
{
pc.SetValue(poco, pc.ChangeType(primaryKeys[index]));
index++;
}
}
}
return primaryKeys;
}
}
finally
{
CloseSharedConnection();
}
}
catch (Exception x)
{
if (OnException(x))
throw;
return null;
}
}
Here is the code for BulkInsert that you can add to v5.01 PetaPoco.cs
You can paste it somewhere close the regular insert at line 1098
You give it an IEnumerable of Pocos and it will send it to the database
in batches of x together. The code is 90% from the regular insert.
I do not have performance comparison, let me know :)
/// <summary>
/// Bulk inserts multiple rows to SQL
/// </summary>
/// <param name="tableName">The name of the table to insert into</param>
/// <param name="primaryKeyName">The name of the primary key column of the table</param>
/// <param name="autoIncrement">True if the primary key is automatically allocated by the DB</param>
/// <param name="pocos">The POCO objects that specifies the column values to be inserted</param>
/// <param name="batchSize">The number of POCOS to be grouped together for each database rounddtrip</param>
public void BulkInsert(string tableName, string primaryKeyName, bool autoIncrement, IEnumerable<object> pocos, int batchSize = 25)
{
try
{
OpenSharedConnection();
try
{
using (var cmd = CreateCommand(_sharedConnection, ""))
{
var pd = PocoData.ForObject(pocos.First(), primaryKeyName);
// Create list of columnnames only once
var names = new List<string>();
foreach (var i in pd.Columns)
{
// Don't insert result columns
if (i.Value.ResultColumn)
continue;
// Don't insert the primary key (except under oracle where we need bring in the next sequence value)
if (autoIncrement && primaryKeyName != null && string.Compare(i.Key, primaryKeyName, true) == 0)
{
// Setup auto increment expression
string autoIncExpression = _dbType.GetAutoIncrementExpression(pd.TableInfo);
if (autoIncExpression != null)
{
names.Add(i.Key);
}
continue;
}
names.Add(_dbType.EscapeSqlIdentifier(i.Key));
}
var namesArray = names.ToArray();
var values = new List<string>();
int count = 0;
do
{
cmd.CommandText = "";
cmd.Parameters.Clear();
var index = 0;
foreach (var poco in pocos.Skip(count).Take(batchSize))
{
values.Clear();
foreach (var i in pd.Columns)
{
// Don't insert result columns
if (i.Value.ResultColumn) continue;
// Don't insert the primary key (except under oracle where we need bring in the next sequence value)
if (autoIncrement && primaryKeyName != null && string.Compare(i.Key, primaryKeyName, true) == 0)
{
// Setup auto increment expression
string autoIncExpression = _dbType.GetAutoIncrementExpression(pd.TableInfo);
if (autoIncExpression != null)
{
values.Add(autoIncExpression);
}
continue;
}
values.Add(string.Format("{0}{1}", _paramPrefix, index++));
AddParam(cmd, i.Value.GetValue(poco), i.Value.PropertyInfo);
}
string outputClause = String.Empty;
if (autoIncrement)
{
outputClause = _dbType.GetInsertOutputClause(primaryKeyName);
}
cmd.CommandText += string.Format("INSERT INTO {0} ({1}){2} VALUES ({3})", _dbType.EscapeTableName(tableName),
string.Join(",", namesArray), outputClause, string.Join(",", values.ToArray()));
}
// Are we done?
if (cmd.CommandText == "") break;
count += batchSize;
DoPreExecute(cmd);
cmd.ExecuteNonQuery();
OnExecutedCommand(cmd);
}
while (true);
}
}
finally
{
CloseSharedConnection();
}
}
catch (Exception x)
{
if (OnException(x))
throw;
}
}
/// <summary>
/// Performs a SQL Bulk Insert
/// </summary>
/// <param name="pocos">The POCO objects that specifies the column values to be inserted</param>
/// <param name="batchSize">The number of POCOS to be grouped together for each database rounddtrip</param>
public void BulkInsert(IEnumerable<object> pocos, int batchSize = 25)
{
if (!pocos.Any()) return;
var pd = PocoData.ForType(pocos.First().GetType());
BulkInsert(pd.TableInfo.TableName, pd.TableInfo.PrimaryKey, pd.TableInfo.AutoIncrement, pocos);
}
And in the same lines if you want BulkUpdate:
public void BulkUpdate<T>(string tableName, string primaryKeyName, IEnumerable<T> pocos, int batchSize = 25)
{
try
{
object primaryKeyValue = null;
OpenSharedConnection();
try
{
using (var cmd = CreateCommand(_sharedConnection, ""))
{
var pd = PocoData.ForObject(pocos.First(), primaryKeyName);
int count = 0;
do
{
cmd.CommandText = "";
cmd.Parameters.Clear();
var index = 0;
var cmdText = new StringBuilder();
foreach (var poco in pocos.Skip(count).Take(batchSize))
{
var sb = new StringBuilder();
var colIdx = 0;
foreach (var i in pd.Columns)
{
// Don't update the primary key, but grab the value if we don't have it
if (string.Compare(i.Key, primaryKeyName, true) == 0)
{
primaryKeyValue = i.Value.GetValue(poco);
continue;
}
// Dont update result only columns
if (i.Value.ResultColumn)
continue;
// Build the sql
if (colIdx > 0)
sb.Append(", ");
sb.AppendFormat("{0} = {1}{2}", _dbType.EscapeSqlIdentifier(i.Key), _paramPrefix,
index++);
// Store the parameter in the command
AddParam(cmd, i.Value.GetValue(poco), i.Value.PropertyInfo);
colIdx++;
}
// Find the property info for the primary key
PropertyInfo pkpi = null;
if (primaryKeyName != null)
{
pkpi = pd.Columns[primaryKeyName].PropertyInfo;
}
cmdText.Append(string.Format("UPDATE {0} SET {1} WHERE {2} = {3}{4};\n",
_dbType.EscapeTableName(tableName), sb.ToString(),
_dbType.EscapeSqlIdentifier(primaryKeyName), _paramPrefix,
index++));
AddParam(cmd, primaryKeyValue, pkpi);
}
if (cmdText.Length == 0) break;
if (_providerName.IndexOf("oracle", StringComparison.OrdinalIgnoreCase) >= 0)
{
cmdText.Insert(0, "BEGIN\n");
cmdText.Append("\n END;");
}
DoPreExecute(cmd);
cmd.CommandText = cmdText.ToString();
count += batchSize;
cmd.ExecuteNonQuery();
OnExecutedCommand(cmd);
} while (true);
}
}
finally
{
CloseSharedConnection();
}
}
catch (Exception x)
{
if (OnException(x))
throw;
}
}
Here's a nice 2018 update using FastMember from NuGet:
private static async Task SqlBulkCopyPocoAsync<T>(PetaPoco.Database db, IEnumerable<T> data)
{
var pd = PocoData.ForType(typeof(T), db.DefaultMapper);
using (var bcp = new SqlBulkCopy(db.ConnectionString))
using (var reader = ObjectReader.Create(data))
{
// set up a mapping from the property names to the column names
var propNames = typeof(T).GetProperties().Where(p => Attribute.IsDefined(p, typeof(ResultColumnAttribute)) == false).Select(propertyInfo => propertyInfo.Name).ToArray();
foreach (var propName in propNames)
{
bcp.ColumnMappings.Add(propName, "[" + pd.GetColumnName(propName) + "]");
}
bcp.DestinationTableName = pd.TableInfo.TableName;
await bcp.WriteToServerAsync(reader).ConfigureAwait(false);
}
}
You can just do a foreach on your records.
foreach (var record in records) {
db.Save(record);
}

Categories