Universal database retrieval - c#

I'm attempting to make a method that returns a ObservableCollection by providing a object type & a table name.
The method should use this information to make a ObservableCollection which retrieves all rows from table and puts it into the previous ObservableCollection with the desired object type.
public class DatabaseManager
{
public async Task<ObservableCollection<object>> GetObjectsAsync(object object_name, string table_name)
{
Type type = object_name.GetType();
IList<PropertyInfo> props = new List<PropertyInfo>(type.GetProperties());
ObservableCollection<object> oc = new ObservableCollection<object>();
using (SqlConnection conn = new SqlConnection(Constants.ConnectionString))
{
await conn.OpenAsync();
using (SqlCommand cmd = conn.CreateCommand())
{
cmd.CommandText = "select * from " + table_name;
using (SqlDataReader reader = await cmd.ExecuteReaderAsync())
{
while (await reader.ReadAsync())
{
object x = new object();
foreach (PropertyInfo prop in props)
{
// I want to create a universal object that i can add to my observable collection here.
object propValue = prop.GetValue(object_name, null);
// Code Here
}
oc.Add(x);
}
}
}
}
return oc;
}
}
The goal is to make a method for getting a ObservableCollection from a Database Table with a none specific object type.
I would appreciate any help / recommendations, I might have the wrong mind set of how to do this, but if it is at all possible please tell me.

C# is a statically typed language, so generally you cannot build an object without knowing its type. In other words, when you do this
object x = new object();
you literally get an object of type System.Object, with no additional properties.
There are two ways of working around this:
Use dynamic and ExpandoObject - this would let you add properties on the fly, but the resulting objects would need to be used as dynamic, or
Take the type of the object that will be produced by the query - your code can "link" table name to objects stored in the table. If you decide to go this route, consider using a lightweight ORM library, such as StackExchange's Dapper.

Related

How to improve speed of generic loop while reading from database

I have a mapping function that reads from the data loaded from the database.
It maps onto a class that is through generic.
The problem is that it takes 3 minutes for looping through a 10000 records.
I would like to improve its performance I am looking for a solution for this?
public static List<T> GetList<T>(string query = null, string whereClause = null, Dictionary<string, string> Params = null)
{
var results = new List<T>();
var properties = GetReadableProperties<T>();
query = QueryMaker.SelectQuery<T>(query, whereClause);
using (SqlConnection Connection = new SqlConnection(ConnectionString))
{
Connection.Open();
SqlCommand cmd = GetSqlCommandWithParams(query, Connection, Params);
SqlDataReader reader = cmd.ExecuteReader();
if (reader.HasRows)
{
// From here
while (reader.Read())
{
var item = Activator.CreateInstance<T>();
foreach (var property in properties)
{
DBReader(item, property, reader);
}
results.Add(item);
}
// To here. It takes 3 minutes. reading a 10000 record from database into reader isn't as slow as this code is.
}
Connection.Close();
return results;
}
}
This is the DBReader function:
private static void DBReader(Object _Object, PropertyInfo property, SqlDataReader reader)
{
if (!reader.IsDBNull(reader.GetOrdinal(property.Name)))
{
Type convertTo = Nullable.GetUnderlyingType(property.PropertyType) ?? property.PropertyType;
property.SetValue(_Object, Convert.ChangeType(reader[property.Name], convertTo), null);
}
else
{
property.SetValue(_Object, null);
}
}
I can see a number of things you could theoretically improve. For example:
store the result of GetOrdinal to reuse when getting values.
use a generic new() instead of Activator.CreateInstance.
Rather than using reflection, you could precompile an expression into a Func<> and invoke it for each row.
Call the appropriate Get...() methods based on the desired type to avoid boxing and unboxing.
But all of these are micro-optimizations and even combining them all together is unlikely to make a significant difference. The amount of time you're describing (minutes for tens of thousands of rows) is unlikely to be caused by any of these kinds of issues. It's most likely coming from I/O operations somewhere, but it's impossible to be certain without profiling the code to see where the most time is being spent. Profile your code, and then use those results to inform what approach you take.

Passing generic class to amethod that returns a generic ObservableCollection of the same class that is passed

I am working with pulling data from a DB in WPF and doing it with multiple different tables, which I have modeled as POCO's in C#. I want to be able to create a method that is flexible enough to handle any of these POCO's being passed in, and then return an observable collection of the same class to the caller.
I have something set up that I haven't tested yet, but I know this probably isn't the best way to implement this so I wanted to get some advice on how to best do this before I even bothered troubleshooting or attempting to get this to work:
public static ObservableCollection<object> SQLAuthentication(ObservableCollection<object> myCollection, object myClass, String sql)
{
var conn = new SqlConnection();
var paramList = GenerateSQLParameters(myClass, null);
var tempModel = Global.GenerateNewInstance(myClass);
//get the type
Type model = tempModel.GetType();
var prop = model.GetProperties();
PropertyInfo pi;
using (getConnection(conn))
{
conn.Open();
SqlCommand cmd;
SqlDataReader reader;
cmd = new SqlCommand(sql, conn);
reader = cmd.ExecuteReader();
while (reader.Read())
{
//set the values for each property in the model
foreach (var p in prop)
{
pi = tempModel.GetType().GetProperty(p.Name);
pi.SetValue(tempModel, reader[p.Name]);
}
myCollection.Add(tempModel);
}
reader.Close();
cmd.Dispose();
}
return myCollection;
}
You can use generic types in the method signature its self, you dont need to use object collection... and as im writing this John B hit the nail on its head
public static ObservableCollection<T> SQLAuthentication(ObservableCollection<T> myCollection, T myClass, String sql) where T: class
#John, if you post it as an answer ill remove this since you did technically shout it out first, just leave a comment if you do so i get the notification

Dynamically creating JSON objects from reflected methods

I've since getting a bunch of help from the SO hivemind been able to create a REST API (C#, MVC) that handles dynamic (at least that's what I want to call it) calls through catching the string param and finding the correct method through using reflection.
var myType = typeof(JaberoDC.JaberoDC.JaberoDC);
var method = myType
.GetMethods(BindingFlags.Public | BindingFlags.Instance | BindingFlags.DeclaredOnly)
.Single(mi =>mi.ReturnType == typeof(DataSet)
&& string.Equals(mi.Name, param, StringComparison.OrdinalIgnoreCase));
var subject = Activator.CreateInstance(myType);
var result = method.Invoke(subject, new Object[] { "", conStr, "", 0, 0, null });
DataSet ds = (DataSet)result;
What I need help with now is any sort of advice on how to handle the various results dynamically. Meaning that I need some help with understanding how to either create a class that handles 0=>N columns from the rows in the DataTable (after getting the DT from the DS), or how to serialize the data without creating instances of above mentioned class.
Again, I'm sorry if my terminology is off here.
I've basically been thrown into the deep-end at my current employer to create a REST API for a DataComponent that has a few hundred methods. As it is now the API only works when using below code:
List<Worksite> arr2 = new List<Worksite>();
foreach (DataTable table in ds.Tables)
{
foreach (DataRow row in table.Rows)
{
string id = row["JobID"].ToString();
string name = row["JobName"].ToString();
string site = row["SiteName"].ToString();
arr2.Add(new Worksite
{
ID = id,
JobName = name,
SiteName = site
});
}
}
Basically this only handles worksites (a specific DataSet in the DataComponent).
My current solution:
public string GetFromParam(String param)
{
var myType = typeof(JaberoDC.JaberoDC.JaberoDC);
var method = myType
.GetMethods(BindingFlags.Public | BindingFlags.Instance | BindingFlags.DeclaredOnly)
.Single(mi =>mi.ReturnType == typeof(DataSet)
&& string.Equals(mi.Name, param, StringComparison.OrdinalIgnoreCase));
var subject = Activator.CreateInstance(myType);
var result = method.Invoke(subject, new Object[] { "", conStr, "", 0, 0, null });
DataSet ds = (DataSet)result;
List<GenerateModel> arr2 = new List<GenerateModel>();
int count = 0;
List<Dictionary<string, object>> rows = new List<Dictionary<string, object>>();
Dictionary <string, object> dicRow;
foreach (DataTable table in ds.Tables)
{
foreach (DataRow row in table.Rows)
{
dicRow = new Dictionary<string, object>();
foreach (DataColumn col in table.Columns){
dicRow.Add(col.ColumnName, row[col]);
}
rows.Add(dicRow);
}
}
// for (int i = 0; i < ds.Tables.)
string json = JsonConvert.SerializeObject(rows);
return json;
}
I'm basically using the proposed solution in the marked answer whilst also using a dynamic solution to the issue with datakey-value pairs by using a dictionary to hold them.
I agree with Clint's comment about using Json.net by Newtonsoft. That has a lot of helper methods that will make your life easier.
Ideally, you will be creating one controller per type of data that you expect to receive. Then you can use the incoming data to create an instance of a class that is modeled after what you will be receiving. With RESTful APIs and MVC, I treat the incoming data as a FormDataCollection, which is like an incoming stream of keyvalue pairs that you must process sequentially. Here is a code sample to point you in the right direction:
// POST api/mycontrollername
public HttpResponseMessage Post(FormDataCollection fdc)
{
try
{
if (fdc != null)
{
MyClass myInstance = new MyClass();
IEnumerator<KeyValuePair<string, string>> pairs = fdc.GetEnumerator();
while (pairs.MoveNext())
{
switch (pairs.Current.Key)
{
case "PhoneNumber":
myInstance.PhoneNumber = pairs.Current.Value;
break;
...
default:
// handle any additional columns
break;
}
}
// do stuff with myInstance
// respond with OK notification
}
else
{
// respond with bad request notification
}
}
catch (Exception ex)
{
// respond with internal server error notification
}
}
If you must handle multiple data types with the same controller, there may be a particular data field that would give you a clue as to what you are receiving so that you can handle it appropriately. If you genuinely have no idea what you are receiving at any time, you may be able to cast it as a dynamic type and use reflection, but it sounds like the data sender is setting you up for a tough integration in that case - it is definitely not how I would engineer an API...
EDIT: (To clarify based on your comments) It appears you want to accept requests for any number of data types using a single controller, and return a DataTable with the results, even though you don't know in advance which fields the DataTable will contain.
First, unless it is a specific requirement, I wouldn't use a DataTable because it is not a very platform-independent solution - if the requesting application is in a non-.NET language, it will have an easier time parsing a Json array compared to a DataTable. If you must use a DataTable, you can look into .Net's DataTable.ToXML() extension method, but I have run into issues with special characters not converting to XML well and some other gotchas with this approach.
If you want to go the recommended json route, the data you request will have to be sufficient to query the database (ie: RequestType = "jobSiteReport", MinDate = "7/1/2016", MaxDate = "7/12/2016"). Use this to query the database or generate the data (into a DataSet or DataTable if that is what you are comfortable with) depending on what the "RequestType" is. Use Linq or a loop or some other method to transform the DataTable into an object array (this could be an anonymous type if you want, but I prefer keeping things strongly typed when I have the option.) This is pseudocode, but should give you an idea of what I mean:
//using Newtonsoft.Json;
List<MyObject> objects = new List<MyObject>();
for (int i = 0; i < dt.Rows.Count; i++)
{
MyObject obj = new MyObject()
obj.Time = dt.Rows[i]["Time"];
obj.Person = dt.Rows[i]["PersonID"];
...
objects.Add(obj);
}
string json = JsonConvert.SerializeObject(objects.ToArray());
Now you have a string that contains json data with all of the results from your DataTable. Then you can return that. (You could also use a SqlDataReader to accomplish the same result without using a DataTable as an intermediate step - populate the list from the SqlDataReader, convert the list to an array, then serialize the array to json.)
If you are inventing the schema, you should create documentation about what the requester should expect to receive.
Json.Net also has methods for deserializing strongly typed objects from json, deserializing anonymous types, etc. You should read up on what is available - I'm sure that it has something that will be perfect for what you are trying to achieve.

Return Anonymous Type using SqlQuery RAW Query in Entity Framework

How can I make Entity Framework SqlQuery to return an Anonymous type.
Right now I run a context.TheObject.SqlQuery() RAW query. the query joins two tables and I want to return the results of the joined tables.
If I use it with a type context.TheObject.SqlQuery() I only get to see the results of the table of that same type.
I tried db.Database.SqlQuery<DbResults>("the sql query here"); With a pre-defined class that matches the result's objects, but all the fields are null.
Using Entity Framework 6 with MySQL.
I'm going out on a limb here, and will try to address your underlying problem instead of directly answering your question.
Your scenario with the pre-defined class should work. A likely pitfall is that the column names and the properties of your class did not match up.
Sample code (LinqPad)
var results = Database.SqlQuery<TestResult>("select r.Name, b.BankName from relation r inner join BankAccount b on b.RelationId = r.Id where r.Id = 2");
results.Dump();
}
public class TestResult {
public string Name { get; set; }
public string BankName { get; set; }
I'd strongly advise you to revisit your problematic code using explicit types.
In direct response to your question: no, you can't return anonymous types from SqlQuery. The best you can do is build dynamic objects, but that unfortunately requires a fair bit of manual work using TypeBuilder. See http://www.codeproject.com/Articles/206416/Use-dynamic-type-in-Entity-Framework-SqlQuery for a sample.
Here's what I did.
Execute sp and get the results into a data reader
public static async Task<IEnumerable<object>> GetAnonymousResults(IUnitOfWork unitOfWork, string spName, SqlParameter[] outParameters, params SqlParameter[] parameters)
{
//meh, you only need the context here. I happened to use UnitOfWork pattern and hence this.
var context = unitOfWork as DbContext;
DbCommand command = new SqlCommand();
command.CommandType = CommandType.StoredProcedure;
command.CommandText = spName;
command.Connection = context.Database.Connection;
command.Parameters.AddRange(parameters);
//Forget this if you don't have any out parameters
command.Parameters.AddRange(outParameters);
try
{
command.Connection.Open();
var reader = await command.ExecuteReaderAsync();
return reader.ToObjectList();//A custom method implemented below
}
finally
{
command.Connection.Close();
}
}
Read individual values from each row into a expando object and put the list of expando objects into an array
public static List<object> ToObjectList(this IDataReader dataReader, bool ignoreUnmappedColumns = true)
{
var list = new List<object>();
while (dataReader.Read())
{
IEnumerable<string> columnsName = dataReader.GetColumnNames();//A custom method implemented below
var obj = new ExpandoObject() as IDictionary<string, object>;
foreach (var columnName in columnsName)
{
obj.Add(columnName, dataReader[columnName]);
}
var expando = (ExpandoObject)obj;
list.Add(expando);
}
return list;
}
Get the list of columns by using the reader.GetSchemaTable() method
public static IEnumerable<string> GetColumnNames(this IDataReader reader)
{
var schemaTable = reader.GetSchemaTable();
return schemaTable == null
? Enumerable.Empty<string>()
: schemaTable.Rows.OfType<DataRow>().Select(row => row["ColumnName"].ToString());
}
Usage
var results =
await
StandaloneFunctions.GetAnonymousResults(_unitOfWork, "spFind",
outputParameters,parameters);
In my case, I happened to use SPs but this should work with queries. All you have to do is replace the command with the following ( and remove all the parameter passing)
command.CommandType = CommandType.Text;
command.CommandText = "select * from SomeTable";

C# function to return generic objects / entities

In my application I need to display a list of records returned by different stored procedures. Each store procedure returns different types of records (ie the number of columns and the column type are different).
My original thought was to create a class for each type of records and create a function which would execute the corresponding stored procedure and return List< MyCustomClass>. Something like this:
public class MyCustomClass1
{
public int Col1 { get; set; } //In reality the columns are NOT called Col1 and Col1 but have proper names
public int Col2 { get; set; }
}
public static List<MyCustomClass1> GetDataForReport1(int Param1)
{
List<MyCustomClass1> output = new List<MyCustomClass1>();
using (SqlConnection cn = new SqlConnection("MyConnectionString"))
using (SqlCommand cmd = new SqlCommand("MyProcNameForReport1", cn))
{
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add("#Param1", SqlDbType.Int).Value = Param1;
SqlDataReader rdr=cmd.ExecuteReader();
int Col1_Ordinal = rdr.GetOrdinal("Col1");
int Col2_Ordinal = rdr.GetOrdinal("Col2");
while (rdr.Read())
{
output.Add(new MyCustomClass1
{
Col1 = rdr.GetSqlInt32(Col1_Ordinal).Value,
Col2 = rdr.GetSqlInt32(Col2_Ordinal).Value
});
}
rdr.Close();
}
return output;
}
This works fine but as I don't need to manipulate those records in my client code (I just need to bind them to a graphical control in my application layer) it doesn't really make sense to do it this way as I would end up with plenty of custom classes that I wouldn't actually use. I found this which does the trick:
public static DataTable GetDataForReport1(int Param1)
{
DataTable output = new DataTable();
using (SqlConnection cn = new SqlConnection("MyConnectionString"))
using (SqlCommand cmd = new SqlCommand("MyProcNameForReport1", cn))
{
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add("#Param1", SqlDbType.Int).Value = Param1;
output.Load(cmd.ExecuteReader());
}
return output;
}
This returns a DataTable which I can bind to whatever control I use in my application layer. I'm wondering if using a DataTable is really needed.
Couldn't I return a list of objects created using anonymous classes:
public static List<object> GetDataForReport1(int Param1)
{
List<object> output = new List<object>();
using (SqlConnection cn = new SqlConnection("MyConnectionString"))
using (SqlCommand cmd = new SqlCommand("MyProcNameForReport1", cn))
{
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add("#Param1", SqlDbType.Int).Value = Param1;
SqlDataReader rdr=cmd.ExecuteReader();
int Col1_Ordinal = rdr.GetOrdinal("Col1");
int Col2_Ordinal = rdr.GetOrdinal("Col2");
while (rdr.Read())
{
output.Add(new
{
Col1 = rdr.GetSqlInt32(Col1_Ordinal).Value,
Col2 = rdr.GetSqlInt32(Col2_Ordinal).Value
});
}
rdr.Close();
}
return output;
}
Any other ideas? Basically I just want the function to return 'something' which I can bind to a graphical control and I prefer not to have to create custom classes as they wouldn’t really be used. What would be the best approach?
The good news is: this isn't the first time the entities vs datasets issue has come up. It's a debate much older than my own programming experience. If you don't want to write DTO's or custom entities then your options are DataTables/DataSets or you could reinvent this wheel again. You wouldn't be the first and you won't be the last. The argument for DTO's/Entities is that you don't have the performance overheads that you get with DataSets. DataSets have to store a lot of extra information about the datatypes of the various columns, etc...
One thing, you might be happy to hear is that if you go the custom entity route and you are happy for your object property names to match the column names returned by your sprocs, you can skip writing all those GetDataForReport() mapping functions where you are using GetOrdinal to map columns to properties. Luckily, some smart monkeys have properly sussed that issue here.
EDIT: I was researching an entirely different problem today (binding datasets to silverlight datagrids) and came across this article by Vladimir Bodurov, which shows how to transform an IEnumerable of IDictionary to an IEnumerable of dynamically created objects (using IL). It occured to me that you could easily modify the extension method to accept a datareader rather than the IEnumerable of IDictionary to solve your issue of dynamic collections. It's pretty cool. I think it would accomplish exactly what you were after in that you no longer need either the dataset or the custom entities. In effect you end up with a custom entity collection but you lose the overhead of writing the actual classes.
If you are lazy, here's a method that turns a datareader into Vladimir's dictionary collection (it's less efficient than actually converting his extension method):
public static IEnumerable<IDictionary> ToEnumerableDictionary(this IDataReader dataReader)
{
var list = new List<Dictionary<string, object>>();
Dictionary<int, string> keys = null;
while (dataReader.Read())
{
if(keys == null)
{
keys = new Dictionary<int, string>();
for (var i = 0; i < dataReader.FieldCount; i++)
keys.Add(i, dataReader.GetName(i));
}
var dictionary = keys.ToDictionary(ordinalKey => ordinalKey.Value, ordinalKey => dataReader[ordinalKey.Key]);
list.Add(dictionary);
}
return list.ToArray();
}
If you're going to XmlSerialize them and want that to be efficient, they can't be anonymous. Are you going to send back the changes? Anonymous classes wont be much use for that.
If you're really not going to do anything with the data other than gridify it, a DataTable may well be a good answer for your context.
In general, if there's any sort of interesting business login going on, you should be using Data Transfer Objects etc. rather than just ferrying around grids.
If you use anonymous classes, you are still defining each type. You just get to use Type Inference to reduce the amount of typing, and the class type will not clutter any namespaces.
The downside of using them here is that you want to return them out of the method. As you noted, the only way to do this is to cast them to objects, which makes all of their data inaccessable without reflection! Are you never going to do any processing or computation with this data? Display one of the columns as a picture instead of a value, or hide it based on some condition? Even if you are only ever going to have a single conditional statement, doing it with the anonymous class version will make you wish you hadn't.

Categories