I am building a WEB API to generate JSON objects in .net core
The thing is the data sets are generated in SQL stored procedures (using dynamic SQL) and i dont know the type of objects that are returned so i can map it to a concrete model, since the output columns change depending on the parameters.
Does any one know ho to retrive the data set from the BD in net core 1.0 with or without using EF?
Browsed a lot and can only find ansers that use models
Thanks in advance
You can add the following dependencies for your project in project.json file:
System.Data.Common
System.Data.SqlClient
As you can see in the next image:
Rebuild your project and you can code something like this:
using System;
using System.Collections.Generic;
using System.Data.SqlClient;
using System.Dynamic;
namespace ConsoleApp1
{
public class Program
{
public static IEnumerable<dynamic> GetData(String cmdText)
{
using (var connection = new SqlConnection("server=(local);database=Northwind;integrated security=yes;"))
{
connection.Open();
using (var command = new SqlCommand(cmdText, connection))
{
using (var dataReader = command.ExecuteReader())
{
var fields = new List<String>();
for (var i = 0; i < dataReader.FieldCount; i++)
{
fields.Add(dataReader.GetName(i));
}
while (dataReader.Read())
{
var item = new ExpandoObject() as IDictionary<String, Object>;
for (var i = 0; i < fields.Count; i++)
{
item.Add(fields[i], dataReader[fields[i]]);
}
yield return item;
}
}
}
}
}
public static void Main(String[] args)
{
foreach (dynamic row in GetData("select * from Shippers"))
{
Console.WriteLine("Company name: {0}", row.CompanyName);
Console.WriteLine();
}
Console.ReadKey();
}
}
}
Please let me know if this is useful.
Related
I am inputting a text file into a DataTable and then using SqlBulkCopy to copy to a Database. While BulkCopy is fast, inserting 50000+ lines into DataTable is not (around 5 mins). How do I make it efficient?
Can I insert data into the DataTable quickly?
If not, is there a way to save the inserted data permanently into the DataTable so I don't have to insert it every time I run the program?
for (; i < fares.Length; )
{
k = i;
Console.WriteLine("Inserting " + k + " out of " + (fares.Length));
for (; i <= (k + 3); i++)
{
if (i % 4 == 0)
{
for (int j = 0; j < fares.Length - 1; j++)
{
{
int space = fares[i].IndexOf(" ");
startStation = fares[i].Substring(0, space);
endStation = fares[i].Substring(space + 1, fares[i].Length - space - 1);
}
}
}
else if (i % 4 == 1)
{
valueFare = fares[i];
}
else if (i % 4 == 2)
{
standardFare = fares[i];
}
else if (i % 4 == 3)
{
time = int.Parse(fares[i]);
}
}
faresDT.Rows.Add(startStation, endStation, valueFare, standardFare, time);
If what you want is to optimize your load to the database, I suggest that you get rid of the DataTable completely. By making use of Marc Gravell's FastMember (and anyone who's using SqlBulkCopy should be using FastMember IMHO) you can get a DataReader directly from any IEnumerable.
I would use some variation of the below code whenever writing from a file directly to a database. The below code will stream the contents of the file directly to the SqlBulkCopy operation thru the clever use of yield returns and lazy load of IEnumerable.
using System;
using System.Collections.Generic;
using System.Data.SqlClient;
using System.IO;
using System.Text;
using FastMember;
namespace BulkCopyTest
{
public class Program
{
public static void Main(string[] args)
{
const string filePath = "SOME FILE THAT YOU WANT TO LOAD TO A DB";
WriteData(GetData<dynamic>(filePath));
}
private static void WriteData<T>(IEnumerable<T> data)
{
using (var bcp = new SqlBulkCopy(GetConnection(), SqlBulkCopyOptions.TableLock, null))
using (var reader = ObjectReader.Create(data))
{
SetColumnMappings<T>(bcp.ColumnMappings);
bcp.BulkCopyTimeout = 300;
bcp.BatchSize = 150000;
bcp.DestinationTableName = ""; //TODO: Set correct TableName
bcp.WriteToServer(reader);
}
}
private static void SetColumnMappings<T>(SqlBulkCopyColumnMappingCollection mappings)
{
//Setup your column mappings
}
private static IEnumerable<T> GetData<T>(string filePath)
{
using (var fileStream = File.OpenRead(filePath))
using (var reader = new StreamReader(fileStream, Encoding.UTF8))
{
string line;
while ((line = reader.ReadLine()) != null)
{
//TODO: Add actual parsing logic and whatever else is needed to create an instance of T
yield return Activator.CreateInstance<T>();
}
}
}
private static SqlConnection GetConnection()
{
return new SqlConnection(new SqlConnectionStringBuilder
{
//TODO: Set Connection information here
}.ConnectionString);
}
}
}
In this case I think you should take advantage of the BeginLoadData, LoadDataRow and EndLoadData methods provided in the DataTable class, you could use them like this:
try
{
faresDT.BeginLoadData();
// Your for loop...
{
// Logic defining the value of startStation, endStation, valueFare, standardFare and time removed for briefness.
faresDT.LoadDataRow(new object[] {startStation, endStation, valueFare, standardFare, time}, true);
}
}
finally
{
faresDT.EndLoadData();
}
What BeginLoadData() does is turning off some processing that happens every time you add a row, and only does it once when you are done loading data by calling EndLoadData().
You can find more details about these APIs here:
https://learn.microsoft.com/en-us/dotnet/api/system.data.datatable.loaddatarow?view=netframework-4.7.2
I've a problem of fetching data from database, Because I'm new in c# Language. I thought that the ExecuteReader should retrieve all rows. by the way I can't access the data like full of object. and the reader.GetString(0) it's return to me only the index of array object.
Is any properties from c# to access the row from database?
I'm looking for the answer to solve this problem. I'd appropriated for the helping!
Here is the code I have so far:
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Npgsql;
namespace Sample.Controllers
{
public class HomeController : Controller
{
public string Index()
{
var connString = "Host=localhost;Username=postgres;Password=123456;Database=dotnet_core_db";
using (var conn = new NpgsqlConnection(connString))
{
conn.Open();
// Retrieve all rows
using (var cmd = new NpgsqlCommand("SELECT * FROM Persons", conn))
using (var reader = cmd.ExecuteReader())
while (reader.Read())
{
Console.WriteLine(reader.GetString(0));
Console.WriteLine(reader.GetString(1));
Console.WriteLine(reader.GetString(2));
Console.WriteLine(reader.GetString(3));
Console.WriteLine(reader.GetString(4));
ViewData["connString"] = connString;
}
}
return connString;
}
public IActionResult Error()
{
ViewData["RequestId"] = Activity.Current?.Id ?? HttpContext.TraceIdentifier;
return View();
}
}
}
Expectation Result
{
{
id: 92,
name: "foo",
last_name: "bar",
address: "adress Of foo",
city: "city of foo"
},
{
id: 872,
name: "foo1",
last_name: "bar 2",
address: "address of foo1",
city: "city of foo1"
}
}
you can do it like below one
using (var cmd = new NpgsqlCommand("SELECT * FROM Persons", conn))
{
using (var reader = cmd.ExecuteReader())
{
var columns = new List<string>();
for(int i=0;i<reader.FieldCount;i++)
columns.Add(reader.GetName(i));
JsonArrayCollection jsonArray = new JsonArrayCollection();
while (reader.Read())
{
JsonObjectCollection jsonObject = new JsonObjectCollection();
for(string columnName in columns)
jsonObject.Add(new JsonStringValue(columnName, reader[columnName]));
jsonArray.Add(jsonObject);
}
}
}
Written in notepad, so please excuse any compilation error.
Note: you can use any json library, only the syntax will change.
The data retrieval is okay. What you need is a json writing package. Consider library like json.net to ease the job.
StringBuilder sb = new StringBuilder();
StringWriter sw = new StringWriter(sb);
using (JsonWriter writer = new JsonTextWriter(sw))
{
writer.Formatting = Formatting.Indented;
while (dataReader.Read())
{
writer.WriteStartObject();
for (int i = 0; i < dataReader.fieldCount; ++i)
{
writer.WritePropertyName(dataReader.GetName(i));
writer.WriteValue(dataReader.GetValue(i).ToString());
}
writer.WriteEndObject();
}
}
fileWriter.Write(sb.ToString());
I have an application which stores some data in a SQL database in a binary field. These data are serialized from a .net object in traditional .NET application with binary serialization. I'm writing a .net core application that needs to interface with the above but reading the binary data and making sense of them.
Ideally I want to be able just to deserialize these data with .net Core as if I was running full .net framework. The data itself are not complex, they are just a Dictionary<string,string> produced by the old asp.net profile provider.
I need to both read and write the binary data from .net core code.
I understand that support for BinaryFormatter is coming to .net core. In the meanwhile, is there anything not very complicated I can do right now to serialize / deserialize data with .net core as full .net framework would?
Looks like we have to write our own BinaryReader/BinaryWriter methods...
SCHEMA
CREATE TABLE dbo.BinaryTest (
BinaryData varbinary(max) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
CODE
using System;
using System.Collections.Generic;
using System.Data;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using System.IO;
namespace StackOverflow.ConsoleTester
{
public class Program
{
private const String SqlConnectionString = #"Server={Server};Database={Database};User={User};Password={Password};";
public static void Main(String[] args)
{
using (var db = new SqlConnection(SqlConnectionString))
{
var sqlCommand = new SqlCommand("DELETE FROM dbo.BinaryTest;", db) { CommandType = CommandType.Text };
db.Open();
sqlCommand.ExecuteNonQuery();
}
using (var db = new SqlConnection(SqlConnectionString))
{
var serializedData = new Dictionary<String, String> {{"key1", "value1"}, {"key2", "value2"}};
var binaryData = WriteBinaryData(serializedData);
var sqlCommand = new SqlCommand("INSERT INTO dbo.BinaryTest (BinaryData) VALUES (#BinaryData);", db) { CommandType = CommandType.Text };
var parameter = new SqlParameter("BinaryData", SqlDbType.VarBinary) {Value = binaryData};
sqlCommand.Parameters.Add(parameter);
db.Open();
sqlCommand.ExecuteNonQuery();
}
Dictionary<String, String> deserializedData = null;
using (var db = new SqlConnection(SqlConnectionString))
{
var sqlCommand = new SqlCommand("SELECT BinaryData FROM dbo.BinaryTest", db) { CommandType = CommandType.Text };
db.Open();
var reader = sqlCommand.ExecuteReader();
while (reader.Read())
{
deserializedData = ReadBinaryData(reader.GetSqlBinary(0));
}
}
if (deserializedData != null)
{
foreach (var item in deserializedData)
{
Console.WriteLine($"Key: {item.Key}; Value: {item.Value}");
}
}
Console.ReadKey();
}
private static Byte[] WriteBinaryData(Dictionary<String, String> data)
{
var memoryStream = new MemoryStream();
var binaryWriter = new BinaryWriter(memoryStream);
foreach (var item in data)
{
binaryWriter.Write(item.Key);
binaryWriter.Write(item.Value);
}
var binaryData = memoryStream.ToArray();
return binaryData;
}
private static Dictionary<String, String> ReadBinaryData(SqlBinary data)
{
var model = new Dictionary<String, String>();
var memoryStream = new MemoryStream(data.Value);
var binaryReader = new BinaryReader(memoryStream);
while (binaryReader.BaseStream.Position != binaryReader.BaseStream.Length)
{
model.Add(binaryReader.ReadString(), binaryReader.ReadString());
}
return model;
}
}
}
I am getting some SQL statement that user inputs on UI:
select * from [table] where [col1] is not null and col2 <> 0
I need to verify that all column names (col1 and col2) are in brackets (like col1). And show some popup message if any column name is not in brackets (in this case, col2).
Is there some way to do such SQL verification in C#?
Regarding parsing SQL, you can use the TSqlParser library that ships with Visual Studio. This parser accepts a visitor class that inherits from TSqlFragmentVisitor and overrides the visit method for ColumnReferenceExpression.
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using Microsoft.SqlServer.TransactSql.ScriptDom;
namespace ConsoleApplication8
{
public class QueryParser
{
public IEnumerable<string> Parse(string sqlSelect)
{
TSql100Parser parser = new TSql100Parser(false);
TextReader rd = new StringReader(sqlSelect);
IList<ParseError> errors;
var columns = new List<string>();
var fragments = parser.Parse(rd, out errors);
var columnVisitor = new SQLVisitor();
fragments.Accept(columnVisitor);
columns = new List<string>(columnVisitor.Columns);
return columns;
}
}
internal class SQLVisitor : TSqlFragmentVisitor
{
private List<string> columns = new List<string>();
private string GetNodeTokenText(TSqlFragment fragment)
{
StringBuilder tokenText = new StringBuilder();
for (int counter = fragment.FirstTokenIndex; counter <= fragment.LastTokenIndex; counter++)
{
tokenText.Append(fragment.ScriptTokenStream[counter].Text);
}
return tokenText.ToString();
}
public override void ExplicitVisit(ColumnReferenceExpression node)
{
columns.Add(GetNodeTokenText(node));
}
public IEnumerable<string> Columns {
get { return columns; }
}
}
public class Program
{
private static void Main(string[] args)
{
QueryParser queryParser = new QueryParser();
var columns = queryParser.Parse("SELECT A,[B],C,[D],E FROM T WHERE isnumeric(col3) = 1 Order by Id desc");
foreach (var column in columns)
{
Console.WriteLine(column);
}
}
}
}
However, I am not sure that parsing SQL that an user inputs in UI is a good idea. Even if you run the SQL statements in a sand boxed environment where the user only has read permissions (I do hope you are doing this), it is not very user friendly. As a user I would prefer to be able to select the columns I want using check boxes or by dragging and dropping them rather than having to write a SQL query.
I'm trying to use SqlBulkCopy to import a bunch of data to our website. In most of the other areas we're using Entity model which uses byte arrays to represent binary data in SQL. However, SqlBulkCopy seems to be confusing byte[] with string. Everything seems to be working fine except for this one binary column which throws an exception: "The given value of type String from the data source cannot be converted to type binary of the specified target column."
I've created a small test case to illustrate the problem:
using System.Data;
using System.Data.SqlClient;
namespace SqlBulkCopyTest
{
class Program
{
static void Main(string[] args)
{
DataTable table = new DataTable("BinaryData");
table.Columns.Add("Data");
for (int i = 0; i < 10; i++)
{
var row = table.NewRow();
row["Data"] = new byte[5] { 1, 2, 3, 4, 5 };
table.Rows.Add(row);
}
using (var connection =
new SqlConnection("Data Source=localhost\\sqlexpress;Initial Catalog=TestBulkCopy;Integrated Security=True"))
{
connection.Open();
using (var copier = new SqlBulkCopy(connection))
{
copier.DestinationTableName = table.TableName;
/* EXCEPTION HERE: */ copier.WriteToServer(table);
}
}
}
}
}
This uses a test database with a BinaryData table which has a single binary(5) column named Data.
Any help would be greatly appreciated
Instead of:
table.Columns.Add("Data");
Add the "Data" column as a binary:
table.Columns.Add("Data", typeof(Byte[]));