I am getting some SQL statement that user inputs on UI:
select * from [table] where [col1] is not null and col2 <> 0
I need to verify that all column names (col1 and col2) are in brackets (like col1). And show some popup message if any column name is not in brackets (in this case, col2).
Is there some way to do such SQL verification in C#?
Regarding parsing SQL, you can use the TSqlParser library that ships with Visual Studio. This parser accepts a visitor class that inherits from TSqlFragmentVisitor and overrides the visit method for ColumnReferenceExpression.
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using Microsoft.SqlServer.TransactSql.ScriptDom;
namespace ConsoleApplication8
{
public class QueryParser
{
public IEnumerable<string> Parse(string sqlSelect)
{
TSql100Parser parser = new TSql100Parser(false);
TextReader rd = new StringReader(sqlSelect);
IList<ParseError> errors;
var columns = new List<string>();
var fragments = parser.Parse(rd, out errors);
var columnVisitor = new SQLVisitor();
fragments.Accept(columnVisitor);
columns = new List<string>(columnVisitor.Columns);
return columns;
}
}
internal class SQLVisitor : TSqlFragmentVisitor
{
private List<string> columns = new List<string>();
private string GetNodeTokenText(TSqlFragment fragment)
{
StringBuilder tokenText = new StringBuilder();
for (int counter = fragment.FirstTokenIndex; counter <= fragment.LastTokenIndex; counter++)
{
tokenText.Append(fragment.ScriptTokenStream[counter].Text);
}
return tokenText.ToString();
}
public override void ExplicitVisit(ColumnReferenceExpression node)
{
columns.Add(GetNodeTokenText(node));
}
public IEnumerable<string> Columns {
get { return columns; }
}
}
public class Program
{
private static void Main(string[] args)
{
QueryParser queryParser = new QueryParser();
var columns = queryParser.Parse("SELECT A,[B],C,[D],E FROM T WHERE isnumeric(col3) = 1 Order by Id desc");
foreach (var column in columns)
{
Console.WriteLine(column);
}
}
}
}
However, I am not sure that parsing SQL that an user inputs in UI is a good idea. Even if you run the SQL statements in a sand boxed environment where the user only has read permissions (I do hope you are doing this), it is not very user friendly. As a user I would prefer to be able to select the columns I want using check boxes or by dragging and dropping them rather than having to write a SQL query.
Related
The CSVHelper .NET library seems fantastic so far, but the documentation is a little lacking for a pseudo-beginner like myself.
I need to read a csv file and write the results to our SQL Server database. For the table I'm writing to, I need to map to its columns from the CSV columns, including some concatenation of multiple fields to one.
This is what I have for reading the csv file:
public static void Main(string[] args)
{
using (var reader = new StreamReader(#"C:\Users\me\Documents\file.csv"))
using (var csv = new CsvReader(reader))
{
csv.Configuration.PrepareHeaderForMatch = (string header, int index) =>
header.Replace(" ", "_").Replace("(", "").Replace(")", "").Replace(".", "");
var records = csv.GetRecords<EntityCsv>().ToList();
}
}
My EntityCsv class contains property names for all columns of the csv file.
Then, I also have a class called TaskEntity which contains the property names and types for the destination database table (although I'm unclear as to whether I need this).
Finally, per advice from a colleague, I have a method set up to make use of SQLBulkCopy as thus:
public void AddBulk(List<TaskEntity> entities)
{
using (var con = GetConnection())
{
SqlBulkCopy bulk = new SqlBulkCopy(con);
bulk.BatchSize = 2000;
bulk.BulkCopyTimeout = 0;
bulk.DestinationTableName = "dbo.CsvExports";
bulk.WriteToServer(entities.AsDataTable());
bulk.Close();
}
}
I borrowed that code block from him and would theoretically run that method as the final step.
But I know I'm missing a step in between, and that is mapping the fields from the csv to the SQL server field. I'm scratching my head at how to implement this step.
So let's say for simplicity's sake I have 3 columns in the csv file, and I want to map them to 2 columns of the SQL table as follows:
CsvColumn1 -> SQLtableColumn1
CsvColumn2 + CsvColumn3 -> SQLtableColumn2
How would I go about accomplishing this with CsvReader and C#? I have explored the Mapping section of the CSVReader documentation but everything I'm seeing in there seems to refer to mapping column names from an input file to names in an output file. I don't see anything there (nor anywhere on the Google) that speaks specifically to taking the input file and exporting its rows to a SQL database.
You can use a ClassMap to map the csv columns to the sql table columns and skip the CsvEntity class.
public static void Main(string[] args)
{
using (var reader = new StreamReader(#"C:\Users\me\Documents\file.csv"))
using (var csv = new CsvReader(reader))
{
csv.Configuration.PrepareHeaderForMatch = (string header, int index) =>
header.Replace(" ", "_").Replace("(", "").Replace(")", "").Replace(".", "");
csv.Configuration.RegisterClassMap<TaskEntityMap>();
var records = csv.GetRecords<TaskEntity>().ToList();
}
}
public class TaskEntity
{
public int Id { get; set; }
public string SqlTableColumn1 { get; set; }
public string SqlTableColumn2 { get; set; }
}
public sealed class TaskEntityMap : ClassMap<TaskEntity>
{
public TaskEntityMap()
{
Map(m => m.SqlTableColumn1).Name("CsvColumn1");
Map(m => m.SqlTableColumn2).ConvertUsing(row => row.GetField<string>("CsvColumn2") + " " + row.GetField<string>("CsvColumn3"));
}
}
I have used the SqlBulkCopy along with the csvhelper to dump data in to the sql server.
SqlBulkCopy is an awesome utility that writes data to the sql server from nearly any data source that can be loaded into a DataTable instance.
var lines = File.ReadAllLines(file);
if (lines.Count() == 0)
return;
var tableName = GetTableName(file);
var columns = lines[0].Split(',').ToList();
var table = new DataTable();
sqlBulk.ColumnMappings.Clear();
foreach (var c in columns)
{
table.Columns.Add(c);
sqlBulk.ColumnMappings.Add(c, c);
}
for (int i = 1; i < lines.Count() - 1; i++)
{
var line = lines[i];
// Explicitly mark empty values as null for SQL import to work
var row = line.Split(',')
.Select(a => string.IsNullOrEmpty(a) ? null : a).ToArray();
table.Rows.Add(row);
}
sqlBulk.DestinationTableName = tableName;
sqlBulk.WriteToServer(table);
I can't find anything specific in the Firebird.NET documentaion, but I need to extract multiple specific data rows out of a table from my Firebird database. I later want to host the server and website formulars, where you can list all the employee with it's specific data fields which i want to choose manually.
The sql query would be:
SELECT employee_id, name, first_name, active_or_not
FROM t_personal
I can already make a query with a single return string inside the server-class using Dapper like this:
using System;
using System.Linq;
using FirebirdSql.Data.FirebirdClient;
using Dapper;
using System.Data;
//the references i use
public string SingleQuery(string commando)
{
try
{
String query = _festeVerbindung.Query<String>(commando).Single();
return query;
}
catch (Exception oof)
{
return oof.Message;
}
}
Preferably i would like to extract them into a string array because i think this would be the best option to put it in an output web formular later, but if you have any better ideas i would appreciate it.
I don't think your query will work as you expect. You are returning one row of type string, but your SQL query is returning multiple fields. If you really want to return a string, then in the SQL statement you can concatenate the fields. But this would make dealing with the individual fields in the C# code harder. I would suggest something along the lines of below. I am illustrating 3 ways to do this. 1) with no where clause in the SQL, then another with a where clause and lastly converting the results to an array or strings. The snippet for the last i wouldn't recommend it, it is sloppy but it is there to show how you can transform the return set into an array of strings. If you need to really return an array of strings i would suggest using a CSV or JSON test serializer. Those packages call handle the anomalies you'd find when dealing with strings (like NULL or embedded commas, etc. ).
-HTH
public class EmployeeRec
{
public int Employee_Id { get; set; }
public string Name { get; set; }
public string First_name { get; set; }
public string active_or_not { get; set; }
}
public class QueryClass
{
public EmployeeRec[] ExecuteQueryNoWhereClause()
{
var sql = #"SELECT employee_id, name, first_name, active_or_not FROM t_personal";
try
{
using (IDbConnection _festeVerbindung = Data.Connection.GetConnection("My Connectionstring"))
{
var results = _festeVerbindung.Query<EmployeeRec>(sql).ToArray();
var activeOnly = results.Where(x => x.active_or_not == "Y");
return results;
}
}
catch (Exception oof)
{
throw;
}
}
public EmployeeRec[] ExecuteQueryWithWhereClause(string activeFlag)
{
var sql = #"SELECT employee_id, name, first_name, active_or_not FROM t_personal WHere active_or_not = #ACTIVE_FLAG";
try
{
using (IDbConnection _festeVerbindung = Data.Connection.GetConnection("My Connectionstring"))
{
var results = _festeVerbindung.Query<EmployeeRec>(sql, new { ACTIVE_FLAG = activeFlag }).ToArray();
foreach (var item in results)
{
}
return results;
}
}
catch (Exception oof)
{
throw;
}
}
public string[] ExecuteQueryReturnStringArray(string activeFlag)
{
StringBuilder sb = new StringBuilder();
List<string> retArray = new List<string>();
var sql = #"SELECT employee_id, name, first_name, active_or_not FROM t_personal WHere active_or_not = #ACTIVE_FLAG";
try
{
using (IDbConnection _festeVerbindung = Data.Connection.GetConnection("My Connectionstring"))
{
var results = _festeVerbindung.Query<EmployeeRec>(sql, new { ACTIVE_FLAG = activeFlag }).ToArray();
/*
*
* There are better ways to do this, if you need to return a string array you should a Json or CSV text serializer like Servicestack.Text *
*
*/
retArray.Add("employee_id, name, first_name, active_or_not");
foreach (var item in results)
{
retArray.Add(item.Employee_Id.ToString() + "," + item.Name + "," + item.First_name + "," + item.active_or_not);
}
return retArray.ToArray();
}
}
catch (Exception oof)
{
throw;
}
}
}
Good day! Im trying to parse XML subchild using dataset. The thing is its not reading the "SiteCode" when it has multiple value.
for example:
string filePath = #"" + _clsPathIntervalSttngs.localPath + "/" + "hehe.xml";
DataSet dataSet = new DataSet()
dataSet.ReadXml(filePath, XmlReadMode.InferSchema);
// Then display informations to test
foreach (DataTable table in dataSet.Tables)
{
Console.WriteLine(table);
for (int i = 0; i < table.Columns.Count; ++i)
{
Console.Write("\t" + table.Columns[i].ColumnName.Substring(0, Math.Min(6, table.Columns[i].ColumnName.Length)));
Console.WriteLine();
}
foreach (var row in table.AsEnumerable())
{
for (int i = 0; i < table.Columns.Count; ++i)
{
Console.Write("\t" + row[i]);
}
Console.WriteLine();
}
}
this is what it is returning.
Its returning a 0 value and selecting the Product instead of sitecode.
Where did i go wrong?
You might have to check the code because I just took something similar I had lying around and changed it to look at your document hierarchy. I also didn't use a DataSet. Consider the following code:
var filePath = "<path to your file.xml>";
var xml = XDocument.Load(filePath);
var items = from item in xml.Descendants("Product").Elements()
select item.Value;
Array.ForEach(items.ToArray(), Console.WriteLine);
That should show you the values of each element under product. If you want the whole element, remove the .Value in the select clause of the LINQ query.
Update
I'm now projecting to an anonymous type. You'll get one of these for each Product element in the file.
var items = from item in dataset.Descendants("Product")
select new
{
RefCode = item.Element("RefCode").Value,
Codes = string.Join(", ", item.Elements("SiteCode").Select(x => x.Value)),
Status = item.Element("Status").Value
};
Array.ForEach(items.ToArray(), Console.WriteLine);
I have flattened the codes to a comma separated string but you can keep the IEnumerable or ToList it as you wish.
Using xml Linq :
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Xml;
using System.Xml.Linq;
namespace ConsoleApplication51
{
class Program
{
const string FILENAME = #"c:\temp\test.xml";
static void Main(string[] args)
{
XElement doc = XElement.Load(FILENAME);
List<Product> products = doc.Descendants("Product").Select(x => new Product()
{
refCode = (string)x.Element("RefCode"),
siteCode = x.Elements("SiteCode").Select(y => (int)y).ToArray(),
status = (string)x.Element("Status")
}).ToList();
}
}
public class Product
{
public string refCode { get; set; }
public int[] siteCode { get; set; }
public string status { get; set; }
}
}
I am building a WEB API to generate JSON objects in .net core
The thing is the data sets are generated in SQL stored procedures (using dynamic SQL) and i dont know the type of objects that are returned so i can map it to a concrete model, since the output columns change depending on the parameters.
Does any one know ho to retrive the data set from the BD in net core 1.0 with or without using EF?
Browsed a lot and can only find ansers that use models
Thanks in advance
You can add the following dependencies for your project in project.json file:
System.Data.Common
System.Data.SqlClient
As you can see in the next image:
Rebuild your project and you can code something like this:
using System;
using System.Collections.Generic;
using System.Data.SqlClient;
using System.Dynamic;
namespace ConsoleApp1
{
public class Program
{
public static IEnumerable<dynamic> GetData(String cmdText)
{
using (var connection = new SqlConnection("server=(local);database=Northwind;integrated security=yes;"))
{
connection.Open();
using (var command = new SqlCommand(cmdText, connection))
{
using (var dataReader = command.ExecuteReader())
{
var fields = new List<String>();
for (var i = 0; i < dataReader.FieldCount; i++)
{
fields.Add(dataReader.GetName(i));
}
while (dataReader.Read())
{
var item = new ExpandoObject() as IDictionary<String, Object>;
for (var i = 0; i < fields.Count; i++)
{
item.Add(fields[i], dataReader[fields[i]]);
}
yield return item;
}
}
}
}
}
public static void Main(String[] args)
{
foreach (dynamic row in GetData("select * from Shippers"))
{
Console.WriteLine("Company name: {0}", row.CompanyName);
Console.WriteLine();
}
Console.ReadKey();
}
}
}
Please let me know if this is useful.
Regarding this previous stackoverflow question:
DRY CLR table-valued functions
It seems it only runs in single thread mode. To test this I modified the code slightly to prepend the Name field with the current thread number. All of the returned results had the same thread number assigned. Is this action by design? Is there anyway to get it to multithread? Thanks.
private class ResultRow
// This class holds a row which we want to return.
{
public SqlInt32 CustId;
public SqlString Name;
public ResultRow(SqlInt32 custId_, SqlString name_)
{
int mythread = Thread.CurrentThread.ManagedThreadId;
CustId = custId_;
Name = "[" + mythread.ToString() + "] " + name_;
}
}
EDITED per Marc's question:
Here's the full piece of code. It returns 3470 records in 7 seconds.
using System;
using System.Data;
using System.Data.SqlClient;
using System.Data.SqlTypes;
using Microsoft.SqlServer.Server;
using System.Runtime.InteropServices;
using System.Text;
using System.Collections.Generic;
using System.Collections;
using System.Threading;
namespace CS_CLR_TVF
{
public partial class UserDefinedFunctions
{
// This class holds a row which we want to return.
private class ResultRow
{
public SqlString fldProductName;
public ResultRow(SqlString product_)
{
int mythread = Thread.CurrentThread.ManagedThreadId;
fldProductName = "[" + mythread.ToString() + "] " + product_;
}
}
[SqlFunction(DataAccess = DataAccessKind.Read, FillRowMethodName = "Test_FillRow", TableDefinition = "fldProductName nvarchar(1024)")]
public static IEnumerable xudf_cs_tvf(SqlString strSearchClue)
{
ArrayList results = new ArrayList();
using (SqlConnection connection = new SqlConnection("context connection=true"))
{
connection.Open();
string s1;
using (SqlCommand select = new SqlCommand("SELECT fldProductName FROM tblProducts", connection))
{
using (SqlDataReader reader = select.ExecuteReader())
{
while (reader.Read())
{
s1 = reader.GetSqlString(0).ToString();
// do a substring compare, if "match" grab the row
int idx = s1.IndexOf(strSearchClue.ToString());
if (idx > -1) results.Add(new ResultRow(reader.GetSqlString(0)));
}
}
}
}
return results;
}
// This function takes a row and tells SQL Server what variables we want to
// return from it and what types it contains.
public static void Test_FillRow(object resultsObj, out SqlString fldProductName)
{
ResultRow selectResults = (ResultRow)resultsObj;
fldProductName = selectResults.fldProductName;
}
}
}
Pretty straight forward internal select statement:
SELECT fldProductName FROM tblProducts
.
.
.
.
Here's a version implemented as a scalar UDF and it does do multithreading. It returns 3470 records in <1 second.
[Microsoft.SqlServer.Server.SqlFunction]
public static long xudf_csfake(SqlString strSearchClue, SqlString strStringtoSearch)
{
string s1 = strStringtoSearch.ToString();
// do a substring compare, if "match" grab the row
int idx = s1.IndexOf(strSearchClue.ToString());
if (idx > -1) return 1;
return 0;
}
Here is it's external select statement:
SELECT fldProductName FROM tblProducts WHERE (dbo.xudf_csfake('METAL' ,fldProductName) = 1)
So I seem to getting the opposite of what the article indicates.