Is it possible to get a list of DataTables from SQL Server? - c#

I am trying to get a list of tables from my SQL Server database. I would like the DataTable object not just the name.
So far I have got:
using (SqlConnection connection = new SqlConnection(Settings.Default.DatabaseString))
{
List<DataTable> tables = new List<DataTable>();
connection.Open();
}
I can get the names of the tables using:
DataTable schema = connection.GetSchema("Tables");
foreach (DataRow row in schema.Rows)
{
var name = row[2].ToString();
}
But I would like more information than just the name of the table I would like to return primary and foreign keys etc. Is this possible?

You can get infromations about the database, tables, columns and indexes with the Conenction.GetSchema overloads:
using (var con = new SqlConnection(Settings.Default.DatabaseString))
{
con.Open();
DataTable tables = con.GetSchema("Tables");
foreach (DataRow tableRow in tables.Rows)
{
String database = tableRow.Field<String>("TABLE_CATALOG");
String schema = tableRow.Field<String>("TABLE_SCHEMA");
String tableName = tableRow.Field<String>("TABLE_NAME");
String tableType = tableRow.Field<String>("TABLE_TYPE");
DataTable columns = con.GetSchema("Columns", new[] { database, null, tableName });
foreach (DataRow col in columns.Rows)
{
Console.WriteLine(string.Join(",",col.ItemArray));
}
DataTable indexes = con.GetSchema("Indexes", new[] { database, null, tableName });
foreach (DataRow index in indexes.Rows)
{
Console.WriteLine(string.Join(",", index.ItemArray));
}
DataTable indexColumns = con.GetSchema("IndexColumns", new[] { database, null, tableName });
foreach (DataRow indexCol in indexColumns.Rows)
{
Console.WriteLine(string.Join(",", indexCol.ItemArray));
}
}
}
Here's a list of possible values for GetSchema:
https://msdn.microsoft.com/en-us/library/cc716722(v=vs.110).aspx

SQL Server is very open about querying the catalog. You can easily create the query that will return the data you require.
Multiple tables exists to help you:
SELECT * FROM sys.tables
SELECT * FROM sys.columns
SELECT * FROM sys.indexes
SELECT * FROM sys.objects o WHERE o.type IN ('PK', 'F')
Querying the name of the objects from any of these tables can be done with the system method OBJECT_NAME.

Related

How to get newly inserted Ids from identity primary key column when using SqlBulkCopy

I'm using C# and .NET Core 6. I want to bulk insert about 100 rows at once into database and get back their Ids from BigInt identity column.
I have tried lot of different variants, but still do not have the working solution. When I preview table variable, the Id column has DbNull value and not the newly inserted Id.
How to get Ids of newly inserted rows?
What I have:
SQL Server table:
CREATE TABLE dbo.ExamResult
(
Id bigint IDENTITY(1, 1) NOT NULL,
Caption nvarchar(1024) NULL,
SortOrder int NOT NULL,
CONSTRAINT PK_ExmRslt PRIMARY KEY CLUSTERED (Id)
) ON [PRIMARY]
GO
C# code:
private static void BulkCopy(IEnumerable<ExamResultDb> examResults, SqlConnection connection)
{
using (var bulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, null))
{
var table = CreateDataTable(examResults);
bulkCopy.DestinationTableName = "ExamResult";
bulkCopy.EnableStreaming = true;
foreach (var item in table.Columns.Cast<DataColumn>())
{
if (item.ColumnName != "Id")
{
bulkCopy.ColumnMappings.Add(item.ColumnName, item.ColumnName);
}
}
using (var reader = new DataTableReader(table))
{
bulkCopy.WriteToServer(reader);
}
}
}
private static DataTable CreateDataTable(IEnumerable<ExamResultDb> examResults)
{
var table = new DataTable("ExamResult");
table.Columns.AddRange(new[]
{
new DataColumn("Id", typeof(long)),
new DataColumn("Caption", typeof(string)),
new DataColumn("SortOrder", typeof(int))
});
////table.Columns[0].AutoIncrement = true;
////table.PrimaryKey = new[] { table.Columns[0] };
foreach (var examResult in examResults)
{
var row = table.NewRow();
row["Caption"] = examResult.Caption;
row["SortOrder"] = examResult.SortOrder;
table.Rows.Add(row);
}
return table;
}
You need an OUTPUT clause on your insert, but Bulk Copy does not allow this kind of customization.
While nowhere near as neat, you could do this by using a Table Valued Parameter and a custom INSERT statement. TVPs use the bulk copy mechanism, so this should still be pretty fast, although there will be two inserts: one to the TVP and one to the real table.
First create a table type
CREATE TYPE dbo.Type_ExamResult AS TABLE
(
Caption nvarchar(1024) NULL,
SortOrder int NOT NULL
);
This function will iterate the rows as SqlDatRecord
private static IEnumerable<SqlDataRecord> AsExamResultTVP(this IEnumerable<ExamResultDb> examResults)
{
// fine to reuse object, see https://stackoverflow.com/a/47640131/14868997
var record = new SqlDataRecord(
new SqlMetaData("Caption", SqlDbType.NVarChar, 1024),
new SqlMetaData("SortOrder", SqlDbType.Int)
);
foreach (var examResult in examResults)
{
record.SetString(0, examResult.Caption);
record.SetInt32(0, examResult.SortOrder);
yield return record; // looks weird, see above link
}
}
Finally insert using OUTPUT
private static void BulkCopy(IEnumerable<ExamResultDb> examResults, SqlConnection connection)
{
const string query = #"
INSERT dbo.ExamResult (Caption, SortOrder)
OUTPUT Id, SortOrder
SELECT t.Caption, t.SortOrder
FROM #tvp t;
";
var dict = examResults.ToDictionary(er => er.SortOrder);
using (var comm = new SqlCommand(query, connection))
{
comm.Parameters.Add(new SqlParameter("#tmp", SqlDbType.Structured)
{
TypeName = "dbo.Type_ExamResult",
Value = examResults.AsExamResultTVP(),
});
using (var reader = comm.ExecuteReader())
{
while(reader.Read())
dict[(int)reader["SortOrder"]].Id = (int)reader["Id"];
}
}
}
Note that the above code assumes that SortOrder is a natural key within the dataset to be inserted. If it is not then you will need to add one, and if you are not inserting that column then you need a rather more complex MERGE statement to be able to access that column in OUTPUT, something like this:
MERGE dbo.ExamResult er
USING #tvp t
ON 1 = 0 -- never match
WHEN NOT MATCHED THEN
INSERT (Caption, SortOrder)
VALUES (t.Caption, t.SortOrder)
OUTPUT er.Id, t.NewIdColumn;

Query number of rows and columns in SQLite database

Currently, I have a database called "testDB.db" that has 100 rows and 3 columns. Is there some library in C# that allows me to easily check how many columns and rows there are in table "test001" in "testDB.db"?
namespace SQLiteExtractor
{
class Program
{
static void Main()
{
string liteString = #"Data Source = .\testDB.db";
connectToSQLite(liteString);
}
public static void connectToSQLite(string liteConString)
{
using SQLiteConnection liteCon = new SQLiteConnection(liteConString);
liteCon.Open();
string query = "SELECT * FROM test001";
int sizeOfDR = 0;
List<string> liteEntries = new List<string>();
using var cmd = new SQLiteCommand(query, liteCon);
using SQLiteDataReader SQLiteDR = cmd.ExecuteReader();
while (SQLiteDR.Read())
{
liteEntries.Add(SQLiteDR.GetString(1));
}
foreach (string entry in liteEntries)
Console.WriteLine(entry);
}
}
}
For rows, you can do a SELECT COUNT() query against the table.
For cols, you can inspect the table metadata via system table queries. Use PRAGMA table_info(table_name); to get the col info.
Specific to your table:
SELECT COUNT(*) FROM test001
and
PRAGMA table_info('test001')

Get Values from DataTable by row and column name

I'm typically used to traditional table in SQL where I have multiple columns with rows populated. I execute a stored procedure and store all the data in DataTable and loop through the table to get the results I need. For example,
public static DataTable getInfo (string sessionID)
{
try
{
SqlConnection conn = new SqlConnection(System.Configuration.ConfigurationManager.ConnectionStrings["SandBox"].ConnectionString);
SqlCommand cmd = new SqlCommand("GetSessionInfo", conn);
cmd.Parameters.AddWithValue("SessionGUID", sessionID);
cmd.CommandType = CommandType.StoredProcedure;
DataTable dt = new DataTable();
SqlDataAdapter da = new SqlDataAdapter(cmd);
da.Fill(dt);
return dt;
}
catch (Exception)
{
throw;
}
}
I would load the DataTable:
DataTable infoTbl = new DataTable();
infoTbl = getInfo(lbldatabasesessionID.Text);
And I would use foreach loop to loop through the DataTable.
foreach (DataRow row in infoTbl.Rows)
{
string x = col.ToString();
}
The issue I run into is the database guy gave me a stored procedure that returns a different output (different from what I'm used to). It's a row based.
The only way I can access for example the First Name is if I hard code the position like:
string firstName = infoTbl.Rows[16][2].ToString();
I don't feel comfortable doing this since the position could potentially change. How would I access ElementValue by knowing the name knowing ElementType and ElementName?
Any suggestions?
Using DataSet:
string firstName = string.Empty;
DataRow row = table.Select("ElementType = 'Demographics' AND ElementName = 'FirstName'").FirstOrDefault();
if (row != null)
{
firstName = (string)row["ElementValue"];
}
Using Linq:
string firstName = table.AsEnumerable()
.Where(f => f.Field<string>("ElementType") == "Demographics" &&
f.Field<string>("ElementName") == "FirstName")
.Select(f => f.Field<string>("ElementValue")).FirstOrDefault();

Bulk Update in C#

For inserting a huge amount of data in a database, I used to collect all the inserting information into a list and convert this list into a DataTable. I then insert that list to a database via SqlBulkCopy.
Where I send my generated list LiMyList which contain information of all bulk data which I want to insert to database and pass it to my bulk insertion operation
InsertData(LiMyList, "MyTable");
Where InsertData is
public static void InsertData<T>(List<T> list,string TableName)
{
DataTable dt = new DataTable("MyTable");
clsBulkOperation blk = new clsBulkOperation();
dt = ConvertToDataTable(list);
ConfigurationManager.OpenExeConfiguration(ConfigurationUserLevel.PerUserRoamingAndLocal);
using (SqlBulkCopy bulkcopy = new SqlBulkCopy(ConfigurationManager.ConnectionStrings["SchoolSoulDataEntitiesForReport"].ConnectionString))
{
bulkcopy.BulkCopyTimeout = 660;
bulkcopy.DestinationTableName = TableName;
bulkcopy.WriteToServer(dt);
}
}
public static DataTable ConvertToDataTable<T>(IList<T> data)
{
PropertyDescriptorCollection properties = TypeDescriptor.GetProperties(typeof(T));
DataTable table = new DataTable();
foreach (PropertyDescriptor prop in properties)
table.Columns.Add(prop.Name, Nullable.GetUnderlyingType(prop.PropertyType) ?? prop.PropertyType);
foreach (T item in data)
{
DataRow row = table.NewRow();
foreach (PropertyDescriptor prop in properties)
row[prop.Name] = prop.GetValue(item) ?? DBNull.Value;
table.Rows.Add(row);
}
return table;
}
Now I want to do an update operation, is there any way as for inserting data is done by SqlBulkCopy for Updating data to DataBase From C#.Net
What I've done before is perform a bulk insert from the data into a temp table, and then use a command or stored procedure to update the data relating the temp table with the destination table. The temp table is an extra step, but you can have a performance gain with the bulk insert and massive update if the amount of rows is big, compared to updating the data row by row.
Example:
public static void UpdateData<T>(List<T> list,string TableName)
{
DataTable dt = new DataTable("MyTable");
dt = ConvertToDataTable(list);
using (SqlConnection conn = new SqlConnection(ConfigurationManager.ConnectionStrings["SchoolSoulDataEntitiesForReport"].ConnectionString))
{
using (SqlCommand command = new SqlCommand("", conn))
{
try
{
conn.Open();
//Creating temp table on database
command.CommandText = "CREATE TABLE #TmpTable(...)";
command.ExecuteNonQuery();
//Bulk insert into temp table
using (SqlBulkCopy bulkcopy = new SqlBulkCopy(conn))
{
bulkcopy.BulkCopyTimeout = 660;
bulkcopy.DestinationTableName = "#TmpTable";
bulkcopy.WriteToServer(dt);
bulkcopy.Close();
}
// Updating destination table, and dropping temp table
command.CommandTimeout = 300;
command.CommandText = "UPDATE T SET ... FROM " + TableName + " T INNER JOIN #TmpTable Temp ON ...; DROP TABLE #TmpTable;";
command.ExecuteNonQuery();
}
catch (Exception ex)
{
// Handle exception properly
}
finally
{
conn.Close();
}
}
}
}
Notice that a single connection is used to perform the whole operation, in order to be able to use the temp table in each step, because the scope of the temp table is per connection.
In my personal experience, the best way to handled this situation is utilizing a Stored Procedure with a Table-Valued Parameter and a User-Defined Table Type. Just set up the type with the columns of the data table, and pass in said-data table as a parameter in the SQL command.
Within the Stored Procedure, you can either join directly on some unique key (if all rows you are updating exist), or - if you might run into a situation where you are having to do both updates and inserts - use the SQL Merge command within the stored procedure to handle both the updates and inserts as applicable.
Microsoft has both syntax reference and an article with examples for the Merge.
For the .NET piece, it's a simple matter of setting the parameter type as SqlDbType.Structured and setting the value of said-parameter to the Data Table that contains the records you want to update.
This method provides the benefit of both clarity and ease of maintenance. While there may be ways that offer performance improvements (such as dropping it into a temporary table then iterating over that table), I think they're outweighed by the simplicity of letting .NET and SQL handle transferring the table and updating the records itself. K.I.S.S.
Bulk Update:
Step 1: put the data which you want to update and primary key in a list.
Step 2: pass this list and ConnectionString to BulkUpdate Method As shown below
Example:
//Method for Bulk Update the Data
public static void BulkUpdateData<T>(List<T> list, string connetionString)
{
DataTable dt = new DataTable("MyTable");
dt = ConvertToDataTable(list);
using (SqlConnection conn = new SqlConnection(connetionString))
{
using (SqlCommand command = new SqlCommand("CREATE TABLE
#TmpTable([PrimaryKey],[ColumnToUpdate])", conn))
{
try
{
conn.Open();
command.ExecuteNonQuery();
using (SqlBulkCopy bulkcopy = new SqlBulkCopy(conn))
{
bulkcopy.BulkCopyTimeout = 6600;
bulkcopy.DestinationTableName = "#TmpTable";
bulkcopy.WriteToServer(dt);
bulkcopy.Close();
}
command.CommandTimeout = 3000;
command.CommandText = "UPDATE P SET P.[ColumnToUpdate]= T.[ColumnToUpdate] FROM [TableName Where you want to update ] AS P INNER JOIN #TmpTable AS T ON P.[PrimaryKey] = T.[PrimaryKey] ;DROP TABLE #TmpTable;";
command.ExecuteNonQuery();
}
catch (Exception ex)
{
// Handle exception properly
}
finally
{
conn.Close();
}
}
}
}
Step 3: put The ConvertToDataTable Method as shown Below.
Example:
public static DataTable ConvertToDataTable<T>(IList<T> data)
{
PropertyDescriptorCollection properties = TypeDescriptor.GetProperties(typeof(T));
DataTable table = new DataTable();
foreach (PropertyDescriptor prop in properties)
table.Columns.Add(prop.Name, Nullable.GetUnderlyingType(prop.PropertyType) ?? prop.PropertyType);
foreach (T item in data)
{
DataRow row = table.NewRow();
foreach (PropertyDescriptor prop in properties)
row[prop.Name] = prop.GetValue(item) ?? DBNull.Value;
table.Rows.Add(row);
}
return table;
}
Notes: WhereEver SquareBracket[] is there, put your own value.
Try out SqlBulkTools available on Nuget.
Disclaimer: I'm the author of this library.
var bulk = new BulkOperations();
var records = GetRecordsToUpdate();
using (TransactionScope trans = new TransactionScope())
{
using (SqlConnection conn = new SqlConnection(ConfigurationManager
.ConnectionStrings["SqlBulkToolsTest"].ConnectionString))
{
bulk.Setup<MyTable>()
.ForCollection(records)
.WithTable("MyTable")
.AddColumn(x => x.SomeColumn1)
.AddColumn(x => x.SomeColumn2)
.BulkUpdate()
.MatchTargetOn(x => x.Identifier)
.Commit(conn);
}
trans.Complete();
}
Only 'SomeColumn1' and 'SomeColumn2' will be updated. More examples can be found here
I would insert new values in a temporary table and then do a merge against the destination table, something like this:
MERGE [DestTable] AS D
USING #SourceTable S
ON D.ID = S.ID
WHEN MATCHED THEN
UPDATE SET ...
WHEN NOT MATCHED
THEN INSERT (...)
VALUES (...);
You could try to build a query that contains all data. Use a case. It could look like this
update your_table
set some_column = case when id = 1 then 'value of 1'
when id = 5 then 'value of 5'
when id = 7 then 'value of 7'
when id = 9 then 'value of 9'
end
where id in (1,5,7,9)
I'd go for a TempTable approach because that way you aren't locking anything. But if your logic needs to be only in the front end and you need to use bulk copy, I'd try a Delete/Insert approach but in the same SqlTransaction to ensure integrity which would be something like this:
// ...
dt = ConvertToDataTable(list);
using (SqlConnection cnx = new SqlConnection(myConnectionString))
{
using (SqlTranscation tran = cnx.BeginTransaction())
{
DeleteData(cnx, tran, list);
using (SqlBulkCopy bulkcopy = new SqlBulkCopy(cnx, SqlBulkCopyOptions.Default, tran))
{
bulkcopy.BulkCopyTimeout = 660;
bulkcopy.DestinationTableName = TabelName;
bulkcopy.WriteToServer(dt);
}
tran.Commit();
}
}
Complete answer, disclaimer: arrow code; this is mine built from research; Published in SqlRapper. It uses custom attributes over properties to determine whether a key is primary. Yes, super complicated. Yes super reusable. Yes, needs to be refactored. Yes, it is a nuget package. No, the documentation isn't great on github, but it exists. Will it work for everything? Probably not. Will it work for simple stuff? Oh yeah.
How easy is it to use after setup?
public class Log
{
[PrimaryKey]
public int? LogId { get; set; }
public int ApplicationId { get; set; }
[DefaultKey]
public DateTime? Date { get; set; }
public string Message { get; set; }
}
var logs = new List<Log>() { log1, log2 };
success = db.BulkUpdateData(logs);
Here's how it works:
public class PrimaryKeyAttribute : Attribute
{
}
private static bool IsPrimaryKey(object[] attributes)
{
bool skip = false;
foreach (var attr in attributes)
{
if (attr.GetType() == typeof(PrimaryKeyAttribute))
{
skip = true;
}
}
return skip;
}
private string GetSqlDataType(Type type, bool isPrimary = false)
{
var sqlType = new StringBuilder();
var isNullable = false;
if (Nullable.GetUnderlyingType(type) != null)
{
isNullable = true;
type = Nullable.GetUnderlyingType(type);
}
switch (Type.GetTypeCode(type))
{
case TypeCode.String:
isNullable = true;
sqlType.Append("nvarchar(MAX)");
break;
case TypeCode.Int32:
case TypeCode.Int64:
case TypeCode.Int16:
sqlType.Append("int");
break;
case TypeCode.Boolean:
sqlType.Append("bit");
break;
case TypeCode.DateTime:
sqlType.Append("datetime");
break;
case TypeCode.Decimal:
case TypeCode.Double:
sqlType.Append("decimal");
break;
}
if (!isNullable || isPrimary)
{
sqlType.Append(" NOT NULL");
}
return sqlType.ToString();
}
/// <summary>
/// SqlBulkCopy is allegedly protected from Sql Injection.
/// Updates a list of simple sql objects that mock tables.
/// </summary>
/// <typeparam name="T"></typeparam>
/// <param name="rows">A list of rows to insert</param>
/// <param name="tableName">a Table name if your class isn't your table name minus s.</param>
/// <returns>bool success</returns>
public bool BulkUpdateData<T>(List<T> rows, string tableName = null)
{
var template = rows.FirstOrDefault();
string tn = tableName ?? template.GetType().Name + "s";
int updated = 0;
using (SqlConnection con = new SqlConnection(ConnectionString))
{
using (SqlCommand command = new SqlCommand("", con))
{
using (SqlBulkCopy sbc = new SqlBulkCopy(con))
{
var dt = new DataTable();
var columns = template.GetType().GetProperties();;
var colNames = new List<string>();
string keyName = "";
var setStatement = new StringBuilder();
int rowNum = 0;
foreach (var row in rows)
{
dt.Rows.Add();
int colNum = 0;
foreach (var col in columns)
{
var attributes = row.GetType().GetProperty(col.Name).GetCustomAttributes(false);
bool isPrimary = IsPrimaryKey(attributes);
var value = row.GetType().GetProperty(col.Name).GetValue(row);
if (rowNum == 0)
{
colNames.Add($"{col.Name} {GetSqlDataType(col.PropertyType, isPrimary)}");
dt.Columns.Add(new DataColumn(col.Name, Nullable.GetUnderlyingType(col.PropertyType) ?? col.PropertyType));
if (!isPrimary)
{
setStatement.Append($" ME.{col.Name} = T.{col.Name},");
}
}
if (isPrimary)
{
keyName = col.Name;
if (value == null)
{
throw new Exception("Trying to update a row whose primary key is null; use insert instead.");
}
}
dt.Rows[rowNum][colNum] = value ?? DBNull.Value;
colNum++;
}
rowNum++;
}
setStatement.Length--;
try
{
con.Open();
command.CommandText = $"CREATE TABLE [dbo].[#TmpTable]({String.Join(",", colNames)})";
//command.CommandTimeout = CmdTimeOut;
command.ExecuteNonQuery();
sbc.DestinationTableName = "[dbo].[#TmpTable]";
sbc.BulkCopyTimeout = CmdTimeOut * 3;
sbc.WriteToServer(dt);
sbc.Close();
command.CommandTimeout = CmdTimeOut * 3;
command.CommandText = $"UPDATE ME SET {setStatement} FROM {tn} as ME INNER JOIN #TmpTable AS T on ME.{keyName} = T.{keyName}; DROP TABLE #TmpTable;";
updated = command.ExecuteNonQuery();
}
catch (Exception ex)
{
if (con.State != ConnectionState.Closed)
{
sbc.Close();
con.Close();
}
//well logging to sql might not work... we could try... but no.
//So Lets write to a local file.
_logger.Log($"Failed to Bulk Update to Sql: {rows.ToCSV()}", ex);
throw ex;
}
}
}
}
return (updated > 0) ? true : false;
}

Is there easy method to read all tables from SQLite database to DataSet object?

Now I use method in C# to read table from SQLite database into DataTable,
but I want to send all table into other object.
So I think I have to use DataSet to combine all DataTable(s)
and send it to object as parameter.
Is there method that easy read all tables from SQLite database to DataSet?
Or I have to read all tables from SQLite database to DataTable each table
and combine to DataSet by hand?
The sql for listing all the tables is:
SELECT name FROM sqlite_master WHERE type = 'table' ORDER BY 1
you could then get all the tables as databases seperately and then add them into a dataset - an example here: http://www.dotnetperls.com/dataset
so i guess the code would be something like:
Dataset d = new Dataset()
foreach (tableName in GetTables()){
d.Tables.Add(GetDataTable("select * from "+tableName);
}
code for GetTables and GetDataTable (i'll leave the piecing it together to you):
public ArrayList GetTables()
{
ArrayList list = new ArrayList();
// executes query that select names of all tables in master table of the database
String query = "SELECT name FROM sqlite_master " +
"WHERE type = 'table'" +
"ORDER BY 1";
try
{
DataTable table = GetDataTable(query);
// Return all table names in the ArrayList
foreach (DataRow row in table.Rows)
{
list.Add(row.ItemArray[0].ToString());
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}
return list;
}
public DataTable GetDataTable(string sql)
{
try
{
DataTable dt = new DataTable();
using (var c = new SQLiteConnection(dbConnection))
{
c.Open();
using (SQLiteCommand cmd = new SQLiteCommand(sql, c))
{
using (SQLiteDataReader rdr = cmd.ExecuteReader())
{
dt.Load(rdr);
return dt;
}
}
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
return null;
}
}

Categories