I am aware this was asked here. However it doesn't answer my question. I have 10 tables in a database called "merged". I am taking a bunch of other databases with an identical structure as "merged" except that "merged" has an extra column that is a combination of two columns in the table. I am trying to transfer all this data into merged but I believe the extra column I believe is preventing the transfer.
SqlCommand MergeDB = connDestination.CreateCommand();
string sqlMergeDB = "";
int a= 0;
for (a=0; a < tablenames.Length; a++){
string sqlMergeDB = "INSERT INTO sql_merged.dbo." + tablenames[a] + " SELECT * FROM sourceForMerged.dbo." + tablenames[a];
using (SqlDataReader reader = MergeDB.ExecuteReader()) {
while(reader.Read())
{
MessageBox.Show("Transfered a table");
}
}
}
The error occurs at the SqlDataReader row of the code above, which I believe means there is something wrong with the sql command. Please help. Thanks
If you name all the columns in both parts of the INSERT . . . SELECT statement you can map which source column gets inserted into which destination column.
If you imagine TargetTable (Name, ProductType, Date) and SourceTable (Date, Type, Name) then using:
INSERT INTO TargetTable (Name, ProductType, Date)
SELECT Name, Type, Date FROM SourceTable
would move the three columns into the appropriate columns even though the order doesn't match.
If you have "extra" columns in one table or the other you can either leave them out or provide expressions to fill them in:
INSERT INTO TargetTable (Name, ProductType, Date, CombinedValues)
SELECT Name, Type, Date, (ValCol1 + ' ' + ValCol2) FROM SourceTable
has four columns receiving data from four expressions, one of which concatenates two columns worth of data. (In real life, you may find that the concatenation expression is more complicated, but this is the basic idea).
You cannot use a:
Insert Into [Table] Select * From [Table2]
unless the tables schemas are identical. You would have to list out the columns for the Insert Statement.
If possible you could drop the column on the destination table and then add it back after the insert.
You could do something like this to build up you insert code if the table is very wide:
SELECT
'cmd.Parameter.Add("#' + column_name + '", SqlDbType.' + data_type + ');',
column_name 'Column Name',
data_type 'Data Type'
FROM information_schema.columns
WHERE table_name = 'TableName'
Related
I have column names something like this in a view.
SELECT
CAST(CONCAT('Trp', f.Production_disposal_Number, p.production_waste_number,p.production_shiftchange_number, t.production_unit_number,pr.process_unit_number,p.production_run_number AS VARCHAR(50)) AS ProductionKey
,(SELECT MAX(v.LastModifiedDate) FROM (VALUES (p.LastModifiedDate), (f.LastModifiedDate), (p.LastModifiedDate),
(d.LastModifiedDate), (pl.LastModifiedDate), (pr.LastModifiedDate), (t.LastModifiedDate)) AS vtable(LastModifiedDate))
AS LastModifiedDate
,CAST(p.Start_Date AS DATETIME2(0)) AS ProductionStartDate, CAST(p.End_Date AS DATETIME2(0)) AS ProductionEndDate
I want to retrieve these column names by the passing below sql query in my C# program
query=
"#DECLARE
#olddelim nvarchar(32) = char(13) + Char(10),
#newdelim nchar(1) = NCHAR(9999);
SELECT * FROM STRING_SPLIT(REPLACE(OBJECT_DEFINITION(object_id('sge.vwProduct')),#olddelim, #newdelim), #newdelim);";
using (SqlConnection connection = new SqlConnection(connectionString))
{
SqlCommand command = new SqlCommand(queryString, connection);
connection.Open();
SqlDataReader reader = command.ExecuteReader();
try
{
while (reader.Read())
list.Add(reader.GetString(0));
}
finally
{
reader.Close();
connection.Close();
}
}
This code runs fine only if all the column names are in single line. However, if the single column definition are written in multiple lines then it fails to read that column name. Also, if multiple column names are written in a single line, it doesn't read that as well.
For example, column name LastModifiedDate is neither identified nor read by program as it fails to identify start of the column definition. And in the same way ProductionStartDate is not read.
I want all column names be identified correctly and read by the program so that no column names are missed.
Change your query. You can easily query the column names of the view from the dictionary. There's no need to "parse" the create statement for that purpose.
SELECT c.name
FROM sys.columns c
INNER JOIN sys.views v
ON c.object_id = v.object_id
INNER JOIN sys.schemas s
ON s.schema_id = v.schema_id
WHERE s.name = 'sge'
AND v.name = 'vwProduct'
ORDER BY c.column_id;
db<>fiddle (with different object names; but it should suffice as demonstration)
But be aware that SqlDataReader is disposable, so you should Dispose() it or use using like you did with the connection.
We have a system that uses Electronic Forms but the standard searching functionality of the forms is very sub par, so I have been tasked with trying to create a better approach for the users.
Ok so a little background of the current database format :
Table1
------
FormID
FormName
Creator
CreateDate
Table2
------
FormID
FormControlName
ControlData
The relationship is a 1 to Many with there being many controls on one form.
My task is to create a search that will find the relevant forms by searching the form name (changes based on data) and each of the form Controls that belong to that form.
I have managed to do this using C# coding but because there is a high number of records in the Database, and the fact that in my Current solution I am retrieving everything and iterating over it to find the relevant items, it is quite a slow solution.
My Code :
private DataTable GetForms() {
string FormName = ddForm.SelectedValue;
string SearchText = tbSearch.Text;
List<string> FormIDs = GetMatchingForms(FormName, SearchText);
DataTable dtForms = new DataTable("Forms Table");
dtForms.Columns.Add("Form Name");
dtForms.Columns.Add("Initiator");
dtForms.Columns.Add("Start Date").DataType = typeof(DateTime);
dtForms.Columns.Add("FormLink");
foreach (string FormID in FormIDs) {
DataRow nRow = dtForms.NewRow();
nRow[0] = GetData.GetString("SELECT [FormName] FROM [Table1] Where [FormID] = '" + FormID + "'", conString);
string UserID = GetData.GetString("SELECT [Creator] FROM [Table1] Where [FormID] = '" + FormID + "'", conString);
string UserName = GetData.GetString("Select [UserName] From [User] Where [UserID] = '" + UserID + "'", conString);
nRow[1] = UserName;
nRow[2] = GetData.GetString("SELECT [CreateDate] FROM [Table1] Where [FormID] = '" + FormID + "'", conString);
nRow[3] = "~/Form.aspx?formid=" + FormID;
dtForms.Rows.Add(nRow);
}
return dtForms;
}
private List<string> GetMatchingForms(string FormName, string SearchText) {
//FormName can be = % to search all forms
DataTable dtForms = GetData.GetDT("SELECT * FROM [Table1] Where [FormName] LIKE '%" + FormName + "%'", conString);
List<string> FormList = new List<string>();
foreach (DataRow row in dtForms.Rows) {
string FormName = row["FormName"].ToString();
string FormID = row["FormID"].ToString();
bool Relevant = false;
if (FormName.Contains(SearchText)) {
Relevant = true;
} else {
DataTable dtFormControls = GetData.GetDT("SELECT * FROM [Table2] Where [FormID] = '" + FormID + "'", conString);
foreach (DataRow cRow in dtFormControls.Rows) {
string ControlData = cRow["ControlData"].ToString();
if (ControlData.Contains(SearchText)) {
Relevant = true;
break;
}
}
}
if (Relevant) {
FormList.Add(FormID);
}
}
return FormList;
}
I was wondering if it would be possible to replicate the above code's functionality into a SQL Query (or a small number of queries) to hopefully try and speed up the current solution. My current knowledge of SQL Queries is not the best and I can't even begin to think of where to start with this issue.
To clarify, we currently have 300,000 forms and a total of 10.4 million data records, the database has been re indexed recently and that does seem to effect performance positively. We are planning on doing maintenance on this relatively soon but we will still be keeping the bulk of the data that is currently stored.
Edit : The Database in question is part of 3rd party software and we are limited to Read-Only access, database modifications are no possible for us.
As you can see from the number of records that the timing issue is quite a big one, as it literally takes minutes to execute the current code.
Any help will be greatly appreciated thanks.
The performance problems are because you are running four queries (3 againsts Table1 and 1 against User) for each string in your list, which is a big overhead. I'd recommend one of the following (please bear in mind I don't have access to your database, so sorry for any coding errors)
1) Use LinqToSql
If you use LinqToSql then you can extract the data from the first part of your query with something similar to the following:
var myResults = (from t in context.Table1
join u in context.User on t.UserId equals u.UserId
where formIds.Contains (t.FormId)
select new { t.FormName, t.Creator, t.CreateDate }).ToList();
The Contains method allows you to effectively join your in memory data with data from the database, removing the need to loop for each item.
2) Use Database query
The equivalent SQL statement would be:
select t.FormName, t.Creator, t.CreateDate
from Table1 t
inner join [User] u on t.UserID = u.UserId
where t.FormId in (<list of formIDs here>)
You could either create a SQL command, building this string which is not recommended because of SQL Injection concerns or alternatively you could create a parameterised query or a stored procedure which is far better from a security perspective.
All of the above only applies to the first part of your code, but can easily be replicated for the 2nd part.
A 1 to N relationship can be easily created in TSQL. The sample program below adds primary keys for both the form id and control id with a foreign key relationship.
This will return the data you want in one record set versus multiple calls you are making before.
The next question is what type of data is the control data? That is where you might be having the issues.
If it is defined as a varchar(max) or text, then you have to perform a full table scan. You can not use a regular index < 900 bytes.
A index or balanced tree search is a N LOG(N) operation at worst versus a table search which is a N operation. This is analysis of algorithms, order of magnitude, http://en.wikipedia.org/wiki/Big_O_notation, O(N LOG(N)) versus O(N).
With N = 11 M, you only have to look at 1.04 M rows max. This is assuming a binary tree. SQL server uses B+tree. http://sqlity.net/en/563/index-misconceptions-tsql-tuesday-026-second-chances/
If the control data field is text, you want to apply full text indexing. Look at my blog articles http://craftydba.com/?p=1421 and/or presentation on how to set one up. You will have to use the CONTAINS() or FREETEXT() functions for searching.
This index can be built right after a data load but provides you with superior speed versus a traditional LIKE clause. This pushes the search load (computation) onto the SQL server instead of the client (web server).
I hope this helps you out,
If you have any more questions, just ask.
Sincerely
John
--
-- Sample table 1
--
create table tempdb.dbo.forms
(
FormID int identity(1,1) primary key clustered,
FormName varchar(32),
Creator varchar(32) DEFAULT (coalesce(suser_sname(),'?')),
CreateDate smalldatetime DEFAULT (getdate())
);
go
-- Add data
insert into tempdb.dbo.forms (FormName) values ('Main');
go
-- Show the data
select * from tempdb.dbo.forms;
go
--
-- Sample table 2
--
create table tempdb.dbo.controls
(
ControlId int identity(1,1) primary key clustered,
FormID int,
FormControlName varchar(32),
ControlData varchar(32)
);
go
-- Add foreign key 2 forms table
ALTER TABLE tempdb.dbo.controls WITH CHECK
ADD CONSTRAINT fk_tbl_forms FOREIGN KEY(FormId)
REFERENCES tempdb.dbo.forms (FormID)
go
-- Add data
insert into tempdb.dbo.controls (FormId, FormControlName, ControlData)
values
(1, 'Drop Down', 'My drop down data'),
(1, 'Text Box', 'My text box');
go
-- Show the data
select * from tempdb.dbo.controls;
go
--
-- Use a join command (1 x N) relationship with where
--
-- Show data from both
select
f.FormID,
f.FormName,
f.Creator,
f.CreateDate,
c.ControlId,
c.FormControlName,
c.ControlData
from
tempdb.dbo.forms as f inner join
tempdb.dbo.controls as c
on
f.FormID = c.FormID
where
f.FormName like '%Main%' and
c.ControlData like '%box%'
go
When a user deletes a row in the database, I want to archive it into a separate table in the database rather than delete it or flag it in the current table. I figure I would need to do something like in this link:
How to copy a row from one SQL Server table to another
The thing is, the archive table has 1 extra column in it that does not match the original table (ArchiveTimeStamp). This ArchiveTimeStamp does not exist in the original table, instead I would use something like
archiveComm.Parameters.AddWithValue("ArchiveTimeStamp", Date.Time.Now);
This is what I have so far:
SqlCommand archiveComm = new SqlCommand("INSERT INTO Archive_Table SELECT * FROM Table WHERE RowID = #rowID", conn);
Is there a way for me to modify the SqlCommand to add another param that doesn't exist in the original Table?
Why not just handle this on the back end? You can create a trigger on the original table to insert into another table after every delete?
Your trigger will look like this:
CREATE TRIGGER onOriginalTableDelete
ON originalTable
FOR DELETE
AS
INSERT INTO anotherTable
SELECT * FROM deleted;
When a record is deleted on the original table, it will insert the deleted record into the other table. You might want to read on using the deleted table here.
Check this SQL Fiddle. Since you're inserting the timestamp in another column, you can just add this on the INSERT INTO SELECT statement:
INSERT INTO OtherTable
SELECT *, CURRENT_TIMESTAMP FROM MainTable;
This could be the query for your trigger:
CREATE TRIGGER onOriginalTableDelete
ON originalTable
FOR DELETE
AS
INSERT INTO anotherTable
SELECT *, CURRENT_TIMESTAMP FROM deleted;
Good question. I'd suggest (as Gian has also suggested) moving the logic you require to backup the deleted row into a trigger that gets fired on delete.
Triggers are events in a database associated to a table which get fired upon an action occurring i.e. insert / update / delete.
So in your scenario, if you create an ON DELETE trigger in the source table, it will get fired when a delete occurs. The SQL contained within the trigger can specify what to do with the deleted data, which in your scenario will be: insert the deleted info into the archive table with a timestamp.
So if you have:
Source_Table:
Col_1
Col_2
Col_3
Archive_Table:
Col_1
Col_2
Col_3
Time_Stamp
You'll need to create a FOR DELETE trigger against Source_Table (something like this):
CREATE TRIGGER SourceDeletedTrigger
ON database.dbo.Source_Table
FOR DELETE
AS
INSERT INTO Archive_Table(Col_1, Col_2, Col_3, Time_Stamp)
SELECT
DELETED.Col_1,
DELETED.Col_2,
DELETED.Col_3,
GETDATE()
FROM DELETED
GO
The above is some rough SQL which may contain a couple of syntax errors but the guts of the idea is conveyed.
You will have to use to explicit column list and values form of the INSERT statement:
INSERT INTO Archive_Table (
Col1
,Col2
,Col3 )
SELECT
Col1
,Col2
,Col3
FROM
Table
WHERE
Row_ID = #Row_ID
See Insert into ... values ( SELECT ... FROM ... )
I think you have to specify the columns with something like this
INSERT INTO tab1
(col1, col2)
SELECT col1, col2
FROM tab2
WHERE RowID = #rowID"
You need to specify the columns name in that case:
archiveComm.Parameters.AddWithValue("ArchiveTimeStamp", Date.Time.Now);
string SQL = "INSERT INTO Archive_Table (Col1,Col2,ArchiveTimeStamp) " & _
"SELECT Col1,Col2,#ArchiveTimeStamp FROM Table WHERE RowID = #rowID"
SqlCommand archiveComm = new SqlCommand(SQL, conn);
Here is my suggestion, you are forced to supply the column names or it won't let you run the query, however I understand you would prefer a generic solution that worked for any table so I suggest building the insert SQL dynamically, cache it on your application, and then just execute it with your extra archive column. Here is a c# example:
public class ArchiveTableRow
{
private static readonly IDictionary<string, string> _cachedInsertStatements = new Dictionary<string, string>();
public void Archive(string tableName, string rowId)
{
if (_cachedInsertStatements.ContainsKey(tableName) == false)
{
BuildInsertStatement(tableName);
}
var insertQuery = _cachedInsertStatements[tableName];
// ...
SqlCommand archiveComm = new SqlCommand(insertQuery, conn);
// ...
archiveComm.Parameters.AddWithValue("ArchiveTimeStamp", Date.Time.Now);
// ...
}
private void BuildInsertStatement(string tableName)
{
// Get the columns names:
var getColumnNamesQuery = #"
SELECT Table_Schema, Column_Name
FROM Information_Schema.Columns
WHERE Table_Name = '" + tableName + "'
Order By Ordinal_Position";
// Execute the query
SqlCommand archiveComm = new SqlCommand(getColumnNamesQuery, conn);
// Loop and build query and add your archive in the end
// Add to dictionary
}
}
You would use it with something like:
var archiveRow = new ArchiveTableRow();
archiveRow.Archive("TableName", rowId);
What would be the best technique to use to insert a row of data retrieved from multiple sources (including textbox, other SQL tables, etc) into another SQL table? Will I need to create an intermediate data structure or is there a way to do it using something like
INSERT INTO Proposal_listing
(company_name, project_name, status_proposal)
SELECT(company_name, project_name, status_pipeline)
FROM Project_Pipeline
WHERE status_pipeline = 'Proposal Phase';
EDIT: For example, if I wanted to take columns A1, B1, C1 from table 1 and values from textbox.text, textbox2.text and insert into columns A2, B2, C2, D2, E2 from table 2.
Thanks
How about trying to integrate your textbox value into your SQL Query. Something like this:
string commandText = #"Insert INTO MyTable (Column1, Column2, Column3)
SELECT FirstName, LastName, #TBValue
FROM OtherTable where id = #QSValue";
cmd.CommandText = commandText;
cmd.Parameters.Add("#TBValue", tb_MyBox.Text);
cmd.Parameters.Add("#QSValue", Request.QueryString["id"]);
This is all off the top of my head so you might need to play with the syntax, but that's the general idea.
I have inserted a row into my table, and I want to get it's ID and plus it with an int and inserted in that row.
But I don't know how to get it's ID.
Here is the insert code:
objCommand.Connection = objConnection;
objCommand.CommandText = "INSERT INTO Moin " +
" (Title, TotalID, Code ) " +
"VALUES (#Title , #TotalID, #Code )";
objCommand.Connection = objConnection;
objCommand.CommandText = "INSERT INTO Moin " +
" (Title, TotalID, Code ) " +
"VALUES (#Title , #TotalID, #Code ) SELECT SCOPE_IDENTITY()";
object id = objCommand.ExecuteScalar();
Try using the OUTPUT clause of SQL Server in your query - it can return any of the just inserted value (here I'm assuming your column is called ID - adapt as needed):
objCommand.Connection = objConnection;
objCommand.CommandText = "INSERT INTO Moin(Title, TotalID, Code ) " +
"OUTPUT Inserted.ID " +
"VALUES (#Title , #TotalID, #Code ); "
and then execute it like this:
int result = (int)objCommand.ExecuteScalar();
Since you're returning just one row and one column (just the INT), you can use .ExecuteScalar() to retrieve that value back from the INSERT statement.
With the OUTPUT clause, you can return any values just inserted - not just the identity column. So you could also return values that are filled by the database with default values, or whatever you need. If you return multiple values, you need to use a data reader to read them all - ExecuteScalar() only works for a single value.
But, as Anders correctly mentioned - using an ORM like Entity Framework would do all of this automatically for you and you wouldn't have to deal with those raw SQL commands anymore....
Building SQL commands in strings should be considered a legacy technique. If you use Entity Framework or linq-to-sql the retrieval of the id is handled automatically for you.
With pure SQL, use the SCOPE_IDENTITY() function to retrieve the id of the inserted element.