transaction question in SQL Server 2008 - c#

I am using SQL Server 2008 Enterprise. And using ADO.Net + C# + .Net 3.5 + ASP.Net as client to access database. When I access SQL Server 2008 tables, I always invoke stored procedure from my C# + ADO.Net code.
My question is, if I do not have any transaction control (I mean begin/end transaction) from my client C# + ADO.Net code, and I also do not have any transaction control (I mean begin/end transaction) in sql stored procedure code. Then my question is, each single Insert/Delete/Update/Select statement will act as a single transaction? Is that correct? For example, in the following store procedure, delete/insert/select will act as 3 single transactions?
create PROCEDURE [dbo].[FooProc]
(
#Param1 int
,#Param2 int
,#Param3 int
)
AS
DELETE FooTable WHERE Param1 = #Param1
INSERT INTO FooTable
(
Param1
,Param2
,Param3
)
VALUES
(
#Param1
,#Param2
,#Param3
)
DECLARE #ID bigint
SET #ID = ISNULL(##Identity,-1)
IF #ID > 0
BEGIN
SELECT IdentityStr FROM FooTable WHERE ID = #ID
END

Then my question is, each single
Insert/Delete/Update/Select statement
will act as a single transaction?
Yes, without explicit transaction control, each SQL statement will be wrapped in its own transaction. The single statement is guaranteed to be executed as a whole or fail as a whole.
The single statements will run under the current transaction isolation level: normally read committed. So it won't read uncommitted changes from other statements, but it might suffer from nonrepeatable reads, or phantom values.

If you don't handle transactions then each statement will be independent and might interfere with other users running the stored procedure at the same time.

Related

CREATE DATABASE statement not allowed within multi-statement transaction. Dynamic database creation using stored procedure + EF

I am trying to create a database using a stored procedure as shown here:
CREATE PROCEDURE [dbo].[usp_CreateDatabase]
#dbName nvarchar(50)
AS
BEGIN
SET NOCOUNT ON;
DECLARE #createDbQuery nvarchar(max) = 'CREATE DATABASE '+ Quotename(#dbName);
EXEC (#createDbQuery)
END
This stored procedure will be in the Master database.
When I execute this in Management Studio, it works as expected and the database is created.
C# code:
using (Data.MasterDB db = new Data.MasterDB())
{
db.usp_CreateDatabase(databaseName);
}
dbContext has the connection string of the Master database.
But when I try to execute the same from my application using Entity Framework, it throws
CREATE DATABASE statement not allowed within multi-statement transaction.
Database 'TestDb' does not exist. Make sure that the name is entered correctly
Is your application creating a Transaction before executing the procedure?
If so, that would cause this problem since the Procedure is also creating a transaction.
You will see this same behavior if you were to execute the procedure in management studio (or any IDE) with "Auto-Commit" turned off, or if you wrapped it in a transaction like so:
BEGIN TRANSACTION
EXECUTE dbo.usp_CreateDatabase TmpData
COMMIT;

T-SQL Equivalent of .NET TransactionScopeOption.Suppress

In my .NET code, inside a database transaction (using TransactionScope), I could include a nested block with TransactionScopeOption.Suppress, which ensures that the commands inside the nested block are committed even if the outer block rolls back.
Following is a code sample:
using (TransactionScope txnScope = new TransactionScope(TransactionScopeOption.Required))
{
db.ExecuteNonQuery(CommandType.Text, "Insert Into Business(Value) Values('Some Value')");
using (TransactionScope txnLogging = new TransactionScope(TransactionScopeOption.Suppress))
{
db.ExecuteNonQuery(CommandType.Text, "Insert Into Logging(LogMsg) Values('Log Message')");
txnLogging.Complete();
}
// Something goes wrong here. Logging is still committed
txnScope.Complete();
}
I was trying to find if this could be done in T-SQL. A few people have recommended OPENROWSET, but it doesn't look very 'elegant' to use. Besides, I think it is a bad idea to put connection information in T-SQL code.
I've used SQL Service Broker in past, but it also supports Transactional Messaging, which means message is not posted to the queue until the database transaction is committed.
My requirement: Our application stored procedures are being fired by some third party application, within an implicit transaction initiated outside stored procedure. And I want to be able to catch and log any errors (in a database table in the same database) within my stored procedures. I need to re-throw the exception to let the third party app rollback the transaction, and for it to know that the operation has failed (and thus do whatever is required in case of a failure).
You can set up a loopback linked server with the remote proc transaction Promotion option set to false and then access it in TSQL or use a CLR procedure in SQL server to create a new connection outside the transaction and do your work.
Both methods suggested in How to create an autonomous transaction in SQL Server 2008.
Both methods involve creating new connections. There is an open connect item requesting this functionality be provided natively.
Values in a table variable exist beyond a ROLLBACK.
So in the following example, all the rows that were going to be deleted can be inserted into a persisted table and queried later on thanks to a combination of OUTPUT and table variables.
-- First, create our table
CREATE TABLE [dbo].[DateTest] ([Date_Test_Id] INT IDENTITY(1, 1), [Test_Date] datetime2(3));
-- Populate it with 15,000,000 rows
-- from 1st Jan 1900 to 1st Jan 2017.
INSERT INTO [dbo].[DateTest] ([Test_Date])
SELECT
TOP (15000000)
DATEADD(DAY, 0, ABS(CHECKSUM(NEWID())) % 42734)
FROM [sys].[messages] AS [m1]
CROSS JOIN [sys].[messages] AS [m2];
BEGIN TRAN;
BEGIN TRY
DECLARE #logger TABLE ([Date_Test_Id] INT, [Test_Date] DATETIME);
-- Delete every 1000 row
DELETE FROM [dbo].[DateTest]
OUTPUT deleted.Date_Test_Id, deleted.Test_Date INTO #logger
WHERE [Date_Test_Id] % 1000 = 0;
-- Make it fail
SELECT 1/0
-- So this will never happen
COMMIT TRANSACTION;
END TRY
BEGIN CATCH
ROLLBACK TRAN
SELECT * INTO dbo.logger FROM #logger;
END CATCH;
SELECT * FROM dbo.logger;
DROP TABLE dbo.logger;

Multiple SQL commands executed via "ExecuteReader" in a system I can only manipulate SQL statements

I have to configure a system which provides me with an area to input SQL statements.
It is important to notice that we cannot modify the system we are configuring.
I believe the system was built in C# (.net for sure but C# is a guess).
Any way, I'm trying to create a script that would:
create a temporary table
create a temporary procedure (which inserts into the table created)
call the temporary procedure 4 times
Read the temp table as a response to the system's call.
Something like:
CREATE Procedure #myTempProcedure(
#param1 nvarchar(max)
) as
begin
insert #tempTable (col1, col2) select aCol, bCol from table2 where col2 = #param1;
end;
CREATE TABLE #tempTable
(col1 nvarchar(512),
(col2 nvarchar(512));
EXEC #myTempProcedure N'val1';
EXEC #myTempProcedure N'val2';
EXEC #myTempProcedure N'val3';
EXEC #myTempProcedure N'val4';
select col1, col2 from #tempTable;
The system is very likely executing my script via C# SqlCommand.ExecuteReader() method. As I can simulate the problem in a simple C# application I created.
The problem is that when executing this script, the system (or SQL Server) assumes the procedure body to be the entire script and seems to disregard my ; in line 6 of the example above. My intention with this ; was to flag the end of the procedure creation.
Executing this script in Management studio requires a GO to be placed in line 7 of the example above, otherwise the same problem reported by the system would happen in Management Studio.
Is there a GO equivalent I could use in this script to get it to work??
Or is there a better way to script this??
I have a background in Oracle, and I'm still leaning SQL server usual tricks... The System accepts multiple commands apart from the create procedure here, So I'm inclined to believe there is a SQL Server trick I could use here.
Thank you in advance!
The problem is that syntactically there is no way to create a procedure and then do something after it in the same batch. The compiler doesn't know where it ends, and things like semi-colon don't fix it (because semi-colon only terminates a statement, not a batch).
Using dynamic SQL, (and fixing one syntax error) this works:
EXEC('
CREATE Procedure ##myTempProcedure(
#param1 nvarchar(max)
) as
begin
insert #tempTable (col1, col2) select aCol, bCol from table2 where col2 = #param1;
end;
');
CREATE TABLE #tempTable
(
col1 nvarchar(512),
col2 nvarchar(512)
);
EXEC ##myTempProcedure N'val1';
EXEC ##myTempProcedure N'val2';
EXEC ##myTempProcedure N'val3';
EXEC ##myTempProcedure N'val4';
select col1, col2 from #tempTable;
EXEC('DROP PROC ##myTempProcedure;');
1) please look at the rights on the server you many have some issues with if you cannot change or add anything to the system.
ie., Create procedure statements.
2) you could do a small exercise
open a connection object using the SqlConnection()
keep the connection open till you execute all you statements
ie., a) create your #table
b) execute your insert statement.
c) select * from your #table
this should get you back the data you are intending to get back from your temp table note i skipped the entire proc here.
Instead of creating stored procedure, you can execute sql statements delimited by semi colon. You can execute multiple statements this way. Also if you want to create a temp table and load it with data, you can use the same connection with multiple sql commands.
Given that the proc definition doesn't change, and that there is no real harm in the proc existing beyond the end of this particular process, it could just as easily be a regular (i.e. non-temporary) Stored Procedure that just happens to exist in tempdb. The benefit of using a regular Stored Procedure created in tempdb is that you do not need to worry about potential name collisions when using global temporary stored procedures. The script simply needs to ensure that the stored procedure exists. But there is no need to remove the Stored Procedure manually or have it automatically cleaned up.
The following code is adapted from #RBarryYoung's answer:
IF (OBJECT_ID(N'tempdb.dbo.myTempProcedure') IS NULL)
BEGIN
USE [tempdb];
EXEC('
CREATE PROCEDURE dbo.myTempProcedure(
#param1 NVARCHAR(MAX)
) AS
BEGIN
INSERT INTO #tempTable (col1, col2)
SELECT aCol, bCol
FROM table2
WHERE col2 = #param1;
END;
');
END;
CREATE TABLE #tempTable
(
col1 NVARCHAR(512),
col2 NVARCHAR(512)
);
EXEC tempdb.dbo.myTempProcedure N'val1';
EXEC tempdb.dbo.myTempProcedure N'val2';
EXEC tempdb.dbo.myTempProcedure N'val3';
EXEC tempdb.dbo.myTempProcedure N'val4';
SELECT col1, col2 FROM #tempTable;
The only difference here is that a non-temporary Stored Procedure does not execute in the context of the current database, but instead, like any other non-temporary Stored Procedure, runs within the context of the database where it exists, which in this case is tempdb. So the table that is being selected from (i.e. table2) needs to be fully-qualified. This means that if the proc needs to run in multiple databases and reference objects local to each of them, then this approach is probably not an option.

How to prevent SQL injection using a stored procedure while insert record?

I am new to SQL Server, I am trying to insert records into table using a stored procedure as shown below.
I want a suggestion that is using the below stored procedure. Also:
can I prevent SQL injection?
is it the right way?
Correct me if I miss anything in below procedure which leads to SQL injection.
Create PROCEDURE [dbo].[spInsertParamTable]
#CmpyCode nvarchar(50),
#Code nvarchar(50),
#DisplayCode nvarchar(50),
#TotalDigit int,
#Nos bigint,
#IdentitY int OUTPUT
AS
BEGIN
INSERT tblParamTable (CmpyCode, Code, DisplayCode, TotalDigit, Nos)
VALUES (#CmpyCode, #Code, #DisplayCode, #TotalDigit, #Nos)
END
SELECT #Identity = SCOPE_IDENTITY();
RETURN #Identity
SQL Injection specifically refers to injecting SQL code into an existing SQL query that's built up via string concatenation and executed dynamically. It is almost always of the form:
#dynamicSQL = "select * from sensitivetable where field = " + #injectableParameter
sp_executesql #dynamicSQL
For this particular stored procedure, the worst an attacker could do is insert unhelpful values into your tblParamTable.
However, if these values are then used in a dynamically-built query later on, then this merely becomes a second-order attack: insert values on page 1, see results of dynamic query on page 2. (I only mention this since your table is named tblParamTable, suggesting it might contain parameters for later re-use.)
Can I prevent SQL injection?
You already are - there is no way to "inject" code into your SQL statement since you're using parameters.
Is it the right way?
Well, there's not one "right" way - but I don't see anything seriously wrong with what you're doing. A few suggestions:
You don't need to RETURN your output parameter value. Setting it is enough.
You have the last SELECT outside of the BEGIN/END block, which isn't hurting anything but for consistency you should put everything inside BEGIN/END (or leave them out altogether).

Insert multiple sql rows via stored proc

I have looked a some related topics but my question isn't quite answered:
C# - Inserting multiple rows using a stored procedure
Insert Update stored proc on SQL Server
Efficient Multiple SQL insertion
I have the following kind of setup when running my stored procedure in the code behind for my web application. The thing is I am now faced with the possibility of inserting multiple products and I would like to do it all in one ExecuteNonQuery rather than do a foreach loop and run it n number of times.
I am not sure how to do this, or if it can be, with my current setup.
The code should be somewhat self explanatory but if clarification is needed let me know. Thanks.
SqlDatabase database = new SqlDatabase(transMangr.ConnectionString);
DbCommand commandWrapper = StoredProcedureProvider.GetCommandWrapper(database, "proc_name", useStoredProc);
database.AddInParameter(commandWrapper, "#ProductID", DbType.Int32, entity._productID);
database.AddInParameter(commandWrapper, "#ProductDesc", DbType.String, entity._desc);
...more parameters...
Utility.ExecuteNonQuery(transMangr, commandWrapper);
Proc
ALTER PROCEDURE [dbo].[Products_Insert]
-- Add the parameters for the stored procedure here
#ProductID int,
#Link varchar(max)
#ProductDesc varchar(max)
#Date DateTime
AS BEGIN
SET NOCOUNT ON;
INSERT INTO [dbo].[Prodcuts]
(
[CategoryID],
[Link],
[Desc],
[Date]
)
VALUES
(
#ProductID,
#Link,
#ProductDesc,
#Date
)
END
You should be fine running your stored procedure in a loop. Just make sure that you commit rarely, not after every insert.
For alternatives, you have already found the discussion about loading data.
Personally, I like SQL bulk insert of the form insert into myTable (select *, literalValue from someOtherTable);
But that will probably not do in your case.
You could pass all your data as a table value parameter - MSDN has a pretty good write up about it here
Something along the lines of the following should work
CREATE TABLE dbo.tSegments
(
SegmentID BIGINT NOT NULL CONSTRAINT pkSegment PRIMARY KEY CLUSTERED,
SegCount BIGINT NOT NULL
);
CREATE TYPE dbo.SegmentTableType AS TABLE
(
SegmentID BIGINT NOT NULL
);
CREATE PROCEDURE dbo.sp_addSegments
#Segments dbo.SegmentTableType READONLY
AS
BEGIN
MERGE INTO dbo.tSegments AS tSeg
USING #Segments AS S
ON tSeg.SegmentID = S.SegmentID
WHEN MATCHED THEN UPDATE SET T.SegCount = T.SegCount + 1
WHEN NOT MATCHED THEN INSERT VALUES(tSeg.SegmentID, 1);
END
Define the commandWrapper and parameters for the command outside of the loop and then with in the loop you just assign parameter values and execute the proc.
SqlDatabase database = new SqlDatabase(transMangr.ConnectionString);
DbCommand commandWrapper = StoredProcedureProvider.GetCommandWrapper(database, "proc_name", useStoredProc);
database.AddInParameter(commandWrapper, "#ProductID", DbType.Int32 );
database.AddInParameter(commandWrapper, "#ProductDesc", DbType.String);
...more parameters...
foreach (var entity in entitties)
{
database.SetParameterValue(commandWrapper, "#ProductID",entity._productID);
database.SetParameterValue(commandWrapper, "#ProductDesc",entity._desc);
//..more parameters...
Utility.ExecuteNonQuery(transMangr, commandWrapper);
}
Not ideal from a purist way of doing things, but sometimes one is limited by frameworks and libraries, and that you are forced to call stored procedures in a certain way, bind parameters in a certain way, and that connections are managed by pools as part of your framework.
In such circumstances, a method we have found to work is to simply write your stored procedure with a lot of parameters, usually a name followed by a number, e.g. #ProductId1, #ProductDesc1, #ProductId2, #ProductDesc2 up to a number you decide, possibly say 32.
You can use some form of scripting language to produce the lines for this.
You can get the stored procedure to insert all the values first into a table parameter that allows nulls, then do bulk inserts / merges on this data in a way similar to Johnv2020's answer. You might remove the null rows first.
It will usually be more efficient than doing it one at a time (partly because of the database operations itself, and partly because of your framework's overheads in getting the connection to call the procedure etc.)

Categories