I have a database A which is connected to my website(ASP.NET MVC). Whenever there is a change/update in database A through the website, I want to run a console app to grab updated data from database A and pull it down to database B.
Is this possible to implement this function using SqlDependency or Service Broker, or is there a better way of doing it?
There is number of ways how you can do that. To name a few:
setup database mirroring
backup/restore whole db (can easily be overkill)
use custom scripts to update one db to another
use sync framework from ado.net
use some custom code to update second db
While you can setup first three to be completely on database level, 4,5 (and 3 as well) uses some application.
In order to call your code on time you can use both push and pull approaches, so either setup a timer or use SqlDependency to have a callback when update happened.
On database level you can setup trigger or have a recurring job setup.
You may implement SQL SERVER CLR integration in following ways:
Enable CLR with SQL server: https://msdn.microsoft.com/en-us/library/ms131048(SQL.100).aspx
Write CLR trigger : https://msdn.microsoft.com/en-us/library/ms131093(v=sql.100).aspx
For more info :https://msdn.microsoft.com/en-us/library/ms254498%28v=vs.110%29.aspx
UPDATE:
You may create a sp like bellow and call this sp in a trigger for that table: CREDIT :
CREATE PROCEDURE dbo.usp_ExecCmdShellProcess
AS
BEGIN
DECLARE #job NVARCHAR(100) ;
SET #job = 'xp_cmdshell replacement - ' + CONVERT(NVARCHAR, GETDATE(), 121) ;
EXEC msdb..sp_add_job #job_name = #job,
#description = 'Automated job to execute command shell script',
#owner_login_name = 'sa', #delete_level = 1 ;
EXEC msdb..sp_add_jobstep #job_name = #job, #step_id = 1,
#step_name = 'Command Shell Execution', #subsystem = 'CMDEXEC',
#command = 'c:\Testconsole.exe', #on_success_action = 1 ;
EXEC msdb..sp_add_jobserver #job_name = #job ;
EXEC msdb..sp_start_job #job_name = #job ;
END ;
GO
and I can't say better ,but I think you can go for a trigger and call a clr function in the SQL server a https://msdn.microsoft.com/en-us/library/w2kae45k.aspx.
Related
I need to pass a parameter to the SSIS package, which is being executed from an SQLAgent job. The job is triggered from C# code. I could not get a definitive answer in the internet. Is it possible?
SQLServer Version: 2012
From the C#, I am using the following code to start the job.
exec msdb..sp_start_job #job_name='Upload_Job'
A SQL Agent job is static - the definition of the actions to take is specified at job/job step creation time.
If you need to pass parameters to a job, SQL Agent doesn't support this.
If you have a finite set of parameters to pass: Update Hr, Load Finance, Delta Processing, Full Load - then create a job per scenario and be done with it.
If the parameters cannot be bound, then create one-time jobs within SQL Agent. This allows you to specify the exact parameters you need for this run with no worries about concurrent access to a configuration table. The reason for taking this route and not just running the SSISDB procedures themselves usually involves the need to specify a proxy user/credential.
Create a parameters table and have your C# app insert them into that. Then modify your SSIS job to grab the parameters from their as the first step of the package. Add a final step to the package to clear down the parameter table.
#Nick.McDermaid stated there was a concern about "parallelism". If there is a chance that this job may be set to run in quick succession you could have a "parameter queue". The first step of the SSIS package would "pop" the top parameter off the queue. This way it can be run pretty quickly time after time with no issue.
Like some have said in the comments, the better answer is to not use the agent, but instead, call the SSIS task itself in SSISDB. This means you can run the task multiple times at the same time (impossible with Agent), and you can pass parameters.
Depending on your set up, you might want to create a Stored Procedure that calls the relevant SP's in SSISDB for your specific task, rather than calling them all in the application. I've found this easier, in my opinion, as you have a little more control and only need to change one place if someone on the package changes.
This is an example, however, might help you get the idea:
USE SSISDB;
GO
--I'm going to create the SPs in a new schema in SSISDB, however, you can create this elsewhere if you want
--Create the new schema
CREATE SCHEMA app;
GO
--Create the proc, I've made up some parameters
CREATE PROC app.UploadTask #FileName sql_variant, #FileDate date, #RetryNum int AS
DECLARE #execution_id bigint;
--Create the execution
EXEC [catalog].create_execution #package_name = N'UploadDocument.dtsx', --Made up package name
#execution_id = #execution_id OUTPUT,
#folder_name = N'FTP Packages', --Madeup name
#project_name = N'FileTranfers', --Madeup Project Name
#use32bitruntime = FALSE,
#reference_id = NULL;
--Add the paramters
EXEC [catalog].set_execution_parameter_value #execution_id = #execution_id,
#object_type = 30,
#parameter_name = N'FileName',
#parameter_value = #FileName;
EXEC [catalog].set_execution_parameter_value #execution_id = #execution_id,
#object_type = 30,
#parameter_name = N'SubmissionDate',
#parameter_value = #FileDate;
EXEC [catalog].set_execution_parameter_value #execution_id = #execution_id,
#object_type = 30,
#parameter_name = N'Retries',
#parameter_value = #RetryNum;
--Set the logging level
EXEC [catalog].set_execution_parameter_value #execution_id = #execution_id,
#object_type = 50,
#parameter_name = N'LOGGING_LEVEL',
#parameter_value = 1;
--This is optional, comment out or delete the following if you do not want it
--Set the package to run synchronously
EXEC [catalog].set_execution_parameter_value #execution_id = #execution_id,
#object_type = 50,
#parameter_name = N'SYNCHRONIZED',
#parameter_value = 1;
--And now, run the package
EXEC [catalog].start_execution #execution_id;
GO
--Now a sample call:
EXEC app.UploadTask #FileName = N'\\YourFileFile\SomeShare\SomeFolder\YourFile.txt', --It's important this is a nvarchar, varchar won't work!
#FileDate = '20180704',
#RetryNum = 3;
Any questions, please do ask.
Why do you use SQL Server Agent job? It is not possible to pass parameters to a job.
There is another way to execute the package: The DTExec utility, which allows passing parameters to the package configuration:
dtexec /f "PathToMyPackage\Package.dtsx" /set \package.variables[myvariable].Value;myvalue
MS SQL server 2017
asp.net core 2
I call store procedure from c#:
res = db.Query<SearchPictureInfoOutputModel>("sp_SearchPictureInfo", new { Prc_ID = input.Prc_ID, Cust_ID = input.Cust_ID, AppCode = input.AppCode},
commandType: CommandType.StoredProcedure).FirstOrDefault();
I get error:
System.Data.SqlClient.SqlException: 'Cannot resolve the collation conflict between "Cyrillic_General_CI_AS" and "SQL_Latin1_General_CP1_CI_AS" in the equal to operation.'
If i call procedure from ms managment studio
DECLARE #return_value int
EXEC #return_value = [dbo].[sp_SearchPictureInfo]
#Prc_ID = 2663,
#Cust_ID = 26429,
#AppCode = N'19139'
SELECT 'Return Value' = #return_value
GO
No error, just return value.
How can I solve collation?
Most likely your system databases have different collation.
You can check that by executing the following query in your SQL Management studio:
SELECT name, collation_name
FROM sys.databases
Also on non master databases you can use right-click -> Properties -> General -> Maintenance and you will see Collation as last option.
Non-master ones can be changed using this query:
USE master;
GO
ALTER DATABASE <<YOUR_DATABASE_OTHER_THAN_MASTER>>
COLLATE Latin1_General_CI_AS ;
GO
If you have to change the master one (like I do) you will not be able via the SQL Management studio. The easiest way for me was to stop it as well as all SQL services, the navigate to the binn folder of the SQL (for 2016 and x64) was this one:
C:\Program Files\Microsoft SQL Server\MSSQL13.MSSQLSERVER\MSSQL\Binn
Then if you use the default instance just run:
sqlservr.exe -m -q"Latin1_General_CI_AS"
or whatever collation you want to set. If you have several instances add also the following parameter:
-s"<<YOUR_SQL_INSTANCE>>" e.g. -> sqlservr.exe -m -s"SQLEXP2014" -q"Latin1_General_CI_AS"
Hopefully this will help you to resolve your problem :)
Sources:
https://learn.microsoft.com/en-us/sql/relational-databases/collations/view-collation-information?view=sql-server-2017
https://www.mssqltips.com/sqlservertip/3519/changing-sql-server-collation-after-installation/
I am very new to C# and VFP(Visual fox pro). My first goal is to run and test a simple update query using query builder of MS visual studio. After adding a VFP database successfully(now I can preview data and execute select statements), I executed the following query and I am getting SQL Execution error where error message is: Trigger failed in Inventory.
UPDATE inventory SET location = 'test' WHERE inventoryid = 221
I dont understand what is wrong and Is there any better ways to learn and play with VFP database?
Update
Thanks to everyone. This database came up with a software. Since I was able to connect to the database and export some files after building an external app, I thought I could easily update few information too. Since now I have VFP and I dont have any source code of the whole software,
Is there any possibility that I might not find a way out to find or
modify the trigger function and the stored procedure since it is
part of a packaged software?
I just have the installed software and the VFP database inside it. The software talks to that VFP database.
Updated 2
I found the stored procedure in a FPT file under a different folder which is a location of an another system database. This is the code:
PROCEDURE InventoryUpdateTrigger(tcAlias,tcSource,tlHis)
IF VARTYPE(glNoTrigger) = "L"
IF glNoTrigger
RETURN .t.
ENDIF
ENDIF
IF VARTYPE(goApp) <> "O"
RETURN .f.
ENDIF
LOCAL laFieldList(1),laDataValues(1,3), laDataFields(1)
LOCAL lcCriticalFields, lnFields, lcField, llRetVal,lnItemid,lnAlias,;
lcPtNo, llBBHasChanged, lnField, lcNewVal, lcOldVal, lcSource, lnInventoryid
lnAlias = SELECT(0)
llRetVal = .t.
lcCriticalFields = "ITEMID,ITEMNUMBER,MFGR,SERIALNUMBER,"+;
"DESCRIPTION,WAREHOUSE,MACHINETYPE,QTY,UNITCOST,STATUS,RECEIVESTATUS,"+;
"FREIGHTIN,CONDITIONCODE,DATERECEIVED,BREAKDOWNINVENTORYID,"+;
"DATEALLOCATED,ALLOCATEDBY,DATEUNBOOKED,UNBOOKEDBY,LOCATION"
llBBHasChanged = .f.
BEGIN TRANSACTION
TRY
lnInventoryid = EVALUATE(tcAlias+".inventoryid")
lnItemid = EVALUATE(tcAlias+".itemid")
SELECT (tcAlias)
lnDataFields = AFIELDS(laDataFields,tcAlias)
DIMENSION laDataValues(lnDataFields,3)
FOR lnField = 1 TO lnDataFields
lcField = ALLTRIM(laDataFields(lnField,1))
lcNewVal = ALLTRIM(TRANSFORM(EVALUATE(tcAlias+"."+lcField)))
lcOldVal = ALLTRIM(TRANSFORM(OLDVAL(lcField,tcAlias)))
laDataValues(lnField,1) = lcField
laDataValues(lnField,2) = lcOldVal
laDataValues(lnField,3) = lcNewVal
IF !llBBHasChanged
IF UPPER(lcField) $ UPPER(lcCriticalFields)
IF GETFLDSTATE(lcField,tcAlias) > 1
llBBHasChanged = .t.
ENDIF
ENDIF
ENDIF
ENDFOR
IF goapp.osystem.audit_inventory
llRetVal = UpdateDatabaseAudit(IIF(tlHis,"INVENTORYHIS","INVENTORY"), lnInventoryid,;
goapp.nUserid, AUDIT_UPDATE, tcSource, #laDataFields,#laDataValues)
ENDIF
IF llRetVal
LOCAL llUpdateBrokerBin
llUpdateBrokerBin = .f.
IF llBBHasChanged
IF !tlHis
** need to update brokerbin
IF GETFLDSTATE("STATUS",tcAlias) > 1 OR GETFLDSTATE("RECEIVESTATUS",tcAlias) > 1
lcOldReceive = OLDVAL("ReceiveStatus",tcAlias)
lcNewReceive = EVALUATE(tcAlias+".ReceiveStatus")
lcOldStatus = OLDVAL("Status",tcalias)
lcNewStatus = EVALUATE(tcAlias+".Status")
lnCase = 0
DO case
CASE lcOldReceive <> STATUS_RECEIVED ;
AND lcNewReceive = STATUS_RECEIVED ;
AND lcNewStatus = STATUS_OPEN
llUpdateBrokerBin = .t.
lnCase = 1
CASE lcOldStatus = STATUS_OPEN ;
AND lcNewStatus = STATUS_ALLOCATED ;
AND lcNewReceive = STATUS_RECEIVED
llUPdateBrokerBin = .t.
lnCase = 2
CASE lcOldReceive = STATUS_RECEIVED ;
AND lcNewReceive <> STATUS_RECEIVED ;
AND lcNewStatus = STATUS_OPEN
llUpdateBrokerBin = .t.
lnCase = 3
CASE lcOldStatus = STATUS_ALLOCATED ;
AND lcNewStatus = STATUS_OPEN ;
AND lcNewReceive = STATUS_RECEIVED
llUPdateBrokerBin = .t.
lnCase = 4
CASE lcOldStatus <> STATUS_BOOKED ;
AND lcNewStatus = STATUS_BOOKED
llUpdateBrokerBin = .t.
lnCase = 5
CASE lcOldStatus = STATUS_BOOKED ;
AND lcNewStatus <> STATUS_BOOKED
llUpdateBrokerBin = .t.
lnCase = 6
ENDCASE
ELSE
lcReceive = EVALUATE(tcAlias+".ReceiveStatus")
lcStatus = EVALUATE(tcAlias+".Status")
IF lcReceive = STATUS_RECEIVED AND lcStatus = STATUS_OPEN
llUpdateBrokerBin = .t.
ENDIF
ENDIF
IF llUpdateBrokerBin
llRetVal = setbrokerbintoprocess(lnItemid,lnInventoryid,0,tcAlias,"INVENTORY","UPDATE")
ENDIF
ENDIF
ENDIF
ENDIF
CATCH TO loException
llRetVal = .f.
RecordError(loException.ErrorNo, loException.LineNo, loException.Message,loException.Procedure)
FINALLY
IF llRetVal
END TRANSACTION
ELSE
ROLLBACK
ENDIF
ENDTRY
SELECT (lnAlias)
RETURN llRetVal
ENDPROC
It leads to many questions which I think that is part of learning VFP first and it is not about simply running a SQL statement.
Do you even have VFP to work with?
If you do, run it. In the command window, type
CD "path to where the database is located" [enter]
OPEN DATABASE NameOfYourDatabase [enter]
MODIFY DATABASE [enter]
This will bring up the database and show all the tables and whatever relationships between the tables in question. From there, right-click and popup menu will have "Stored Procedures". The code is embedded in the database container.
Also, for the specific inventory table, you can do a find within the opened database. Then right-click on it and click "Modify...". This will bring up the details about the table... Columns, sizes, indexes, and a "tab" page for the overall table showing what rules to apply and methods to call for insert, update and delete triggers.
This should get you started (provided you HAVE VFP to begin with).
FEEDBACK...
Since you have VFP9, once started, in the command window, type
CD ? [enter]
the "?" will ask you for a folder to change directory to. Pick the directory where your database is located.
Once there, then type
OPEN DATABASE ? [enter]
and it should ask which database to open... pick it.
then
MODIFY DATABASE [enter]
it will open a window showing all the tables and whatever relationships are in the database.
Right-click in the database in any open area and then click "Stored Procedures". This will bring up the code window for the stored procedures.
Now, with respect to the tables that have triggers, back in the modify database window, if you pick a table that you know has triggers (per your example), right-click on the table and modify. it will bring up a tab screen showing the fields, indexes and then table. The "Table" tab has the triggers and any rules for validation and insert/update/delete triggers.
Hopefully this will help you find your issue.
#Robi,
If you don't have a need to work with existing VFP databases, then it is unlikely you would want to work with them. I suggest you to use one of the many other databases that have ODBC and/or OLEDB drivers (MS SQL Server, PostgreSQL, MariaDb, MySQL, Oracle, SQLite just to name a few). You can work with such databases using C# or VFP. With C# you not only have access to those with ODBC and/or OLEDB drivers but to almost any database (like the relatively newer NoSQL databases). This is just a suggestion. If you would anyway use VFP database from C#, keep in mind that you need to own VFP to do some tasks at database level, and not all commands are supported through VFPOLEDB driver. Also note that VFPOLEDB driver is 32 bits (don't forget to target x86 from C# IOW). Also the VFPOLEDB driver unfortunately doesn't behave the same way when you use it from C# or VFP.
Having said that, assuming you would anyway use a VFP database and want to debug this case, it is easier to do it from within VFP. For a seasoned developer there would be other ways, but I think this one is the easiest for a newcomer. In command window:
use ('inventory')[enter]
modify structure[enter]
(you may need to write fullpath to yourTable if it is not in search path, you don't need to set default to that folder)
This would bring up a tabbed dialog. Choose "Table" tab and check the procedure name next to "Update trigger". So now you know which procedure is called for update trigger. Copy the name from there and close that dialog.
Now type this:
Set Database to[spacebar]
as soon as you press spacebar VFP would show you a database list (likely with one element). Choose the one that this table belongs to and press [enter]. Then type:
Modify Procedure [enter]
Now you should see the code window with "Stored Procedure"s for that database. Locate the procedure that you got from update trigger (hint: in standart toolbar there is an icon with a document and magnifier - "Document View Window". If you bring it up and then click back to code window, you would see all the procedure names in that code file. Clicking the procedure name in doc. window would directly take you to the code location).
Now assuming you are good up to this point, and found the procedure that update trigger calls you need to find out why it is failing. To debug, just after "Procedure ..." line (and also after "lparameters ..." or "parameters ..." line if there is one) insert this line of code:
set step on
Press Ctrl+W to save. Now run your update command from command line:
UPDATE inventory SET location = 'test' WHERE inventoryid = 221
If would hit your "set step on" line and the debugger would pop up. Now you can do all sorts of debugging (after all these years and the glory of VS2013, I still seem to miss VFP's debugging capabilities compared to C# - most of which is discovered in advanced development).
I would bet "location" is coming from a lookup table, which someone created a referential integrity (missing 'test' in the lookup would cause the trigger to fail).
Is there a way to determine the current SQL Server session ID (##SPID) for an opened DbContext, short of making a SQL query directly to the database?
If there is, is there any guarantee that the SQL Server session ID will remain the same until the DbContext is released and its connection is released back to the Entity Framework connection pool? Something similar to this:
using (MyEntities db = new MyEntities()) {
// the following 3 pieces of code are not existing properties and will result in compilation errors
// I'm just looking for something similar to the following 3 lines
db.CurrentSessionId; //error
db.Database.CurrentSessionId; //error
((IObjectContextAdapter)db).ObjectContext.Connection.CurrentSessionId; //error
// the following code will work, but will this session id be the same until the original DbContext is disposed?
// is there any chance that a db.Database.SqlQuery call will spin off it's own connection from the pool?
short spid = db.Database.SqlQuery<short>("SELECT ##SPID").FirstOrDefault();
}
First of all, The Dbcontext alone will NOT open any sql process on your database. The query does.
so in this case when you run SELECT ##SPID you will definitely open a new process with a new ID.
The good news is Entityframework will use the same process to run your subsequent query. So ideally in the same using block you will always get the same ##SPID value.
You can run this query
select *
from master.dbo.sysprocesses
where program_name = 'EntityFramework'
to observe the current processes on your database associated with Entity Framework.
You can then use the query below to get the SQL statement that associated with specific process. For more information please take a look the accepted answer here: List the queries running on SQL Server
declare
#spid int
, #stmt_start int
, #stmt_end int
, #sql_handle binary(20)
set #spid = XXX -- Fill this in
select top 1
#sql_handle = sql_handle
, #stmt_start = case stmt_start when 0 then 0 else stmt_start / 2 end
, #stmt_end = case stmt_end when -1 then -1 else stmt_end / 2 end
from master.dbo.sysprocesses
where spid = #spid
order by ecid
SELECT
SUBSTRING( text,
COALESCE(NULLIF(#stmt_start, 0), 1),
CASE #stmt_end
WHEN -1
THEN DATALENGTH(text)
ELSE
(#stmt_end - #stmt_start)
END
)
FROM ::fn_get_sql(#sql_handle)
I can run this command in SqlManager to detach the db
ALTER DATABASE mydb SET OFFLINE WITH ROLLBACK IMMEDIATE
GO
dbo.sp_detach_db #dbname = N'mydb',#keepfulltextindexfile = N'false'
When I use the same connection running the same commadn via ado.net fails with error:
The database 'mydb' can not be opened because it is offline
(Error is translated from german.)
The Ado.Net code is
SqlCommand cmdOffline = new SqlCommand(#"ALTER DATABASE mydb SET OFFLINE WITH ROLLBACK IMMEDIATE");
cmdOffline.Connection = prepareMasterDBConnection;
cmdOffline.ExecuteNonQuery();
SqlCommand cmdDetach = new SqlCommand(#"dbo.sp_detach_db #dbname = N'mydb',#keepfulltextindexfile = N'false'");
cmdDetach.Connection = prepareMasterDBConnection;
cmdDetach.ExecuteNonQuery();
The connection is set to master - DB and open. The first commadn exceutes sucessfully.
What is the difference here when calling the code from ado and from sql-manager?
If your goal is to avoid conflicting connections while dropping it, rather than setting it offline before detaching, I would use the command, ALTER DATABASE mydb SET SINGLE_USER WITH ROLLBACK IMMEDIATE instead of setting it offline (and reverse it with ALTER DATABASE mydb SET MULTI_USER).
Detach needs to do some stuff before it detaches. Like s_detach says (my bold)
#skipchecks = 'skipchecks'
Specifies whether to skip or run
UPDATE STATISTIC. skipchecks is a
nvarchar(10) value, with a default
value of NULL. To skip UPDATE
STATISTICS, specify true. To
explicitly run UPDATE STATISTICS,
specify false.
By default, UPDATE STATISTICS is
performed to update information about
the data in the tables and indexes in
the SQL Server 2005 Database Engine.
Performing UPDATE STATISTICS is useful
for databases that are to be moved to
read-only media.
When it's offline, you can't do that...