What is the best approach to audit the database for:
Make a copy of the deleted and updated records.
Save the date and user ID of the ones who did the DML which is (the ID) was saved on a session in ASP.NET
My manager told me to add 3 extra columns to each table one for the ID of whom updated the record, one for whom deleted the record and the third has a boolean value 0 for deleted and 1 for active (not deleted yet), but I think this is a work around and I'm pretty sure there is a better way to do it .
I was thinking of creating history tables and write an AFTER DELETE trigger that saves all the DML.
Is this it or is there a more plain forward way?
SQL Server 2016 onwards you can do this using Temporal tables:
A system-versioned temporal table is a type of user table designed to
keep a full history of data changes, allowing easy point-in-time
analysis. This type of temporal table is referred to as a
system-versioned temporal table because the period of validity for
each row is managed by the system (that is, the database engine).
If what you are really trying to do is to record who changed what, a better approach is to use roles and groups to prevent users altering anything they shouldn't.
Related
Evening all,
Background: I have many xml data files which I need to import in to an SQL database via my C# WPF based Windows Application. Whilst some data files are a simple and straight forward INSERT, many require validation and verification checks, require GUIDs from existing records (from within the db) before the INSERT takes place in other to maintain certain relationships.
So I've broken the process in to three stages:-
Validate records. Many checks exist such as e.g. 50,000 accounts in xml file must have a matching reference with an associated record already in the database. If there is no corresponding account, abandon the entire import process; Another would be, only the 'Current' record can be updated, so again, if it points to a 'historic' record then it needs to crash and burn.
UPDATE the associated database records e.g Set from 'Current' to 'Historic' as record is to be superseded by what's to come next...
INSERT the records across multiple tables. e.g. 50,000 accounts to be inserted.
So, in summary, I need to validate records before any changes can be made to the database with a few validation checks. At which point I will change various tables for existing records before inserting the 'latest' records.
My first attempt to resolve this situation was to load the xml file in to XmlDocument or XDocument instance, iterate over every account in the xml file and perform an sql command for every account. Verify it exists, verify its a current account, change the record before inserting it. Rinse and repeat for thousands of records - Not ideal to say the least.
So my second attempt is to load the xml file in to a data table. Export the corresponding accounts from the database in to another data table and perform a nested loop validation e.g. does DT1.AccountID exist in DT2.AccountID, move DT2.GUID to DT1.GUID etc etc. I appreciate this could also be a slow process. That said, I do have the luxury then of performing both the UPDATE and INSERT stored procedures with a table value parameter (TVP) and making use of the data table information.
I appreciate many will suggest letting the SQL do all of the work but i'm lacking in that skillset unfortunately (happy to learn if thats the general consensus) but I would much rather do the work in C# code if at all possible.
Any views on this are greatly appreciated. Many of the questions I've found are around bulk INSERT, not so much about validating existing records, following by updating records followed by inserting records. I suppose my question is around the first part, the validation. Does extracting the data from the db in to a data table to work on seem wrong, old fashioned, pointless?
I'm sure i've missed out some vital piece of information so apologies if unclear.
Cheers
I'm creating a winforms app using EF6, so i've created a new table, and a whole bunch of controls, mostly textboxes and comboboxes.
I'm very new to all this, but i know that i need to use a using statement for the "entities" context.
The Id field, is PK and Auto-increment (IsIdendtity = True) so if i don't specify a write to the ID field, it should get an automatically assigned ID when the new entity is tracked to the context right?
How would i validate this? or manually attempt to write my ID?
Reason for question?
i worked on something seperate and simple, adding/removing records.. somehow, the ID's have jumped up to 4000, and i've only entered and removed about 5 - 6 records. so how has it got to 4000 already? (no loop was used for my code either) and was the same, ISIdentity, auto-increment 1.
yet somehow.. records got ID 4001..
So this has lead me to believe i need to validate and restrict that data further, but because im an amateur, i've no clue where to start, any advice would be greatly appreciated.
C#/WInforms/EF6
This is related with the Sql Server. When you create an identity sql server creates that identity with cache and reserving identities amount of specified cache. When you restart your SQL Server you loose previously reserved values and this cache is created again and the amount of identities reserved again. So this causes gaps in your identity. You can turn it off but with a performance trade off.
For further reading check this
There is mssql table in an external customer network. The aim is to create and reflect the same table in the local server. External mssql table, of course, can be changed (data) every hour and somehow I have to check for changes and reflect that changes in local table when new rows are added/deleted or updated. Is there any efficient way to do it? Additionaly i know that this table will have thousands of records. First of all, I thought about some windows service application but have no idea what approach to do, I do not think datatable/dataset with regards to so much records is fine as i remember memory out of exception in past. Any ideas?
The way I would go about it is to create triggers on the existing tables that upon insert, update and delete would insert into a new sync table (or a sync table per existing table) which would mark the change pending synchronization. Your C# code would read from this table on a schedule, apply changes to the local DB and delete the rows from the 'pending' table.
For example, this is how Azure SQL Data Sync works; it creates a table per existing table in the source table and then checks all these tables. I think, depending on how many tables you have and the structure etc, you could write something like JSON in just the one table instead, and it would be easier to check one table than plenty (obviously this depends on how many actual tables we're talking about).
I currently capture audit history of billing changes tables data in a SQL Server database, on the row updates/inserts. This question/answer has been beaten to death (but also evolved both from Microsoft SQL Server features, and features for user defined solutions).
However, I am finding in my scenario, frequent schema changes. So, I am manually dropping and recreating the shadow audit tables, which is getting tedious over many tables.
Is there turn-key feature or plugin with SQL Server that does both with ease from dev perspective/non-dba, i.e. capture both the meta changes and full row update/transaction, in the same place / one stop shopping?
There seem to be a many ways, but I want one solution the. The way I do this now, is I add a new column to the shadow table as the schema grows, and look for when the schema was changed in the audit table.
CDC - Change Data Capture, but drops the tables on cleanup
Audit Triggers
Audit Table Feature
AutoAudit: love the options , not updated, performance seems slow
EF Audit tracking changes change tracking
We have a production database using Change Data Capture(CDC) feature to maintain audit data for few tables. But because of the performance impacts and need to make changes to database structure like adding indexes which we can’t do due to CDC, we now want to disable CDC.
Requirement is to capture all the Insert, Delete and Update actions from web application for a set of tables in a single audit table.
Let's take for example I have to maintain Audit information for TableA in TableB for each Add, Update and Remove done to TableA through web application [in C#].
I'm currently updating each stored procedure which can cause either of these 3 actions for each of the table in the set. But it's too error prone and time taking method as I have huge list of tables.
Please help with a better way which can be more efficient both in terms of time and performance to achieve this.
Quick answer : Just make sure that all actions that will change the DATABASE is recorded!
In our project, we created a separate table which has the Following Fields
Date_And_Time Actions UserID Part_of_Program
whenever a user executes an insert, update or delete command, or logs in or out of the system, or the system runs an automatic operation inside our database, the program automatically inserts a record here which tells us what action was taken, who did it and what what part of the program was affected.
..Or
you could write code that constantly checks the current size of your tables. If the size changes, then record it in the audit table.
If the size increases, then data was inserted. otherwise data was deleted.