We are currently using Azure pipelines to run automated UI and API tests on different test environments. Our tests are written in C# using visual studio, specflow and MStest
It's working well, but now i am looking to improve the test data set up before our tests are ran.
We use SQL server databases. Currently we use a 'Execute single or multiple SQL scripts' Azure Pipeline task to delete data in tables so we can reuse test data in our scripts. We currently only use DELETE FROM SQL commands. For example:
DELETE FROM [dbo].[table1]
DELETE FROM [dbo].[table2]
etc.
Obviously this is not a good solution as it requires maintenance each time there are db changes or new tables and columns are added. We run into foreign key constraints etc.
The application we are running tests on consists or 3 databases:
DB 1 = client db
DB 2 = Broker db
DB 3 = internal db
I was thinking a back up and restore of the test database before we run tests would be a better solution. However, I am unsure what the recommended best practice solution is?
I see Azure Pipelines has these tasks, but i was think maybe a sql/stored procedure script would be better?
Also, can anyone point me in the direction in what a SQL script would look like to do this?
I was thinking a back up and restore of the test database before we
run tests would be a better solution
You can follow below script to backup Azure SQL database in powershell task .
$exportRequest = New-AzureRmSqlDatabaseExport -ResourceGroupName $ResourceGroupName -ServerName $ServerName `
-DatabaseName $DatabaseName -StorageKeytype $StorageKeytype -StorageKey $StorageKey -StorageUri $BacpacUri `
-AdministratorLogin $creds.UserName -AdministratorLoginPassword $creds.Password
$importStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink
[Console]::Write("Exporting")
while ($importStatus.Status -eq "InProgress")
{
$importStatus = Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $importRequest.OperationStatusLink
[Console]::Write(".")
Start-Sleep -s 10
}
[Console]::WriteLine("")
$importStatus
For details, please refer to this ticket.
Related
I'm working with Dynamics365 CE in the cloud. I'm trying to run some rather involved queries that I've built up as SQL scripts (using the wonderful "SQL-4-CDS" plugin for the XrmToolBox).
Now I know I can connect to the Dataverse data store through the TDS endpoint (if enabled - it is in my case), and from SSMS, it works just fine:
Server Name = myorg.crm4.dynamics.com,5558
Authentication = Azure Active Directory - Password
User Name = my company e-mail
I can connect to Dataverse, and run my queries - all is great.
Now I'd like to do the same from C# code (running on .NET 6) that I'm writing, that should end up being an Azure Function in the end - so it's a "server-to-server", behind-the-scenes, no interactive login context kind of scenario.
I can connect to Dataverse via the TDS endpoint using this connection string - as long as I'm running the app interactively - as me, in my user context:
Server=myorg.crm4.dynamics.com,5558;Authentication=Active Directory Password;Database=my_dbname;User Id=my_email;Password=my_pwd;
However - this won't work with a server-to-server "daemon"-style setup.
Since I'm using .NET 6 (for the Azure Function), and since I want to run some custom SQL statements, I cannot use the "CRM XRM Client" tooling (with the IOrganizationService classes) - I need to use straight ADO.NET - any idea would I could define an ADO.NET compatible connection string, that would use a Client ID and Client Secret (which I both have at my disposal)?
I've tried a great many values for the Authentication=...... setting - but none have worked so far. Any place I can find a complete list of the supported values for this connection string parameter?
Thanks for any help or pointers!
I'm writing a WPF application.
Trying to use the normal method of getting a connection returns an error similar to: "The 'Microsoft.ACE.OLEDB.12.0' provider is not registered on the local machine."
ACE.OLEDB has never been installed on this machine so this error makes sense.
I'm trying to create this application in a way so that our users won't need to contact IT to have the application installed. Getting IT involved is a no go situation and the project will be abandoned.
Another team has an Access database (accdb) that I want my application to extract information (only read, no insert or update). I talked to the team and they won't convert this database back to an earlier version (mdb).
After my research I assume that installing ACE.OLEDB without using Admin privileges is impossible. Because of this and my application requirement of not requiring admin privileges I need to start looking for "Mutant"/Dirty solutions that don't involve ACE.OLEDB.
I tried using power-shell but I'm getting the same problems as I had with C# (requires IT to install ACE.OLEDB).
I have two potential solutions. One write a VBA script that opens up the database and dumps a query result into a file. My C# application would call this VB script and then parse the created file.
The second option is to create a new Access process using Process.Start(fullFilePath) and somehow pass the command to execute a query and somehow pass the results back to the executing application (either via a method return or first to a file).
How would you get the data out?
Is there a way for C# to duplicate the DB file and convert it from (accdb -> mdb)?
This is the second question I ask that is very similar.
C# Connecting to Access DB with no install
The difference between the two (to prevent this is a duplicate question) is that in the previous question I was looking for ways to install ACE.OLEDB without admin privileges while here I'm just looking for any other work around.
Found a workaround. It uses Microsoft.Office.Interop.Access found in NuGet.
var accApp = new Microsoft.Office.Interop.Access.Application();
accApp.OpenCurrentDatabase(#tests.DatabasePath);
Microsoft.Office.Interop.Access.Dao.Database cdb = accApp.CurrentDb();
Microsoft.Office.Interop.Access.Dao.Recordset rst =
cdb.OpenRecordset(
"SELECT * FROM Users",
Microsoft.Office.Interop.Access.Dao.RecordsetTypeEnum.dbOpenSnapshot);
while (!rst.EOF)
{
Console.WriteLine(rst.Fields["username"].Value);
rst.MoveNext();
}
rst.Close();
accApp.CloseCurrentDatabase();
accApp.Quit();
All,
I am trying to solve a seemingly very common problem of updating our development database and our production database simultaneously, whenever a change is made to our development database.
Something like this:
PM > Add-Migration AddMyNewColumnColumnToMyTable
PM > update-database - dev
PM > update-database - prod
I've seen solutions while researching this but nothing yet that is as simple and straight forward as running SQLCompare on the dev and production databases and then exporting and running a SQL script on the production database.
How are you all doing this?
Thanks
Well as long as both of your databases are following the same migration stream meaning both are created from same migration and updated in same chronological order, you should not have any issues with using same migrations.
What you can do is create two connection strings in your web.config (or app.config) and when you are updating database use the following syntax:
update-database -connectionStringName YourProdDbContextConnectionStringName
So assume you have 5 migrations total: Migration1, Migration2, Migration3, Migration4 and Migration 5.
If your production is updated up to Migration3, and you've made Migration4 and Migration5 on your dev database, issuing update-command with different connection string to your production will apply Migration4 and Migration5 at the same time, without any problems.
Well my windows service has to send out automated emails when the sql database has been updated.
How Exactly would i go about doing this? any code or tutorials would really help me
Solution 1 - Using sp_send_dbmail
Here is an example of creating a trigger that sends an email when an INSERT/UPDATE/DELETE event occurs on a specific table:
USE AdventureWorks2008R2;
GO
IF OBJECT_ID ('Sales.reminder2','TR') IS NOT NULL
DROP TRIGGER Sales.reminder2;
GO
CREATE TRIGGER reminder2
ON Sales.Customer
AFTER INSERT, UPDATE, DELETE
AS
EXEC msdb.dbo.sp_send_dbmail
#profile_name = 'AdventureWorks2008R2 Administrator',
#recipients = 'danw#Adventure-Works.com',
#body = 'Don''t forget to print a report for the sales force.',
#subject = 'Reminder';
GO
Source: http://msdn.microsoft.com/en-us/library/ms189799.aspx
sp_send_dbmail was introduced in SQL Server 2005. More info: http://msdn.microsoft.com/en-us/library/ms190307.aspx
Note:
Before use, Database Mail must be enabled using the Database Mail
Configuration Wizard, the SQL Server Surface Area Configuration tool,
or sp_configure.
Solution 2 - Using xp_cmdshell
If you can't set up Database Mail you have another option: xp_cmdshell.
With it you can run command line commands within SQL statements e.g. a small email sending tool.
This one is a small example how to send emails using System.Net.Mail in a C# application: http://weblogs.asp.net/scottgu/archive/2005/12/10/432854.aspx
How to use xp_cmdshell: http://msdn.microsoft.com/en-us/library/aa260689%28v=sql.80%29.aspx
So you create a small C# console app that sends email then you execute it with xp_cmdshell right from your SQL statement.
Solution 3 - Using a windows service (as he wants)
A Windows Service can't determine by itself whether a rows gets updated in an MSSQL database. You need to log the changes. To do so you may create an trigger for a specific table to record changes. By recording changes I mean inserting a new row into a log table withing the trigger like this:
USE MyDatabase;
GO
IF OBJECT_ID ('Products.TRRowUpdated','TR') IS NOT NULL
DROP TRIGGER Products.TRRowUpdated;
GO
CREATE TRIGGER TRRowUpdated
ON Products
AFTER INSERT, UPDATE, DELETE
AS
INSERT INTO log ('Message', 'Date') VALUES ('Products table got modified', GETTIME())
GO
Creating a Windows Service Project is as easy as creating a Console Application using Visual Studio.
Your service will then read the 'log' table like every minute and send out emails if there were any rows in it (and deletes them of course).
It is possible to determine what kind of change happened: INSERT, UPDATE or DELETE. See the comments on this site for more details: http://msdn.microsoft.com/en-us/library/ms189799.aspx
How do I synchronise a database on one server to the other? I need to do it via C#, not scripts. I was planning to use ADO.NET to retrieve the schema and dataset from one database, but how do I sync it in code?
Thanks
There are various options available to you:
SSIS to Export/Import data between System1 & System2
Mirroring to copy data between System1 & System2
Replication to keep System2 in sync with System1
Scripts for Database Backup/Restore between servers using C# to be the IO glue that scripts the backup on System1, copies the file to System2 and calls the restore script on System2
The easiest option to implement is #4 specially if changes occur to System1 that need to be replicated to System2 (tables, indexes, views, procedures, functions, triggers..etc..etc...)
See Database Mirroring in SQL Server 2005 # SQL Server Performance or Database Mirroring in SQL Server 2005 # Microsoft for mirroring information.
If you need some more enlightment on #4 just reply. Oh, it would help to specify what version of SQL server you are using. This information assumes >=2005 (Yukon, Katmai and possibly Killimanjaro)
Update: I would stay clear of trying to implement your own runtime on this as there are so many variations that just copying between 2 servers requires the ability to do diffs against the objects. Even using the SMO .NET objects this would be an ardous task that would require a lengthy development schedule.
Update 1: The poster is interested in the SSIS version so we will use those instructions.
Start SQL Server Management Studio
Navigate to the chosen database
Right click and Tasks->Export Data
Click Next
Select required source
Click Next
Select destination settings
Click Next
Select either use either tables or Write a query (we will assume use tables)
Click Next
Select the required tables
Click Edit Mappings
Ensure that enable identity insert is selected if required
Ensure Delete/Appends rows is selected as required
Ensure Drop and re-create destination table selected as required
Click Ok
Click Next
Now save this to an SSIS package either into SQL Server or on the filesystem
You can now use DTSExec to execute this package via a scheduler or use the .NET wrapper and call from your own C# runtime
C# Code example from Microsoft Website
using System;
using Microsoft.SqlServer.Dts.Runtime;
namespace RunFromClientAppCS
{
class Program
{
static void Main(string[] args)
{
string pkgLocation;
Package pkg;
Application app;
DTSExecResult pkgResults;
pkgLocation =
#"C:\Program Files\Microsoft SQL Server\100\Samples\Integration Services" +
#"\Package Samples\CalculatedColumns Sample\CalculatedColumns\CalculatedColumns.dtsx";
app = new Application();
pkg = app.LoadPackage(pkgLocation, null);
pkgResults = pkg.Execute();
Console.WriteLine(pkgResults.ToString());
Console.ReadKey();
}
}
}
Resources
dtexec Books Online (BOL)
SQL Server Import Export Wizard BOL
Visual Walkthru # http://www.accelebrate.com/sql_training/ssis_2008_tutorial.htm
Youtube Videos
Loading & Running SSIS Packages Programmatically
Why would you build this when there are db tools that do it for you?
Look into SSIS, or as it was previously known, DTS.