I have a simple SSIS package that only has 2 steps. Step 1 is a script task that will return the creation date of the last created file in the specific folder. And the second step, it will save the step 1 data into the database. Here is the complete code for step 1 in C#.
string vDir = Dts.Variables["UnitDirectory"].Value.ToString();
string vDBName = Dts.Variables["DatabaseName"].Value.ToString();
var directory = new DirectoryInfo(vDir);
var myFile = directory.GetFiles(vDBName + "*").OrderByDescending(f => f.LastWriteTime).First();
DateTime creation = File.GetCreationTime(vDir + myFile);
Dts.Variables["CreatedDate"].Value = creation;
Dts.TaskResult = (int)ScriptResults.Success;
In this case, i pass UnitDirectory variable with network sharing directory.
\\192.168.0.2\Database Backup\
The problem is, I can execute this package both from the Integration Service Catalog in SQL Server Management Studio and from Visual Studio. But, when I executed it as SQL Job, it just errors in step 1 and giving an unclear message.
Script Task:Error: Exception has been thrown by the target of an invocation.
I run SQL Job with my domain administrator credentials which have login access into network sharing directory. And this problem starts when I run in new installed SQL Server 2014. Previously this package has run normally in my old SQL Server.
Can you give me solution for this? I need your solution based on your experience. Thanks in advance.
Related
I have a SQL Agent Job with steps that resemble the following:
Exec SSIS package A
Exec SSIS package B
Exec SSIS package C
Exec stored proc
I have it set up so that if any of the SSIS packages fail, the job moves on to next step. The package failure notification comes from SSIS. I have 2 script tasks. One in event handler to capture/append error codes/messages to a variable:
The second script task is in control flow and grabs the appended error message variables and other email forming variables and sends them to an outlook email account:
This works when running the package with my credentials + just the individual package. When I try to run the package through the SQL Agent Job with default credentials (or even my own) I get the following errors:
The desired result is SQL Agent Job runs completely regardless of any of steps 1-3 failing. When any step 1-3 fail, within SSIS the failure email message is sent.
EDIT: SQL Agent Job
We are using the legacy Package Deployment Model. The Package lives on the file system.
One of our Jobs that runs an SSIS package for deleting files with a script task (c# File.Delete) behaves in a strange way as below. Could anyone help us understand the reason for it?
Below are the basic conditions:
1. SQL Server Agent Service's Logon Account: DomainA\AAA
2. Owner of the Job: DomainA\AAA
3. DomainA\AAA is a member of local "Administrators" group
What we found strange is:
The job fails with the message "Access to the path E:\XXXX\pp.csv denied" when Full control access is given only to "Administrators" windows group and successes when Full control access is given directly to "DomainA\AAA" windows user.
Before the error message above, it says "The step was executed as: DomainA\AAA".
Version Info:
SQL Server 2008 SP2(10.0.4000)
Windows 2003 R2 x64 SP2
Note:
1. English messages above are my own translation from our language and it wouldn't be accurately equal to the ones in English version.
2. The SSIS package is simplified to have this one script task only for testing.
Can you check weather on the folder you have given all the permission(Read and write) for Administrators group. If this doesn't works please change the folder and try once.
Try this blog to get information about the permission on this directory and file. Hope, this code will help you to find out which permission you need to assign for this user.
http://craigot.blogspot.com/2012/09/ssis-checking-filefolder-permissions.html
I'm developing Windows Desktop appication(C# WPF) with Microsoft localdb(.mdf) and I want that users of my software can carry their localdb (.mdf) file when they move to other places(computers). The localdb (.mdf) file is created on the first computer.
To test my application, I copied my localdb file from computer A to computer B and attached with below code successfully.
string attach_greendbnameQuery = string.Format(#"EXEC sp_attach_db #dbname = N'greendb_{0}', #filename1 = N'{1}\greendb_{0}.mdf'", textBoxGreenLogin.Text, Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData).ToString());
SqlCommand attach_greendbnamecomm = new SqlCommand(attach_greendbnameQuery, check_datadbinmasterConn);
attach_greendbnamecomm.ExecuteNonQuery();
And I could read and write data into the moved localdb file.
However, when I run backup command on this database, exception occurs like,
'Unable to open physical file, Operating system error 32, process cannot open the file because the file is in use by another process'
If I enter Server Management Studio- Security-KayLee-PC\KayLee- User Mapping,
the users of all other localdb are 'dbo' but only the user of manually moved database is KayLee-PC\KayLee and only 'public' is checked and all other database roles are not checked including db_owner. (I always start(login) Windows(O/S) with KayLee-PC\Kaylee account)
I tried to make all roles checked to database roles even to server roles but failed.
Even, I tried to drop the user KayLee-PC\KayLee through below code but the exception message show as,
'User 'KayLee-PC\KayLee' does not exist in current database'
If I click the database in Server Management Studio, the current database status is not changed to clicked database and subtree nodes(like Table, Security and so on) are not displayed with message 'cannot access to the database' eventhough if I click other databases, the current database status is changed to clicked database.
Also, I tried to change the owner of the localdb(database) through below code but it seems the execution failed with result '-1' with 'sa' and when trying with 'KayLee-PC\KayLee' which I always login to Windows O/S, error message is like 'KayLee-PC\KayLee is already an owner(exist) of this database'
SqlConnection dbownerchange_Conn = new SqlConnection();
dbownerchange_Conn.ConnectionString = dbownerchange_ConnectionString;
dbownerchange_Conn.Open();
SqlCommand dbownerchange_comm = new SqlCommand();
dbownerchange_comm.Connection = dbownerchange_Conn;
dbownerchange_comm.CommandText = "EXEC sp_changedbowner 'sa'";
dbownerchange_comm.ExecuteNonQuery();
dbownerchange_Conn.Close();
Simply, If we need to move(copy) Microsoft localdb file to another place(computer), how can I do this successfully?
Must we detach before move the localdb file? If so, I'm worried there're always people who don't follow guideline by running detach function.
I've tried several scenarios to understand how SQL server is working. The conclusion is we need to detach first and re-attach to same or different machine(computer). Then, I could have been successful to move to another computer.
However, a single process of backup and restoring localdb to another machine(computer) doesn't work. Furthermore, if we detach localdb(database file .mdf) first, SQL server no more recognizes the localdb database and we cannot run backup command for the localdb.
Conclusively and simply, if we want to move localdb (microsoft database .mdf file) to other local server(computer, machine), we're needed to just detach the localdb from master database in computer A and copy the files and re-attach to another master database of computer B.
Hope this helps someone else..
So I've written a simple exe that will extract sql files from a zipped folder and open them in SQL Server Studio. It works great except that sometimes there will be multiple sql files to open, which then causes multiple SQL Server Instances to open. How can I make the files all open in one instance?
This is what I'm trying so far:
foreach (string sqlFile in files)
{
Process sqlServer;
if (Process.GetProcessesByName("Ssms").Length > 0)
sqlServer = Process.GetProcessesByName("Ssms")[0];
else
sqlServer = new Process();
sqlServer.StartInfo.FileName = sqlFile;
sqlServer.Start();
}
Sometimes a file will magically open in an existing SQL Server window but I haven't figured out why.
I couldn't find any way to use an existing SSMS (sorry!), but fortunately, I found SSMS command line very useful. It can be fed with the name of the server and/or the instance to connect with switch -S. It also logins with Windows Authentication with switch -E. Take a look at its options via SSMS /? and choose what fits your problem best. Anyway I tested the following code with and without a pre-existing SSMS instance connected/disconnected:
string serverName = "myservername";
var sepratedFiles = string.Join(" ", files.Select(p=>"\"" +p +"\"");
Process sqlServer = new Process();
sqlServer.StartInfo.FileName = "SSMS";
sqlServer.StartInfo.Arguments = string.Format("-S {0} {1}", serverName, sepratedFiles );
sqlServer.Start();
Today, for each customer, we deploy same SSRS reports folder and data source folder.
The difference between these folders are the name of each folder and the connection string of the data source.
We are using Report Server 2008 R2.
Is it possible to maintain only one reports and data source folder and change programmatically its connection string on server-side before the report been rendered?
If not, Is it something that can be achieved by changing some logic in reports?
Today we use "shared data source" option.
This is something we've done in our environment - we maintain one set of reports that can be deployed at any client with their own configuration.
You've got a couple of options here. Since you're using a Shared Data Source this makes things easier as you won't need to define a Data Source for each report.
1. Use the rs.exe utility and a script file
rs.exe at Books Online
This program allows you to create script files (in VB.NET) that can interact with a Report Server Web Service. You create a script file (e.g. Deploy.rss) and call the rs.exe program with various parameters, including any custom ones you define:
rs.exe -i DeployReports.rss -s http://server/ReportServer -v DatabaseInstance="SQL" -v DatabaseName="ReportDB" -v ReportFolder="ClientReports"
So this would call a script DeployReports.rss, connect to http://server/ReportServer, with three user defined parameters which could be used to create a data source and the report folder.
In the scipt file you could have something like this:
Public Sub Main()
rs.Credentials = System.Net.CredentialCache.DefaultCredentials
CreateFolder(reportFolder, "Report folder")
CreateFolder(datasourceFolder, "Data source folder")
CreateDataSource()
End Sub
Which can then make Web Service calls like:
rs.CreateFolder(folderName, "/", Nothing)
'Define the data source definition.
Dim definition As New DataSourceDefinition()
definition.CredentialRetrieval = CredentialRetrievalEnum.Integrated
definition.ConnectString = "data source=" + DatabaseInstance + ";initial catalog=" + DatabaseName
definition.Enabled = True
definition.EnabledSpecified = True
definition.Extension = "SQL"
definition.ImpersonateUser = False
definition.ImpersonateUserSpecified = True
'Use the default prompt string.
definition.Prompt = Nothing
definition.WindowsCredentials = False
Try
rs.CreateDataSource(datasource, datasourcePath, False, definition, Nothing)
Console.WriteLine("Data source {0} created successfully", datasource)
Catch e As Exception
Console.WriteLine(e.Message)
End Try
You haven't specified what version of Reporting Services you're using, so I'm assuming 2008. Please note that there are multiple endpoints that can be used, depending on SQL Server version. The 2005/2008 end point is deprecated in 2008R2 and above but is still usable. Just something to bear in mind when writing your script.
2. Call the SSRS Web Service through an application
Report Server Web Service overview
The same calls that are made from the script above can be made in any other application, too. So you'd just need to add a reference to a Report Server Web Service through WSDL and you can connect to a remote service and call its methods to deploy reports, data sources, etc.
So ultimately you're connecting to the Report Server Web Service, it's just the medium used that you need to think about.
Using a script is easier to get running as it's just running a program from the command line, but writing your own deployment application will certainly give greater flexibility. I would recommend getting the script going, so you understand the process, then migrate this to a bespoke application if required. Good luck!
You can use an Expression Based Connection String to select the correct database. You can base this on a parameter your application passes in, or the UserId global variable. I do believe you need to configure the unattended execution account for this to work.
Note: be careful about the security implications. Realize that if you would pass sensitive data (e.g. passwords) into a parameter, that (a) it will go over the wire, and (b) will be stored in the execution log tables for reporting services.