Runtime error in script task - c#

We have a fully running database server, say serverA, whose data are refreshed daily.
We want to duplicate this database on a different server, says serverB, so that we have a test environment. The databases has been restored to serverB.
Like serverA, we want serverB's data to be refreshed daily also, so the tests we conduct on serverB can be said as fully accurate since they will have the same data as serverA. We deployed the SSIS packages used in serverA in serverB and copied the SQL Server Agent Jobs in serverB also.
I am trying to modify these jobs and packages so that they can run smoothly on serverB, I'm changing directory paths, server names, etc.
Now, there is this job that always fails because of a package, zip.dtsx.
zip.dtsx retrieves files from directoryA, compresses them and saves the compressed file to directoryB, then deletes the file in directoryA. However, I cannot figure out why it's having a runtime error.
zip.dtsx has a script task named Zip files.
The script language is Microsoft Visual C# 2010
The ReadOnlyVariables set are User::DestinationPath, User::NamePart, User::SourcePath,$Package::filename
The script is,
public void Main()
{
String sourcePath = Convert.ToString(Dts.Variables["SourcePath"].Value);
String namePart = Convert.ToString(Dts.Variables["NamePart"].Value);
String destinationPath = Convert.ToString(Dts.Variables["DestinationPath"].Value);
FileStream sourceFile = File.OpenRead(#sourcePath + namePart);
FileStream destFile = File.Create(#destinationPath + namePart);
GZipStream compStream = new GZipStream(destFile, CompressionMode.Compress);
try
{
int theByte = sourceFile.ReadByte();
while (theByte != -1)
{
compStream.WriteByte((byte)theByte);
theByte = sourceFile.ReadByte();
}
}
finally
{
compStream.Dispose();
sourceFile.Close();
destFile.Close();
File.Delete(#sourcePath + namePart);
}
Dts.TaskResult = (int)ScriptResults.Success;
}
The error I'm getting, when I execute the task in Microsoft Visual Studio -> Right click Script Task object -> Execute task
I am not familiar with Microsoft Visual C# and I have just also begun using SSIS packages, so I'm really at a loss.
UPDATE:
I tried commenting out different lines in the C# script. Finally, when I commented out File.Delete(#sourcePath + namePart);, the job calling zip.dtsx has succeeded. However, I am not sure why I'm having an error with this line. I'm not sure if it is because of permissions or any other else.

Related

Shell32 GetDetailsOf call in SSIS returning different results when run manually versus on server

In an SSIS package I have a script task that is trying to get the Author information for a file. When run from Visual Studio the code works as expected, returning the Authors information from the file metadata. When run on the server as a scheduled job against the same file no Authors information is found.
private static void GetDocAuthor(object param)
{
try
{
DocAuthor = "Unknown";
object[] args = (object[])param;
string folderPath = (string)args[0];
string fileName = (string)args[1];
fileName = fileName.Trim('\\');
// Filename: Column 0
// Authors: Column 20
var folder = new Shell32.Shell().NameSpace(folderPath);
foreach (Shell32.FolderItem item in folder.Items())
{
string retrievedFilename = folder.GetDetailsOf(item, 0);
if (retrievedFilename == fileName)
{
DocAuthor = folder.GetDetailsOf(item, 20);
}
}
}
catch (Exception ex)
{
using (EventLog eventLog = new EventLog("Application"))
{
eventLog.Source = "Application";
eventLog.WriteEntry(string.Format("Error getting Authors. Error message = {0}", ex.Message), EventLogEntryType.Information, 107);
}
}
}
The GetDocAuthor method is being executed as an STA thread since I read about the Shell COM object not working in SSIS due to it being run as an MTA thread. This is how it is being called:
object[] args = new object[] { folderPath, fileName };
Thread staThread = new Thread(new ParameterizedThreadStart(GetDocAuthor));
staThread.SetApartmentState(ApartmentState.STA);
staThread.Start(args);
staThread.Join();
folderPath is the folder where the file I want the information on resides. fileName is the name of the specific file I want the author of.
The folderPath is a UNC path so I don't expect that to be the issue. I'm missing something that must be different from when it runs locally as opposed to on the server. When running on the server the script is able to find the file, it just isn't able to get the metadata.
Anyone have a direction to explore?
After doing some more testing between the local execution and server execution here is what I know.
The SSIS package is running in 32 bit mode both locally and on the server.
Locally it's running in VS 2017 on Windows 10. On the server it's running on SQL server 2014 on a Windows Server 2012 R2 box.
Using some application logging what I've determined is that GetDetailsOf(item, 10) is the same on both computers, being the NT User ID for the person who created the file, while for GetDetailsOf(item, 20) contains the Author information on Windows 10 but is blank on Windows Server 2012 R2.
Unfortunately the NT User ID is not what I need, it's the Author information.
Sounds like the issue has to do with what's available on Windows Server 2012 R2 and not the file itself. Can anyone confirm that the Authors metadata isn't available on Server 2012?

How to upload a file from my local machine to a vault of s3 glacier using c# in a console app?

did someone knows how to do that because i had investigate about, but i found only wrong/don't working answers I had try a lot of solutions but it seems to be wrong, like using the Chilkat directory , using ArchiveTransferManager ...
Chilkat.Rest rest = new Chilkat.Rest();
bool bTls = true;
int port = 443;
bool bAutoReconnect = true;
bool success = rest.Connect("glacier.eu-west-1.amazonaws.com", port, bTls, bAutoReconnect);
Chilkat.AuthAws authAws = new Chilkat.AuthAws();
authAws.AccessKey = ;
authAws.SecretKey = ;
authAws.ServiceName = "glacier";
authAws.Region = "us-west-1";
success = rest.SetAuthAws(authAws);
rest.AddHeader("x-amz-glacier-version", "2012-06-01");
string filePath = "20190422.csv";
Chilkat.Crypt2 crypt = new Chilkat.Crypt2();
crypt.HashAlgorithm = "sha256-tree-hash";
crypt.EncodingMode = "hexlower";
string treeHashHex = crypt.HashFileENC(filePath);
rest.AddHeader("x-amz-sha256-tree-hash", treeHashHex);
crypt.HashAlgorithm = "sha256";
string linearHashHex = crypt.HashFileENC(filePath);
authAws.PrecomputedSha256 = linearHashHex;
rest.AddHeader("x-amz-archive-description", filePath);
Chilkat.Stream fileStream = new Chilkat.Stream();
fileStream.SourceFile = filePath;
string responseStr = rest.FullRequestStream("POST", "/682988997959/vaults/streamqueuesvault", fileStream);
if (rest.LastMethodSuccess != true)
{
Debug.WriteLine(rest.LastErrorText);
return;
}
int respStatusCode = rest.ResponseStatusCode;
if (respStatusCode >= 400)
{
Debug.WriteLine("Response Status Code = " + Convert.ToString(respStatusCode));
Debug.WriteLine("Response Header:");
Debug.WriteLine(rest.ResponseHeader);
Debug.WriteLine("Response Body:");
Debug.WriteLine(responseStr);
return;
}
Debug.WriteLine("response status code = " + Convert.ToString(respStatusCode));
string archiveId = rest.ResponseHdrByName("x-amz-archive-id");
Debug.WriteLine("x-amz-archive-id = " + archiveId);
string location = rest.ResponseHdrByName("Location");
Debug.WriteLine("Location = " + location);
Here is a step by step guide on How to upload a file from my local machine to a vault of s3 glacier using c# in a console app?. First I would like to present some basic background information that will be used later in the solution. Feel free to skip ahead to the solution if you are smart on S3 Glacier.
If you have AWS SDK for .NET and VS already installed, you can download the Repo from Github.
Quick Intro to S3-Glacier
Amazon S3 Glacier is Amazons low cost long term storage service.
In Glacier terminology, an object is referred to as an Archive. Also the folders where you store archives are called Vaults. Its pretty simple - From the Glacier FAQ:
Q: How is data within Amazon S3 Glacier organized?
You store data in Amazon S3 Glacier as an archive. Each archive is assigned a unique archive ID that can later be used to retrieve the data. An archive can represent a single file or you may choose to combine several files to be uploaded as a single archive. You upload archives into vaults. Vaults are collections of archives that you use to organize your data.
When you upload objects to S3 Glacier, the objects don't immediately appear in your Glacier console. Your Glacier console will refresh once a day.
Amazon recommends you use the AWS SDK for .NET when developing C# applications that interface AWS services.
Simple Solution
Before you code, go into your AWS Console and create a S3 Glacier Vault name 'TestVault'.
At the time of this solution (April 2019), I suggest you use Visual Studio 2019. These steps are similar for earlier versions of Visual Studio.
The code I present was taken directly from the AWS SDK for .NET Documentation.
Once your visual studio is ready, then follow these steps:
Create a new project (use template -> Console App (.NET Framework) - not Console App (.NET Core) and name it ConsoleApp9
Add the AWS SDK to your project via NuGet package manager command.
Tools menu, select Nuget Package Manager, and click Package Manager Console.
then type Install-Package AWSSDK.
For a MAC use Project->Add Nuget Packages. Search for "AWSSDK.Glacier" and install it.
Below is the working code. You need to copy most of this into your Program.cs and remove the default "Hello World" code. Your final Program.cs code should look like
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Amazon.Glacier;
using Amazon.Glacier.Transfer;
using Amazon.Runtime;
namespace ConsoleApp9
{
class Program
{
static string vaultName = "TestVault";
static string archiveToUpload = "C:\\Windows\\Temp\\TEST-ARCHIVE.txt";
static void Main(string[] args)
{
try
{
var manager = new ArchiveTransferManager(Amazon.RegionEndpoint.USEast1);
// Upload an archive.
string archiveId = manager.Upload(vaultName, "upload archive test", archiveToUpload).ArchiveId;
Console.WriteLine("Archive ID: (Copy and save this ID for use in other examples.) : {0}", archiveId);
Console.WriteLine("To continue, press Enter");
Console.ReadKey();
}
catch (AmazonGlacierException e) { Console.WriteLine(e.Message); }
catch (AmazonServiceException e) { Console.WriteLine(e.Message); }
catch (Exception e) { Console.WriteLine(e.Message); }
Console.WriteLine("To continue, press Enter");
Console.ReadKey();
}
}
}
Put the file that you want to be uploaded to Glacier as c:\Windows\Temp\Test-Archive.txt. You can put the file anywhere you want, just update the variable archiveToUpload in your code to reflect the location.
If your region is not USEast1, Change the AWS Region on the line just after the try:
var manager = new ArchiveTransferManager(Amazon.RegionEndpoint.YOUR-REGION);
Run the program and it will upload the file. If you have installed the AWS SDK before this will likely work just fine and you will have a screen that shows your archive id.:
If you run into permissions or authorization errors - please follow these steps on setting up authorization for the AWS SDK. I recommend using a Credentials File (2nd option from top). Other problems could be wrong Vault Name or it cant find the file on your machine.
When you go back to the Glacier console, you will not see any files uploaded. Glacier is low cost and slow moving compared to s3 and so your Vault contents are updated once a day.
As long as you get an ID in step 6, your file was successfully stored in Glacier.
Hope this helps and you find success.
Make sure your region is consistent. In the following code, "eu-west-1" is used in the Connect call, but "us-west-1" is used for authAws.Region.
bool success = rest.Connect("glacier.eu-west-1.amazonaws.com", port, bTls, bAutoReconnect);
Chilkat.AuthAws authAws = new Chilkat.AuthAws();
authAws.AccessKey = ;
authAws.SecretKey = ;
authAws.ServiceName = "glacier";
authAws.Region = "us-west-1";

C# Not saving data in MS Access Database [duplicate]

I have following C# code in a console application.
Whenever I debug the application and run the query1 (which inserts a new value into the database) and then run query2 (which displays all the entries in the database), I can see the new entry I inserted clearly. However, when I close the application and check the table in the database (in Visual Studio), it is gone. I have no idea why it is not saving.
using System;
using System.Collections.Generic;
using System.Data.Entity;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Data.SqlServerCe;
using System.Data;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
try
{
string fileName = "FlowerShop.sdf";
string fileLocation = "|DataDirectory|\\";
DatabaseAccess dbAccess = new DatabaseAccess();
dbAccess.Connect(fileName, fileLocation);
Console.WriteLine("Connected to the following database:\n"+fileLocation + fileName+"\n");
string query = "Insert into Products(Name, UnitPrice, UnitsInStock) values('NewItem', 500, 90)";
string res = dbAccess.ExecuteQuery(query);
Console.WriteLine(res);
string query2 = "Select * from Products";
string res2 = dbAccess.QueryData(query2);
Console.WriteLine(res2);
Console.ReadLine();
}
catch (Exception e)
{
Console.WriteLine(e);
Console.ReadLine();
}
}
}
class DatabaseAccess
{
private SqlCeConnection _connection;
public void Connect(string fileName, string fileLocation)
{
Connect(#"Data Source=" + fileLocation + fileName);
}
public void Connect(string connectionString)
{
_connection = new SqlCeConnection(connectionString);
}
public string QueryData(string query)
{
_connection.Open();
using (SqlCeDataAdapter da = new SqlCeDataAdapter(query, _connection))
using (DataSet ds = new DataSet("Data Set"))
{
da.Fill(ds);
_connection.Close();
return ds.Tables[0].ToReadableString(); // a extension method I created
}
}
public string ExecuteQuery(string query)
{
_connection.Open();
using (SqlCeCommand c = new SqlCeCommand(query, _connection))
{
int r = c.ExecuteNonQuery();
_connection.Close();
return r.ToString();
}
}
}
EDIT: Forgot to mention that I am using SQL Server Compact Edition 4 and VS2012 Express.
It is a quite common problem. You use the |DataDirectory| substitution string. This means that, while debugging your app in the Visual Studio environment, the database used by your application is located in the subfolder BIN\DEBUG folder (or x86 variant) of your project. And this works well as you don't have any kind of error connecting to the database and making update operations.
But then, you exit the debug session and you look at your database through the Visual Studio Server Explorer (or any other suitable tool). This window has a different connection string (probably pointing to the copy of your database in the project folder). You search your tables and you don't see the changes.
Then the problem get worse. You restart VS to go hunting for the bug in your app, but you have your database file listed between your project files and the property Copy to Output directory is set to Copy Always. At this point Visual Studio obliges and copies the original database file from the project folder to the output folder (BIN\DEBUG) and thus your previous changes are lost.
Now, your application inserts/updates again the target table, you again can't find any error in your code and restart the loop again until you decide to post or search on StackOverflow.
You could stop this problem by clicking on the database file listed in your Solution Explorer and changing the property Copy To Output Directory to Copy If Newer or Never Copy. Also you could update your connectionstring in the Server Explorer to look at the working copy of your database or create a second connection. The first one still points to the database in the project folder while the second one points to the database in the BIN\DEBUG folder. In this way you could keep the original database ready for deployment purposes and schema changes, while, with the second connection you could look at the effective results of your coding efforts.
EDIT Special warning for MS-Access database users. The simple act of looking at your table changes the modified date of your database ALSO if you don't write or change anything. So the flag Copy if Newer kicks in and the database file is copied to the output directory. With Access better use Copy Never.
Committing changes / saving changes across debug sessions is a familiar topic in SQL CE forums. It is something that trips up quite a few people. I'll post links to source articles below, but I wanted to paste the answer that seems to get the best results to the most people:
You have several options to change this behavior. If your sdf file is part of the content of your project, this will affect how data is persisted. Remember that when you debug, all output of your project (including the sdf) if in the bin/debug folder.
You can decide not to include the sdf file as part of your project and manage the file location runtime.
If you are using "copy if newer", and project changes you make to the database will overwrite any runtime/debug changes.
If you are using "Do not copy", you will have to specify the location in code (as two levels above where your program is running).
If you have "Copy always", any changes made during runtime will always be overwritten
Answer Source
Here is a link to some further discussion and how to documentation.

Reverse engineering SSIS package using C#

There is a requirement to extract source,destination and column names of source and destination. Why am I trying to do this is because I have thousands of packages and opening each package has on an average 60 to 75 of columns and listing all required info will take huge amount of time and its not a single time requirement and this task is done manually every two months in my organization currently.
I'm looking for some ways to reverse engineer keeping all packages in a single folder and then go through each package and get the info and put it in some spreadsheet.
I thought of opening package in xml and get the info of interested node and put in spreadsheet which is little cumbersome. Please suggest what are the available libraries to start with it.
SQL server provide assemblies to manipulate packages programmatically.
To do a reverse engineering (deserialize a dtsx package), You have to do this by looping over packages and read them programmatically, just follow this detailed link
Reading DTS and SSIS packages programmatically
There is another way (harder way and not recommended) to achieve this , by reading dtsx as text file and parse the xml content. check my answer at the following question to get an example:
Automate Version number Retrieval from .Dtsx files
Hint:
just open the package in visual studio. go to the package explorer Tab (near control flow and data flow tabs) you will find a treeview. it will leads you the way you have to search for the component you need
Update 1 - C# Script # 2019-07-08
If you are looking for a script that list all package objects you can use a similar script:
using System;
using DtsRuntime = Microsoft.SqlServer.Dts.Runtime;
using DtsWrapper = Microsoft.SqlServer.Dts.Pipeline.Wrapper;
public void Main()
{
string pkgLocation;
DtsRuntime.Package pkg;
DtsRuntime.Application app;
DtsRuntime. DTSExecResult pkgResults;
pkgLocation =
#"D:\Test\Package 1.dtsx";
app = new DtsRuntime.Application();
pkg = app.LoadPackage(pkgLocation, null);
//List Executables (Tasks)
foreach(DtsRuntime.Executable tsk in pkg.Executables)
{
DtsRuntime.TaskHost TH = (DtsRuntime.TaskHost)tsk;
MessageBox.Show(TH.Name + "\t" + TH.HostType.ToString());
//Data Flow Task components
if (TH.InnerObject.ToString() == "System.__ComObject")
{
try
{
DtsWrapper.MainPipe m = (DtsWrapper.MainPipe)TH.InnerObject;
DtsWrapper.IDTSComponentMetaDataCollection100 mdc = m.ComponentMetaDataCollection;
foreach (DtsWrapper.IDTSComponentMetaData100 md in mdc)
{
MessageBox.Show(TH.Name.ToString() + " - " + md.Name.ToString());
}
}
catch {
// If it is not a data flow task then continue foreach loop
}
}
}
//Event Handlers
foreach(DtsRuntime.DtsEventHandler eh in pkg.EventHandlers)
{
MessageBox.Show(eh.Name + " - " + CM.HostType);
}
//Connection Manager
foreach(DtsRuntime.ConnectionManager CM in pkg.Connections)
{
MessageBox.Show(CM.Name + " - " + CM.HostType);
}
//Parameters
foreach (DtsRuntime.Parameter Param in pkg.Parameters)
{
MessageBox.Show(Param.Name + " - " + Param.DataType.ToString());
}
//Variables
foreach (DtsRuntime.Variable Var in pkg.Variables)
{
MessageBox.Show(Var.Name + " - " + Var.DataType.ToString());
}
//Precedence Constraints
foreach (DtsRuntime.PrecedenceConstraint PC in pkg.PrecedenceConstraints)
{
MessageBox.Show(PC.Name);
}
}
References
Loading and Running a Local Package Programmatically
Update 2 - SSISPackageExplorer Project # 2019-07-10
I started a small project called SSISPackageExplorer on Git-Hub which allow the user to read the package objects in a TreeView, It is very basic right now but i will try to improve it in a while:
GitHub - SSISPackageExplorer
Some of the properties in dtsx Microsoft.SqlServer.Dts.Pipeline are not CLS-compliant.
ColumnInformation Constructors
ColumnInformation Class
Definition
Namespace:
Microsoft.SqlServer.Dts.Pipeline
Assembly:
Microsoft.SqlServer.PipelineHost.dll
Important
This API is not CLS-compliant.
C++
Copy
public ref class ColumnInformation
otherwise try this.
Just open your dtsx package in notepad++. Find table name then do the same search on the property name in all packages( find ion all files). I think that even if you search for the column in dtsx opened in a text editor it will give you everything. It's manual but can be updated with Regex and c#. I never did it with regex. I just did notepad++ and one package once.

sqlclient fails to execute anything except select statements [duplicate]

I have following C# code in a console application.
Whenever I debug the application and run the query1 (which inserts a new value into the database) and then run query2 (which displays all the entries in the database), I can see the new entry I inserted clearly. However, when I close the application and check the table in the database (in Visual Studio), it is gone. I have no idea why it is not saving.
using System;
using System.Collections.Generic;
using System.Data.Entity;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Data.SqlServerCe;
using System.Data;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
try
{
string fileName = "FlowerShop.sdf";
string fileLocation = "|DataDirectory|\\";
DatabaseAccess dbAccess = new DatabaseAccess();
dbAccess.Connect(fileName, fileLocation);
Console.WriteLine("Connected to the following database:\n"+fileLocation + fileName+"\n");
string query = "Insert into Products(Name, UnitPrice, UnitsInStock) values('NewItem', 500, 90)";
string res = dbAccess.ExecuteQuery(query);
Console.WriteLine(res);
string query2 = "Select * from Products";
string res2 = dbAccess.QueryData(query2);
Console.WriteLine(res2);
Console.ReadLine();
}
catch (Exception e)
{
Console.WriteLine(e);
Console.ReadLine();
}
}
}
class DatabaseAccess
{
private SqlCeConnection _connection;
public void Connect(string fileName, string fileLocation)
{
Connect(#"Data Source=" + fileLocation + fileName);
}
public void Connect(string connectionString)
{
_connection = new SqlCeConnection(connectionString);
}
public string QueryData(string query)
{
_connection.Open();
using (SqlCeDataAdapter da = new SqlCeDataAdapter(query, _connection))
using (DataSet ds = new DataSet("Data Set"))
{
da.Fill(ds);
_connection.Close();
return ds.Tables[0].ToReadableString(); // a extension method I created
}
}
public string ExecuteQuery(string query)
{
_connection.Open();
using (SqlCeCommand c = new SqlCeCommand(query, _connection))
{
int r = c.ExecuteNonQuery();
_connection.Close();
return r.ToString();
}
}
}
EDIT: Forgot to mention that I am using SQL Server Compact Edition 4 and VS2012 Express.
It is a quite common problem. You use the |DataDirectory| substitution string. This means that, while debugging your app in the Visual Studio environment, the database used by your application is located in the subfolder BIN\DEBUG folder (or x86 variant) of your project. And this works well as you don't have any kind of error connecting to the database and making update operations.
But then, you exit the debug session and you look at your database through the Visual Studio Server Explorer (or any other suitable tool). This window has a different connection string (probably pointing to the copy of your database in the project folder). You search your tables and you don't see the changes.
Then the problem get worse. You restart VS to go hunting for the bug in your app, but you have your database file listed between your project files and the property Copy to Output directory is set to Copy Always. At this point Visual Studio obliges and copies the original database file from the project folder to the output folder (BIN\DEBUG) and thus your previous changes are lost.
Now, your application inserts/updates again the target table, you again can't find any error in your code and restart the loop again until you decide to post or search on StackOverflow.
You could stop this problem by clicking on the database file listed in your Solution Explorer and changing the property Copy To Output Directory to Copy If Newer or Never Copy. Also you could update your connectionstring in the Server Explorer to look at the working copy of your database or create a second connection. The first one still points to the database in the project folder while the second one points to the database in the BIN\DEBUG folder. In this way you could keep the original database ready for deployment purposes and schema changes, while, with the second connection you could look at the effective results of your coding efforts.
EDIT Special warning for MS-Access database users. The simple act of looking at your table changes the modified date of your database ALSO if you don't write or change anything. So the flag Copy if Newer kicks in and the database file is copied to the output directory. With Access better use Copy Never.
Committing changes / saving changes across debug sessions is a familiar topic in SQL CE forums. It is something that trips up quite a few people. I'll post links to source articles below, but I wanted to paste the answer that seems to get the best results to the most people:
You have several options to change this behavior. If your sdf file is part of the content of your project, this will affect how data is persisted. Remember that when you debug, all output of your project (including the sdf) if in the bin/debug folder.
You can decide not to include the sdf file as part of your project and manage the file location runtime.
If you are using "copy if newer", and project changes you make to the database will overwrite any runtime/debug changes.
If you are using "Do not copy", you will have to specify the location in code (as two levels above where your program is running).
If you have "Copy always", any changes made during runtime will always be overwritten
Answer Source
Here is a link to some further discussion and how to documentation.

Categories