I am trying to create a simple flow using the BIML <ScriptComponentProject> tag, but it does not work for me!
if I create the data flow manually, then it works perfectly fine!
my data flow has: SRC > SCRIPT-COMPONENT > TGT
the only logic that I have added is to calculate a row-number for each incoming row (keeping the logic absolutely simple for now)
when the manual solution is executed:
the sequential row-number is stored in the TGT table (test_np_02)
Great!
however, when I try to create absolutely the same package using the attached BIML, I get the following errors:
SSIS Output log:
Error 0 The namespace '<global namespace>' already contains a definition for 'Input0Buffer'. ...\Designer\BufferWrapper.cs 14 14
Error 0 The namespace '<global namespace>' already contains a definition for 'UserComponent'. ...\Designer\ComponentWrapper.cs 12 14
Error 0 The namespace '<global namespace>' already contains a definition for 'Connections'. ...\Designer\ComponentWrapper.cs 49 14
Error 0 The namespace '<global namespace>' already contains a definition for 'Variables'. ...\Designer\ComponentWrapper.cs 60 14
EmitSsis. Internal Compiler Error: Workflow EmitSsis contains fatal errors. Phase execution halted.
I am unsure what is going wrong when I try to create the PKG using BIML:
these errors never popped up when I created the package manually (with the same code)
I copy-pasted the important C# files from the manual solution into the BIML
Questions / Ideas:
is it something to do with the GUID that has to be used for ProjectCoreName, AssemblyProduct and AssemblyTitle?
is this automatically generated each time the BIML is expanded?
if so, how can I create a GUID in the BIML itself for these items .. so that it works directly after expansion?
is it something to do with the sequence in which I have to create the .cs files in the BIML, within the <Files> tag?
I currently assume that actual sequence of the <File> items within the <Files> tag is not relevant for BIML
could it be that I have to generate "Resources.resx/Resources.Designer.cs" and "Settings.settings/Settings.Designer.cs" as well in the BIML?
is it really important to have the "#region Namespaces" section in each .cs file?
I removed that from the manual soln .. that still worked fine !
Pls help.
Notes:
I am using SSDT 2015 and Varigence BIMLExpress 2017 (Build 5.0.61915.0)
I am aware that this could very easily be done using ROW_NUMBER() within SQLServer in the SRC itself, but:
I am using OleDbSource in the attached example just to keep it simple
finally, I will have to generate hundreds of code PKGs with BIML, which will have the SRC as a flat file, and not OleDB.
and obviously, I would love to have my generated ScriptComponents work immediately after expansion .. without any further manual interventions :)
Thanx,
NP
BIML file: https://drive.google.com/file/d/10O3aSL5IO34ULS44wl7IX4LUmPH_pI6V/view?usp=sharing
The error will disappear if you wrap your classes in a custom namespace.
namespace SomeNamespace {
public class UserComponent: ScriptComponent
{
...
}
public class Connections
{
...
}
public class Variables
{
...
}
}
namespace SomeNamespace {
public class Input0Buffer: ScriptBuffer
{
...
}
}
Also one remark about (dynamically) creating multiple packages from this Biml file: You will have to assign a dynamic namespace name like so:
namespace <#=variableContainingNamespaceName#> {
...
}
About your remarks:
The issue is not related to the ProjectCoreName, but I fear that you may run into issues if you reuse the same ProjectCoreName for multiple packages generated from the same biml (without having actually tested it)
No
Not necessary
No, the section is not necessarily required.
Related
Suppose I have two files in my current working directory:
// file1.cs
Console.WriteLine("file1");
//file 2.cs
Console.WriteLine("file2");
In powershell, I do a dotnet new and delete the automatically generated Program.cs file. Then I do a dotnet build and get an error:
Only one compilation unit can have top level statements
I understand why this occurs, but I would like to be able to have full control of which .cs file is being targetted, while the other ones get ignored.
Is there any way to achieve this without having to create a whole new project for every file?
Doing this with .NET doesn't seem to be possible as of now. An issue on the dotnet/sdk GitHub has requested for this feature to be implemented.
However, you can use the C Sharp Compiler to compile a Windows executable and specify a .cs file with csc file1.cs
file1.cs:
using System;
Console.WriteLine("File 1");
https://learn.microsoft.com/en-us/dotnet/csharp/fundamentals/program-structure/top-level-statements
These files both use top-level statements. It implies that they both contain the Main method where program execution starts. You can only have one entry point. Generally, C# code is going to be contained within classes. Define a class in one (or both) files and put your methods within.
// Program.cs
public class Program
{
static void Main()
{
Console.WriteLine("Program.cs");
}
}
// Util.cs
public class Util
{
public static void Display()
{
Console.WriteLine("Util.cs");
}
}
Anyone know why the following code:
foreach (Word.XMLSchemaReference reference in Globals.ThisDocument.Application.ActiveDocument)
{
}
Gives me:
Error 1 foreach statement cannot operate on variables of type
'Microsoft.Office.Interop.Word.Document' because
'Microsoft.Office.Interop.Word.Document' does not contain a public
definition for 'GetEnumerator' C:\Program Files\Microsoft
Office\Templates\Projects\Project1\Project1\ActionsPaneControl1.cs 1054 13 Project1
I have that code in an actionpane control in a Word document level project which has been created with VS2013 using C# .Net 4.0 for Word 2010.
I am trying to run the following code within that loop:
if (reference.NamespaceURI.Contains("ActionsPane"))
{
reference.Delete();
}
Basically, the documents created with my addin give the user a message when they reopen the created document:
One or more XML expansion packs are available for this file. Choose
one from the list below. No XML expansion pack Microsoft Actions Pane
3
So it seems I need to find the reference and delete it before the user saves the document?
The message is very clear: Globals.ThisDocument.Application.ActiveDocument does not implement IEnumerable. I think you are looking for something in the active document that implements a IEnumerable of XMLSchemaReference. Check the properties from the Globals.ThisDocument.Application.ActiveDocument.
You're trying to enumerate over ActiveDocument. Are you trying to enumerate over the XML schemas?
foreach (var schema In ActiveDocument.XMLSchemaReferences){
schema.dosomething
}
I have been using and old assembly that references a Dynamics CRM tenant. The assembly contains (among other things) the proxy classes for the tenant. I need to update the proxy classes to the most recent version.
I have used crmsvcutil.exe in the past to generate the proxy class filefor a tenant, but in this assembly the classes each in their own separate files. For example, there is a 'person' c-sharp file which contains the variables of the person entity and the get/set methods.
The current files are in the following format:
using System;
using System.CodeDom.Compiler;
using System.ComponentModel;
using System.Diagnostics;
using System.Xml.Serialization;
namespace CRM2011.Proxy.Organisation
{
[GeneratedCode("System.Xml", "4.0.30319.1"), DesignerCategory("code"), DebuggerStepThrough, XmlType(Namespace = "http://schemas.microsoft.com/crm/2007/WebServices")]
[Serializable]
public class thr_offer : BusinessEntity
{
private Lookup createdbyField;
private CrmDateTime createdonField;
//More variables here
public Lookup createdby
{
get
{
return this.createdbyField;
}
set
{
this.createdbyField = value;
}
}
public CrmDateTime createdon
{
get
{
return this.createdonField;
}
set
{
this.createdonField = value;
}
}
//More get/set methods here
}
These appear to have been generated by a tool, but I don't know of any tool that will do this. If I can generate files such as these instead of the monstrous file generated by crmsvcutil.exe it will make development much easier.
Any ideas?
The files were generated with the CRM 4.0 SDK (the CrmDateTime rang a bell).
With the CrmSvcUtil.exe from the CRM 4.0 SDK it is possible to generate separate files, as documented here:
http://msdn.microsoft.com/en-us/library/ff681563.aspx
/out parameter
Determines the name of the .cs or .xml output file and whether there
is one file or one per entity. It may include a full path. If you
specify a name that does not end with .cs or .xml, CrmSvcUtil will
write an individual .cs file for every entity in the system to the
folder you specify. For example, /out:MyClasses outputs a class file
(.cs) for every entity to a folder called MyClasses.However,
/out:MyClasses.cs outputs a class file (MyClasses.cs) that contains
all entities.
This "feature" has been removed from CRM 2011 SDK (if you try to specify the name without the file extension it will still generate a single file)
You'd either have to create a splicer that splices the entities for you after they've been generated into a single file, or create CrmSvcUtil extensions that perform this basic change of behavior for you.
Edit Which the Early Bound Generator in the XrmToolBox does via configuration.
CrmSvcUtil contained in Microsoft.CrmSdk.CoreTools 9.1.0.108 support splitting entities (and other types) into single files using commandline arguments /splitfiles and /outdirectory.
NOTE: The CrmSvcUtil is not 100% clear about the /splitfiles argument, as it is referenced as /SplitToFiles by itself. 🙄
I am having a problem using Visual Studio data driven testing. I have tried to deconstruct this to the simplest example.
I am using Visual Studio 2012. I create a new unit test project.
I am referencing system data.
My code looks like this:
namespace UnitTestProject1
{
[TestClass]
public class UnitTest1
{
[DeploymentItem(#"OrderService.csv")]
[DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV", "OrderService.csv", "OrderService#csv", DataAccessMethod.Sequential)]
[TestMethod]
public void TestMethod1()
{
try
{
Debug.WriteLine(TestContext.DataRow["ID"]);
}
catch (Exception ex)
{
Assert.Fail();
}
}
public TestContext TestContext { get; set; }
}
}
I have a very small csv file that I have set the Build Options to to 'Content' and 'Copy Always'. I have added a .testsettings file to the solution, and set enable deployment, and added the csv file.
I have tried this with and without |DataDirectory|, and with/without a full path specified (the same path that I get with Environment.CurrentDirectory). I've tried variations of "../" and "../../" just in case. Right now the csv is at the project root level, same as the .cs test code file.
I have tried variations with xml as well as csv.
TestContext is not null, but DataRow always is.
I have not gotten this to work despite a lot of fiddling with it. I'm not sure what I'm doing wrong.
Does mstest create a log anywhere that would tell me if it is failing to find the csv file, or what specific error might be causing DataRow to fail to populate?
I have tried the following csv files:
ID
1
2
3
4
and
ID, Whatever
1,0
2,1
3,2
4,3
So far, no dice.
I am using ReSharper, could it be interfering in some way?
Updated
I have it mostly working now! I am able to use XML, but when I use CSV my column, which is named ID comes back as ID
Not sure why. I've checked the actual file of course, and no weird characters are present.
For anyone having a similar problem, I turned off Just My Code and enabled Net Framework source stepping, etc. so that I could get more detailed debug information. This allowed me to determine that ReSharper was causing me problems. I disabled resharper and modified my attributes like this:
[DeploymentItem("UnitTestProject1\\OrderService.csv")]
[DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV", "|DataDirectory|\\bin\\Debug\\OrderService.csv", "OrderService#csv", DataAccessMethod.Sequential)]
And it worked (except as noted). I am still suspicious of the "bin\debug" in my path, but I'm just happy my DataRow is no longer null.
Thanks!
Any ideas?
I was struggling with a similar problem today when trying to make data-driven tests work with CSV input file. The name of the first column had some garbage at the beggining of it, i.e. ID instead of just ID.
It turned out it was an encoding issue. The CSV file was saved in UTF-8 which adds a byte order mark at the beginning, obviously confusing the parser. Once I saved the file in ANSI encoding, it worked as expected.
I know it's an old question, but this information might help someone else ending up on this page.
Have you tried adding it through the properties window?
Go to Test menu -> Windows -> Test View -> the tests will load up.
Click on the test to alter i.e. TestMethod1 and press F4 (properties).
Look for 'Data Source' and click the ellipses next to it
It will walk you through a wizard that sets up the attributes properly for the TestMethod
You have the deployment part set up properly, which is normally the big stumbling block.
You also don't have to set the build action to Copy Always as the deployment does this for you. This option is used if you include items like .xml files you use for configs, or icons/images as part of your project.
Update 1:
Also try this tutorial on MSDN.
Update 2:
Try this post, involving ProcMon
I see that you said you tried putting the CSV itself into the testsettings file, but have you tried just putting in the directory?
<Deployment>
<DeploymentItem filename="Test\Data\" />
</Deployment>
Then your DataSource line will look something like this:
[DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV", "|DataDirectory|\\YOURCSV.csv", "YOURCSV#csv", DataAccessMethod.Sequential)]
If you do it this way, you don't need to specify the DeploymentItem line.
Our folder structure looks like this: Trunk\Test\Test\Data
We include: Test\Data in the deployment
We then access Test\Data via the |DataDirectory|\
All CSVs live within the \Data folder
I'm new to code contracts. I downloaded the latest build of code contract project (1.4.40314.1) and started to implement it in my project. When i enabled 'Runtume Checking' through Code Contracts Tab in VS2010, i got this Error
Error 1 The command ""C:\Program Files (x86)\Microsoft\Contracts\Bin\ccrewrite" "#Application1ccrewrite.rsp"" exited with code -1.
everytime i build the project. Plz help.
Now it's a major problem for me.
Every project using code contracts is showing same error in VS2010 Errors window and 'Application1ccrewrite.rsp' not found in output window, but it is there.
I tried out everything. I installed both versions (Pro, Std) but the problem persist. Plz help !
I had this problem as well. In my case the problem was that ccrewrite cannot work with files in a network folder but requires the project to be on your local hard disk.
I had this problem. The Assembly name and Default namespace of the class library that causes the problem had the same name as an existing DLL in the destination folder. I had been refactoring my code and whilst the namespaces in the CS files had all be changed to namespace2 the default namespace in the properties file was still namespace1
When I corrected this the files all built successfully...
Sometimes you can get this when your solution path is too long, especially with many projects.
Try moving to c:\temp and building it, it might fix it (although of course, this might not be a solution if you need it in the folder it currently is).
This bug I noticed in earlier CC versions and may now be fixed.
I don't know if you had the same problem as me, but I also saw this error. In my case, I had a method with a switch statement, and depending on the branch taken, different requirements applied:
static ITransaction CreateTransaction(
String transType,
MyType1 parm1,
/* Other params unimportant to this example */
String parm5)
{
switch (transType) {
case Transaction.Type.SOME_TRANSFER:
Contract.Requires<ArgumentNullException>(parm1.Account != null, "Account cannot be null.");
Contract.Requires<ArgumentException>(!String.IsNullOrWhiteSpace(parm5), "parm5 cannot be null or empty.");
// Create instance
return someInst;
case Transaction.Type.SOME_OTHER_TRANSFER:
Contract.Requires<ArgumentException>(!String.IsNullOrWhiteSpace(parm1.Type), "Type cannot be null or empty.");
Contract.Requires<ArgumentException>(!String.IsNullOrWhiteSpace(parm1.Number), "Number cannot be null or empty.");
// Create instance
return someInst;
/* Other cases */
default:
throw new ApplicationException("Invalid or unknown transaction type provided.");
}
}
This was giving me the error you noted in the Errors List when I tried to build. In the output window, I was getting this:
EXEC : Reference Assembly Generator
warning : Something is wrong with
contract number 1 in the method
'TerraCognita.LoanExpress.Domain.Loan.CreateLoanTransaction'
AsmMeta failed with uncaught
exception: Operation is not valid due
to the current state of the object.
I pushed each branch into a method of its own, making Contract.Requires the first lines of code in each method, and I no longer had a compilation problem. It appears that Contract.Requires must be the first lines of code in a method - which makes sense, since they are intended to be used to define pre-conditions.
Hope this helps.
The solution is to put the pre and pos conditions in the first lines. The ccrewrite does not accept that pre and post conditions are below command lines.