Auto Generated Schema only works when done in Memory - c#

I generated a Json Schema off of a currently existing class.
JSchemaGenerator generator = new JSchemaGenerator();
JSchema schema = generator.Generate(typeof(Client));
This will validate fine, however, I need to put on dependencies (which you can't do from the class), so I copied the Schema results to a file. Now, the file will validate as fine on http://www.jsonschemavalidator.net/. However, when I try to load it using the following:
using (StreamReader file = File.OpenText("c:\\myJson.json"))
{
file.BaseStream.Position = 0;
using (JsonTextReader reader = new JsonTextReader(file))
{
JSchema schema2 = JSchema.Load(reader);
}
}
I will always get errors on any internal references in the file:
"Contact": {"$ref": "#/definitions/Contact"},
An exception of type 'Newtonsoft.Json.Schema.JSchemaReaderException'
occurred in Newtonsoft.Json.Schema.dll but was not handled in user code
Additional information: Could not resolve schema reference
'#/definitions/Contact/ Path 'definitions.Contact' Line 120, position 20
Why would this be fine if done in memory, but if loaded from a file will fail?

So I found the problem. While you can generate your schema from your existing models, it may not always do it correctly.
We found that some of the classes referenced were added with an extra Item layer in them (not sure why) Once we removed the reference of "Item" at the end, it worked fine. It also generated a second Item layer in the object, which I had to remove to get validation to work on that object.

Related

Dynamically compiled project losing resources

I need to compile source code of big project dynamically and output type can be Windows Application or Class Library.
Code is nicely executed and its possible to make .dll or .exe files, but problem is that, when I'm trying to make .exe file - it's losing resources like project icon. Result file doesn't include assembly information to.
Any way to solve this? (Expected result should be the same, that manual Build function on project file in Visual Studio 2015).
Thank you!
var workspace = MSBuildWorkspace.Create();
//Locating project file that is WindowsApplication
var project = workspace.OpenProjectAsync(#"C:\RoslynTestProjectExe\RoslynTestProjectExe.csproj").Result;
var metadataReferences = project.MetadataReferences;
// removing all references
foreach (var reference in metadataReferences)
{
project = project.RemoveMetadataReference(reference);
}
//getting new path of dlls location and adding them to project
var param = CreateParamString(); //my own function that returns list of references
foreach (var par in param)
{
project = project.AddMetadataReference(MetadataReference.CreateFromFile(par));
}
//compiling
var projectCompilation = project.GetCompilationAsync().Result;
using (var stream = new MemoryStream())
{
var result = projectCompilation.Emit(stream);
if (result.Success)
{
/// Getting result
//writing exe file
using (var file = File.Create(Path.Combine(_buildPath, fileName)))
{
stream.Seek(0, SeekOrigin.Begin);
stream.CopyTo(file);
}
}
}
We never really designed the workspace API to include all the information you need to emit like this; in particular when you're calling Emit there's an EmitOptions you can pass that includes, amongst other things, resource information. But we don't expose that information since this scenario wasn't hugely considered. We've done some of the work in the past to enable this but ultimately never merged it. You might wish to consider filing a bug so we officially have the request somewhere.
So what can you do? I think there's a few options. You might consider not using Roslyn at all but rather modifying the project file and building that with the MSBuild APIs. Unfortunately I don't know what you're ultimately trying to achieve here (it would help if you mentioned it), but there's a lot more than just the compiler invocation that is involved in building a project. Changing references potentially changes other things too.
It'd also be possible, of course, to update MSBuildWorkspace yourself to pass this through. If you were to modify the Roslyn code, you'll see we implement a series of interfaces named "ICscHostObject#" (where # is a number) and we get passed the information from MSBuild to that. It looks like we already stash that in the command line arguments, so you might be able to pass that to our command line parser and get the data back you need that way.

How to Deserialize and Serialize Build Process Parameters in TFS

I am trying to create a new build definition using TFS 2013 API. The process template that I have to refer contains several custom activities and parameters. While creating the Build Definition, some of the Property value needs to be updated dynamically. So I tried to Deserialize the process parameters using below code:
IDictionary<string, object> processParams = WorkflowHelpers.DeserializeProcessParameters(defaultTemplate.Parameters);
This code always throwing below exception:
An unhandled exception of type 'System.Xaml.XamlObjectWriterException' occurred in System.Xaml.dll
Additional information: No matching constructor found on type 'System.Activities.Activity'. You can use the Arguments or FactoryMethod directives to construct this type.
This is really frustrating and I can't get rid of this error yet.
I have also tried to deserialize the process parameters using below code:
using (StringReader stringReader = new StringReader(parameterValues))
{
object obj = XamlServices.Load(
ActivityXamlServices.CreateReader(
new XamlXmlReader((TextReader)stringReader, new XamlXmlReaderSettings { LocalAssembly = System.Reflection.Assembly.GetExecutingAssembly() }
)));
}
This works but when I serialize it again using XamlServices.Save(XamlWriter writer, object instance) method, the parameters got changed which is not compatible with build workflow.
So, how can I update the build process parameters here ? Can the WorkflowHelpers class can be used in other ways ? Or is there any other way to update the process parameters so that it gets reflected in the build definition. Any help is highly appreciated.
This is working fine now.
Below is new code:
IDictionary<string, object> processParams = WorkflowHelpers.DeserializeProcessParameters(defaultTemplate.ProcessParameters);
Instead of defaultTemplate.Parameters, I need to pass defaultTemplate.ProcessParameters

SharpMap and Entity Framework: The invoked member is not supported in a dynamic assembly

I am using sharp map inside a custom "map widget" component. To populate the map, I want to use the entity framework, which is inside a seperate DLL. This works fine if I create a map, and then get the data.
public void loadMap() {
var map = new MapWidget(); // Create a new widget which internally uses SharpMap
map.AddCountriesLayer(); // Load the map background from .shp file
var data = new IPService.GetPointsForMap(); // Gets IP address from entity framework, inside "domain.dll"
map.AddDots(data); // Add dots
}
However, if I get the data first, and then make the map, things break:
public void loadMap() {
var data = new IPService.GetPointsForMap(); // Accessing entity framework before sharpmap
var map = new MapWidget();
map.AddCountriesLayer();
map.AddDots(data);
}
results in
System.NotSupportedException "The invoked member is not supported in a dynamic assembly."
at System.Reflection.Emit.InternalAssemblyBuilder.GetExportedTypes()
at GeoAPI.GeometryServiceProvider.ReflectInstance()
at GeoAPI.GeometryServiceProvider.get_Instance()
at SharpMap.Data.Providers.ShapeFile.set_SRID(Int32 value) in C:\dev\DLLs\SharpMap Source\Trunk\SharpMap\Data\Providers\ShapeFile.cs:line 859
at SharpMap.Data.Providers.ShapeFile.ParseProjection() in C:\dev\DLLs\SharpMap Source\Trunk\SharpMap\Data\Providers\ShapeFile.cs:line 978
at SharpMap.Data.Providers.ShapeFile..ctor(String filename, Boolean fileBasedIndex) in C:\dev\DLLs\SharpMap Source\Trunk\SharpMap\Data\Providers\ShapeFile.cs:line 302
at Dashboard.Widgets.MapWidget.AddCountriesLayer() in c:\dev\Dashboard\v1\Dashboard\Classes\Widgets\Generic\MapWidget.cs:line 86
What the heck is going on here? Why would using the entity framework first break it?
To fix this issue, I added this to the program.cs, to force the widget to be loaded first.
static void Main()
{
Application.EnableVisualStyles();
Application.SetCompatibleTextRenderingDefault(false);
// Hack to force SharpMap to register before entity framework
var widget = new Widgets.MapWidget();
widget.Update();
Application.Run(new DashboardForm());
}
However, I don't like it - it seems pretty fragile and I don't like "coding by coincidence". Is there anything I can do to fix it?
Note:
I found this blog post: http://elegantcode.com/2010/01/28/the-entity-framework-and-the-the-invoked-member-is-not-supported-in-a-dynamic-assembly-exception/
I added the domain assembly to the connectionString
My project structure is this:
Dashboard.exe
App.Config contains connectionstring
References SharpMap
References Domain.Dll
Contains MapWidget
Domain.dll
Contains DomainModel and Services
Uses Entity Model for persistance
App.config contains connectionstring, entity framework config section and entity framework connection factory
So my questions are:
Why is it happening?
What can I do to stop it? (If not, is there a better place then Program.cs for the hacky code)
Thanks for reading, please ask me to clarify if I haven't been clear.
I had a very similar problem, but I'm not using Entity Framework (I'm using NHibernate instead), so, I've figured out that this may not be a proxy-object problem after all.
I also dislike "coding by coincidence", but I assume that by calling new MapWidget(), some initialization related to GeoApi is performed internally - as GeoApi is used by SharpMap internally.
In my case, I was not using the map directly, I was simply inserting some geo data in my database using NHibernate and I was getting exact same stack trace, so I figured that might be the same problem.
As much as I hate it, I had something like this:
// my object to be persisted using NHibernate
var myObj = new MyObj();
// add polygon of type GeoAPI.Geometries.IGeometry
myObj.CoveredArea = myGeoFactory.CreatePolygonArea(/* ... */);
// use NHibernate to save my obj
sessioNScope.Save(myObj); // <- throws NotSupportedException here
and it gave me exact exception as you had. After changing it to
// Ignore this line: hack to initialize GeoApi
new Map();
// my object to be persisted using NHibernate
var myObj = new MyObj();
// add polygon of type GeoAPI.Geometries.IGeometry
myObj.CoveredArea = myGeoFactory.CreatePolygonArea(/* ... */);
// use NHibernate to save my obj
sessioNScope.Save(myObj);
it worked just fine. In my case I used new Map() instead of new MapWidget() because it is a winforms application.
TLDR: think of it as an hack, which performs initialization
This is most likely due to the dynamic proxies generated by EF. I'm not familiar with SharpMap, so I can't comment on what effect does it have to initialize it before EF, but you should be able to avoid the exception by disabling proxy creation:
context.Configuration.ProxyCreationEnabled = false;
Note that this will disable lazy loading and change tracking, so first read this article carefully: http://msdn.microsoft.com/en-us/data/jj592886

How to mandate workflows to include local types assembly name?

I am using WF4 and am having my types, activities and workflows in the same project and then I am using ActivityXamlServices.Load(path) to load my workflow activity and it throws the following exception:
The type ‘InArgument(local:,,,, of property ‘InputArgs’ could not be resolved
By looking at the solution to this problem at this post, I included assembly name manually in the workflow, and it all works.
Problem: Everytime I make any change in the workflow, it re-writes the xaml and removes the assembly names I manually added.
Question: Is there a way to mandate including assembly names of local types also?
The trick is to use a XamlXmlReaderSettings and specify what should be used as the local assembly reference.
var settings = new XamlXmlReaderSettings()
{
LocalAssembly = typeof(YourArgumentType).Assembly
};
var reader = new XamlXmlReader(path, settings);
Activity workflow = ActivityXamlServices.Load(reader);

Dynamic assembly generated on HttpWebRequest.GetResponse()

It appears our application has an assembly leak. I noticed that on any calls where a web service call is invoked using the HttpWebRequest object a dynamic assembly is loaded on the call httpWebRequest.GetResponse()
I can see the assembly get loaded through the debugger ('w3wp.exe' (Managed): Loaded '7-6jav6v', No symbols loaded.) but I cannot figure out why this would occur.
Has anyone else experienced this before?
Edit:
To add clarifications to this question.
In c# when you create an XmlSerializer an assembly is created to complete the serialization. This always will occur unless you use a tool to do this for you in advance. If you use the constructor of (Type type) or (Type type, string "namespace") then only 1 assembly will be generated. If you use any other constructor then a new assembly will be generated for each serialization.
THis is not the case in the problem stated above.
There is a block of code in our codebase that manually makes a soap call and returns a string (the string is xml, ex: ). Each time this block of code executes a new assembly gets created. When examining one of these assemblies this is referenced "XmlSerializationWriter1.Microsoft.Xml.Serialization.GeneratedAssembly.XmlSerializationReader1.XmlSerializer1.ArrayOfObjectSerializer.ArrayOfObjectSerializer1.ArrayOfObjectSerializer2"
For a better understanding - the code block looks like below and when the last line executes the assembly gets generated...multiple assemblies, one for each time this block runs.
HttpWebRequest oHttpWebRequest =(HttpWebRequest)WebRequest.Create("URL TO WEBSERVICE");
oHttpWebRequest.Timeout =((1000*60)*30);
oHttpWebRequest.Method ="POST" ;
oHttpWebRequest.ContentType ="text/xml" ;
oHttpWebRequest.Headers.Add("SOAPAction: http://www.tempuri.com/"+WebMethodName);
StreamWriter oStreamWriter = new StreamWriter(oHttpWebRequest.GetRequestStream()) ;
string SoapRequest=#"<soap:Envelope xmlns:xsi=""http://www.w3.org/2001/XMLSchema-instance"" xmlns:xsd=""http://www.w3.org/2001/XMLSchema"" xmlns:soap=""http://schemas.xmlsoap.org/soap/envelope/""><soap:Body>";
SoapRequest=SoapRequest + HttpUtility.HtmlDecode(XmlHttpRequestData);
SoapRequest=SoapRequest + #"</soap:Body></soap:Envelope>";
oStreamWriter.Write(SoapRequest);
oStreamWriter.Close();
oHttpWebRequest.ProtocolVersion.Build;
WebResponse oWebResponse = oHttpWebRequest.GetResponse() ;
According to your comment below Sky Sanders' answer, the generated assemblies are for XML serialization. Serialization assemblies are dynamically generated, unless you pre-generate them using the XML Serializer Generator Tool (Sgen.exe). If you do that, the existing assemblies will be used and no assembly will be generated
Is the schema of the xml for the web services you call fixed, or dynamic? If you are calling arbitrary web services that each take arbitrary XML messages as input and return arbitrary XML messages as output...then the XmlSerializer is going to create a new assembly for each schema. If each message essentially uses the same schema, but varies enough in structure, even though they could use a common schema, the XmlSerializer is only so capable...its going to generate a assembly to handle each specific schema it identifies.
Like Thomas said, if your schema is fixed, use the XML Serializer Generator Tool to pre-generate your serialization assemblies.

Categories