Generating Mapping views from code - EF6 - c#

https://msdn.microsoft.com/en-us/data/dn469601.aspx
I am trying to get the strategy mentioned in the linked article implemented in my huge codebase with over 500 entities to improve performance. I am stuck with the following issue.
System.Data.Entity.Core.EntityCommandCompilationException occurred
HResult=0x8013193B Message=An error occurred while preparing the
command definition. See the inner exception for details.
Source= StackTrace:
Inner Exception 1: MappingException: The current model no longer
matches the model used to pre-generate the mapping views, as indicated
by the
ViewsForBaseEntitySets3193163ce55837363333438629c877839ae9e7b7494500b6fd275844cda6d343.MappingHashValue
property. Pre-generated mapping views must be either regenerated using
the current model or removed if mapping views generated at runtime
should be used instead. See
http://go.microsoft.com/fwlink/?LinkId=318050 for more information on
Entity Framework mapping views.
Here's what I have tried. There are a few gaps in the original article on how it needs to be implemented where I might have fallen prey.
Step 1: I have created a class that extends DBMappingViewCache.
public class EFDbMappingViewCache : DbMappingViewCache
{
protected static string _mappingHashValue = String.Empty;
public override string MappingHashValue
{
get
{
return GetCachedHashValue();
}
}
public override DbMappingView GetView(EntitySetBase extent)
{
Dictionary<string, string> dict = GetMappedViewFromCache();
if (extent == null)
{
throw new ArgumentNullException("extent");
}
if(dict.ContainsKey(extent.Name))
{
return new DbMappingView(dict[extent.Name]);
}
return null;
}
public static string GetCachedHashValue()
{
string cachedHash;
string path = HttpContext.Current.Server.MapPath(#"~\EFCache\MappingHashValue.txt");
if (!File.Exists(path))
{
File.Create(path).Dispose();
}
using (var streamReader = new StreamReader(path, Encoding.UTF8))
{
cachedHash = streamReader.ReadToEnd();
}
return cachedHash;
}
public static void UpdateHashInCache(string hashValue)
{
string path = HttpContext.Current.Server.MapPath(#"~\EFCache\MappingHashValue.txt");
using (var streamWriter = new StreamWriter(path, false))
{
streamWriter.Write(hashValue);
}
}
private static void UpdateMappedViewInCache(Dictionary<EntitySetBase, DbMappingView> dict)
{
string path = HttpContext.Current.Server.MapPath(#"~\EFCache\MappingView.json");
Dictionary<String, String> stringDict = new Dictionary<string, string>();
foreach(var entry in dict)
{
stringDict[entry.Key.Name] = entry.Value.EntitySql.ToString();
}
var json = new JavaScriptSerializer().Serialize(stringDict);
using (var streamWriter = new StreamWriter(path, false))
{
streamWriter.Write(json);
}
}
private static Dictionary<String, string> GetMappedViewFromCache()
{
string path = HttpContext.Current.Server.MapPath(#"~\EFCache\MappingView.json");
var json = String.Empty;
using (var streamReader = new StreamReader(path, Encoding.UTF8))
{
json = streamReader.ReadToEnd();
}
Dictionary<String, string> mappedViewDict = new Dictionary<String, string>();
if (!String.IsNullOrEmpty(json))
{
var ser = new System.Web.Script.Serialization.JavaScriptSerializer();
mappedViewDict = ser.Deserialize<Dictionary<String, string>>(json);
}
return mappedViewDict;
}
public static void CheckAndUpdateEFViewCache()
{
using (var ctx = new CascadeTranscationsDbContext(DBHelper.GetConnString()))
{
var objectContext = ((IObjectContextAdapter)ctx).ObjectContext;
var mappingCollection = (StorageMappingItemCollection)objectContext.MetadataWorkspace
.GetItemCollection(DataSpace.CSSpace);
string computedHashValue = mappingCollection.ComputeMappingHashValue();
string currentHashValue = GetCachedHashValue();
SetHashValue(computedHashValue);
if (computedHashValue != currentHashValue)
{
UpdateHashInCache(computedHashValue);
IList<EdmSchemaError> errors = new List<EdmSchemaError>();
Dictionary<EntitySetBase, DbMappingView> result = mappingCollection.GenerateViews(errors);
UpdateMappedViewInCache(result);
}
}
}
}
I have stored the hashvalue and mapping generated in a file and retrieved it in GetView() method.
I have exposed a public CheckAndUpdateEFViewCache() method which will generate the view mapping when called and store in file.
Step2: Call the CheckAndUpdateEFViewCache() from Global.asax file Application_Start() method.
Step3: Include assembly in the file where context is first called.
[assembly: DbMappingViewCacheType(typeof(Models.Entities.MyDBContext), typeof(EFDbMappingViewCache))]
I am really not sure where this assembly line actually needs to go. There is no information on it in the link. There is a really good chance that Step3 might be where i have gone wrong.
Can someone help with the problem ?

The issue I had faced was because I already had a mapped file generated using EF Tools and it was registered. When the configuration I wrote attempted to register one more time, EF threw an error.
Further I want to add that Cached DB model store improved the performance several folds and I ended up using just that in my project.
Link to Cached DB model store usage

Related

IWriterConfiguration.ReferenceHeaderPrefix equivalent in newer versions of CsvHelper

What is the equivalent of csvWriter.Configuration.ReferenceHeaderPrefix in the newer version of CsvHelper? Trying this
csvWriter.Configuration.ReferenceHeaderPrefix = (memberType, memberName) => $"{memberName}_";
but its not let me because ReferenceHeaderPrefix has only get method after version 20.0.0
The usual workflow is to construct an instance of the CsvConfiguration class and pass that into the constructor for the reader or writer. And CsvConfiguration.ReferenceHeaderPrefix does have a set method.
var config = new CsvConfiguration(CultureInfo.InvariantCulture)
{
ReferenceHeaderPrefix = (args) => $"{args.MemberName}_",
};
using (var writer = new StreamWriter("path\\to\\file.csv"))
using (var csv = new CsvWriter(writer, config))
{
// Write your CSV records here.
csv.WriteRecords(records);
}
Note also that, in the current version (27.2.0), the ReferenceHeaderPrefix takes a single ReferenceHeaderPrefixArgs argument, which contains MemberType and MemberName fields:
public readonly struct ReferenceHeaderPrefixArgs
{
public readonly Type MemberType;
public readonly string MemberName;
public ReferenceHeaderPrefixArgs(Type memberType, string memberName)
{
MemberType = memberType;
MemberName = memberName;
}
}

Fast way to find key-value pair in C#

Sounds pretty trivial but I really despair:
How do I import and match a value to a given key in an elegant fast way?
Just telephone area code - finding the matching prefix for a given zone (no multi-user). I have it as CSV but SQLite would be fine, too. SQLite connector? Or Dictionary? I don't know ...
Thank you in advance!
Nico
Simple as that. It's a console application.
Not pretty with the global creation and intialization but I didn't manage to do it across multiple source files by using only just a public class (type "Dictionary" and return this).
Main .cs file:
static class GlobalVar
{
public static Dictionary<string, string> areaCodesDict = new Dictionary<string, string>();
}
Second .cs file:
public class AreaCodes
{
public static void ParseCsv()
{
var path = ConfigurationManager.AppSettings["areaCodesCsv"];
using (var strReader = new StreamReader(path))
{
while (!strReader.EndOfStream)
{
var line = strReader.ReadLine();
if (line == null) { continue; }
var csv = line.Split(Convert.ToChar(ConfigurationManager.AppSettings["areaCodesCsvDelim"]));
// areaCodesDict.Add(key, value)
GlobalVar.areaCodesDict.Add(csv[0], csv[1]);
}
}
}
}
Example usage in Main.cs file again:
if (regexMatch.Success)
{
foreach (KeyValuePair<string, string> pair in GlobalVar.areaCodesDict)
{
if(destNumber.StartsWith(pair.Value))
{
destNumber = destNumber.Replace(pair.Value, pair.Key);
}
}
}

Resource (.resx) data is not saved

I can't get what's the problem. Please check my code's fragments. Each time when I add resource data, it clears last data and writes new records in .resx.
For example, Applications.resx has "MyApp1" key with "MyApp1Path" value. Next time if I add "MyApp2" key with "MyApp2Path" value, I notice that {"MyApp1", "MyApp1Path"} doesn't exist.
//Adding Application in Applications List
ResourceHelper.AddResource("Applications", _appName, _appPath);
Here is ResourceHelper class:
public class ResourceHelper
{
public static void AddResource(string resxFileName, string name, string value)
{
using (var resx = new ResXResourceWriter(String.Format(#".\Resources\{0}.resx", resxFileName)))
{
resx.AddResource(name, value);
}
}
}
Yes this is expected, ResXResourceWriter just adds nodes, it doesn't append.
However, you could just read the nodes out, and add them again
public static void AddResource(string resxFileName, string name, object value)
{
var fileName = $#".\Resources\{resxFileName}.resx";
using (var writer = new ResXResourceWriter(fileName))
{
if (File.Exists(fileName))
{
using (var reader = new ResXResourceReader(fileName))
{
var node = reader.GetEnumerator();
while (node.MoveNext())
{
writer.AddResource(node.Key.ToString(), node.Value);
}
}
}
writer.AddResource(name, value);
}
}
Disclaimer, untested and probably needs error checking

Writing compact xml with XmlDictionaryWriter.CreateBinaryWriter and a XmlDictionary

I want to write an xml document to disk in a compact format. To this end, I use the net framework method XmlDictionaryWriter.CreateBinaryWriter(Stream stream,IXmlDictionary dictionary)
This method writes a custom compact binary xml representation, that can later be read by XmlDictionaryWriter.CreateBinaryReader. The method accepts an XmlDictionary that can contain common strings, so that those strings do not have to be printed in the output each time. Instead of the string, the dictionary index will be printed in the file. CreateBinaryReader can later use the same dictionary to reverse the process.
However the dictionary I pass is apparently not used. Consider this code:
using System.IO;
using System.Xml;
using System.Xml.Linq;
class Program
{
public static void Main()
{
XmlDictionary dict = new XmlDictionary();
dict.Add("myLongRoot");
dict.Add("myLongAttribute");
dict.Add("myLongValue");
dict.Add("myLongChild");
dict.Add("myLongText");
XDocument xdoc = new XDocument();
xdoc.Add(new XElement("myLongRoot",
new XAttribute("myLongAttribute", "myLongValue"),
new XElement("myLongChild", "myLongText"),
new XElement("myLongChild", "myLongText"),
new XElement("myLongChild", "myLongText")
));
using (Stream stream = File.Create("binaryXml.txt"))
using (var writer = XmlDictionaryWriter.CreateBinaryWriter(stream, dict))
{
xdoc.WriteTo(writer);
}
}
}
The produced output is this (binary control characters not shown)
#
myLongRootmyLongAttribute˜myLongValue#myLongChild™
myLongText#myLongChild™
myLongText#myLongChild™
myLongText
So apparently the XmlDictionary has not been used. All strings appear in their entirety in the output, even multiple times.
This is not a problem limited to XDocument. In the above minimal example I used a XDocument to demonstrate the problem, but originally I stumbled upon this while using XmlDictionaryWriter in conjunction with a DataContractSerializer, as it is commonly used. The results were the same:
[Serializable]
public class myLongChild
{
public double myLongText = 0;
}
...
using (Stream stream = File.Create("binaryXml.txt"))
using (var writer = XmlDictionaryWriter.CreateBinaryWriter(stream, dict))
{
var dcs = new DataContractSerializer(typeof(myLongChild));
dcs.WriteObject(writer, new myLongChild());
}
The resulting output did not use my XmlDictionary.
How can I get XmlDictionaryWriter to use the suplied XmlDictionary?
Or have I misunderstood how this works?
with the DataContractSerializer approach, I tried debugging the net framework code (visual studio/options/debugging/enable net. framework source stepping). Apparently the Writer does attempt to lookup each of the above strings in the dictionary, as expected. However the lookup fails in line 356 of XmlbinaryWriter.cs, for reasons that are not clear to me.
Alternatives I have considered:
There is an overload for XmlDictionaryWriter.CreatebinaryWriter, that also accepts a XmlBinaryWriterSession. The writer then adds any new strings it encounters into the session dictionary. However, I want to only use a static dictionary for reading and writing, which is known beforehand.
I could wrap the whole thing into a GzipStream and let the compression take care of the multiple instances of strings. However, this would not compress the first instance of each string, and seems like a clumsy workaround overall.
Yes there is a misunderstanding. XmlDictionaryWriter is primarily used for serialization of objects and it is child class of XmlWriter. XDocument.WriteTo(XmlWriter something) takes XmlWriter as argument. The call XmlDictionaryWriter.CreateBinaryWriter will create an instance of System.Xml.XmlBinaryNodeWriter internally. This class has both methods for "regular" writing:
// override of XmlWriter
public override void WriteStartElement(string prefix, string localName)
{
// plain old "xml" for me please
}
and for dictionary based approach:
// override of XmlDictionaryWriter
public override void WriteStartElement(string prefix, XmlDictionaryString localName)
{
// I will use dictionary to hash element names to get shorter output
}
The later is mostly used if you serialize object via DataContractSerializer (notice its method WriteObject takes argument of both XmlDictionaryWriter and XmlWriter type), while XDocument takes just XmlWriter.
As for your problem - if I were you I'd make my own XmlWriter:
class CustomXmlWriter : XmlWriter
{
private readonly XmlDictionaryWriter _writer;
public CustomXmlWriter(XmlDictionaryWriter writer)
{
_writer = writer;
}
// override XmlWriter methods to use the dictionary-based approach instead
}
UPDATE (based on your comment)
If you indeed use DataContractSerializer you have few mistakes in your code.
1) POC classes have to be decorated with [DataContract] and [DataMember] attribute, the serialized value should be property and not field; also set namespace to empty value or you'll have to deal with namespaces in your dictionary as well. Like:
namespace XmlStuff {
[DataContract(Namespace = "")]
public class myLongChild
{
[DataMember]
public double myLongText { get; set; }
}
[DataContract(Namespace = "")]
public class myLongRoot
{
[DataMember]
public IList<myLongChild> Items { get; set; }
}
}
2) Provide instance of session as well; for null session the dictionary writer uses default (XmlWriter-like) implementation:
// order matters - add new items only at the bottom
static readonly string[] s_Terms = new string[]
{
"myLongRoot", "myLongChild", "myLongText",
"http://www.w3.org/2001/XMLSchema-instance", "Items"
};
public class CustomXmlBinaryWriterSession : XmlBinaryWriterSession
{
private bool m_Lock;
public void Lock() { m_Lock = true; }
public override bool TryAdd(XmlDictionaryString value, out int key)
{
if (m_Lock)
{
key = -1;
return false;
}
return base.TryAdd(value, out key);
}
}
static void InitializeWriter(out XmlDictionary dict, out XmlBinaryWriterSession session)
{
dict = new XmlDictionary();
var result = new CustomXmlBinaryWriterSession();
var key = 0;
foreach(var term in s_Terms)
{
result.TryAdd(dict.Add(term), out key);
}
result.Lock();
session = result;
}
static void InitializeReader(out XmlDictionary dict, out XmlBinaryReaderSession session)
{
dict = new XmlDictionary();
var result = new XmlBinaryReaderSession();
for (var i = 0; i < s_Terms.Length; i++)
{
result.Add(i, s_Terms[i]);
}
session = result;
}
static void Main(string[] args)
{
XmlDictionary dict;
XmlBinaryWriterSession session;
InitializeWriter(out dict, out session);
var root = new myLongRoot { Items = new List<myLongChild>() };
root.Items.Add(new myLongChild { myLongText = 24 });
root.Items.Add(new myLongChild { myLongText = 25 });
root.Items.Add(new myLongChild { myLongText = 27 });
byte[] buffer;
using (var stream = new MemoryStream())
{
using (var writer = XmlDictionaryWriter.CreateBinaryWriter(stream, dict, session))
{
var dcs = new DataContractSerializer(typeof(myLongRoot));
dcs.WriteObject(writer, root);
}
buffer = stream.ToArray();
}
XmlBinaryReaderSession readerSession;
InitializeReader(out dict, out readerSession);
using (var stream = new MemoryStream(buffer, false))
{
using (var reader = XmlDictionaryReader.CreateBinaryReader(stream, dict, new XmlDictionaryReaderQuotas(), readerSession))
{
var dcs = new DataContractSerializer(typeof(myLongRoot));
var rootCopy = dcs.ReadObject(reader);
}
}
}

Parsing JSON page

Been trying to figure out how to parse out "in_reply_to_status_id_str -> id_str" form the twitter search page:
https://twitter.com/phoenix_search.phoenix?q=hello&headers%5BX-Twitter-Polling%5D=true&headers%5BX-PHX%5D=true&since_id=203194965877194752&include_entities=1&include_available_features=1&contributor_details=true&mode=relevance&query_source=unknown
Anyone that could write a small example to show how it can be done?
Using Json.Net
dynamic jObj = JsonConvert.DeserializeObject(new WebClient().DownloadString("your url"));
foreach (var item in jObj.statuses)
{
Console.WriteLine("{0} {1}", item.in_reply_to_status_id_str, item.id_str);
}
SO here is where I pull my Json, this is where my list gets made, which you all ready have:
public JsonResult AllStatuses() //from the json called in the _client view
{
var buildStatuses = new List<BuildStatus>();
var projects = Client.AllProjects();
foreach (var project in projects)
{
try
{
var buildConfigs = Client.BuildConfigsByProjectId(project.Id);
foreach (var buildConfig in buildConfigs)
{
var b = new BuildStatus();
var build = Client.LastBuildByBuildConfigId(buildConfig.Id);
var status = build.Status; // Used to loop through BuildConfigID's to find which is a FAILURE, SUCCESS, ERROR, or UNKNOWN
var change = Client.LastChangeDetailByBuildConfigId(buildConfig.Id); // Provides the changeID
var changeDetail = Client.ChangeDetailsByChangeId(change.Id); // Provides the username, this one populates the usernames
if (changeDetail != null)
b.user = changeDetail.Username;
b.id = buildConfig.Id.ToString();
// If the date isn't null place the start date in long format
if (build.StartDate != null)
b.date = build.StartDate.ToString();
// If block; set the status based on the BuildconfigID from the var status
if (status.Contains("FAILURE")){
b.status = "FAILURE";
}
else if (status.Contains("SUCCESS")){
b.status = "SUCCESS";
}
else if (status.Contains("ERROR")){
b.status = "ERROR";
}
else{
b.status = "UNKNOWN";
}
buildStatuses.Add(b);
}
} catch { }
}
var query = buildStatuses.OrderBy(x => x.status); // Create a sorted list from Error - Unknown
return Json(query, JsonRequestBehavior.AllowGet);
Then I copied the JsonConverter I linked you too.
On my Website I finally pulled apart the list of Json with.
public JsonResult AllStatuses() //from the json called in the _client view
{
List<Client> clients = storeDB.Clients.Include("Projects").Include("Projects.Builds").ToList();
var buildStatuses = new List<BuildStatus>();
foreach (var client in clients) {
// Network credentials
// Used to get the Json Service request // URL here: client.ClientURL
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("http://localhost:81/Status/AllStatuses");
var response = request.GetResponse();
var reader = new StreamReader(response.GetResponseStream());
var responseString = reader.ReadToEnd();
var serializer = new JavaScriptSerializer();
serializer.RegisterConverters((new[] { new DynamicJsonConverter() }));
dynamic obj = serializer.Deserialize(responseString, typeof(object)) as dynamic;
foreach (var objects in obj) // Pull apart the dynamic object
{
var id = objects.id;
var status = objects.status;
var date = objects.date;
var user = objects.user;
var bs = new BuildStatus();
try
{
bs.status = status;
bs.date = date;
bs.id = id;
bs.user = user;
}
catch { throw; }
buildStatuses.Add(bs);
}
}
return Json(buildStatuses, JsonRequestBehavior.AllowGet);
}
Go for a jQuery approach:
var obj = jQuery.parseJSON(jsonString);
alert(obj.in_reply_to_status_id_str.id_str);
You can use this json libraryfor accomplish this.
You could also use the DataContractJsonSerializer class available in .NET once you add a reference to System.Runtime.Serialization.
All you need to do is a create two DataContract classes. Something like:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Runtime.Serialization;
using System.Text;
namespace MyNamespace
{
[DataContract]
public class TwitterObject
{
[DataMember(Name = "statuses")]
public TwitterStatus[] Statuses { get; set; }
}
[DataContract]
public class TwitterStatus
{
[DataMember(Name = "in_reply_to_status_id_str")]
public string InReplyToStatusIdStr { get; set; }
[DataMember(Name = "id_str")]
public string IdStr { get; set; }
}
}
Then from any other method you wish, you just have to use the DataContractJsonSerializer to build your JSON into a .NET object:
DataContractJsonSerializer jsonSerializer = new DataContractJsonSerializer(typeof(TwitterObject));
// assume the twitterResponse is the JSON you receive
MemoryStream memoryStream = new MemoryStream(Encoding.ASCII.GetBytes(twitterResponse));
var twitterJson = jsonSerializer.ReadObject(memoryStream) as TwitterObject;
There may be some typos, but this should give you the hint. I'm currently working on an extensive synchronization between a server app and a website and this is the method I currently use for JSON communication between the two. I've found the combination of DataContracts and DataContractJsonSerializer is easier to use than 3rd party libraries.

Categories