Trying to write a more robust TemplateEngine with replacements defined externally - c#

I am in the process of writing a TemplateEngine. The intention is that will read in a HTML file containing placeholders. Something along the lines of <p><Dear |*name|* you have won 1st prize in |*competition|* thanks for your entry on |*date|*</p> where anything between || needs to be replaced by data from a key value pair dictionary.
I currently initialise a dictionary manually like this:
mergeData.Add("name", dataRow["UserName"].ToString());
mergeData.Add("competiton", dataRow["CompName"].ToString());
mergeData.Add("date", dataRow["EntryDate"].ToString());
templateEngine.Initialise(mergeData);
templateEngine.Run();`
As you can see it makes use of a few magic strings. What I would like to do is make it a bit more extensible so that the placeholders and replacements could be defined externally. At the moment I am explictly specifying which columns to use for the data in code. I think that maybe a DataTable would work for the Initialise() but not sure how to approach it. Any ideas / suggestions would be most welcome.
public class TemplateEngine
{
private string _template;
private bool _initialised = false;
private Dictionary<string, string> _mergeData;
private TemplateEngine(string templateString)
{
_template = templateString;
}
public void Initialise(Dictionary<string, string> mergeData)
{
if (mergeData == null)
{
_initialised = false;
throw new ArgumentException("Must specify key value pairs to perform merge correctly");
}
_mergeData = mergeData;
_initialised = true;
}
public string Run()
{
if (_initialised == false)
{
throw new Exception("Cannot run engine as mergeData dictionary is not initalised");
}
foreach (var kvp in _mergeData)
{
_template = _template.Replace("|*" + kvp.Key + "|*", kvp.Value);
}
return _template;
}
public static TemplateEngine FromFile(string filePath)
{
if (filePath == string.Empty)
{
throw new ArgumentException("FilePath not specified cannot instantiate TemplateEngine");
}
string html = System.IO.File.ReadAllText(filePath);
var templateEngine = new TemplateEngine(html);
return templateEngine;
}
}
}

Related

strategy pattern with dictionary lookup

I'm writing a validation engine.
I'm given with an a object payload containing around list of ~40 different properties. Every property will undergo different validation.
The validations include checking if the field is a string and validating if the length exceeds permissible limit set by db and there are conditions to check null values and empty fields as well.
So, I thought of picking the strategy pattern.
Code:
interface IValidationEngine
{
Error Validate(string propName, dynamic propValue);
}
public class StringLengthValidator : IValidationEngine
{
static Dictionary<string, int> sLengths = new Dictionary<string, int>();
public StringLengthValidator()
{
if (sLengths.Count == 0)
{
sLengths = new Dictionary<string, int>(){....};
}
}
public Error Validate(string name, dynamic value)
{
var err = default(Error);
//logic here
return err;
}
}
public class MandatoryValidator : IValidationEngine
{
public Error Validate(string name, dynamic value)
{
var err = default(Error);
//logic here
return err;
}
}
public class MandatoryStringLengthValidator : IValidationEngine
{
public Error Validate(string name, dynamic value)
{
var e = default(Error);
if (value == null || (value.GetType() == typeof(string) && string.IsNullOrWhiteSpace(value)))
{
e = new Error()
{
//.. err info
};
}
else
{
StringLengthValidator sl = new StringLengthValidator();
e = sl.Validate(name, value);
}
return e;
}
}
There is another class which I named it ValidationRouter. The job of this is to route to the specific validator (Think of this as a Strategy as per pattern).
I wrote the initial router to have a dictionary of keys and its respective objects map so that whatever comes in will match its key and calls its appropriate validate() method on the dictionary value. But as adding keys and values with new object creation. I think the class will take lot of memory as I've the object creation in the constructor.
So, I've added Lazy to the object creation and it ended up like this
Here is the code for router
public class ValidationRouter
{
public Dictionary<string, Lazy<IValidationEngine>> strategies = new Dictionary<string, Lazy<IValidationEngine>>();
public ValidationRouter()
{
var mandatoryStringLength = new Lazy<IValidationEngine>(() => new MandatoryStringLengthValidator());
var mandatory = new Lazy<IValidationEngine>(() => new MandatoryValidator());
var stringLength = new Lazy<IValidationEngine>(() => new StringLengthValidator());
strategies.Add("position", mandatoryStringLength);
strategies.Add("pcode", stringLength);
strategies.Add("username", mandatoryStringLength);
strategies.Add("description", stringLength);
strategies.Add("sourcename", stringLength);
//OMG: 35 more fields to be added to route to specific routers
}
public Error Execute(string name, dynamic value)
{
//Find appropriate field and route to its strategy
var lowered = name.ToLower();
if (!strategies.ContainsKey(lowered)) return default(Error);
return strategies[lowered].Value.Validate(name, value);
}
}
As you can see I need to add 35 more keys to dictionaries with appropriate strategies in the constructor.
Is the approach correct or is there any better way of routing to specific validator algorithms ?
I thought of object creation to be done by factory pattern with Activator.CreateInstance but not sure how to achieve as each of my properties will have different strategies.
Any ideas would be appreciated.

Dynamic change type of binary deserilized object

I have a question, I think it is basic knowledge but can not find any source of information to fill my gap.
Situation looks like this I am deserilizing couple of objects using for each. Foe example:
foreach (var property in _foundBackupProjectFiles)
{
DeserializeDocumentSetFromBinary(property.Key);
}
Where _foundBackupProjectFiles is Dictionary Key is name of the object value is list of serilized files (names + path).
Now each object is a different list, for now I am using:
if (property == "Document")
{
var deserilized = (List<DocumentSet.Document>) bformatter.Deserialize(stream);
listdoc.AddRange(deserilized);
}
else if (property == "DocumentLookup")
{
var deserilized = (List<DocumentSet.DocumentLookup>) bformatter.Deserialize(stream);
listdoclookup.AddRange(deserilized);
}
else if (property == "DocumentText")
{
var deserilized = (List<DocumentSet.DocumentText>) bformatter.Deserialize(stream);
listdoctxt.AddRange(deserilized);
}
It works but it looks pretty ugly and hard to read, is there an elegant way to delegate this part :
var deserilized = List<DocumentSet.DocumentText>)bformatter.Deserialize(stream);
listdoctxt.AddRange(deserilized);
and add dynamic typing ?
This is not a hell of a lot neater but you could use a switch with a extension method to add the deserialized object to the list.
public class Class
{
private List<Document> listdoc = new List<Document>();
private List<DocumentLookup> listdoclookup = new List<DocumentLookup>();
private List<DocumentText> listdoctxt = new List<DocumentText>();
public void Deserialize(string property, Stream stream)
{
var bformatter = new BinaryFormatter();
var des = bformatter.Deserialize(stream);
switch (property)
{
case "Document":
listdoc.AddToList(des);
break;
case "DocumentLookup":
listdoclookup.AddToList(des);
break;
case "DocumentText":
listdoctxt.AddToList(des);
break;
default:
throw new ArgumentOutOfRangeException("property");
}
}
}
public static class ListDeserializeExts
{
public static void AddToList<TValue>(this List<TValue> list, object des)
{
list.AddRange((IEnumerable<TValue>)des);
}
}

C# Dictionary As Application Cache

I don't even know if I am on the right track but I gave a shot to Dictionary to store big and static data (like multi language data) and use it in the whole application. This example goes for a WinForms app.
In this scenario, I have a table which holds various parameters, divided into groups. The table is made of following fields: GROUP_CODE, PARAMETER_KEY, PARAMETER_VALUE... It simply holds lots of values as in this example:
GROUP_CODE: 'MAILING_INFO'
PARAMETER_KEY: 'SendMailAfterProcessIsDone'
PARAMETER_VALUE: 'a#b.com'
There is no problem or anything to hold and get the data from the database. My only problem is that how exactly I am going to handle this data...
My ParameterValues class holds the following, same fields I select from the table...
public class ParameterValues
{
private string groupCode;
private string parameterKey;
private string parameterValue;
public string GroupCode
{
get { return groupCode; }
set { groupCode = value; }
}
public string ParameterKey
{
get { return parameterKey; }
set { parameterKey = value; }
}
public string ParameterValue
{
get { return parameterValue; }
set { parameterValue = value; }
}
}
In another class, called CacheHelper, I am trying to put this object with the GROUP_CODE value as key into a Dictionary. (Hope that made sense)
public class CacheHelper
{
public Dictionary<string, ParameterValues> LoadParameterCache2()
{
//Dictionary<string, ParameterValues> objList = new Dictionary<string, ParameterValues>();
Dictionary<string, List<ParameterValues>> objList = new Dictionary<string, List<ParameterValues>>();
//call the values from the database blah blah
while (rdr.Read())
{
ParameterValues cacheObj = new ParameterValues();
cacheObj.ParameterKey = rdr.GetString("PARAMETER_KEY");
cacheObj.ParameterValue = rdr.GetString("PARAMETER_VALUE");
objList.Add(rdr.GetString("GROUP_CODE"), /*List ParameterValues goes here blah blah*/);
}
return objList; //and finally return it to the main application thread
}
}
I have to group it by GROUP_CODE values as from the database since Dic's "key" has to be unique... I know....
Since I think it was not clear enough, I tried to demonstrate what I am actually trying to do in the following image...
I simply cannot put it there groupped. Please someone show me a way... And also I'd like to hear your ideas if this is a good way to use this as application caching for really large data (thousands of rows I mean).
Many thanks!
I am not going to talk about caching solutions, as there are some out there and I am not particularly experienced with any.
But If you want to try static Dictionary solution, you could try grouping your data using Linq:
public Dictionary<string, List<ParameterValues>> LoadParameterCache2()
{
List<KeyValuePair<string, ParameterValues>> dataValues = new List<KeyValuePair<string, ParameterValues>>();
// Your code to produce the DataReader
DataTableReader rdr = GetDataReader();
while (rdr.Read())
{
ParameterValues cacheObj = new ParameterValues();
cacheObj.ParameterKey = (string)rdr["PARAMETER_KEY"];
cacheObj.ParameterValue = (string)rdr["PARAMETER_VALUE"];
KeyValuePair<string, ParameterValues> dataValue = new KeyValuePair<string, ParameterValues>((string)rdr["GROUP_CODE"], cacheObj);
dataValues.Add(dataValue);
}
Dictionary<string, List<ParameterValues>> objList = dataValues.GroupBy(d => d.Key).ToDictionary(k => k.Key, v => v.Select(i => i.Value).ToList());
return objList;
}
I think I get where you are having a problem, please note the syntax may be not 100% as I am typing from memory.
public class CacheHelper
{
public Dictionary<string, List<ParameterValues>> LoadParameterCache2()
{
var objList = new Dictionary<string, List<ParameterValues>>();
//call the values from the database blah blah
while (rdr.Read())
{
var cacheObj = new ParameterValues();
cacheObj.ParameterKey = rdr.GetString("PARAMETER_KEY");
cacheObj.ParameterValue = rdr.GetString("PARAMETER_VALUE");
//here is where you need to change things.
var groupCode = rdr.GetString("GROUP_CODE");
if (objList.ContainsKey(groupCode) == false)
{
objList.Add(groupCode, new List<ParameterValues> {cacheObj});
}
else
{
objList[groupCode].Add(cacheObj);
}
}
return objList; //and finally return it to the main application thread
}
}

How to check CONTAINS with multiple values

I am trying to find all the zones that contain 2 or more zone members where the search term is a string value. Here is the code I have. In the FindCommmonZones method when I try to cast the result of an Intersect to an ObservableCollection I get a run-time on an invalid cast. The question is, is there a better way to do this? The string array that is the paramter for FindCommonZones() can be any count of strings. StackOverflow had some other similar posts but none really answered my question - it looked like they all pertained more to SQL.
Some code:
public class Zone
{
public List<ZoneMember> MembersList = new List<ZoneMember>();
private string _ZoneName;
public string zoneName{ get{return _ZoneName;} set{_ZoneName=value;} }
public Zone ContainsMember(string member)
{
var contained = this.MembersList.FirstOrDefault(m => m.MemberWWPN.
Contains(member) || m.MemberAlias.Contains(member));
if (contained != null) { return this; }
else { return null; }
}
}
public class ZoneMember
// a zone member is a member of a zone
// zones have ports, WWPNs, aliases or all 3
{
private string _Alias = string.Empty;
public string MemberAlias {get{return _Alias;} set{_Alias = value; } }
private FCPort _Port = null;
public FCPort MemberPort { get { return _Port; } set { _Port = value; } }
private string _WWPN = string.Empty;
public string MemberWWPN { get { return _WWPN; } set { _WWPN = value; } }
private bool _IsLoggedIn;
public bool IsLoggedIn { get { return _IsLoggedIn; } set { _IsLoggedIn = value; } }
private string _FCID;
public string FCID {get{return _FCID;} set{ _FCID=value; } }
}
private ObservableCollection<ZoneResult> FindCommonZones(string[] searchterms)
{
ObservableCollection<ZoneResult> tempcollection =
new ObservableCollection<ZoneResult>();
//find the zones for the first search term
tempcollection = this.FindZones(searchterms[0]);
//now search for the rest of the search terms and compare
//them to existing result
for (int i = 1; i < searchterms.Count(); i++ )
{
// this line gives an exception trying to cast
tempcollection = (ObservableCollection<ZoneResult>)tempcollection.
Intersect(this.FindZones(searchterms[i]));
}
return tempcollection;
}
private ObservableCollection<ZoneResult> FindZones(string searchterm)
// we need to track the vsan where the zone member is found
// so use a foreach to keep track
{
ObservableCollection<ZoneResult> zonecollection = new ObservableCollection<ZoneResult>();
foreach (KeyValuePair<int, Dictionary<int, CiscoVSAN>> fabricpair in this.FabricDictionary)
{
foreach (KeyValuePair<int, CiscoVSAN> vsanpair in fabricpair.Value)
{
var selection = vsanpair.Value.ActiveZoneset.
ZoneList.Select(z => z.ContainsMember(searchterm)).
Where(m => m != null).OrderBy(z => z.zoneName);
if (selection.Count() > 0)
{
foreach (Zone zone in selection)
{
foreach (ZoneMember zm in zone.MembersList)
{
ZoneResult zr = new ZoneResult(zone.zoneName,
zm.MemberWWPN, zm.MemberAlias, vsanpair.Key.ToString());
zonecollection.Add(zr);
}
}
}
}
}
return zonecollection;
}
Intersect is actually Enumerable.Intersect and is returning an IEnumerable<ZoneResult>. This is not castable to an ObservableCollection because it isn't one - it is the enumeration of the intersecting elements in both collections.
You can, however create a new ObservableCollection from the enumeration:
tempcollection = new ObservableCollection<ZoneResult>(tempcollection
.Intersect(this.FindZones(searchterms[i]));
Depending on how many elements you have, how ZoneResult.Equals is implemented, and how many search terms you expect, this implementation may or may not be feasable (FindZones does seem a little overly-complicated with O(n^4) at first glance). If it seems to be a resource hog or bottleneck, it's time to optimize; otherwise I would just leave it alone if it works.
One suggested optimization could be the following (incorporating #Keith's suggestion to change ContainsMember to a bool) - although it is untested, I probably have my SelectManys wrong, and it really largely amounts to the same thing, you hopefully get the idea:
private ObservableCollection<ZoneResult> FindCommonZones(string[] searchterms)
{
var query = this.FabricDictionary.SelectMany(fabricpair =>
fabricpair.Value.SelectMany(vsanpair =>
vsanpair.Value.ActiveZoneSet.ZoneList
.Where(z=>searchterms.Any(term=>z.ContainsMember(term)))
.SelectMany(zone =>
zone.MembersList.Select(zm=>new ZoneResult(zone.zoneName, zm.MemberWWPN, zm.MemberAlias, vsanpair.Key.ToString()))
)
)
.Distinct()
.OrderBy(zr=>zr.zoneName);
return new ObservableCollection<ZoneResult>(query);
}

Quickest Array Initialization?

In an application of mine, I need a large constant (actually static readonly) array of objects. The array is initialized in the type's static constructor.
The array contains more than a thousand items, and when the type is first used, my program experiences a serious slowdown. I would like to know if there is a way to initialise a large array quickly in C#.
public static class XSampa {
public class XSampaPair : IComparable<XSampaPair> {
public XSampaPair GetReverse() {
return new XSampaPair(Key, Target);
}
public string Key { get; private set; }
public string Target { get; private set; }
internal XSampaPair(string key, string target) {
Key = key;
Target = target;
}
public int CompareTo(XSampaPair other) {
if (other == null)
throw new ArgumentNullException("other",
"Cannot compare with Null.");
if (Key == null)
throw new NullReferenceException("Key is null!");
if (other.Key == null)
throw new NullReferenceException("Key is null!");
if (Key.Length == other.Key.Length)
return string.Compare(Key, other.Key,
StringComparison.InvariantCulture);
return other.Key.Length - other.Key;
}
}
private static readonly XSampaPair[] pairs, reversedPairs;
public static string ParseXSampaToIpa(this string xsampa) {
// Parsing code here...
}
public static string ParseIpaToXSampa(this string ipa) {
// reverse code here...
}
static XSampa() {
pairs = new [] {
new XSampaPair("a", "\u0061"),
new XSampaPair("b", "\u0062"),
new XSampaPair("b_<", "\u0253"),
new XSampaPair("c", "\u0063"),
// And many more pairs initialized here...
};
var temp = pairs.Select(x => x.GetReversed());
reversedPairs = temp.ToArray();
Array.Sort(pairs);
Array.Sort(reversedPairs);
}
}
PS: I use to array to convert X-SAMPA phonetic transcription to a Unicode string with the corresponding IPA characters.
You can serialize a completely initialized onject into a binary file, add that file as a resource, and load it into your array on startup. If your constructors are CPU-intensive, you might get an improvement. Since your code appears to perform some sort of parsing, the chances of getting a decent improvement there are fairly high.
You could use an IEnumerable<yourobj> which would let you lazily yield return the enumerable only as needed.
The problem with this is you won't be able to index into it like you could using the array.

Categories