I have a solution that has a MVC-project and a windows console application. Both projects share the same Backend-Project, that loads and saves data.
I need to exchange the data from these two projects. Therefore I decided to use Isolated scope:
private string LoadInstallationFile()
{
IsolatedStorageFile isoStore = IsolatedStorageFile.GetMachineStoreForDomain();
if (!isoStore.FileExists(ClientSettingsFile)) return null;
/* do stuff */
}
private void SaveInstallationFile() {
IsolatedStorageFile isoStore = IsolatedStorageFile.GetMachineStoreForDomain();
using (IsolatedStorageFileStream isoStream = new IsolatedStorageFileStream(ClientSettingsFile, FileMode.CreateNew, isoStore))
{
using (StreamWriter writer = new StreamWriter(isoStream))
{
writer.WriteLine(data);
}
}
}
Now Inside the mvc-project I save the Data using "SaveInstallationFile()". I can access that file from within that mvc-project.
But when I try to access the data using the other (console-)project, the File does not exist.
How can I exchange data between these two?
(There is a big chance that both run under different user credentials, so GetUserStore...() IMHO would not work.
Both, the MVC-Application AND the Console-Application run on the Same Server.
If you really need to use isolated storage API you can use GetMachineStoreForAssembly (if this code is shared in the same assembly for both projects). Currently you use different storages for different applications. But to be honest, I would prefer to use database or custom configurable shared disk path as it looks more flexible solution.
GetMachineStoreForAssembly is less restrictive version of GetMachineStoreForDomain (both of them require code to be in the same assembly but GetMachineStoreForDomain requires it to be also in the same application). You can check MSDN documentation:
First method (GetMachineStoreForAssembly) is equivalent of GetStore(IsolatedStorageScope.Assembly |
IsolatedStorageScope.Machine, null, null)
Second method is equivalent of GetStore(IsolatedStorageScope.Assembly |
IsolatedStorageScope.Domain | IsolatedStorageScope.Machine,
null, null); (so, it includes one additional flag that's why it's more restrictive)
And also both of them check calling assembly. Not root executing assembly.
Related
I'm fairly certain I'm either doing something wrong, or understanding something wrong. It's hard to give a piece of code to show my problem, so I'm going to try explaining my scenario, with the outcome.
I'm starting up several instances of a DLL, in the same console application, but in it's own app domain. I then generated a Guid.NewGuid() that I assign to a class in the instance, and set the application's folder to a new folder. This works great so far. I can see everything works great, and my instances are seperated. However... when I started changing my app's folder to the same name as the unique GUID generated for that class I started picking up anomolies.
It works fine, when I instantiate the new instances slowly, but when I hammer new ones in, the application started picking up data in its folder, when it started up. After some investigation, I found that its because that folder already exist, due to that GUID already being instantiated. On further investigation, I can see that the machine takes a bit of a pause, and then continues to generated the new instances, all with the same GUID.
I understand that the GUID generating algorithm uses the MAC as part of it, but I was under the impression that even if the same machine, at the same exact moment generates two GUIDs, it would still be unique.
Am I correct in that statement? Where am I wrong?
Code :
Guid guid = Guid.NewGuid();
string myFolder = Path.Combine(baseFolder, guid.ToString());
AppDomain ad = AppDomain.CurrentDomain;
Console.WriteLine($"{ad.Id} - {guid.ToString()}");
string newHiveDll = Path.Combine(myFolder, "HiveDriveLibrary.dll");
if (!Directory.Exists(myFolder))
{
Directory.CreateDirectory(myFolder);
}
if (!File.Exists(newHiveDll))
{
File.Copy(hiveDll, newHiveDll);
}
Directory.SetCurrentDirectory(myFolder);
var client = ServiceHelper.CreateServiceClient(serviceURL);
ElementConfig config = new ElementConfig();
ElementConfig fromFile = ElementConfigManager.GetElementConfig();
if (fromFile == null)
{
config.ElementGUID = guid;
config.LocalServiceURL = serviceURL;
config.RegisterURL = registerServiceURL;
}
else
{
config = fromFile;
}
Directory.SetCurrentDirectory is a thin wrapper atop the Kernel 32 function SetCurrentDirectory.
Unfortunately, the .NET documentation writers didn't choose to copy the warning from the native function:
Multithreaded applications and shared library code should not use the SetCurrentDirectory function and should avoid using relative path names. The current directory state written by the SetCurrentDirectory function is stored as a global variable in each process, therefore multithreaded applications cannot reliably use this value without possible data corruption from other threads that may also be reading or setting this value
It's your reliance on this function that's creating the appearance that multiple threads have magically selected exactly the same GUID value.
I used the following code for creating torrent in monotorrent solution. But references are missing for "RawTrackerTier".I am able to create torrent when i comment that lines of code but i am unable to download file from that created torrent. Is it because of the missing reference or any other problem in the solution?If it is ,can anyone provide the missing references?
please help me!!
public void CreateTorrent(string path, string savePath)
{
// The class used for creating the torrent
TorrentCreator c = new TorrentCreator();
// Add one tier which contains two trackers
RawTrackerTier tier = new RawTrackerTier(); // MISSING REFERENCE HERE
tier.Add("http://localhost/announce");
c.Announces.Add(tier);
c.Comment = "This is the comment";
c.CreatedBy = "Doug using " + VersionInfo.ClientVersion;
c.Publisher = "www.aaronsen.com";
// Set the torrent as private so it will not use DHT or peer exchange
// Generally you will not want to set this.
c.Private = true;
// Every time a piece has been hashed, this event will fire. It is an
// asynchronous event, so you have to handle threading yourself.
c.Hashed += delegate(object o, TorrentCreatorEventArgs e)
{
Console.WriteLine("Current File is {0}% hashed", e.FileCompletion);
Console.WriteLine("Overall {0}% hashed", e.OverallCompletion);
Console.WriteLine("Total data to hash: {0}", e.OverallSize);
};
// ITorrentFileSource can be implemented to provide the TorrentCreator
// with a list of files which will be added to the torrent metadata.
// The default implementation takes a path to a single file or a path
// to a directory. If the path is a directory, all files will be
// recursively added
ITorrentFileSource fileSource = new TorrentFileSource(path);
// Create the torrent file and save it directly to the specified path
// Different overloads of 'Create' can be used to save the data to a Stream
// or just return it as a BEncodedDictionary (its native format) so it can be
// processed in memory
c.Create(fileSource, savePath);
}
The version of MonoTorrent on nuget (version 0.9.0) is quite old and appears to have only single-tier announce support. If you're using that version, this code might work:
// The class used for creating the torrent
TorrentCreator c = new TorrentCreator();
// Add tracker(s)
c.Announces.Add("http://localhost/announce");
The latest version of monotorrent is available (code only) at github. You could try downloading the library source and building it yourself.
I use TextWriterTraceListener (System.Diagnostics) in my application to trace several things like exceptions,...
The application is running on a terminal server and if there are many users using it simultaneously the listener starts to create many tracefiles with random GUIDs in the filename.
Are there possibilities or workarounds to avoid this behaviour ?
I've just taken a look at the documentation for TextWriterTraceListener and there's a note about 1/3 of the way down the page
If an attempt is made to write to a file that is in use or unavailable, the file name is automatically prefixed by a GUID
So, this would appear to be by design. If the file is indeed unavailable then there's nothing that can be done about it with the current implementation. What you could try doing is writing a custom implementation of TextWriterTraceListener that overrides the relevant Write/WriteLine methods so that the output goes to a file, per user, with a name that better suits your needs.
If what you want is for ALL logging from ALL users on the Terminal Server to go to a single file, then you'll almost certainly need to have some kind of "3rd party" process running that "owns" the file and synchronises writes to it, such as a Windows Service that is then called by your custom TextWriterTraceListener
Was the fix calling the Trace.Listeners.Add(xxx listener) multiple times on accident?
Because if you have multiple listeners added they write too all listeners when you call the Trace.writeline();
Also local IIS might be continueing to have the file in use when you shut down the application.
I am currently testing the addition of System.Diagnostics.Trace.Listeners.Clear() in my output method...
// Upon a new day re-create the TextWriterTraceListener to update our file name...
if (_date?.Day != DateTime.Now.Day) { _listener = null; }
if (_listener == null)
{
System.Diagnostics.Trace.Listeners.Clear();
_fileName = $"{DateTime.Now.ToString("yyyy-MM-dd")}_Trace.json";
// Add a writer that appends to the trace.log file:
_listener = new System.Diagnostics.TextWriterTraceListener(_fileName);
_listener.IndentSize = 4;
_listener.TraceOutputOptions = System.Diagnostics.TraceOptions.None; // TraceOptions.DateTime | TraceOptions.ThreadId;
System.Diagnostics.Trace.AutoFlush = true;
System.Diagnostics.Trace.Listeners.Add(_listener);
// Obtain the Console's output stream, then add that as a listener...
System.Diagnostics.Trace.Listeners.Add(new System.Diagnostics.TextWriterTraceListener(Console.Out));
}
I need to dynamically instantiate a web application from a console application. By this definition, I mean that my console application contains a web application that is not bound to IIS/XSP.
Currently, I create the web application into a temporary directory and copy some forged files into it. These are a special Global.asax that maps to my own implementation of HttpApplication to use in the web application (I need to do some initialization at app start), then I forge special .asmx files that map to my own skeleton classes and dynamic plugins
foreach (IPlugin plugin in _target.Plugins)
{
WsdlSkeletonDefinition[] defs = plugin.GetWsdlSkeletons();
if (defs != null)
foreach (WsdlSkeletonDefinition def in defs)
{
if (def.SkeletonType == null)
throw new LogbusException(string.Format("Plugin {0} declares empty skeleton type",
plugin.Name));
if (def.SkeletonType.IsAssignableFrom(typeof(System.Web.Services.WebService)))
throw new LogbusException(
string.Format("Plugin {0} does not declare a valid WSDL skeleton type", plugin.Name));
string fname = def.UrlFileName;
if (fname.EndsWith(".asmx", false, CultureInfo.InvariantCulture))
fname = fname.Substring(0, fname.Length - 5);
if (!Regex.IsMatch(fname, #"^[a-zA-Z0-9_\.\-%]+$", RegexOptions.CultureInvariant))
throw new LogbusException(string.Format(
"Plugin {0} declares invalid WSDL endpoint: {1}",
plugin.Name, def.UrlFileName));
string wsDeclaration = string.Format(ASMX_TEMPLATE, def.SkeletonType.AssemblyQualifiedName);
using (
StreamWriter sw = new StreamWriter(File.Create(Path.Combine(_physicalPath, fname + ".asmx")),
Encoding.Default))
sw.Write(wsDeclaration);
//Copy skeleton asembly if needed
CopyAssemblyTo(def.SkeletonType.Assembly, bindir);
foreach (AssemblyName dependency in def.SkeletonType.Assembly.GetReferencedAssemblies())
{
try
{
CopyAssemblyTo(Assembly.Load(dependency), bindir);
}
//Possible broken dependency
catch { }
}
}
}
My approach works, but I'm not so satisfied by it because I have to write lots of garbage into file system, even if I eventually delete it all.
I know I can control HTTP handlers via Web.config, but I don't want to forge a Web.config for that. I would like to create a mapping such as I can remove the .asmx extension from web services' URLs and still get them.
For example, one of the default scripts is "LogbusManagement.asmx", which must be hard-coded into client APIs and the .asmx prevents portability to other platforms such as PHP. I want to make "LogbusManagement.asmx" equivalent to "LogbusManagement" and any extension. For this, I might use an HttpHandlerFactory.
My straight question is,
like asked here by somebody else: is there a way to programmatically, possibly from Global.asax, to set IHttpHandlers or IHttpHandlerFactories for web applications?
Thank you
This question was already answered in the stackoverflow:
Any way to add HttpHandler programatically in .NET?
What approach do you recommend for persisting user settings in a WPF windows (desktop) application? Note that the idea is that the user can change their settings at run time, and then can close down the application, then when starting up the application later the application will use the current settings. Effectively then it will appear as if the application settings do not change.
Q1 - Database or other approach? I do have a sqlite database that I will be using anyway hence using a table in the database would be as good as any approach?
Q2 - If Database: What database table design? One table with columns for different data types that one might have (e.g. string, long, DateTime etc) OR just a table with a string for the value upon which you have to serialize and de-serialize the values? I'm thinking the first would be easier, and if there aren't many settings the overhead isn't much?
Q3 - Could Application Settings be used for this? If so are there any special tasks required to enable the persistence here? Also what would happen regarding usage of the "default" value in the Application Settings designer in this case? Would the default override any settings that were saved between running the application? (or would you need to NOT use the default value)
You can use Application Settings for this, using database is not the best option considering the time consumed to read and write the settings(specially if you use web services).
Here are few links which explains how to achieve this and use them in WPF -
User Settings in WPF
Quick WPF Tip: How to bind to WPF application resources and settings?
A Configurable Window for WPF
Update: Nowadays I would use JSON.
I also prefer to go with serialization to file. XML files fits mostly all requirements. You can use the ApplicationSettings build in but those have some restrictions and a defined but (for me) very strange behavior where they stored. I used them a lot and they work. But if you want to have full control how and where they stored I use another approach.
Make a class Somewhere with all your settings. I named it MySettings
Implement Save and Read for persistence
Use them in you application-code
Advantages:
Very Simple approach.
One Class for Settings. Load. Save.
All your Settings are type safe.
You can simplify or extend the logic to your needs (Versioning, many Profiles per User, etc.)
It works very well in any case (Database, WinForms, WPF, Service, etc...)
You can define where to store the XML files.
You can find them and manipulate them either by code or manual
It works for any deployment method I can imagine.
Disadvantages:
- You have to think about where to store your settings files. (But you can just use your installation folder)
Here is a simple example (not tested)-
public class MySettings
{
public string Setting1 { get; set; }
public List<string> Setting2 { get; set; }
public void Save(string filename)
{
using (StreamWriter sw = new StreamWriter(filename))
{
XmlSerializer xmls = new XmlSerializer(typeof(MySettings));
xmls.Serialize(sw, this);
}
}
public MySettings Read(string filename)
{
using (StreamReader sw = new StreamReader(filename))
{
XmlSerializer xmls = new XmlSerializer(typeof(MySettings));
return xmls.Deserialize(sw) as MySettings;
}
}
}
And here is how to use it. It's possible to load default values or override them with the user's settings by just checking if user settings exist:
public class MyApplicationLogic
{
public const string UserSettingsFilename = "settings.xml";
public string _DefaultSettingspath =
Assembly.GetEntryAssembly().Location +
"\\Settings\\" + UserSettingsFilename;
public string _UserSettingsPath =
Assembly.GetEntryAssembly().Location +
"\\Settings\\UserSettings\\" +
UserSettingsFilename;
public MyApplicationLogic()
{
// if default settings exist
if (File.Exists(_UserSettingsPath))
this.Settings = Settings.Read(_UserSettingsPath);
else
this.Settings = Settings.Read(_DefaultSettingspath);
}
public MySettings Settings { get; private set; }
public void SaveUserSettings()
{
Settings.Save(_UserSettingsPath);
}
}
maybe someone get's inspired by this approach. This is how I do it now for many years and I'm quite happy with that.
You can store your settings info as Strings of XML in the Settings.Default. Create some classes to store your configuration data and make sure they are [Serializable]. Then, with the following helpers, you can serialize instances of these objects--or List<T> (or arrays T[], etc.) of them--to String. Store each of these various strings in its own respective Settings.Default slot in your WPF application's Settings.
To recover the objects the next time the app starts, read the Settings string of interest and Deserialize to the expected type T (which this time must be explcitly specified as a type argument to Deserialize<T>).
public static String Serialize<T>(T t)
{
using (StringWriter sw = new StringWriter())
using (XmlWriter xw = XmlWriter.Create(sw))
{
new XmlSerializer(typeof(T)).Serialize(xw, t);
return sw.GetStringBuilder().ToString();
}
}
public static T Deserialize<T>(String s_xml)
{
using (XmlReader xw = XmlReader.Create(new StringReader(s_xml)))
return (T)new XmlSerializer(typeof(T)).Deserialize(xw);
}
The long running most typical approach to this question is: Isolated Storage.
Serialize your control state to XML or some other format (especially easily if you're saving Dependency Properties with WPF), then save the file to the user's isolated storage.
If you do want to go the app setting route, I tried something similar at one point myself...though the below approach could easily be adapted to use Isolated Storage:
class SettingsManager
{
public static void LoadSettings(FrameworkElement sender, Dictionary<FrameworkElement, DependencyProperty> savedElements)
{
EnsureProperties(sender, savedElements);
foreach (FrameworkElement element in savedElements.Keys)
{
try
{
element.SetValue(savedElements[element], Properties.Settings.Default[sender.Name + "." + element.Name]);
}
catch (Exception ex) { }
}
}
public static void SaveSettings(FrameworkElement sender, Dictionary<FrameworkElement, DependencyProperty> savedElements)
{
EnsureProperties(sender, savedElements);
foreach (FrameworkElement element in savedElements.Keys)
{
Properties.Settings.Default[sender.Name + "." + element.Name] = element.GetValue(savedElements[element]);
}
Properties.Settings.Default.Save();
}
public static void EnsureProperties(FrameworkElement sender, Dictionary<FrameworkElement, DependencyProperty> savedElements)
{
foreach (FrameworkElement element in savedElements.Keys)
{
bool hasProperty =
Properties.Settings.Default.Properties[sender.Name + "." + element.Name] != null;
if (!hasProperty)
{
SettingsAttributeDictionary attributes = new SettingsAttributeDictionary();
UserScopedSettingAttribute attribute = new UserScopedSettingAttribute();
attributes.Add(attribute.GetType(), attribute);
SettingsProperty property = new SettingsProperty(sender.Name + "." + element.Name,
savedElements[element].DefaultMetadata.DefaultValue.GetType(), Properties.Settings.Default.Providers["LocalFileSettingsProvider"], false, null, SettingsSerializeAs.String, attributes, true, true);
Properties.Settings.Default.Properties.Add(property);
}
}
Properties.Settings.Default.Reload();
}
}
.....and....
Dictionary<FrameworkElement, DependencyProperty> savedElements = new Dictionary<FrameworkElement, DependencyProperty>();
public Window_Load(object sender, EventArgs e) {
savedElements.Add(firstNameText, TextBox.TextProperty);
savedElements.Add(lastNameText, TextBox.TextProperty);
SettingsManager.LoadSettings(this, savedElements);
}
private void Window_Closing(object sender, System.ComponentModel.CancelEventArgs e)
{
SettingsManager.SaveSettings(this, savedElements);
}
Apart from a database, you can also have following options to save user related settings
registry under HKEY_CURRENT_USER
in a file in AppData folder
using Settings file in WPF and by setting its scope as User
In my experience storing all the settings in a database table is the best solution. Don't even worry about performance. Today's databases are fast and can easily store thousands columns in a table. I learned this the hard way - before I was serilizing/deserializing - nightmare. Storing it in local file or registry has one big problem - if you have to support your app and computer is off - user is not in front of it - there is nothing you can do.... if setings are in DB - you can changed them and viola not to mention that you can compare the settings....
You can use SQLite, a small, fast, self-contained, full-featured, SQL database engine. I personally recommend it after trying settings file and XML file approach.
Install NuGet package System.Data.SQLite
which is an ADO.NET provider for SQLite.
The package includes support for LINQ and Entity Framework
Overall you can do many things with such supporting features to your settings window.
1.Install SQLite
2.Create your database file
3.Create tables to save your settings
4.Access database file in your application to read and edit settings.
I felt this approach very much helpful for application settings, since i can do adjustments to database and also take advantage of ADO.Net and LINQ features
I typically do this sort of thing by defining a custom [Serializable] settings class and simply serializing it to disk. In your case you could just as easily store it as a string blob in your SQLite database.
In all the places I've worked, database has been mandatory because of application support. As Adam said, the user might not be at his desk or the machine might be off, or you might want to quickly change someone's configuration or assign a new-joiner a default (or team member's) config.
If the settings are likely to grow as new versions of the application are released, you might want to store the data as blobs which can then be deserialized by the application. This is especially useful if you use something like Prism which discovers modules, as you can't know what settings a module will return.
The blobs could be keyed by username/machine composite key. That way you can have different settings for every machine.
I've not used the in-built Settings class much so I'll abstain from commenting. :)
I wanted to use an xml control file based on a class for my VB.net desktop WPF application. The above code to do this all in one is excellent and set me in the right direction. In case anyone is searching for a VB.net solution here is the class I built:
Imports System.IO
Imports System.Xml.Serialization
Public Class XControl
Private _person_ID As Integer
Private _person_UID As Guid
'load from file
Public Function XCRead(filename As String) As XControl
Using sr As StreamReader = New StreamReader(filename)
Dim xmls As New XmlSerializer(GetType(XControl))
Return CType(xmls.Deserialize(sr), XControl)
End Using
End Function
'save to file
Public Sub XCSave(filename As String)
Using sw As StreamWriter = New StreamWriter(filename)
Dim xmls As New XmlSerializer(GetType(XControl))
xmls.Serialize(sw, Me)
End Using
End Sub
'all the get/set is below here
Public Property Person_ID() As Integer
Get
Return _person_ID
End Get
Set(value As Integer)
_person_ID = value
End Set
End Property
Public Property Person_UID As Guid
Get
Return _person_UID
End Get
Set(value As Guid)
_person_UID = value
End Set
End Property
End Class