I have an application that receives IoT data. I would like to change the layout (that displays the data) depending on the configuration set by the user.
Eg: The user decides that 3 bytes will be device_id, 4th byte when multiplied by a value gives temperature value,etc. How can I create such a user config file and save it for later use ?
After saving the data, how can I display the data based on these config files? I am thinking of using labels to just match the data. Is there a better way to do this ?
So I have done as #Nobody suggested.
I have created a class with details like number of bytes, device id, etc and then took the data from user input via a form. Later used Basic Serialization to save the data and deserialization to read it back the next time I open the Application as per this link.
Code :
[Serializable()]
public class Config
{
public string DeviceId { get; set; }
public string Name { get; set; }
public int Length { get; set; }
}
using (Stream testFileStream = File.Create(pathString)) // Serialization code
{
BinaryFormatter serializer = new BinaryFormatter();
serializer.Serialize(testFileStream, config);
testFileStream.Close();
}
using (Stream testFileStream = File.OpenRead(pathString))
{
BinaryFormatter deserializer = new BinaryFormatter();
config = (Config)deserializer.Deserialize(testFileStream);
testFileStream.Close();
}
Related
I have a XML file which has all information about a voyage and all details inside.
I want to read all records in XML file, after combining records I want to write it to SQL database.
So far I arranged getting header , company and voyage to array but getting details for all records to array I failed.
Here are my task to handle:
Select and read any XML Data to RAM by FileDialog (completed)
Create Arrays and Read XML data to Arrays (partly completed)
Write XML data to DataView (partly completed)
Create T-SQL INSERT Command (Partly completed)
Write Data to Database (Waiting to finish early steps)
While reading from XML to DataView I can get data to memory but could not seperated multi level data as requested.
The exact problem is trying to handle different levels of data in XML in every XML file I recieve.
foreach (var child in childElem.Elements("ManifestData"))
{
foreach(var x in child.Elements())
{
var checkName = x.Name.ToString();
switch (checkName)
{
case "Company":
Globals.Companys.Clear();
foreach (var y in x.Elements())
{
Globals.Companys.Add(y.Name.ToString(), y.Value.ToString());
}
break;
case "Voyage":
Globals.Voyages.Clear();
foreach (var y in x.Elements())
{
Globals.Voyages.Add(y.Name.ToString(), y.Value.ToString());
}
break;
case "BLs":
int recs = 0;
Globals.BL.Clear();
textBox2.Clear();
foreach (var y in x.Elements())
{
foreach (var z in x.Elements("units"))
{
Globals.Units.Add(y.Element("number").Value.ToString(), z.Value.ToString());
}
Globals.BL.Add(y.Element("number").Value.ToString(), y.Value.ToString());
recs = recs + 1;
textBox2.AppendText("\n" + y.ToString());
string output = string.Join("\n", Globals.BL);
MessageBox.Show(output);
}
break;
default:
break;
}
}
}
In my example XML you see that there is 3 BLs and all BL data has different levels.There can be hundreds of BLs with different levels of Goods & Dangerous Goods.
I am having trouble handling multi level XML data here.
I 'll be glad if you help me solve this very basic problem. I am hoping to learn and leave it for the people to figure out to understand making desktop XML reader application for their own DBs.
Here is the XML Data example
You can find all sources here : Project Reading XMLbyC#
The xml processing part can be made simple by deserializing your xml into c# classes which you can then use to do whatever you want.
[XmlRoot(ElementName = "ManifestMessage")]
public class ManifestMessage
{
[XmlElement(ElementName = "Header")]
public Header Header { get; set; }
[XmlElement(ElementName = "ManifestData")]
public ManifestData ManifestData { get; set; }
}
[XmlRoot(ElementName = "Header")]
public class Header
{
[XmlElement(ElementName = "sender")]
public string Sender { get; set; }
[XmlElement(ElementName = "reciever")]
public string Reciever { get; set; }
[XmlElement(ElementName = "timeOfDocument")]
public string TimeOfDocument { get; set; }
[XmlElement(ElementName = "typeOfMessage")]
public string TypeOfMessage { get; set; }
}
// Then when you want to get the xml deserialized into your class hierarchy
var xmlSerializer = new XmlSerializer(typeof(ManifestMessage));
var manifestMessage = xmlSerializer.Deserialize(data) as ManifestMessage;
// now you can use this object to drill down the whole hierarchy
Console.WriteLine(xmlData.Header.Sender);
Console.WriteLine(xmlData.ManifestData.Company.ComanyName);
Console.WriteLine(xmlData.ManifestData.Voyage.CrewNumber);
foreach (var bl in xmlData.ManifestData.BLs.BL)
{
Console.WriteLine(bl.Collect);
Console.WriteLine(bl.Consegnee.Name);
Console.WriteLine(bl.Customer.Name);
}
You can use https://xmltocsharp.azurewebsites.net/ site to generate the whole c# class hierarchy from your xml.
Console.WriteLine is just for demo purpose you can adopt it according to your needs.
I have a database with a photo column, I am trying to display it on a website.
I am coding in Visual studio 2019. It's an MVC project in c#
The two photos show what I got given. The link from the second picture doesn't go anywhere
There are 2 possibilities:
1st option:
Store the path of the image in your database and dynamically set the src of an <img>
2nd option:
Save the image data in your database and reconstruct it on load, your <img> src has to be a controller action that loads, constructs and returns the image.
Edit
Quick example for option 2:
In your cshtml define your image like this:
<img src="#Url.Action("GetFromDB", "Image", new { id = 1234 })" />
Note, you can set the id of your picture dynamically, depending on your scenario. Lets say you have a user class which has a profile picture assigned, you just need to use this id.
On the backend you need a action that handles this request, in this example in the ImageController:
public ActionResult GetFromDB(int id)
{
var image = _dbContext.Find(id);
return File(image.PictureData, image.ContentType);
}
This assumes you have a simple database model for images like this:
class Image
{
[Key]
public int ID { get; set; }
public byte[] PictureData { get; set; }
public string ContentType { get; set; }
}
To save your image to the database, you just need to get it's bytes and content type, for example like this:
using (var ms = new MemoryStream())
{
using (var uploadedImage = Image.FromStream(formFile.OpenReadStream(), true, true))
{
uploadedImage.Save(ms, ImageFormat.Jpeg); // you can actually chose this yourself, depending on your scenario
}
var image = new Model.Image()
{
PictureData = ms.ToArray(),
ContentType = "image/jpeg" // needs to match what you chose above
};
_dbContext.Pictures.Add(image);
_dbContext.SaveChanges();
}
I'm designing a file upload API that needs to work with large files. I want to stay away from passing around byte arrays. The endpoint storage for the file will be a third party such as Azure or Rackspace file storage.
I have the following project structure, which is following DDD:
Web API (Netcore - accepts the uploaded file)
Business Service (calls to Azure to save the file and saves a record to the database)
Domain (Domain models for EF)
Persistence (Repositories EFCore - saves the database changes)
I would like to have methods in each that can start passing through the uploaded filestream as soon as the upload starts. I'm unsure if this is possible?
Previously we've used byte[] to pass the files through the layers, but for large files this seems to require lots of memory to do so and has caused us issues.
Is it possible to optimize the upload of files through a ntier application, so you don't have to copy around large byte arrays, and if so, how can it be done?
In order to clarify, the code structure would be something like the following. Repository stuff has been excluded:
namespace Project.Controllers
{
[Produces("application/json")]
[Route("api/{versionMNumber}/")]
public class DocumentController : Controller
{
private readonly IAddDocumentCommand addDocumentCommand;
public DocumentController(IAddDocumentCommand addDocumentCommand)
{
this.addDocumentCommand = addDocumentCommand;
}
[Microsoft.AspNetCore.Mvc.HttpPost("application/{applicationId}/documents", Name = "PostDocument")]
public IActionResult UploadDocument([FromRoute] string applicationId)
{
var addDocumentRequest = new AddDocumentRequest();
addDocumentRequest.ApplicationId = applicationId;
addDocumentRequest.FileStream = this.Request.Body;
var result = new UploadDocumentResponse { DocumentId = this.addDocumentCommand.Execute(addDocumentRequest).DocumentId };
return this.Ok(result);
}
}
}
namespace Project.BusinessProcess
{
public interface IAddDocumentCommand
{
AddDocumentResponse Execute(AddDocumentRequest request);
}
public class AddDocumentRequest
{
public string ApplicationId { get; set; }
public Stream FileStream { get; set; }
}
public class AddDocumentResponse
{
public Guid DocumentId { get; set; }
}
public class AddDocumentCommand : IAddDocumentCommand
{
private readonly IDocuentRepository documentRepository;
private readonly IMessageBus bus;
public AddDocumentCommand(IDocumentRepository documentRepository, IMessageBus bus)
{
this.documentRepository = documentRepository;
this.bus = bus;
}
public AddDocumentResponse Execute(AddDocumentRequest request)
{
/// We need the file to be streamed off somewhere else, fileshare, Azure, Rackspace etc
/// We need to save a record to the db that the file has been saved successfully
/// We need to trigger the background workers to process the uploaded file
var fileUri = AzureStorageProvider.Save(request.FileStream);
var documentId = documentRepository.Add(new Document { FileUri = fileUri });
bus.AddMessage(new DocumentProcessingRequest { documentId = documentId, fileUri = fileUri });
return new AddDocumentResponse { DocumentId = documentId };
}
}
}
Some notes:
Passing around a byte array or stream doesn't copy the data - the issue is having the data on your server at all. If your web server needs to process the data in its complete form, you aren't going to be able to avoid the memory usage of doing so.
If you don't need to process the data on your web server at all, but just need to put it in blob storage, you should return a uri for the upload, which points to blob storage (this might be helpful: Using CORS with Azure)
If you need to process the data, but you're okay doing that a bit-at-a-time, something like this answer is what you need Using streams in ASP.Net web API
I need to develop a RSS client using c# and I wonder how any RSS client stores what the user read or not.
The simple answer is to store all the feeds of each url and mark whether the user read it or not.
So I need to know how other RSS clients manage the feeds state from being read or not from the user. do they store all the feeds from all the urls or not
Also, I need to know if there are any .net Library for client using pubsubhubbub protocol
For example,
If I subscribe for CNN feeds , the application will load the current CNN feeds then I make it read. After while , I open the client , I should find all the feeds that I read is marked as read.
So this means , that the client will store - for example in its database - all the links of the CNN feeds and save for each link its status whether it is read or not
my question is , is there another way to track the feeds is read or not instead of saving all the feeds of all the sites on DB which will lead to huge database
Welcome to StackOverflow :D
Representing RSS feeds
You could use the following types to represent feeds and articles :
using System;
using System.Collections.Generic;
using System.Linq;
public abstract class RssItem
{
public virtual bool IsRead { get; set; }
public string Name { get; set; }
public string Url { get; set; }
}
public class RssFeed : RssItem
{
public List<RssFeedArticle> Articles { get; set; }
public override bool IsRead
{
get { return Articles.All(s => s.IsRead); }
set { Articles.ForEach(s => s.IsRead = true); }
}
}
public class RssFeedArticle : RssItem
{
public string Content { get; set; }
}
This is really a simple representation, feel free to enhance it.
Basically when you set feed.IsRead = true; all articles will be marked as read, if you query the value it will return true only if all the articles have been read.
Example :
var article1 = new RssFeedArticle {Name = "article1", Content = "content1"};
var article2 = new RssFeedArticle {Name = "article2", Content = "content2"};
var feed = new RssFeed
{
Name = "cool feed",
Articles =
new List<RssFeedArticle>(new[]
{
article1,
article2
})
};
article1.IsRead = true;
feed.IsRead = true;
Storing your data
A common approach is to store your application data is in ApplicationData folder or in My Documents.
The advantage of using My Documents is that the user will generally backup this folder, not necessarily the case of ApplicationData for which novice users probably don't even know its existence.
Example for retrieving your application folder :
using System.IO;
private void Test()
{
string applicationFolder = GetApplicationFolder();
}
private static string GetApplicationFolder()
{
var applicationName = "MyCoolRssReader";
string folderPath = Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData);
string applicationFolder = Path.Combine(folderPath, applicationName);
bool exists = Directory.Exists(applicationFolder);
if (!exists)
{
Directory.CreateDirectory(applicationFolder);
}
return applicationFolder;
}
If you prefer My Documents instead :
string folderPath = Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments);
Here's some explanation/advice from a Microsoft developer :
http://blogs.msdn.com/b/patricka/archive/2010/03/18/where-should-i-store-my-data-and-configuration-files-if-i-target-multiple-os-versions.aspx
pubsubhubbub
There's a library for C# : https://code.google.com/p/pubsubhubbub-publisherclient-csharp/
(from https://code.google.com/p/pubsubhubbub/wiki/PublisherClients)
If you are satisfied with my answer then do not forget to accept it; if you still have some interrogations update your question and I'll try to address them.
I have the following object:
[Serializable]
public class ExampleImage
{
public int ID { get; set; }
public string Filename { get; set; }
public byte[] Content { get; set; }
}
I store this in a List<ExampleImage> which I then pass to the following function to serialize it to a string:
static string SerializeObjectToXmlString(object o)
{
System.Xml.Serialization.XmlSerializer serializer = new System.Xml.Serialization.XmlSerializer(o.GetType());
System.IO.StringWriter writer = new System.IO.StringWriter();
serializer.Serialize(writer, o);
return writer.ToString();
}
I then pass this serialized string to a stored procedure in SQL2000 as an NTEXT which then handled for inserting it into the database:
SELECT * INTO #TempImages
FROM OpenXML(#iDoc, '/ArrayOfExampleImage/ExampleImage')
WITH ([Filename] VARCHAR(255) './Filename', [Content] IMAGE './Content')
The problem I am having is the image is getting trashed. The btye[] is not getting saved properly to the DB. The other fields are just fine. This is the first time I have attempt to send a binary data via XML to SQL so I am most likely doing something wrong at this point. Is my SerializeObjectToXmlString function the problem and it is not handling the serialization of a byte[] properly, maybe the OpenXML SQL function or even the fact that I am sending the XML in as an NTEXT param. I would expect the serialize function to encode the binary properly but I could be wrong.
Any idea what is the issue or maybe a better approach to saving a bunch of images at once?
Edit: I think what is happening, is the serializer is making the byte[] into a base64 string, which is then getting passed along to the stored proc as base64. I am then saving this base64 string into an Image field in SQL and reading it out as a btye[]. So I think I need to somehow get it from base64 to a byte[] before inserting it in my table?
Edit: I am starting to think my only option is to change the stored proc to just do 1 image at a time and not use XML and just pass in the byte[] as an Image type and wrap all the calls in a transaction.
As Gaidin suggested, base64 is the best option. It's the usual way of writing binary data to XML. You can use the following code :
public class ExampleImage
{
public int ID { get; set; }
public string Filename { get; set; }
[XmlIgnore]
public byte[] Content { get; set; }
[XmlElement("Content")]
public string ContentBase64
{
get { return Convert.ToBase64String(Content); }
set { Content = Convert.FromBase64String(value); }
}
}
(by the way, the Serializable attribute has no meaning for XML serialization)
Instead of serializing it to XML, I would serialize it to a byte[] and store that in varbinary(MAX) type field in your DB.
You could try converting it to base64, then saving it to TEXT field or something.