Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I create two projects using dot net. I send some data from one project to another project. I already handled this using xml conversion. Now I think to convert this using json data. Which one is best?.
In most cases, json is better. Because it has less redundant text and more efficient to parse. And what's more, is's much easier to parse json in javascript so that building a web app is much more convenient.
On the other hand, XML is more readable than json. That might be the only advantage of XML in this case.
XML is more verbose, but in the other hand is pretty easy to read.
JSON is simpler but not so good to read.
If you will not have humans looking the data, maybe using JSON would be a better idea. Since its smaller and will be less of a burden in your network (If that is a problem).
The other way you could evaluate, is looking on serialization speed, and that will depend a lot on the library you are using.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
Object serialization method ? I am creating my own arraylist serialization application.
Serialization is a huge topic. Without discussing lots of details about what your requirements are, only vague guidance is possible, in which case: by default, probably keep it simple and use generic things like JSON or XML; Json.NET has over 100 samples which are very reasonable for general-purpose scenarios. If you have specific requirements about size/performance/compatibility/capabilities/etc, then other tools or formats may be more appropriate. The only main rules are:
do not blindly use BinaryFormatter (or NetDataContractSerializer)
don't roll your own home-baked serialization format/code unless you know a lot about serialization and can explain why none of the myriad of pre-existing libraries won't work for your specific scenario
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I am building a web application in ASP.NET and I am relatively really new to programming. I am very confused on how to read objects in a JSON file(using C#) and use them in web application. Any help will be very appreciated.
There are several ways to read in the JSON but the standard way would be to use a JSON serialiser to serialise the document into a c# class structure.
This works well when you know the structure of the json file. See NewtonSoft for a very well known JSON library. There are others out there if you want something different.
If you need to figure out what the structure of your document is, then you could also use a tool like QuickType which will create the c# class structure for you.
If you didn't know the structure of the file up front then you could still use the NewtonSoft library but would need to do a few extra steps and things tend to get tricky.
I am not sure this is within the scope of your question.
NewtonSoft (https://www.newtonsoft.com/json) library is great for this, I recommend taking a look at it, specifically:
https://www.newtonsoft.com/json/help/html/SerializeObject.htm
https://www.newtonsoft.com/json/help/html/DeserializeObject.htm
For examples of usage.
Recommend library: Newtonsoft.Json. It is so easy to use.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
Which of yaml, html, xml, json etc, is best suited for large text blocks and formatting (headings, paragraphs, italics etc)? Also taking into account that it's for use in a wpf app.
Considering the fact that standards such as Office Open XML or XPS use XML to store large documents, I'd suggest XML is likely the best choice.
If you look at some of the document choices for WPF, you will see that XML is a natural choice. Whatever you represent in XAML is XML. A FlowDocument can be serialized to XML as well (see this answer).
You will find further information here.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
If any of you are using the new TVP to pass a collection to SQL Server what issues have you encountered with that process? I am thinking about implmenting it, but wanted to get some feedback on potential pitfalls with going with that solution. I like passing XML for flexibility and the ability to pass collections, but I don't want to have to shred the xml with SQL Server (Doesn't seem like it will scale well). Anyways just wanted to get some feedback from some of the more experienced users....always really helpful.
Thanks,
S
I have been using TVPs for half a year so far, and I did not have any issues at all. I do not send more than 1K rows at a time. Before TVPs were available, I was packing numbers in binary format into an image, and casting parts of an image back into numbers on the server - that scaled very well, performed real fast for up to 100K numbers.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I need to parse some known file formats, one of them is the CUSCAR format, i strongly believe that RegEx will do the job ,any suggestions ?
I just looked at the CUSCAR spec, and I think you'll get some pretty ugly regex code to parse that. You could get away with it, if you are parsing only part of it. You'll have to test for speed, as your main bottleneck will be I/O.
I did something similar with the vendor files that came from QWEST. These beasties were hierarchical text files. Parsing those sucked! I'm currently creating and parsing text files between 4 to 50 million lines each (every day).
There is a nice framework called FileHelpers Library. This framework will help you create object-oriented representation of the records (text lines). It even has a nice wizard to walk you through the creation of these objects representing the records. It will handle master-detail, delimited, and fixed formats easily.