Insert JSON record into SQL Server database - c#

I have an open SqlConnection (extended by Dapper), a table name in the connected database, and a JSON string that I trust to be of the same schema. What's the simplest way to insert the JSON object's field values without deserializing to a static type?
I realize the standard option is to create a class representing the record to deserialize into. However, there are several reasons this is less than ideal. I'm syncing a number of tables in exactly the same way. The schema already exists in two places (the source and the target), so it seems poor form to repeat the schema in the middleware as well. Also, since I'm just going straight into the database, it seems excessive to require a recompile any time someone adds an additional column or table.
Is there a more dynamic solution?

probably deserializing into a dynamic is your best bet. if you want to pull out the values directly from the string, you're going to have to (at least partially) parse it anyways, and at that point you might as well just deserialize it
See this answer for an example using JSON.net: Deserialize json object into dynamic object using Json.net

Deserialize into a dictionary, then construct Dapper DynamicParameters.
How to create arguments for a Dapper query dynamically

Related

Entity Framework Store Object in Column

I currently follow a pattern where I store objects which are serialized and deserialized to a particular column.
This pattern was fine before, however, now due to the frequency of transactions the cost of serializing the object to a JSON string and then later retrieving the string and deserializing back to an object is too expensive.
Is it possible to store an object directly to a column to avoid this cost? I am using Entity Framework and I would like to work the data stored in this column as type Object.
Please advise.
JSON serialization is not fast. It's faster and less verbose than XML, but a lot slower than binary serialization. I would look at third party binary serializers, namely ZeroFormatter, or Wire/Hyperion. For my own stuff I use Wire as a "fast enough" and simple to implement option.
As far as table structure I would recommend storing serialized data in a separate 1..0-1 associated table. So if I had an Order table that I wanted to serialize some extra order-related structure (coming from 3rd party delivery system for example) I'd create another table called OrderDeliveryInfo with a PK of OrderID to join to the Order table to house the Binary[] column for the serialized data. The reason for this is to avoid the cost of retrieving and transmitting the binary blob every time I query Order records unless I explicitly request the delivery info.

Storing a Dictionary<string, string> in the database

At some point in my code, I'm creating a dictionary of type Dictionary<string, string> and I'm wondering what's the best way to store this in a database in terms of converting it to a string and then back to a dictionary.
Thanks.
There are a number of options here.
You can go the normalization route and use a separate table with a key/value pair of columns.
Some databases provide you with a data type that is similar to what you need. PostgreSQL has an hstore type where you can save any key-value pairs, and MS SQL has an XML data type that can be used as well with some simple massaging of your data before insertion.
Without this type of database-specific assistance, you can just use a TEXT or BLOB column and serialize your dictionary using a DB-friendly format such as JSON, XML or language-specific serialization formats.
The tradeoffs are the following:
A separate table with key/value columns makes for expensive querying and is a PITA in general, but you get the most query flexibility and is portable across databases.
If you use a database-powered dictionary type, you get support in queries (i.e "select rows where an attribute stored in the dictionary matches a certain condition"). Without that, you are left with selecting everything and filtering in your program, but
You lose database portability unless you code a middle layer that abstracts this away, and you lose ease of data manipulation in your code (because things "work" as if there was a column in your database with this data).
NoSQL databases that are "document oriented" are meant exactly for this type of storage. Depending on what you are doing, you might want to look at some options. MongoDB is a popular choice.
The proper choice depends on the querying patterns for the data and other non-functional issues such as database support, etc. If you expand on the functionality you need to implement, I can expand on my answer.
If you really want to store the full dictionary as a single string, then you could serialize your dictionary to JSON (or XML) and store the result to the database.
You have a few options here. You could serialize the object into XML, or JSON as #M4N mentioned. You could also create a table with at least two columns: one for key and one for value.
It really depends on what your domain models look like and how you need to manage the data. If the dictionary values or keys change (IE rename, correction, etc), and needs to be reflected across many objects that are dependent on the data, then creating a sort of lookup table for that maps directly to the dictionary might be best. Otherwise, serializing the data would be one of the best performing options.

How can I store a .Net object into SQL Server?

I am trying to store a C# class filled with properties and fields. How can I store that class into SQL Server 2008.
Can someone help??
There a number of different ways and it depends on how you want to take that object back out again. Without further information, we can't help much.
You can use Entity Framework to map the object to a database table. Each property then would correspond to a table column.
You can serialize the object and store in a single column in a database table - serialize as xml, json, or binary blob.
the proper way it to represent the object as a schema: one or more related tables representing the object and its object graph: properties, subtypes, and referenced types. This lets you query the stowed object as SQL data and use it for other purposes (e.g., reporting).
Alternatively, you can serialize the object instance. You can do it declaratively via the [Serializable] attribute. You can roll your own by implementing the ISerializable interface and serialize it as binary, JSON, some other representation of your choice.
You can serialize to/from XML by using XML serialization attributes or by implementing IXmlSerializable.
Or you can ignore the built-in support for this sort of stuff and serialize it your own way.
Once you've serialized it to a Stream, you can stow that in column of Appropriate Type (varbinary(max) or varchar(max)) depending on how its been serialized.
A third option would be to create a CLR user-defined type and install the assembly in SQL Server. I'm not sure I'd suggest that as a "best practice"

Easiest Method to retrieve large sums of data from SQL Server in C#

In my situation, I have a C# DLL I wrote myself that has been registered in a SQL Server database containing sales/customer data. As of now, I'm a bit stuck.
The DLL makes a call to a remote server to obtain a token. The token is then added to the database. Ideally, the next step is to retrieve data from the SQL server into the DLL and then build and post a JSON file to a remote server, using the token the DLL retreived.
Where I am stuck is there are 134 elements, with different data types, in the receipt section of my JSON file alone. I will need to be able to handle all of that data in my C# DLL and in the future I may need to pull a lot more data into this JSON file to be posted. I've done some reasearch and using user defined type (UDT) wouldn't quite work and from what I can tell, is an option I should stay away from. My other two options I know of would be to either export to XML and parse it in my DLL or to create and read in 134+ variables.
My question is: Is there a simpler way to do this besides XML/hard coding? It would be ideal if there was a way to use an array or an object but neither seem to be supported according to what I've read here
Thank you.
Important note: Because of the database and the JSON library I'm using, I'm working in .Net framework 2.0
I would recommend you to use XML serialization on the C# side. You create an object that models your database schema.
As you are using .NET 2.0 you have already a good set of base classes to model your database schema in an object oriented way. Even nullable columns can be mapped to nullable objects to save memory and network space.
From your SQL side you use the FOR XML clause, that will change the output of your query from tabular to XML. You have to make just one good SP that will create XML in the exact hierarchy as your C# objects.
This XML has to match the names and the casing of the classes and the properties of your c# class(es).
Then you will de-serialize this XML from the C# side in no more than 10 lines of code. No matter how big or how complex the data hierarchy is, and you will have instantly in memory objects that you can immediately serialize into JSON again.
Let me know if you need some good examples on how to achieve this. And please clarify if you are running inside of the SQL Server CLR execution context, as you might need special permissions for serializing/deserialize data.
I guess its a very primitive way of achieving what Entity Framework does. but it works.
You should probably stick with using XML as your data is semi-structured. Especially if you know your schema will be changing overtime. SQL Server is not yet an OODBMS.

Making XML in SQL Server faster - convert to tables?

I have a field in the database that is XML because it represents a class that is used in C#/VB.Net. Problem is after the initial manipulation most, but not all, of the manipulation is done in SQL Server. This means that the XML field is converted on the fly.
As we are getting more fields and more records, this operation is getting slow. I think the slow down is the converting all of those fields to other data types.
So to speed it up I was thinking of a couple of ways:
Have a set of tables that represent the different pieces of the XML data. I would make these tables read only using a trigger on Insert/Update that would reject any changes. My 'main' table with the XML in it when it updates the XML would turn off the triggers, update the tables with the new values then turn the triggers back on.
The only real reason we use the XML is because it's really easy to convert it to the class in C#/VB.Net. But I'm getting the point where I may end up writing a routine that will take all the bits and pieces and convert it to a class and also a function to go the other way (class -> tables).
Can anybody give any ideas on a better way to do this? I'm not tied to the idea of using the XML structure. My concern is if we have separate tables to speed up SQL processing and somebody changes the value of a field in that table we have to make sure the XML is updated. Or don't allow the person to update it.
TIA - Jeff.
What is the purpose of the objects you are saving? If anything other than persistence of state, you are not doing yourself any favors and you are not properly separating concerns. If they are persistence of state, then at minimum, make columns out of the properties and fields (can include private as long as you leave an internal method to set the values when you reconstitute).
Disregarding the wisdom of what you're doing, you might look into creating an XML index. This should help you get started: http://msdn.microsoft.com/en-us/library/ms345121%28v=sql.90%29.aspx
The basic idea is that the right index can 'pre-shred' your XML and automatically build the sor of tables you are thinking of doing 'manually'. A downside is that this can really explode your storage requirements if you are storing lots of XML.

Categories