I have a dashboard with many tables that I need to be able to export to excel. Each table is based on a domain model object collection from my domain model.
I have written some code that will take the collection and create an excel file from the data it contains.
Where does this code go? As a first pass, I just stuck it in my controller method. Once I create the excel file I return that in the controller. Easy, but clearly that's no good. But I can't imagine this goes in my domain either.
Where should this code go? I can't seem to classify it as controller logic, domain logic, or anything else. I'm basically serializing a domain object into data in an excel file and returning that to the client.
Eventually what I would like to do is have this code grab the data as json from a url and serialize that into an excel file. Again, I have no idea where that code should go either.
Related
To preserve model information between form posts and page requests, I define the model I use as 'static', so I don't loose my model's information.
Let's say, I have 3 steps of action. First step, I let user to select Excel file. Second step, I create a DataSet and attach to my model. Third step, I validate the user input and update my DbContext based on DataSet. All these steps require a form post. (At the end of actions, I dispose the DataSet to release resources but there is no way to be sure that user cancelled the operation, therefore the DataSet object will remain in the memory.)
So my question is, keeping my model as static this way, is efficient regarding to the memory allocation on the server? Or would you suggest a better option?
Sounds like a bad idea. First of all if multiple users are using this static model at once then your model won't be consistent for each user. There are 3 places you typically put your state:
Write it to the form as hidden fields which you can post back again - this won't really work for your excel file but may be useful to persist the other state of your model.
Write to session state. For your excel file you might want to write to a file system and then store the filename in the session state.
Write to the db. You could put your excel file in the db, or put your excel file into the file system and store a path to the file in the db. Once the user gets past the 3rd step you could set a flag that this data is to be kept, or move it to another table. And you could periodically run a job which cleans up data where the user didn't get to the 3rd step.
I would like to have fields in my Excel sheets be bound (in both directions) to a data source, in this case an Access DB.
For example, I would like to have an Excel sheet 'select' a particular record, say a customer, and then load information on that customer into the worksheet. Then, any changes made to that worksheet would be pushed back to the data store, making Excel a nice front end to the data.
Can this be done? From what I can tell "Get External Data" options in Excel are one way routes. My development background is heavy in ASP.NET C# and SQL.
Excel is designed to deal with datasets and not so much single records. For what you are trying to do with a single record, you would be far better off building a form in access, but as I don't know your environment/organisations limitations I'll make a suggestion.
Since you've obviously got a bit of SQL and coding skill check out this post for an option that would work for you - Updating Access Database from Excel Worksheet Data
You can get or put as much data as you want and can join tables too. It's a good basic get and then push set up.
I am working on mvc4 application using EF code first approach. I have finished writing the models of the application and came across data transfer objects DTOs which I have never used before. The basics that I have in my mind for models is that they are taxi drivers that take data from database as passengers and drop wherever there is requirement. So, in what cases should we go for DTOs?
The problem of using model as the only vehicle is that sometimes it carries too much data. For example, there may need to show all user information but not the SSN. Another concern is over-posting. For example, if your model is used in WEB API to deserialize JSON object, someone can easy stick extra info into it. And if your model will carry this info all the way to DB.
To limit these problems, you can create View Models, DTOs or both, which will limit amount of fields "visible" to the client. Your DTO hydrator will simply skip non-existing fields during hydration.
Check this auto-mapper http://www.dnrtv.com/default.aspx?showNum=155
I have a .NET webforms application that consumes data from a WCF/REST web service. Most of the time, we use shared classes to pass data back and forth, and that works well.
However, occasionally we need to display this data in a Datagrid or similar .NET control, and the most convenient way to do this is via a Dataset. I know we can read XML into a dataset using the Dataset.ReadXML(myXML, XmlReadMode.InferTypedSchema) method, and that's been working OK.
Occasionally, though, the InferTypedSchema infers the wrong datatype. For example, it'll decide that a zip code is an integer, not a string.
What are my options? I know I can manually define the dataset schema, but I'd like to avoid that if possible. The web service automatically generates an .xsd (ie, the autogenerated response/schema URL) - is there any way to tell Dataset.ReadXML to use that? Or is there a better way?
I know you can feed your DataSet not just with data but also with schema, something like
dataSet.ReadXmlSchema(new System.IO.StreamReader("schema.xsd"));
so if you have the xsd should be fine.
If you know your XSD ahead of time, you can generate a class file from that and then tweak it to ensure you have the right data types (strings for zip code).
XML Schema Definition Tool (Xsd.exe)
Once your create your class file, you can use XmlSerializer.Deserialize to convert your XML to a class.
Currently, I'm sitting on an ugly business application written in Access that takes a spreadsheet on a bi-daily basis and imports it into a MDB. I am currently converting a major project that includes this into SQL Server and .net, specifically in c#.
To house this information there are two tables (alias names here) that I will call Master_Prod and Master_Sheet joined on an identity key parent to the Master_Prod table, ProdID. There are also two more tables to store history, History_Prod and History_Sheet. There are more tables that extend off of Master_Prod but keeping this limited to two tables for explanation purposes.
Since this was written in Access, the subroutine to handle this file is littered with manually coded triggers to deal with history that were and have been a constant pain to keep up with, one reason why I'm glad this is moving to a database server rather than a RAD tool. I am writing triggers to handle history tracking.
My plan is/was to create an object modeling the spreadsheet, parse the data into it and use LINQ to do some checks client side before sending the data to the server... Basically I need to compare the data in the sheet to a matching record (Unless none exist, then its new). If any of the fields have been altered I want to send the update.
Originally I was hoping to put this procedure into some sort of CLR assembly that accepts an IEnumerable list since I'll have the spreadsheet in this form already but I've recently learned this is going to be paired with a rather important database server that I am very concerned with bogging down.
Is this worth putting a CLR stored procedure in for? There are other points of entry where data enters and if I could build a procedure to handle them given the objects passed in then I could take a lot of business rule away from the application at the expense of potential database performance.
Basically I want to take the update checking away from the client and put it on the database so the data system manages whether or not the table should be updated so the history trigger can fire off.
Thoughts on a better way to implement this along the same direction?
Use SSIS. Use Excel Source to read the spreadsheets, perhaps use a Lookup Transformation to detect new items and finally use a SQL Server Destination to insert the stream of missing items into SQL.
SSIS is way better fit to these kind of jobs that writing something from scratch, no matter how much fun linq is. SSIS Packages are easier to debug, maintain and refactor than some dll with forgoten sources. Besides, you will not be able to match the refinements SSIS has in managing its buffers for high troughput Data Flows.
Originally I was hoping to put this
procedure into some sort of CLR
assembly that accepts an IEnumerable
list since I'll have the spreadsheet
in this form already but I've recently
learned this is going to be paired
with a rather important database
server that I am very concerned with
bogging down.
Does not work. Any input into a C# written CLR procedure STILL has to follow normal SQL semantics. All that can change is the internal setup. Any communication up with the client has to be done in SQL. Which means executions / method calls. No way to directly pass in an enumerable of objects.
My plan is/was to create an object
modeling the spreadsheet, parse the
data into it and use LINQ to do some
checks client side before sending the
data to the server... Basically I need
to compare the data in the sheet to a
matching record (Unless none exist,
then its new). If any of the fields
have been altered I want to send the
update.
You probably need to pick a "centricity" for your approach - i.e. data-centric or object-centric.
I would probably model the data appropriately first. This is because relational databases (or even non-normalized models represented in relational databases) will often outlive client tools/libraries applications. I would probably start trying to model in a normal form and think about the triggers to maintain audit/history as you mention during this time also.
I would typically then think of the data coming in (not an object model or an entity, really). So then I focus on the format and semantics of the inputs and see if there is misfit in my data model - perhaps there were assumptions in my data model which were incorrect. Yes, I'm not thinking of making an object model which validates the spreadsheet even though spreadsheets are notoriously fickle input sources. Like Remus, I would simply use SSIS to bring it in - perhaps to a staging table and then some more validation before applying it to production tables with some T-SQL.
Then I would think about a client tool which had an object model based on my good solid data model.
Alternatively, the object approach would mean modeling the spreadsheet, but also an object model which needs to be persisted to the database - and perhaps you now have two object models (spreadsheet and full business domain) and database model (storage persistence), if the spreadsheet object model is not as complete as the system's business domain object model.
I can think of an example where I had a throwaway external object model kind of like this. It read a "master file" which was a layout file describing an input file. This object model allowed the program to build SSIS packages (and BCP and SQL scripts) to import/export/do other operations on these files. Effectively it was a throwaway object model - it was not used as the actual model for the data in the rows or any kind of navigation between parent and child rows, etc., but simply an internal representation for internal purposes - it didn't necessarily correspond to a "domain" entity.