Building a model - based RDLC report - c#

Objective
Build a PDF file which represents single C# class object with multiple fields and properties, that is created in runtime. It constains plain text, variables, tables, images.
What I have to use?
As a requirement, I have to render PDF report with Microsoft.Reporting.WebForms.LocalReport and it will probably contain SubReports.
What I did?
When working with reports until now, following simple pattern was enough:
ReportParameter param = new ReportParameter();
param.Name = "Param name";
but in that case I'd have to make over 80 parameters and I think it's just not a nice way, so I'm curious if there is better way to do that. Based on my researchers on SO and google, I can use System.Data.DataSet to achieve what I want, but:
It is only sightly better that hardcoding all the parameters
It is almost the same, but I produce additional files, IMO it's easier to use and understand ReportParameter way for future code users than getting burried in tons of extra files, that(maybe) could have been avoided
DataSets are for tabular data; I have one big model
Problem
Actually, I think the problem lies in DataSource of the RDLC report itself. The ways that VS provides are created for DB connection or any DB related object. That's why using DataSources and DataSets is indicated. I can provide any data that is either of type IEnumerable or IReportData (or from db connection), but mine are neither. I got my model already build and I'd want to use it if possible.
Most examples I found was for creating reports straight from database or custom data sets. Got no more ideas how to make it work. That's why I'm here.

When using an external component is an option for you: I've been using the .NET reporting component "combit List & Label". It has PDF export and an object dataprovider included that takes any .NET object or even object structure and provides the object`s properties as variables/parameters in an interactive report designer for tables, charts, etc.

Related

Design dynamic RDLC programmatically

Is it possible to design reports programmatically in winforms? Like putting textboxes, images, and all sorts of stuffs in a report as per the user requires.
My report currently looks like this:
However, I would like to make the user to add things what they want using checkboxes that I will provide in the form. For example, an image header of their choice, textbox with their defined text, etc.. I've searched through the Internet but all of them are just for creating dynamic tables.
It's certainly possible to generate RDLC programmatically. It's just XML. There are some articles out there on programmatically generating a report.
However, there's a lot of complexity that you might have to wade through.
Based on your screenshot, there might be a better way. It appears that you are trying to generate a quiz. Based on that, you could build a UI that allows someone to add questions of various types and then save them to a table. (Assuming a database backend, but your screenshot looks like SSRS, so I'm going down that route) What I've done in the past is to save a representation of each question as JSON. You end up with a couple of metadata columns, and then a nvarchar(MAX) column for the JSON. This allows a wider variety of question types without needing a lot of complexity in your table structure.
In the actual report definition (I use Visual Studio) you can write queries that extract the JSON and format it as a table for SSRS to consume. If you have at least SQL Server 2014, there is some native JSON functionality in T-SQL. We combine this with some logic in the report, some sub-reports to handle specific meta types, and some HTML content. For a quiz with many questions, you can do the minimum querying to build the basic structure, and the loop over each record and extract more specific data about each question, be it multiple choice, text, etc.
Update
Since you're only doing multiple choice, I would try a much simpler approach. You just need two tables: questions and choices. At that point, it's fairly trivial to configure a tablix to do what you want.

RDLC parameters filter after records pulled

I'm trying to determine best practices with client-side reporting using the reportviewer displaying an RDLC. When designing a report, you can specify parameters to pass to the report which can be used to filter the records. Using SQL Profiler, however, when running the report it appears that the filter is applied after the records are pulled down for the report. Am I seeing this correctly?
If so, why isn't this kind of thing discouraged? Shouldn't the records be filtered perhaps at the dataset level or at some point such that the records are filtered prior to being sent to the client (for performance reasons, of course)? I have looked and looked for a discussion of these kinds of issues over the internet and all I see is a multitude of ways to implement parameters (ie, how to) but no discussion of when one way is better than another and why. This isn't anything new, so I would expect there to be more about this out there. Can anyone point me to something that discusses this if I have missed something?
I experimented with client side RDLC recently and came to the conclusion that the data handling is too inefficient. Like you point out, I encountered the issues with parameters not filtering as you'd expect. I used crystal reports (on which I think this is based) nearly 10 years ago and I'm sure there wasn't issues like this.
The best method I used was to pre-prepare your dataset before generating the report. I found always passing the data at runtime, not defining data access in the report definition ensures it will only ever use the data you provide.
You are right, there's very little documentation/discussion on using this, ultimately I ended up removing it from my project and using other methods to render the charts/tables etc.

connecting to different databases using .net datasets

I have many Databases with same structure and I have designed a dataset that matches the database design. It is easy to connect to database using connectionStrings which asked at design time and defined in app.config. But the problem arises when trying to change the database at runtime. I can not find any non-reflection solution to handle it. Is there any other way to change connection string of a dataset dynamically at run time or at least create dataset with different connection string!!!
You are filling DataSet using TableAdapter and you can easily modify TableAdapter connection string like this:
myTableAdapter.Connection.ConnectionString = connectionString;
Hope this helps :)
gzaxx answer will not work, simply because different DBMS work with different ADO.NET providers, which may or may not be compatible with each other. There's a lot of theory behind it and I won't type all of that in this textbox, but you need to understand that is the TableAdapters that is the main issue, and not the DataTable. Your business and UI layers normally only talk to DataTables which will mostly have the same structure for almost any DBMS given that you have correctly used corresponding data types when creating table columns. So, in theory, if Typed DataSets could provide a way to attach multiple Adapters per DataTable, you could add one adapter for each DBMS you support, while keeping the DataTable structure the same.
I myself had to deal with this issue in a somewhat large project and the only workable solution for me was to separate my Data Access into a separate project (a class lib) and then create one such DLL for each DBMS I was supporting. Hope that helps you get started with this.

Confusion with 3 layer design

I've been reviewing examples on the web of 3 layer design and I've noticed that most samples return either datasets or data tables. The thing that is confusing me is what if you would rather return a generic list of type so you can utlize properties or methods from within the type your list is based on? As example using a Name property that concats various fields in a specific way depending on the data, if the List is bound to a control on a form then the Name property can be used as the datafield. If you would want to accomplish the same thing when using a dataset or table, you'd have to return the data from the database to acheive the same (I try not to use datasets or datatables so I'm probably very wrong about this statement. :) )
The part that is really confusing me is about resusing code, to me it seems the only way to reuse code is to retrieve the data into either a dataset or datatable and then loop through the data and add it to a List, is this generally the best practice for 3 layer or is there a way to do this without datasets and datatables.
The example in the link below demonstrates in essence using datasets or tables and then adding it to an object but I'm forced to ask if this is the best practice?
http://www.codeproject.com/Articles/36847/Three-Layer-Architecture-in-C-NET
Thanks
Using DataTables is a specific dotnetism. The reason behind it is that they contain metadata about the structure of the data, which lets DataGrid (and other such components) display the data automatically without using reflection or so. My guess is this is amongst other things a heritage of the MS Access approach to RAD, where the intent was enabling "business people" to create apps by generating the user interface directly from a SQL schema, essentially doing the opposite of a tiered design. This heritage then seems to have leaked into the hivemind.
There's nothing wrong about using "plain" data structures, as long as you're willing to give up the RAD features, and the trend lately seems to have been to get rid of this tradeoff too. (For instance with Web Forms' strongly typed data controls, and MVC's model binding features.)
Also, speaking more generally, Code Project articles from before MVC was established are not really a good source of wisdom on general software architecture.
What you should carry your data on depends entirely on your needs.
If you retrieve data from the DB and bind it to a datagrid, datasets might give you the perfect solution. If you want some other method where data tracks its own update status you should look into Entity Framework. If you retrieve data and send it through a web service for cross platform or cross domain processing you need to load your data onto some other serializable classes of your own.
Take a look at the article below. It is a little old and targeted at EF4 but it summerizes pros and cons of different strategies very well. (There are three articles in the series, I suggest you read them all)
http://msdn.microsoft.com/en-us/magazine/ee335715.aspx
I think the samples you're finding used data tables and datasets because it's a simple way to show 3-tier design. Now days Entity Framework has largely replaced the "data access layer" mentioned in the sample.
Before entity framework when I wrote a data access layer I would return a generic list that I built from the database. To run an update, delete, or insert I would pass an object in as the parameter to the methods, then use the object's properties as the values in the sql statement. I preferred doing it that way for the reasons you mentioned but also because it allowed me to change the object definitions or db schema (or even use a different db all together) independently of each other.

Efficient way to analyze large amounts of data?

I need to analyze tens of thousands of lines of data. The data is imported from a text file. Each line of data has eight variables. Currently, I use a class to define the data structure. As I read through the text file, I store each line object in a generic list, List.
I am wondering if I should switch to using a relational database (SQL) as I will need to analyze the data in each line of text, trying to relate it to definition terms which I also currently store in generic lists (List).
The goal is to translate a large amount of data using definitions. I want the defined data to be filterable, searchable, etc. Using a database makes more sense the more I think about it, but I would like to confirm with more experienced developers before I make the changes, yet again (I was using structs and arraylists at first).
The only drawback I can think of, is that the data does not need to be retained after it has been translated and viewed by the user. There is no need for permanent storage of data, therefore using a database might be a little overkill.
It is not absolutely necessary to go a database. It depends on the actual size of the data and the process you need to do. If you are loading the data into a List with a custom class, why not use Linq to do your querying and filtering? Something like:
var query = from foo in List<Foo>
where foo.Prop = criteriaVar
select foo;
The real question is whether the data is so large that it cannot be loaded up into memory confortably. If that is the case, then yes, a database would be much simpler.
This is not a large amount of data. I don't see any reason to involve a database in your analysis.
There IS a query language built into C# -- LINQ. The original poster currently uses a list of objects, so there is really nothing left to do. It seems to me that a database in this situation would add far more heat than light.
It sounds like what you want is a database. Sqlite supports in-memory databases (use ":memory:" as the filename). I suspect others may have an in-memory mode as well.
I was facing the same problem that you faced now while I was working on my previous company.The thing is I was looking a concrete and good solution for a lot of bar code generated files.The bar code generates a text file with thousands of records with in a single file.Manipulating and presenting the data was so difficult for me at first.Based on the records what I programmed was, I create a class that read the file and loads the data to the data table and able to save it in database. The database what I used was SQL server 2005.Then I able to manage the saved data easily and present it which way I like it.The main point is read the data from the file and save to it to the data base.If you do so you will have a lot of options to manipulate and present as the way you like it.
If you do not mind using access, here is what you can do
Attach a blank Access db as a resource
When needed, write the db out to file.
Run a CREATE TABLE statement that handles the columns of your data
Import the data into the new table
Use sql to run your calculations
OnClose, delete that access db.
You can use a program like Resourcer to load the db into a resx file
ResourceManager res = new ResourceManager( "MyProject.blank_db", this.GetType().Assembly );
byte[] b = (byte[])res.GetObject( "access.blank" );
Then use the following code to pull the resource out of the project. Take the byte array and save it to the temp location with the temp filename
"MyProject.blank_db" is the location and name of the resource file
"access.blank" is the tab given to the resource to save
If the only thing you need to do is search and replace, you may consider using sed and awk and you can do searches using grep. Of course on a Unix platform.
From your description, I think linux command line tools can handle your data very well. Using a database may unnecessarily complicate your work. If you are using windows, these tools are also available by different ways. I would recommend cygwin. The following tools may cover your task: sort, grep, cut, awk, sed, join, paste.
These unix/linux command line tools may look scary to a windows person but there are reasons for people who love them. The following are my reasons for loving them:
They allow your skill to accumulate - your knowledge to a partially tool can be helpful in different future tasks.
They allow your efforts to accumulate - the command line (or scripts) you used to finish the task can be repeated as many times as needed with different data, without human interaction.
They usually outperform the same tool you can write. If you don't believe, try to beat sort with your version for terabyte files.

Categories