Offline Report genearation Service with Archive in .Net - c#

I am trying to build a web application with large amount of transactions per day with the SQL server in the back end. So I want to generate the reports every day as a offline task and keep it ready in the server for a period of 7 days and then Archive these reports which are more than 7 days old. So I am looking at possible way to achieve this in C#.Net application. Is there a way i can generate these reports offline in some kind of service running in scheduler and display these reports on the web application when the user requests for the report.
I know that there are reporting services like crystal reports, SSRS. but i am not sure whether i will able to achieve my requirements with these.
Any insight would be of great help
Thnaks in advance
Nayan K

Depending on what sort of reporting you are after, this might be more of an SQL question. We do daily roll-ups of certain data to a table using stored procedures, storing the reporting information in tables. We also have vendor-supplied reporting for some of our stuff that generates and emails reports, as well as some of our transactional data loaded into a data warehouse with various reporting options, a couple of C# programs running from Task Scheduler on one of our servers... the list goes on.
Which model is best for you will depend on various factors, including what you personally prefer.
So the short answer is, yes, you can write a C# console application and run it from the Task Scheduler if that's what works for you.

Related

How can I expand my program to make it used on multiple devices and give them access to the old database without losing data?

I built a software for a farm using C#, the program was meant to keep track of the inventory and the financial exchanges related to the farms work.
The software was built to be installed on the manager's computer who then entered the farms data and retrieved reports and so. While the accountant used the same PC to use the financial part of the program with a different account.
Now the farm's business grew and more users need to use the system, can I move the database with the old data to a server so users can log in the system from different PC's at the time and continue the old tasks?
If I can - what do I change in my code?
P.S. the database was done in MS Access.
Not a lot of information to go on here. I can tell you that Access is a file based database system, and so whilst you could put the database files on a server or a NAS device with no problem with multiple users you should expect to run into the usual problems of Windows file sharing - VERY SLOW performance as a minimum.
It is also possible that the database may have been limited to a single user at a time, and without any more information impossible to know whether the developers have allowed for multi-user at all or whether you could have a situation where if several people do open the file at once one person may be overwriting another's data leading to corruption.
The short answer is that if the original developers are no longer around and you cannot ask the question of them then you probably need a new dedicated application to do the work which would mean either a complete rewrite or an alternative commercial application.
For multi-user Microsoft SQL Server, MySql, or even Firebird or another dedicated database back end would be the way to go. The front end could be anything - Winforms, WPF, even a web application if that is what you want, but it would have to be written.
I hope that this is helpful.

How can I know how many queries are fired on my SQL Server from a specific server?

Background:
I work for a company where there are two departments using the same master database. Application for one department is developed by my team while for the other department is outsourced. This application are windows based application and several users work on the system concurrently.
Now a lot of conflict is going on due to the extreme load on the master database and we do not have enough statistics to debate.
Now I wish to know, that how many queries are being fired from that other application. Their source code is not available to us.
Is there any simple tool or logging facility which I can setup and then it starts counting the number of queries fired from that server.
Thanks.
P.S: I know a somewhat similar question exists, but it is for mysql and not helping in solving my problem. "How can i know how many queries are fired in my database?"
You can simply use SQL Server Profiler to get that kind of information. It won't be practical to run it for a long time but if you know when the problems happen, you can just monitor it this way. You can save the output and analyse it later.
You can use some commercially available tools to record all the activity in the database. Some of them are very good and allow you to analyse information in great details.
The following query is will give you all the Queries which is fired on the server with the specific Database Name on which it is fired......
**********************************************************
select db_name(sql_text.dbid) as [Database],
case db_id()
when sql_text.dbid then object_name(sql_text.objectid)
else cast(sql_text.objectid as sysname) end as [Object],
sql_text.text as [AllTSQL],
dbo.GetTSQLFragment(
sql_text.text,
statement_start_offset,
statement_end_offset) as [FragmentTSQL]
from sys.dm_exec_query_stats stats
cross apply sys.dm_exec_sql_text(sql_handle) as sql_text
***************************************************************
Please let know if this works

Performance Issues in Real-time data visualization application

Here's the scenario:
I'm working on decision-support system which is being developed as windows forms client and MySQL database server. There are sources which update data in real-time in the MySQL database and my application is supposed to read and refresh the latest data every second and present them to user in graphically rich forms of presentations like graphs, bars, grids etc.
Now, we are facing problems and have performance bottlenecks. We found reading directly from database is a main problem and we want to boost this transaction. I came across memcached but I'm not sure if I can use this in this scenario. Can you please help me in removing this bottleneck? Can memcahced be used in this scenario? What are the different alternatives that can come to help in this situation?
Preferably, I would want to perform everything in memory and keep taking the database snapshots every 15 minutes to the disk. Is there a way I can do this?

Schedule a Microsoft Dynamics CRM 4.0 Workflow

I am in the process of puting together a custom workflow activity to be used in Microsoft Dynamics CRM 4.0
What I would like to ultimatly acheive is configure a workflow that runs on a schedueled basis i.e run every 2 hours Monday to Friday, rather than on a particular "CRM event" like create, delete status change ect.
Does anyone have any ideas?
Maybe schedule it outside of crm?
Edit 1:
What we are doing is processing rows in a staging table that gets generated from a front-end site. We are creating contact/account and opportunity records in CRM based on the data captured from the front-end.
The more I think about it the more I'm thinking that using workflow is possibly not the best solution?
What about a using a windows service?
Workflow was not best option for this situation due to the following:
Can't schedule it to run
The process can only be triggered by a CRM create, update or similar message
I went with a combination of the following:
A SQL CLR sproc that gets called on an UPDATE trigger on the staging table. The CLR sproc calls a webservice that generates CRM contacts/accounts. That way the front-end site can create records and set a "ready to process" flag once all data has been entered.
The requirement changed from a schedule solution to real time processing (well not ACTUALLY real time). The process needs to run as records are entered from the front end site.
Hope all that makes sense!
A windows workflow using the CRM webservices is one option, a better option would be to change your webform to access the CRM webservce adn enter the data directly.
if you really want to use workflows you can download a tool from http://www.patrickverbeeten.com/pages/TechnicalNet/MicrosoftCrm/PeriodicWorkflows.aspx?id=23 that you install on your CRM server and it allows you to use windows sheduled tasks to triger them.

Architecture Guidance Needed?

We are about to automate number of process for our reporting team.
(The reports are like daily reports, weekly reports, monthly reports, etc..)
Mostly the process is like pulling some data from the oracle and then fill them
in particular excel template files.
Each reports and so their templates are different from each other.
Except the excel file manipulation, there are hardly any business logic behind these.
Client wanted an integrated tool and all the automated processes are placed as
menus/submenus.
Right now roughly there are around 30 process waiting to be automated.
And we are expecting more new reports in the next quarter.
I am nowhere to near having any practical experience when comes to architecuring.
Already i have been maintaining two or three systems(they are more than 4yrs old.)
for this prestegious client.The possiblity of the above mentioned tool will be manintained for another 3 yrs is very likely.
From my past experience i've been through the pain of implmenting change requests to the rigd & undocumented code base resulting in the break down of the system and then eventually myself.
So My main and top most concern is the maintainablity.
When i was searching for these i came across this link,
Smart Clients Using CAB and SCSF
is the above link appropriate for my requirement?
Also Should i place each automated processes in separate forms under a single project,
or place them in separate projects under a single solution..
Please correct me if have missed any other important information.
Thx.
We built a similar system many years ago. The system consisted of a main process and a series of Excel report generators that implemented a plugin architecture. Each report had its own report generator and could be generated on its own schedule, quite similar to how quartz and quartz.net handle scheduling (but this was many years before they were created). The plugin architecture was the key to the system, as new report generators could be created and dropped in dynamically.
The main process was in charge of launching/monitoring generators, listening for events that were triggered from generators, and distribution of all the reports. Distibution mostly consisted of FTP to a file server where reports could be viewed and saved from a web site. But we did also have some reports emailed to a distribution list.
The main database that stored all the report data also housed all the system meta data that informed the main process and the generators of the details of their work to be performed.
In our system all of the reports ran sequentially, mostly due to Excel limitations at the time. Assuming Excel is now capable of behaving itself, there's no reason that a new system couldn't be designed to run in paralel.
Why don't you replace the various Excel templates with a proper reporting solution? Using a decent reporting product has the following benefits:
everyone can have access to the same report
you can use roles to prevent people seeing certain reports
the users can change the parameters of the reports every time they run it, or they can use the defaults you set up for them
the reporting can be highly automated, it can run automatically on certain schedules (like 9am every Monday morning, or every Sunday evening after end of week processing)
reports can be delivered in a variety of formats
reporting tools can use stored procs in the database, those stored procs can encapsulate business logic
Personally i would advocate the use of Sql Server Reporting Services (2008 version, try and avoid 2005), although there are other products out there as well. SSRS can talk to an Oracle database, it can be obtained and used for free.
You also made vague references to "processes". The way i read this, you have some code running somewhere that pulls data out of the database, massages it somewhat, and places it in to some staging area - maybe the Excel spreadsheet is the staging area, and further reports are derived from that staged data. If my interpretation is correct, then the above mentioned reporting product (and possibly many others) remove the need to do this.

Categories