We are about to automate number of process for our reporting team.
(The reports are like daily reports, weekly reports, monthly reports, etc..)
Mostly the process is like pulling some data from the oracle and then fill them
in particular excel template files.
Each reports and so their templates are different from each other.
Except the excel file manipulation, there are hardly any business logic behind these.
Client wanted an integrated tool and all the automated processes are placed as
menus/submenus.
Right now roughly there are around 30 process waiting to be automated.
And we are expecting more new reports in the next quarter.
I am nowhere to near having any practical experience when comes to architecuring.
Already i have been maintaining two or three systems(they are more than 4yrs old.)
for this prestegious client.The possiblity of the above mentioned tool will be manintained for another 3 yrs is very likely.
From my past experience i've been through the pain of implmenting change requests to the rigd & undocumented code base resulting in the break down of the system and then eventually myself.
So My main and top most concern is the maintainablity.
When i was searching for these i came across this link,
Smart Clients Using CAB and SCSF
is the above link appropriate for my requirement?
Also Should i place each automated processes in separate forms under a single project,
or place them in separate projects under a single solution..
Please correct me if have missed any other important information.
Thx.
We built a similar system many years ago. The system consisted of a main process and a series of Excel report generators that implemented a plugin architecture. Each report had its own report generator and could be generated on its own schedule, quite similar to how quartz and quartz.net handle scheduling (but this was many years before they were created). The plugin architecture was the key to the system, as new report generators could be created and dropped in dynamically.
The main process was in charge of launching/monitoring generators, listening for events that were triggered from generators, and distribution of all the reports. Distibution mostly consisted of FTP to a file server where reports could be viewed and saved from a web site. But we did also have some reports emailed to a distribution list.
The main database that stored all the report data also housed all the system meta data that informed the main process and the generators of the details of their work to be performed.
In our system all of the reports ran sequentially, mostly due to Excel limitations at the time. Assuming Excel is now capable of behaving itself, there's no reason that a new system couldn't be designed to run in paralel.
Why don't you replace the various Excel templates with a proper reporting solution? Using a decent reporting product has the following benefits:
everyone can have access to the same report
you can use roles to prevent people seeing certain reports
the users can change the parameters of the reports every time they run it, or they can use the defaults you set up for them
the reporting can be highly automated, it can run automatically on certain schedules (like 9am every Monday morning, or every Sunday evening after end of week processing)
reports can be delivered in a variety of formats
reporting tools can use stored procs in the database, those stored procs can encapsulate business logic
Personally i would advocate the use of Sql Server Reporting Services (2008 version, try and avoid 2005), although there are other products out there as well. SSRS can talk to an Oracle database, it can be obtained and used for free.
You also made vague references to "processes". The way i read this, you have some code running somewhere that pulls data out of the database, massages it somewhat, and places it in to some staging area - maybe the Excel spreadsheet is the staging area, and further reports are derived from that staged data. If my interpretation is correct, then the above mentioned reporting product (and possibly many others) remove the need to do this.
Related
I built a software for a farm using C#, the program was meant to keep track of the inventory and the financial exchanges related to the farms work.
The software was built to be installed on the manager's computer who then entered the farms data and retrieved reports and so. While the accountant used the same PC to use the financial part of the program with a different account.
Now the farm's business grew and more users need to use the system, can I move the database with the old data to a server so users can log in the system from different PC's at the time and continue the old tasks?
If I can - what do I change in my code?
P.S. the database was done in MS Access.
Not a lot of information to go on here. I can tell you that Access is a file based database system, and so whilst you could put the database files on a server or a NAS device with no problem with multiple users you should expect to run into the usual problems of Windows file sharing - VERY SLOW performance as a minimum.
It is also possible that the database may have been limited to a single user at a time, and without any more information impossible to know whether the developers have allowed for multi-user at all or whether you could have a situation where if several people do open the file at once one person may be overwriting another's data leading to corruption.
The short answer is that if the original developers are no longer around and you cannot ask the question of them then you probably need a new dedicated application to do the work which would mean either a complete rewrite or an alternative commercial application.
For multi-user Microsoft SQL Server, MySql, or even Firebird or another dedicated database back end would be the way to go. The front end could be anything - Winforms, WPF, even a web application if that is what you want, but it would have to be written.
I hope that this is helpful.
I am trying to build a web application with large amount of transactions per day with the SQL server in the back end. So I want to generate the reports every day as a offline task and keep it ready in the server for a period of 7 days and then Archive these reports which are more than 7 days old. So I am looking at possible way to achieve this in C#.Net application. Is there a way i can generate these reports offline in some kind of service running in scheduler and display these reports on the web application when the user requests for the report.
I know that there are reporting services like crystal reports, SSRS. but i am not sure whether i will able to achieve my requirements with these.
Any insight would be of great help
Thnaks in advance
Nayan K
Depending on what sort of reporting you are after, this might be more of an SQL question. We do daily roll-ups of certain data to a table using stored procedures, storing the reporting information in tables. We also have vendor-supplied reporting for some of our stuff that generates and emails reports, as well as some of our transactional data loaded into a data warehouse with various reporting options, a couple of C# programs running from Task Scheduler on one of our servers... the list goes on.
Which model is best for you will depend on various factors, including what you personally prefer.
So the short answer is, yes, you can write a C# console application and run it from the Task Scheduler if that's what works for you.
There is following scenario:
I am working on one telecom based project which generates price for provided site details input.
Input is passed in the form of Excel sheet and corresponding output is displayed in gridview.
Output grid contains two dropdownlists which is populated based on site details.
That means there is two dropdown in every row of gridview which is filled by hitting database
At present this tool is working fine for 200 sites,but now client wants to pass 10,000 sites from excel sheet as a input.
It will be very tedious job to hit database for 10,000 sites and it will slowdown the performance of the system.
I am using asp.net 3.5 using C# and database sql server 2008
Does anybody have solution for the best possible way to do this task?
Scalability is your key here.
You should utilize load balancing where possible, if you could make some process asynchronous then do it - using something like ActiveMQ or RabbitMQ this will stop UI Hangs.
Also consider having clustered DB Servers.
Your goal should always be to give the user feedback ASAP (async feedback), Guarantee work processing (Queue/Message system), handle lots of users (Load balancing).
Theres also a lot to be said for code optimization, have a look at your code and see if theres any areas you can "trim the fat" to speed things up.
We're building a survey system and utilising ASP.NET MVC and wondered if anyone can offer suggestions on the architecture.
Here's the problem we're trying to solve. Essentially an agency sends out several surveys every year. They're very structured and not like SurveyMonkey style of surveys - they're actually applications of feedback. Much like a Visa Application there are lots of things they need to do and sometimes it takes them 2-3 weeks to fill it out.
They can upload files (proofs of purchase etc - PDF/JPG) and also multiple "items". Eg. Say for instance they've worked for McDonalds, there could be 20 different franchises, they build a list of locations they've worked. 3 weeks later there could be another 3 new locations and 2 may have closed down. So we need to ensure the forms are able to handle those situations.
The forms themselves (markup and data) change every year - I should mention that this for a taxation/finance/budget system.
We were thinking of using MVC, using Xml to store the data (temporarily), XSD to validate the data, XSL to transform the data to presentable markup (for them to fill out) and then once they "Submit" an application it gets stored into the DB in relevant areas.
When the user starts the application process, they can save the progress so far (we validate whatever they entred and ignore any they havent), save it as an Xml blob and store in the DB. When they're finally ready to submit it, then we do a full validation and upload the files and store them securely (it has their business proofs and accounting statements) and then run some workflows.
What I'm really concerned about is how to manage changing forms versions (a year later). How are form/application systems written these days? We have 2 months to pull this off and about 30 forms to deliver. So 30xXML, 30xXSD, 30xXSL.
This might be a case for integration with Windows Workflow Foundation, since you're talking about maintaining the state of a long-running workflow (completing the application).
If you could compartmentalize the various components of the application process, you could modify the workflow in future years by removing, rerouting, and/or modifying existing portions of the workflow.
That said, it sounds like you might have pretty tight time constraints. It might be worth a couple of hours' investigation into WF, but consider carefully whether introducing something new might jeopardize your deadline.
As for the XML, XSD, XSL route, I think it depends on your team's experience. Personally, I shy away from that and would store the data in one or more "pending applications" tables in a relational database. From there (of course, you could do this from XML, too), build up proper business objects and models to which my MVC views can bind. Field-level validation is performed with Enterprise Validation or Fluent Validation or the like, and final validation is performed by one or more validator classes that inspect all the constituent parts of the application.
To deal with possible changes, keep a clean separation between each of the 30 forms. You should be able to modify a given form next year without messing up others. Remember, you can always subclass or compose a model type if there are new requirements in future years, and you don't have to remove obsolete parts -- your new views just won't expose certain parts of the model.
I am working on a Sometimes Connected CRUD application that will be primarily used by teams(2-4) of Social Workers and Nurses to track patient information in the form of a plan. The application is a revisualization of a ASP.Net app that was created before my time. There are approx 200 tables across 4 databases. The Web App version relied heavily on SP's but since this version is a winform app that will be pointing to a local db I see no reason to continue with SP's. Also of note, I had planned to use Merge Replication to handle the Sync'ing portion and there seems to be some issues with those two together.
I am trying to understand what approach to use for the DAL. I originally had planned to use LINQ to SQL but I have read tidbits that state it doesn't work in a Sometimes Connected setting. I have therefore been trying to read and experiment with numerous solutions; SubSonic, NHibernate, Entity Framework. This is a relatively simple application and due to a "looming" verion 3 redesign this effort can be borderline "throwaway." The emphasis here is on getting a desktop version up and running ASAP.
What i am asking here is for anyone with any experience using any of these technology's(or one I didn't list) to lend me your hard earned wisdom. What is my best approach, in your opinion, for me to pursue. Any other insights on creating this kind of App? I am really struggling with the DAL portion of this program.
Thank you!
If the stored procedures do what you want them to, I would have to say I'm dubious that you will get benefits by throwing them away and reimplementing them. Moreover, it shouldn't matter if you use stored procedures or LINQ to SQL style data access when it comes time to replicate your data back to the master database, so worrying about which DAL you use seems to be a red herring.
The tricky part about sometimes connected applications is coming up with a good conflict resolution system. My suggestions:
Always use RowGuids as your primary keys to tables. Merge replication works best if you always have new records uniquely keyed.
Realize that merge replication can only do so much: it is great for bringing new data in disparate systems together. It can even figure out one sided updates. It can't magically determine that your new record and my new record are actually the same nor can it really deal with changes on both sides without human intervention or priority rules.
Because of this, you will need "matching" rules to resolve records that are claiming to be new, but actually aren't. Note that this is a fuzzy step: rarely can you rely on a unique key to actually be entered exactly the same on both sides and without error. This means giving weighted matches where many of your indicators are the same or similar.
The user interface for resolving conflicts and matching up "new" records with the original needs to be easy to operate. I use something that looks similar to the classic three way merge that many source control systems use: Record A, Record B, Merged Record. They can default the Merged Record to A or B by clicking a header button, and can select each field by clicking against them as well. Finally, Merged Records fields are open for edit, because sometimes you need to take parts of the address (say) from A and B.
None of this should affect your data access layer in the slightest: this is all either lower level (merge replication, provided by the database itself) or higher level (conflict resolution, provided by your business rules for resolution) than your DAL.
If you can install a db system locally, go for something you feel familiar with. The greatest problem I think will be the syncing and merging part. You must think of several possibilities: Changed something that someone else deleted on the server. Who does decide?
Never used the Sync framework myself, just read an article. But this may give you a solid foundation to built on. But each way you go with data access, the solution to the businesslogic will probably have a much wider impact...
There is a sample app called issueVision Microsoft put out back in 2004.
http://windowsclient.net/downloads/folders/starterkits/entry1268.aspx
Found link on old thread in joelonsoftware.com. http://discuss.joelonsoftware.com/default.asp?joel.3.25830.10
Other ideas...
What about mobile broadband? A couple 3G cellular cards will work tomorrow and your app will need no changes sans large pages/graphics.
Excel spreadsheet used in the field. DTS or SSIS to import data into application. While a "better" solution is created.
Good luck!
If by SP's you mean stored procedures... I'm not sure I understand your reasoning from trying to move away from them. Considering that they're fast, proven, and already written for you (ie. tested).
Surely, if you're making an app that will mimic the original, there are definite merits to keeping as much of the original (working) codebase as possible - the least of which is speed.
I'd try installing a local copy of the db, and then pushing all affected records since the last connected period to the master db when it does get connected.