Silverlight, connecting to multiple databases on different servers - c#

I'm developing a Silverlight 4 application (C#).
I used Silverlight Web application template and Entity framework to get started + VS 2010.
I now have a running application that connects to the database and displays data properly.
The database has one table with 5 attributes.
So what i need to do now..
I have a connection string to connect to a database on an external server. The database is similar to mine, but with more tables and updated information.I need to connect to this database. (EDIT: since this is outside my project i wont be having any EF for this but just privilege to query a single table. (VIEW to be specific))
Since I know the attribute names, table etc., use a SELECT query and get the data. So execute and get the result set. (E.g. "SELECT R.name, R.marks FROM results R").
The result set from query will be stored in, for example a data-table, and then insert into my database.
I also created an object class with accessors for the table, so that I can give the 'result set' the structure before I insert.
Theoretically this sounds possible when I chalk it on my board but I want to know if this would work?
I mean having 2 database connection open. Is this possible? If I follow the same steps as above can I achieve what I want?
Please let me know if I'm unclear about anything. This is just a smaller version of the real application but the logic i need to implement is the same. :)
If there is a better approach I'm happy to consider.
Cheers

You may be over thinking this a little. I add a new class project, and add a new ADO.NET Entity Model to that project using the connection string for the second database. After that, you'll be able to reference the new EF project from your Service or Host (Web) project.

Related

Have Users Interact with MS Access Queries instead of Tables

I am working on a project where I need to create a database to track the status of units throughout the production process. My current blockade involves getting the users to interact with a DataGridView that is supplied from a Microsoft Access Query instead of a Microsoft Access Table.
What I want to do is create a query in Microsoft Access and have it link to the DataGridView so end users can interact with a query instead of the actual tables, while populating all parent tables.
I am not sure if what I am attempting to do is possible or advised. This is the first time I have built a database in the professional world and want to make sure I am doing things properly. I have also never built a C# application for business use and have very limited experience with the language itself.
I have tried creating the Query in Access and linking it to the application in the same way you would add a table from a data source. That would allow me to view the data in the query...but it would display as a read-only and not allow for any data to be altered (the query builder in the TableAdapter Query Configuration Wizard indicated it was a read-only) . I have tried adding all related table adapters to the TableAdapterManager and it still didn't help.
I apologize if this question sounds disjointed as I am trying to overcome one obstacle at a time and do not want to overload one question with multiple issues. I can supply my ERD if it will make things easier and I have it normalized to at least 2NF.

C# Is there an easier way to create a database, empty tables, tables with data in them by default, stored procs and views?

Before I posted this question, I did some Googling first on how a database was created through C# and mostly it points to either SMO or SQL query files and it was the time of SQL Server 2005 and 2008.
So at this day in age, is there an easier way to create a database with empty tables, tables with data in them by default, stored procedures and views?
I need a suggestion.
I think the answer is probably Entity Framework. You can do 'code first' and use database migrations, allowing you to write your C# code and use that to generate a lot of the database for you.
Ultimately though, 'easier' is subjective. I personally find EF great for the 'normal' stuff, but at the end of the day, if you need a stored procedure to do some custom logic; you need to write the custom logic, in some fashion.
Maybe have a look and see if you think it fits your needs.
https://www.asp.net/mvc/overview/getting-started/getting-started-with-ef-using-mvc/creating-an-entity-framework-data-model-for-an-asp-net-mvc-application
Looked at the database projects in studio 2013. You create a database as a series of scripts using a familiar GUI. However, changes are published - this process creates a unique change script targeting the connection you define. For new databases the whole thing gets created, but publish against a partial or out dated version and the script created in a change script to bring it up to date.
You can even writ unit tests against your database using specialist tools, although I do find them lacking a bit.
More on msdn - here
Depends. right out of gates. Sp and views. Best shot is directly from database through a workbench. I can then capture definitions and store in a file to be replayed through c#
As for tables there are many orms that can generate tables via c#. Look at entity frameworks. Code first examples
I have generated tables using EF Works fine. I then went into database and created views and sps.
The trick is to migrate new views and sps into your EF model U can google entity Frameworks code first ... Adding views and SPs.
Worst case is u create database all through database workbench. Create a script that an be played to recreate eveything. By running. Then use EF DATABASE first approach
In either case u end up with a good set of autogenerated code to manage CRUD and object management and an abstracted data model

Data transfer from One database to another

I am looking for an idea or some direction. I have to transfer data from one database to another both are structurally and schema wise same.
Its a complete database with maybe 70 tables and having relationship between tables at different levels. Even though i ' m going to mess up the identity when i move across database but as of now i am ok with it.
Idea which i thought was to load required data from all table into an XML and then create connection to second database and push data from this XML its kind of repeated and not best way at all. So looking for direction.
Can i use entity framework for this somehow??
I cannot use SSIS for this it has to be C# Sorry.
You can create a linked server as stated in the comments to your question. You seemed to indicate that you know how to do this, but in case not, you can do this from SQL Server Management Studio by drilling down to "Server Objects > Linked Servers" beneath your source database on the source database server, then right-click, "New Linked Server", etc.
Then you would use a statement like this, for example, from your C# code:
insert into DestServer.DBName.dbo.TableName
select * from SourceServer.DBName.dbo.TableName
Assuming you are connected to 'SourceServer' and that 'SourceServer' maintains a linked server object pointing to 'DestServer'. Note: you don't actually need to use the fully-qualified name for the table on 'SourceServer', but I've put it there for clarity. I.E. you could also do this:
insert into DestServer.DBName.dbo.TableName
select * from TableName
Don't forget to setup the permissions properly in your linked server object so that your query can write to the table in the destination server. You can do this any number of ways, and often (because I work in a small environment where it's maintained by just me and a couple other folks) I just use the "sa" login:
Yes, can use linked servers in .NET.
You just use the 4 part name.
What you can do in TSQL in SSMS you can do in a .NET SQLcommand.
My experience is that you get better performance connecting to the server you are writing to.

How to set up multiple Oracle databases (schemas) with the same user?

Our company inherited some software that runs on C# Visual Studio 2010, Windows 7 and Oracle 11g. After some effort we got the software working and got a stable database (schema) set up.
We are now starting the process of migrating some data from an old system to this "new" system. However, I don't want to mess up our working schema as I expect a bit of trial and error work will be needed with our data import.
I wanted to do the following:
Let's say our existing schema is called PROD. I wanted to create a second schema called TEST that we can use for the imported data. Then, in the C# code I can just switch the name of the datasource when switching between our two database schemas. The catch is that the username and password for this connection appears in a multitude of places scattered in the code. To avoid having to change user credentials in multiple places every time we switch between "db environments", I wanted to create a single user to have access to PROD and to TEST.
However, how to grant user privilege on specific schema? suggests this is not possible. Correct way to give users access to additional schemas in Oracle suggests a method for granting access on an object level, but this is insufficient: I basically want one single user to have access to two identical schemas (PROD and TEST). Once I've achieved this, I want to start modifying TEST to start with our data import.
I have also tried creating TEST as a separate Oracle Database installation on a different port, but when trying to create my user on this new instance I still get a conflict that the user already exists (since it was created for PROD in the original database installation).
My user already exists and has access to PROD. How do I give him access to TEST as well? Or how would one solve the more general problem of having a PROD and TEST database defined in an application that uses Oracle?
In MySQL this would be trivial, but I don't have any idea how to do this in Oracle. I am very new to Oracle.
The question of giving permissions has already been answered.
Now to your question, as a whole: Am I reading correctly that you want to update the database schema, but you want to keep it in the same database as another schema and run both in what appears to be a production database? If so, read that again to let it sink in how extremely dangerous that is.
When migrating from one "schema" to another, as a software update, it is safer to create a new database and migrate the data. This gives you plenty of shots, as you can blow away the new database as you tweak scripts.
If you want as little friction as possible in your software, you need to do a couple of things:
Refactor out the code from the moron who decided to hard code connection information in multiple places. You need to get the strings in one place and make sure you extract out the Data access layer (DAL) code into its own class.
Consider creating domain objects that do not rely on the database schema(s). I consider this mandatory, but you could get away without doing this. I would still create domain objects, even if they match the PROD schema tables, as you should not be using data constructs if you are moving from one schema to another.
Create an interface for your data access layer (DAL)
Map the current data schema, through the current DAL, to the domain objects, using the interface.
Map the new schema, through a new DAL, to the domain objects, using the interface.
Create a factory (or use the provider pattern) to determine which DAL object you are going to use (this makes the application configurable to old or new "schema"
I'm assuming that you have a schema PROD and a DBUSER which have some privileges to objects in this schema.
DBUSER's name and password is hardcoded all over the application.
You've created a new schema TEST which looks the same as PROD (including grants to DBUSER).
You want that wherever the application does something like:
UPDATE some_table set ...
It will update some_table table in TEST and not in PROD.
My suggestion is to use and change SYNONYMS, ie-
When you want to update some_table in PROD, do:
CREATE OR REPLACE PUBLIC SYNONYM some_table for prod.some_table;
and when you want to update some_table in TEST, do:
CREATE OR REPLACE PUBLIC SYNONYM some_table for test.some_table;
The connection to Oracle is not handled correctly in the C# code and this is what is causing difficulty.
If the Data Access Layer were defined separately as Gregory suggests or if a more generic naming convention were used in SQL statements as A.B's answer points to, then it would be much simpler to switch between two databases.
Since our mandate currently doesn't involve making any changes/refactoring of code, I am using a backup and recovery approach:
I create a backup of the working database. Then I do the necessary tests and changes on the database. If I need to revert back to the working database, I create a backup of the "testing" database again and restore the original working database, using the appropriate flag to replace existing tables in the case of a restore. This enables me to switch back and forth between the "working" database and the "test" database.
This is not ideal as it does take some time to execute the backups and restores, but works without affecting the C# code and gives the ability to do work on a "testing" database without affecting the working one. Since this is a temporary scenario until the "testing" database becomes working, this is the approach I'll be following.
As the other answers point out - there is a more generic need to fix/refactor/generalize the connection code - I believe that is the best approach and the only reason I'm not doing that immediately is because we are not yet mandated to change the code.

How to change database design in a deployed application?

Situation
I'm creating a C#/WPF 4 application using a SQL Compact Edition database as a backend with the Entity Framework and deploying with ClickOnce.
I'm fairly new to applications using databases, though I don't suspect I'll have much problem designing and building the original database. However, I'm worried that in the future I'll need to add or change some functionality which will require me to change the database design after the database is already deployed and the user has data in the database.
Questions
Is it even possible to push an updated database design out to users via a clickonce update in the same way it is for code changes?
If I did, how would the user's data be affected?
How is this sort of thing done in real situations? What are some best-practices?
I figure that in the worst case, I'd need to build some kind of "version" number into the database or program settings and create some routine to migrate the user's current version of the database to the new one.
I appreciate any insight into my problem. Thanks a lot.
There are some 'tricks' that are employed when designing databases to allow for design changes.
Firstly, many database designers create views to code against, rather than coding directly to the tables. This allows tables to be altered (split or merged, etc) while only requiring that the views are updated. You may want to investigate database refactoring techniques for this.
Secondly, you can indeed add versioning information to the database (commonly done as a 'version' table with a single field). Updating the database can be done through code or through scripts. One system I worked on would automatically check the database version and then progressively update the schema through versions in code until it matched the required version for the runtime. This was quite an undertaking.
I think your "worst" case is actually a pretty good route to go in this situation. Maintain a database version in the DB and have your application check and update the DB as necessary. If you build your updater correctly, it should be able to maintain the user's data. Depending on the update this might involve creating temporary tables to hold the existing data and repopulating new versions of the tables from them. You might be able to include a new SDF file with the new schema in place in the update process and simply transfer the data. It might be slightly easier that way -- you could use file naming to differentiate versions and trigger the update code that way.
Unfortunately version control and change management for databases is desperately, desperately far from what you can do with the rest of your code.
If you have an internal-only environment there are a number of tools which will help you (DBGhost, Red Gate has a newish app, some deployment management apps) but all of them are less than full solutions imho, but they are mostly good enough.
For client-shipped solutions you really don't have anything better than your worst case I'm afraid. Just try and design with flexibility in mind - see Dr.Herbie's answer.
This is not a solved problem basically.
"Smart Client Deployment with ClickOnce" by Brian Noyes has an excellent chapter on this issue. (Chapter 5)
ISBN 978-0-32-119769-6
He suggests something like this:
if(ApplicationDeployment.CurrentDeployment.IsFirstRun) {
MigrateData();
}
private void MigrateData() {
string previousDb = Path.Combine(ApplicationDeployment.CurrentDeployment.DataDirectory, #".\pre\mydb.sdf");
if(!File.Exists(previousDb))
return;
string oldConnString = #"Data Source=|DataDirectory|\.pre\mydb.sdf";
string newConnString = #"Data Source=|DataDirectory|\mydb.sdf";
//If you are using datasets perform any migration here, with the old and new table adapters.
//Otherwise use an .sql data migration script.
//Store the version of the database in the database, and check that in the beginning of your update script and GOTO the correct line in the SQL script.
}
A common solution is to include a version number somewhere in the database. If you have a table with miscellaneous system data, throw it in there, or create a table with one record just to hold the DB version number. Then whenever the program starts up, check if the database version is less than the expected version. If so, execute the required SQL CREATE, ALTER, etc, commands to bring it up to speed. Have a script or function for each version change. So if you see the database is currently at version 6 and the code expects version 8, execute the 6 to 7 update and the 7 to 8 update.
Another method we used on one project I worked was to ship a schema-only, no data database with the code. Every time you installed a new version the installer would also install the latest copy of this new blank database. Then when the program started it up it would compare the user's current database schema with the new database schema, and determine what database changes were needed on the fly. Like, if in the "reference schema" table Foo had a column named Bar, and there was no column Bar in the user's current database, we would generate a "alter table Foo add Bar ..." and execute it. While writing the first draft of the program to do this was a fair amount of work, once we'd done it there was pretty much zero maintenance to keep the DB schema up to date. The conversion was just done on the fly.
Note that this scheme doesn't handle DB changes that require changing data values, like if you add a new column that must be initially populated by doing some computation on data from other tables or some such. But if you can generate new data from old data, that must mean that the new data is redundant and your database is not normalized. I don't think the situation ever came up for us.
I had the same issue with an app in Android with an SQLite database adding a table. I changed the name of the database to include a version extension, like: theDataBaseV1, deleted the previous one and the app works fine.
I just changed the name of the database and the name in this line of code
private static final String DATABASE_NAME = "busesBogotaV2.db";
in the DBManager when its going to open.
Does anybody knows if this trivial solution has any unintended consequences?

Categories