I have to retrive the data from one service based sql server to another service based sql server
select * from servernameA.db_name.dbo.table_name
where column in (select column from servernameb.db_name.dbo.table_name)
So what's the problem? You can do it.
Just a little performance improvement in your query with DISTINCT.
select * from servernameA.db_name.dbo.table_name
where column in (select DISTINCT column from servernameb.db_name.dbo.table_name)
For your reference:
Query across multiple databases on same server
Selecting data from two different servers in SQL Server
------------Edit----------------
How about this? I haven't tried this yet. Give this a try.
// connectionString is the connection string for the ServerA..DB1
using (var conn = new SqlConnection(connectionString))
{
conn.Open();
// provide commandText as the SQL query having the full name of Server2..DB2
using (var command = new SqlCommand(commandText, conn))
{
using (var reader = command.ExecuteReader())
{
while(reader.Read())
{
// read the result
}
}
}
}
Related
I would like to use Entity Framework (EF) to query a SQL Server instance and return a list of database names on that instance.
I can do this using the following code, but wondered if there was a way with EF?
public static string[] GetDatabaseNames(SqlConnection masterConn)
{
List<string> databases = new List<string>();
// retrieve the name of all the databases from the sysdatabases table
using (SqlCommand cmd = new SqlCommand("SELECT [name] FROM sysdatabases", masterConn))
{
using (SqlDataReader rdr = cmd.ExecuteReader())
{
while (rdr.Read())
{
databases.Add((string)rdr["name"]);
}
}
}
return databases.ToArray();
}
I should mention that I am new to EF and its capabilities / limitations.
You could simply send a raw query to your SQL Server through Entity Framework :
using (var context = new MyContext())
{
var dbNames = context.Database.SqlQuery<string>(
"SELECT name FROM sys.databases").ToList();
}
Sources : https://msdn.microsoft.com/en-us/library/jj592907(v=vs.113).aspx and https://stackoverflow.com/a/147662/2699126
You create view in SQL database
CREATE VIEW [dbo].[SysDatabasesView] AS SELECT * FROM sys.databases
then add this object into edmx
There is a windows form application. I am using the MS Access database for some data manipulation. I want to copy data from one database to another. The table name, schema and the data types are same in both the tables.
I am using the below query to bulk insert data in destination database by selecting data from the source database.
INSERT INTO [Table1] IN 'C:\Data\Users.mdf' SELECT * FROM [Table1]
After data is inserted, I am querying to the target table to fetch the inserted data. I am using OleDbConnection for performing the database operations.
The issue I am facing here is that, after the above mentioned INSERT query is executed when I am executing the SELECT statement to fetch the data, I am not getting the data. However, when I am checking in debugging mode then I am getting the data.
I noticed that if I am waiting for some time after the INSERT statement is executed then the data is coming correctly. So I assume that it needs some time(delay?) to complete the bulk insert operation.
I tried providing Task.Delay(20000) after the INSERT query execution but no luck. Could someone help me here, how I can resolve this issue? Any help is highly appreciated.
I didn't find a good way to handle this but did a work around for the same. After data is inserted into the table, I am firing another query to check whether there is any data in the inserted table or not. This happens in a do..while loop like follows. The table is dropped every time the operation is completed.
var insertQuery = "INSERT INTO [Table1] IN 'C:\Data\Users.mdf' SELECT * FROM [Table1]";
ExecuteQuery(insertQuery, connProd);
var count = 10;
do
{
var selectQuery = "SELECT TOP 1 * FROM " + tableProdCopy;
var dtTopRowData = GetQueryData(selectQuery, connOther);
if (dtTopRowData != null && dtTopRowData.Rows.Count > 0)
{
count = 0;
break;
}
System.Threading.Thread.Sleep(2000);
count = count - 1;
} while (count > 0);
private DataTable GetQueryData(string query, OleDbConnection conn)
{
using (OleDbCommand cmdOutput = new OleDbCommand(query, conn))
{
using (OleDbDataAdapter adapterOutput = new OleDbDataAdapter(cmdOutput))
{
var dtOutput = new DataTable();
adapterOutput.Fill(dtOutput);
return dtOutput;
}
}
}
private void ExecuteQuery(string query, OleDbConnection conn)
{
using (OleDbCommand cmdInput = new OleDbCommand(query, conn))
{
cmdInput.ExecuteNonQuery();
}
}
I have a stored procedure that looks like that:
InsertItem: INSERT INTO (IN itemId INT, name TEXT);
Is there a way I could execute a bulk of it?
like instead of executing something like that:
using (MySqlConnection connection = new MySqlConnection(_connectionString))
{
connection.Open();
foreach (Item item in GetItems())
{
using (MySqlCommand command = new MySqlCommand("InsertItem", connection))
{
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddWithValue("#itemId", item.ItemId);
command.Parameters.AddWithValue("#name", item.Name);
command.ExecuteNonQuery();
}
}
}
I'm trying to achieve code looking like that without successing:
using (MySqlConnection connection = new MySqlConnection(_connectionString))
{
connection.Open();
using (MySqlCommandBulk command = new MySqlCommand("InsertItem", connection))
{
command.CommandType = CommandType.StoredProcedure;
for (Item item in GetItems())
{
MySqlCommandBulkItem bulkItem = new MySqlCommandBulkItem();
bulkItem["itemId"] = item.ItemId;
bulkItem["name"] = item.Name;
command.BulkItems.Add(bulkItem);
}
command.Execute();
}
}
My point is that the command will send all of the data at once, and will not send each query alone.
Any ideas?
The Oracle connector for the Dotnet framework allows the use of arrays in place of scalars on parameters. But the MySQL connector doesn't.
There are two ways to accelerate bulk loads in MySQL.
One of them applies to InnoDB tables but doesn't help with MyISAM tables. Start a transaction. Then, after every few hundred rows, COMMIT it and start another one. That will commit your table inserts in bunches, which is faster than autocommiting them individually.
The other is to use MySQL's LOAD DATA INFILE command to slurp up a data file and bulk-insert it into the database. This is very fast, but you have to be diligent about formatting your file correctly.
I am writing a simple reporting tool that will need to move data from a table in one Access database to a table in another Access database (the table structure is identical). However, I am new to C# and am finding it hard to come up with a reliable solution.
Any pointers would be greatly appreciated.
Access SQL supports using an IN clause to specify that a table resides in a different database. The following C# code SELECTs rows from a table named [YourTable] in Database1.accdb and INSERTs them into an existing table named [YourTable] (with the identical structure) in Database2.accdb:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Data.OleDb;
namespace oleDbTest
{
class Program
{
static void Main(string[] args)
{
string myConnectionString;
myConnectionString =
#"Provider=Microsoft.ACE.OLEDB.12.0;" +
#"Data Source=C:\Users\Public\Database1.accdb;";
using (var con = new OleDbConnection())
{
con.ConnectionString = myConnectionString;
con.Open();
using (var cmd = new OleDbCommand())
{
cmd.Connection = con;
cmd.CommandType = System.Data.CommandType.Text;
cmd.CommandText =
#"INSERT INTO YourTable IN 'C:\Users\Public\Database2.accdb' " +
#"SELECT * FROM YourTable WHERE ID < 103";
cmd.ExecuteNonQuery();
}
con.Close();
}
Console.WriteLine("Done.");
}
}
}
Many ways.
0) If it's only once, copy and paste the table.
1) If you want to do this inside Access, the easiest way is to create a linked table in the new database, and then a make table query in the new database.
2) You can reference the second table directly.
SELECT *
FROM TableInDbX IN 'C:\SomeFolder\DB X';
3) In a macro, you can use the TransferDatabase method of the DoCmd object to link relevant tables and then run suitable append and update queries to synchronize.
4) VBA
http://www.techonthenet.com/access/questions/new_mdb.php
Given column names Col1, Col2, and Col3:
private static void Migrate(string dbConn1, string dbConn2) {
// DataTable to store your info into
var table = new DataTable();
// Modify your SELECT command as needed
string sqlSelect = "SELECT Col1, Col2, Col3 FROM aTableInOneAccessDatabase ";
// Notice this uses the connection string to DB1
using (var cmd = new OleDbCommand(sqlSelect, new OleDbConnection(dbConn1))) {
cmd.Connection.Open();
table.Load(cmd.ExecuteReader());
cmd.Connection.Close();
}
// Modify your INSERT command as needed
string sqlInsert = "INSERT INTO aTableInAnotherAccessDatabase " +
"(Col1, Col2, Col3) VALUES (#Col1, #Col2, #Col3) ";
// Notice this uses the connection string to DB2
using (var cmd = new OleDbCommand(sqlInsert, new OleDbConnection(dbConn2))) {
// Modify these database parameters to match the signatures in the new table
cmd.Parameters.Add("#Col1", DbType.Int32);
cmd.Parameters.Add("#Col2", DbType.String, 50);
cmd.Parameters.Add("#Col3", DbType.DateTime);
cmd.Connection.Open();
foreach (DataRow row in table.Rows) {
// Fill in each parameter with data from your table's row
cmd.Parameters["#Col1"].Value = row["Col1"];
cmd.Parameters["#Col2"].Value = row["Col2"];
cmd.Parameters["#Col3"].Value = row["Col3"];
// Insert that data
cmd.ExecuteNonQuery();
}
cmd.Connection.Close();
}
}
Now, I do not work with Access databases very often, so you may need to tweak something up there.
That should get you well on your way, though.
Worth noting:
If I remember correctly, Access does NOT pay attention to your OleDbParameter names! You could call them whatever you want, and in fact most people just use a question mark ? for the parameter fields.
So, you have to add and update these parameters in the same order that your statement calls them.
So, why did I name the parameters #Col1, #Col2, #Col3? Here, it just to help you and me understand where each parameter is intended to map to. It is also good practice to get into. If you ever migrate to a better database, hopefully it will pay attention to what the parameters are named.
I am quit busy turning a old classic asp website to a .NET site. also i am now using SQL Server.
Now I have some old code
strsql = "select * FROM tabel WHERE ID = " & strID & " AND userid = " & struserid
rs1.open strsql, strCon, 2, 3
if rs1.eof THEN
rs1.addnew
end if
if straantal <> 0 THEN
rs1("userid") = struserid
rs1("verlangid") = strID
rs1("aantal") = straantal
end if
rs1.update
rs1.close
I want to use this in SQL Server. The update way. How can I do this?
How can I check if the datareader is EOF/EOL
How can I insert a row id it is EOF/EOL
How can I update a row or delete a row with one function?
If you want to use raw SQL commands you can try something like this
using (SqlConnection cnn = new SqlConnection(_connectionString))
using (SqlCommand cmd = new SqlCommand())
{
cnn.Open();
cmd.Connection = cnn;
// Example of reading with SqlDataReader
cmd.CommandText = "select sql query here";
using (SqlDataReader reader = cmd.ExecuteReader())
{
while (reader.Read())
{
myList.Add((int)reader[0]);
}
}
// Example of updating row
cmd.CommandText = "update sql query here";
cmd.ExecuteNonQuery();
}
It depends on the method you use... Are you going to use Entity Framework and LINQ? Are you going to use a straight SQL Connection? I would highly recommend going down the EF route but a simple straight SQL snippet would look something like:
using (var connection = new SqlConnection("Your connection string here"))
{
connection.Open();
using (var command = new SqlCommand("SELECT * FROM xyz ETC", connection))
{
// Process results
using (SqlDataReader reader = command.ExecuteReader())
{
while (reader.Read())
{
int userId = (int)reader["UserID"];
string somethingElse = (string)reader["AnotherField"];
// Etc, etc...
}
}
}
// To execute a query (INSERT, UPDATE, DELETE etc)
using (var commandExec = new SqlCommand("DELETE * FROM xyz ETC", connection))
{
commandExec.ExecuteNonQuery();
}
}
You will note the various elements wrapped in using, that is because you need to release the memory / connection when you have finished. This should answer your question quickly but as others have suggested (including me) I would investigate Entity Framework as it is much more powerful but has a learning curve attached to it!
You can use SQL store procedure for Update. And call this store procedure through C#.
Create procedure [dbo].[xyz_Update]
(
#para1
#para2
)
AS
BEGIN
Update tablename
Set Fieldname1=#para1,
Set Feildname2=#para2
end