We're developing a reporting application that uses asp.net-mvc (.net 4). We connect through DDTEK.Sybase middleware to a Sybase ASE 12.5 database.
We're having a problem pulling data into a datareader (from a stored procedure). The stored procedure computes values (approximately 50 columns) by doing sums, counts, and calling other stored procedures.
The problem we're experiencing is... certain (maybe 5% of the columns) come back with NULL or 0. If we debug and copy the SQL statement being used for the datareader and run it inside another SQL tool we get all valid values for all columns.
conn = new SybaseConnection
{
ConnectionString = ConfigurationManager.ConnectionStrings[ConnectStringName].ToString()
};
conn.Open();
cmd = new SybaseCommand
{
CommandTimeout = cmdTimeout,
Connection = conn,
CommandText = mainSql
};
reader = cmd.ExecuteReader();
// AT THIS POINT IMMEDIATELY AFTER THE EXECUTEREADER COMMAND
// THE READER CONTAINS THE BAD (NULL OR 0) DATA FOR THESE COLUMNS.
DataTable schemaTable = reader.GetSchemaTable();
// AT THIS POINT WE CAN VIEW THE DATATABLE FOR THE SCHEMA AND IT APPEARS CORRECT
// THE COLUMNS THAT DON'T WORK HAVE SPECIFICATIONS IDENTICAL TO THE COLUMNS THAT DO WORK
Has anyone had problems like this using Sybase and ADO?
Thanks,
John K.
Problem Solved! ... The problem turned out to be a diffence in the way nulls were handled in the SQL. ... We had several instances in the stored procedure that used non ansi null tests. (x = null rather than x is null) The SQL tools I had used to test this problem were defaulting the "SET ANSINULL" to OFF while our ADO code was not so the "SET ANSINULL" value was ON. Because of this setting, SQL code that tested for null would never test "TRUE" allowing the null value to be returned.
Related
I have two databases, one is Oracle 11g and the other is in Sql Server 2012. There is a table in the Oracle database that contains a column called BASE_ID. This is comprised of work order numbers.
In the Sql Server database, there is also a table that contains a column called WorkOrders. This also contains work order numbers. What I need to do is compare what is in the Oracle column to what is in the SQL Server column, and display what is in Oracle but not in Sql Server. I am a little stumped here, even for a starting point. I am kind of just taking pot shots. This mess is what I have so far:
private void compare()
{
try
{
if (con.State != ConnectionState.Open)
con.Open();
DataTable t = new DataTable();
OracleCommand oraCmd = new OracleCommand("SELECT BASE_ID FROM WORK_ORDER x WHERE NOT EXISTS(SELECT NULL FROM [SQLServer].[BACRTest].[WorkOrders] y WHERE y.WorkOrder = x.BASE_ID)", oraconn);
OracleDataAdapter dt = new OracleDataAdapter(oraCmd);
dt.Fill(t);
dataGridView3.DataSource = t;
}
catch (Exception e)
{
MessageBox.Show(e.Message);
}
}
This is just throwing me an ORA-00903: invalid table name error which is fair enough because I wasn't really expecting it to work.
Any suggestions would be very much appreciated.
As noted above, if you have the correct products installed (e.g. Oracle's Database Gateway for SQL Server, or Database Gateway for ODBC) you can make a direct connection between Oracle and SQL Server. If you don't have such products you can try exporting the SQL Server table to a comma-separated values (.csv) file, then set up an external table in Oracle to read the .csv. Not perhaps the most seamless method in the world, but it does represent one way to handle this.
Best of luck.
Because the entries are on different DBMSs, you cannot directly make a query for this, so most likely you will need to store the entries in memory and compare them one by one. Maybe you could partition the tables into small sets and compare them
I have a file (which has 10 million records) like below:
line1
line2
line3
line4
.......
......
10 million lines
So basically I want to insert 10 million records into the database.
so I read the file and upload it to SQL Server.
C# code
System.IO.StreamReader file =
new System.IO.StreamReader(#"c:\test.txt");
while((line = file.ReadLine()) != null)
{
// insertion code goes here
//DAL.ExecuteSql("insert into table1 values("+line+")");
}
file.Close();
but insertion will take a long time.
How can I insert 10 million records in the shortest time possible using C#?
Update 1:
Bulk INSERT:
BULK INSERT DBNAME.dbo.DATAs
FROM 'F:\dt10000000\dt10000000.txt'
WITH
(
ROWTERMINATOR =' \n'
);
My Table is like below:
DATAs
(
DatasField VARCHAR(MAX)
)
but I am getting following error:
Msg 4866, Level 16, State 1, Line 1
The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
Below code worked:
BULK INSERT DBNAME.dbo.DATAs
FROM 'F:\dt10000000\dt10000000.txt'
WITH
(
FIELDTERMINATOR = '\t',
ROWTERMINATOR = '\n'
);
Please do not create a DataTable to load via BulkCopy. That is an ok solution for smaller sets of data, but there is absolutely no reason to load all 10 million rows into memory before calling the database.
Your best bet (outside of BCP / BULK INSERT / OPENROWSET(BULK...)) is to stream the contents from the file into the database via a Table-Valued Parameter (TVP). By using a TVP you can open the file, read a row & send a row until done, and then close the file. This method has a memory footprint of just a single row. I wrote an article, Streaming Data Into SQL Server 2008 From an Application, which has an example of this very scenario.
A simplistic overview of the structure is as follows. I am assuming the same import table and field name as shown in the question above.
Required database objects:
-- First: You need a User-Defined Table Type
CREATE TYPE ImportStructure AS TABLE (Field VARCHAR(MAX));
GO
-- Second: Use the UDTT as an input param to an import proc.
-- Hence "Tabled-Valued Parameter" (TVP)
CREATE PROCEDURE dbo.ImportData (
#ImportTable dbo.ImportStructure READONLY
)
AS
SET NOCOUNT ON;
-- maybe clear out the table first?
TRUNCATE TABLE dbo.DATAs;
INSERT INTO dbo.DATAs (DatasField)
SELECT Field
FROM #ImportTable;
GO
C# app code to make use of the above SQL objects is below. Notice how rather than filling up an object (e.g. DataTable) and then executing the Stored Procedure, in this method it is the executing of the Stored Procedure that initiates the reading of the file contents. The input parameter of the Stored Proc isn't a variable; it is the return value of a method, GetFileContents. That method is called when the SqlCommand calls ExecuteNonQuery, which opens the file, reads a row and sends the row to SQL Server via the IEnumerable<SqlDataRecord> and yield return constructs, and then closes the file. The Stored Procedure just sees a Table Variable, #ImportTable, that can be access as soon as the data starts coming over (note: the data does persist for a short time, even if not the full contents, in tempdb).
using System.Collections;
using System.Data;
using System.Data.SqlClient;
using System.IO;
using Microsoft.SqlServer.Server;
private static IEnumerable<SqlDataRecord> GetFileContents()
{
SqlMetaData[] _TvpSchema = new SqlMetaData[] {
new SqlMetaData("Field", SqlDbType.VarChar, SqlMetaData.Max)
};
SqlDataRecord _DataRecord = new SqlDataRecord(_TvpSchema);
StreamReader _FileReader = null;
try
{
_FileReader = new StreamReader("{filePath}");
// read a row, send a row
while (!_FileReader.EndOfStream)
{
// You shouldn't need to call "_DataRecord = new SqlDataRecord" as
// SQL Server already received the row when "yield return" was called.
// Unlike BCP and BULK INSERT, you have the option here to create a string
// call ReadLine() into the string, do manipulation(s) / validation(s) on
// the string, then pass that string into SetString() or discard if invalid.
_DataRecord.SetString(0, _FileReader.ReadLine());
yield return _DataRecord;
}
}
finally
{
_FileReader.Close();
}
}
The GetFileContents method above is used as the input parameter value for the Stored Procedure as shown below:
public static void test()
{
SqlConnection _Connection = new SqlConnection("{connection string}");
SqlCommand _Command = new SqlCommand("ImportData", _Connection);
_Command.CommandType = CommandType.StoredProcedure;
SqlParameter _TVParam = new SqlParameter();
_TVParam.ParameterName = "#ImportTable";
_TVParam.TypeName = "dbo.ImportStructure";
_TVParam.SqlDbType = SqlDbType.Structured;
_TVParam.Value = GetFileContents(); // return value of the method is streamed data
_Command.Parameters.Add(_TVParam);
try
{
_Connection.Open();
_Command.ExecuteNonQuery();
}
finally
{
_Connection.Close();
}
return;
}
Additional notes:
With some modification, the above C# code can be adapted to batch the data in.
With minor modification, the above C# code can be adapted to send in multiple fields (the example shown in the "Steaming Data..." article linked above passes in 2 fields).
You can also manipulate the value of each record in the SELECT statement in the proc.
You can also filter out rows by using a WHERE condition in the proc.
You can access the TVP Table Variable multiple times; it is READONLY but not "forward only".
Advantages over SqlBulkCopy:
SqlBulkCopy is INSERT-only whereas using a TVP allows the data to be used in any fashion: you can call MERGE; you can DELETE based on some condition; you can split the data into multiple tables; and so on.
Due to a TVP not being INSERT-only, you don't need a separate staging table to dump the data into.
You can get data back from the database by calling ExecuteReader instead of ExecuteNonQuery. For example, if there was an IDENTITY field on the DATAs import table, you could add an OUTPUT clause to the INSERT to pass back INSERTED.[ID] (assuming ID is the name of the IDENTITY field). Or you can pass back the results of a completely different query, or both since multiple results sets can be sent and accessed via Reader.NextResult(). Getting info back from the database is not possible when using SqlBulkCopy yet there are several questions here on S.O. of people wanting to do exactly that (at least with regards to the newly created IDENTITY values).
For more info on why it is sometimes faster for the overall process, even if slightly slower on getting the data from disk into SQL Server, please see this whitepaper from the SQL Server Customer Advisory Team: Maximizing Throughput with TVP
In C#, the best solution is to let the SqlBulkCopy reads the file. To do this you need to pass an IDataReader direct to SqlBulkCopy.WriteToServer method. Here is an example: http://www.codeproject.com/Articles/228332/IDataReader-implementation-plus-SqlBulkCopy
the best way is a mix between your 1st solution and 2nd,
create DataTable and in the loop add rows to it then use BulkCopy to upload
to DB in one connection use this for help in bulk copy
one other thing to pay attention that bulk copy is a very sensitive operation that almost
every mistake will void the copy, such if you declare the column name in the dataTable as "text" and in the DB its "Text" it will throw an exception, good luck.
If you want to insert 10 million records in the shortest time to direct using SQL query for testing purpose you should use this query
CREATE TABLE TestData(ID INT IDENTITY (1,1), CreatedDate DATETIME)
GO
INSERT INTO TestData(CreatedDate) SELECT GetDate()
GO 10000000
In C# I want to execute a query that use 2 different databases (One is Access for local, and other is distant and is MySQL)
I'm able to do it in VBA Access, but how I can make the same thing in C# ??
This is how I made it in Access:
Link my 2 differents table/databases in Table
In VBA:
sSQL = "INSERT INTO DB1tblClient SELECT * FROM DB2tblClient"
CurrentDb.Execute sSQL
How I can execute this SQL in C# ? (What object to use, etc... Example code if you can)
Thanks !
There are two ways to do this. One is to set up linked tables on Access and run a single query. The other is to run both queries from c# and join them with linq.
The first way is better. If you really have to do it with linq, here is some sample code:
dWConnection.Open();
dWDataAdaptor.SelectCommand = dWCommand1;
dWDataAdaptor.Fill(queryResults1);
dWDataAdaptor.SelectCommand = dWCommand2;
dWDataAdaptor.Fill(queryResults2);
dWConnection.Close();
IEnumerable<DataRow> results1 = (from events in queryResults1.AsEnumerable()
where events.Field<string>("event_code").ToString() == "A01"
|| events.Field<string>("event_code").ToString() == "ST"
select events ) as IEnumerable<DataRow>;
var results2 = from events1 in queryResults1.AsEnumerable()
join events2 in queryResults2.AsEnumerable()
on (string)events1["event_code"] equals (string)events2["event_code"]
select new
{
f1 = (string)events1["event_code"],
f2 = (string)events2["event_name"]
};
DataTable newDataTable = new DataTable();
newDataTable = results1.CopyToDataTable<DataRow>();
See why I said linked tables is better?
You should be able to run the same SQL command from any app, really. This is assuming:
You're connecting to Access from your C# app
DB1tblClient is a local Access table
DB2tblClient is a link table in Access
Given these, you might try the following:
using (OleDbConnection conn = new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=C:\Stuff\MyAccessdb.mdb"))
{
conn.Open();
using (OleDbCommand cmd = conn.CreateCommand())
{
cmd.CommandText = "INSERT INTO DB1tblClient SELECT * FROM DB2tblClient";
cmd.ExecuteNonQuery();
}
}
You might want to check connectionstrings.com if you can't get the connection string right, and you may need to install some components (MDAC or ACE) for connections that use those providers.
Well it is not possible to run this such complex query with single statement.
Basically each query execution object initialized by particular database information,
so need two different object for each database first think.
Now 2 Object need with initialized with its own connection object.
Just fetch data by first object and insert it to another database by usin second connection object.
You need to keep following points in mind before trying this type of query
Both the databases are accessible from your code.
There is inter-connectivity between both the database.
Both the databases are available for the user that you are using to execute this query.
You need to specify the query in following format
DATABASE_NAME.SCHEMA_NAME.TABLE_NAME instead of just TABLE_NAME
EDIT
If you don't have inter-connectivity between databases you can follow following steps
Connect to Source database using one connection.
Read the data from source database into a dataset or datatable using SELECT query.
Connect to target database using a second connection.
Insert all the records one by one using a loop to TARGET Database using standard INSERT query
I'm trying to send a DataTable to a stored procedure using c#, .net 2.0 and SQLServer 2012 Express.
This is roughly what I'm doing:
//define the DataTable
var accountIdTable = new DataTable("[dbo].[TypeAccountIdTable]");
//define the column
var dataColumn = new DataColumn {ColumnName = "[ID]", DataType = typeof (Guid)};
//add column to dataTable
accountIdTable.Columns.Add(dataColumn);
//feed it with the unique contact ids
foreach (var uniqueId in uniqueIds)
{
accountIdTable.Rows.Add(uniqueId);
}
using (var sqlCmd = new SqlCommand())
{
//define command details
sqlCmd.CommandType = CommandType.StoredProcedure;
sqlCmd.CommandText = "[dbo].[msp_Get_Many_Profiles]";
sqlCmd.Connection = dbConn; //an open database connection
//define parameter
var sqlParam = new SqlParameter();
sqlParam.ParameterName = "#tvp_account_id_list";
sqlParam.SqlDbType = SqlDbType.Structured;
sqlParam.Value = accountIdTable;
//add parameter to command
sqlCmd.Parameters.Add(sqlParam);
//execute procedure
rResult = sqlCmd.ExecuteReader();
//print results
while (rResult.Read())
{
PrintRowData(rResult);
}
}
But then I get the following error:
ArgumentOutOfRangeException: No mapping exists from SqlDbType Structured to a known DbType.
Parameter name: SqlDbType
Upon investigating further (in MSDN, SO and other places) it appears as if .net 2.0 does not support sending a DataTable to the database (missing things such as SqlParameter.TypeName), but I'm still not sure since I haven't seen anyone explicitly claiming that this feature is not available in .net 2.0
Is this true?
If so, is there another way to send a collection of data to the database?
Thanks in advance!
Out of the box, ADO.NET does not suport this with good reason. A DataTable could take just about any number of columns, which may or may not map up to a real table in your database.
If I'm understanding what you want to do - upload the contents of a DataTable quickly to a pre-defined, real table with the same structure, I'd suggest you investigate SQLBulkCopy.
From the documentation:
Microsoft SQL Server includes a popular command-prompt utility named
bcp for moving data from one table to another, whether on a single
server or between servers. The SqlBulkCopy class lets you write
managed code solutions that provide similar functionality. There are
other ways to load data into a SQL Server table (INSERT statements,
for example), but SqlBulkCopy offers a significant performance
advantage over them.
The SqlBulkCopy class can be used to write data only to SQL Server
tables. However, the data source is not limited to SQL Server; any
data source can be used, as long as the data can be loaded to a
DataTable instance or read with a IDataReader instance.
SqlBulkCopy will fail when bulk loading a DataTable column of type
SqlDateTime into a SQL Server column whose type is one of the
date/time types added in SQL Server 2008.
However, you can define Table Value Parameters in SQL Server in later versions, and use that to send a Table (DateTable) in the method you're asking. There's an example at http://sqlwithmanoj.wordpress.com/2012/09/10/passing-multipledynamic-values-to-stored-procedures-functions-part4-by-using-tvp/
Per my experience, if you're able to compile the code in C# that means the ADO.Net support that type. But if it fails when you execute the code then the target database might not support it. In your case you mention the [Sql Server 2012 Express], so it might not support it. The Table Type was supported from [Sql Server 2005] per my understanding but you had to keep the database compatibility mode to greater than 99 or something. I am 100% positive it will work in 2008 because I have used it and using it extensively to do bulk updates through the stored procedures using [User Defined Table Types] (a.k.a UDTT) as the in-parameter for the stored procedure. Again you must keep the database compatibility greater than 99 to use MERGE command for bulk updates.
And of course you can use SQLBulkCopy but not sure how reliable it is, is depending on the
I am working on moving a database from MS Access to sql server. To move the data into the new tables I have decided to write a sync routine as the schema has changed quite significantly and it lets me run testing on programs that run off it and resync whenever I need new test data. Then eventually I will do one last sync and start live on the new sql server version.
Unfortunately I have hit a snag, my method is below for copying from Access to SQLServer
public static void BulkCopyAccessToSQLServer
(string sql, CommandType commandType, DBConnection sqlServerConnection,
string destinationTable, DBConnection accessConnection, int timeout)
{
using (DataTable dt = new DataTable())
using (OleDbConnection conn = new OleDbConnection(GetConnection(accessConnection)))
using (OleDbCommand cmd = new OleDbCommand(sql, conn))
using (OleDbDataAdapter adapter = new OleDbDataAdapter(cmd))
{
cmd.CommandType = commandType;
cmd.Connection.Open();
adapter.SelectCommand.CommandTimeout = timeout;
adapter.Fill(dt);
using (SqlConnection conn2 = new SqlConnection(GetConnection(sqlServerConnection)))
using (SqlBulkCopy copy = new SqlBulkCopy(conn2))
{
conn2.Open();
copy.DestinationTableName = destinationTable;
copy.BatchSize = 1000;
copy.BulkCopyTimeout = timeout;
copy.WriteToServer(dt);
copy.NotifyAfter = 1000;
}
}
}
Basically this queries access for the data using the input sql string this has all the correct field names so I don't need to set columnmappings.
This was working until I reached a table with a calculated field. SQLBulkCopy doesn't seem to know to skip the field and tries to update the column which fails with error "The column 'columnName' cannot be modified because it is either a computed column or is the result of a union operator."
Is there an easy way to make it skip the calculated field?
I am hoping not to have to specify a full column mapping.
There are two ways to dodge this:
use the ColumnMappings to formally define the column relationship (you note you don't want this)
push the data into a staging table - a basic table, not part of your core transactional tables, whose entire purpose is to look exactly like this data import; then use a TSQL command to transfer the data from the staging table to the real table
I always favor the second option, for various reasons:
I never have to mess with mappings - this is actually important to me ;p
the insert to the real table will be fully logged (SqlBulkCopy is not necessarily logged)
I have the fastest possible insert - no constraint checking, no indexing, etc
I don't tie up a transactional table during the import, and there is no risk of non-repeatable queries running against a partially imported table
I have a safe abort option if the import fails half way through, without having to use transactions (nothing has touched the transactional system at this point)
it allows some level of data-processing when pushing it into the real tables, without the need to either buffer everything in a DataTable at the app tier, or implement a custom IDataReader