C# migrate blob from oracle to SQL Server database - c#

I am trying to migrate all blob records from oracle database to SQL Server database but select query is very slow. When I used batch size 10 it was still slow.
My code:
using (var orclConn = new OracleConnection(""))
{
orclConn.Open();
using (var orclCmd = new OracleCommand("select * from OracleTable", orclConn))
{
using (var reader = orclCmd.ExecuteReader())
{
using (var conn = new SqlConnection(""))
{
conn.Open();
using (SqlBulkCopy cpy = new SqlBulkCopy(conn, SqlBulkCopyOptions.TableLock | SqlBulkCopyOptions.UseInternalTransaction, null))
{
cpy.DestinationTableName = "dbo.SqlTable";
cpy.BatchSize = 10;
cpy.EnableStreaming = true;
cpy.BulkCopyTimeout = 0;
cpy.WriteToServer(reader);
}
}
}
}
}

Related

Fastest way to get multiple rows using npgsql

I have a following code with List of guids and I need to get all corresponding items from database:
var guids = new List<string>()
{
"1HFAoQchL1NB9Tc5wJLYHm",
"1OLd2nLP92c87kmkHaMAtO",
"2FwmMnAX98sg537nfo$S39",
"38pd0XqUvEvA8nrnF$3vvZ",
"0iqYqWKBvC_x62wk0jcKW3",
"0njEfI2BzDEeQpiOo5S18q",
"0chMA4Bgf9nxnq2i1gG9Fb",
"3TrnV9L79FfBz0ni6_bJ9N"
};
foreach (var gd in guids)
{
MemoryStream stream = null;
using (var conn = new NpgsqlConnection(connectionString))
{
string sQL = $"SELECT geometricaldata FROM entities WHERE guid= '{gd}'";
using (var command = new NpgsqlCommand(sQL, conn))
{
conn.Open();
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
stream = new MemoryStream((byte[])reader[0]);
}
reader.Close();
}
}
}
}
Do I really need to do this in a loop one by one or it would be better to batch SELECT statements? Or maybe there is another way how to get multiple rows with one query that I don't know of?

using Oracle Exadata and C#: Paging data from Oracle to retrieve in datatable

I am using Oracle Exadata as database and using that in a ASP.NET C# Web application.
I am trying to query the data from the database and fetch the data in data table. The data is too huge as I am using a select * option. The data table errors out with memory issues and would not be the right approach as well.
I am trying to check if a paging model can be applied.
I have used the below code model with MS SQL, which works. I am not sure how this can be applied in Oracle query. Below is the code for MS SQL.
public List<DataTable> GetDataSet()
{
var dataTables = new List<DataTable>();
var totalRecords = 0;
var tableIndex = 1;
using (var cn = new SqlConnection {ConnectionString = ConnectionString})
{
using (var cmd = new SqlCommand {Connection = cn})
{
var selectStatement =
#"SELECT Cust.CustomerIdentifier,
Cust.CompanyName,
Cust.ContactName,
C.[Name] AS Country
FROM dbo.Customers AS Cust
INNER JOIN dbo.Countries AS C
ON Cust.CountryIdentifier = C.CountryIdentifier
ORDER BY Cust.CustomerIdentifier
OFFSET #Offset ROWS
FETCH NEXT 25 ROWS ONLY;";
var countStatement = "SELECT COUNT(Cust.CustomerIdentifier) FROM dbo.Customers AS Cust";
cmd.CommandText = countStatement;
cn.Open();
totalRecords = Convert.ToInt32(cmd.ExecuteScalar());
cmd.CommandText = selectStatement;
cmd.Parameters.Add("#OffSet", SqlDbType.Int);
for (var index = 0; index < totalRecords; index++)
{
if (index % 25 == 0)
{
cmd.Parameters["#OffSet"].Value = index;
var dt = new DataTable() {TableName = $"Table{tableIndex}"};
dt.Load(cmd.ExecuteReader());
dataTables.Add(dt);
tableIndex += 1;
}
}
}
}
return dataTables;
}
I am trying to achieve the same functionality with Oracle. How to query the Oracle to get the data in this same way. Thanks

Properly read the column data type in ms access database

How to properly get the Data Type of MS Access table
Below is my code:
var con = $"Provider=Microsoft.Jet.OLEDB.4.0;" +
$"Data Source={database};" +
"Persist Security Info=True;";
using (OleDbConnection myConnection = new OleDbConnection())
{
myConnection.ConnectionString = con;
myConnection.Open();
DataTable dt = myConnection.GetSchema("Tables");
var l = myConnection.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, new object[] { null, null, null, "TABLE" });
var isTableExist = false;
foreach (DataRow item in l.Rows)
{
var tbleName = item[2].ToString();
if (tbleName == tableName)
{
isTableExist = true;
break;
}
}
if (isTableExist)
{
string sql = string.Format($"SELECT * FROM {tableName}");
using (OleDbCommand cmd = new OleDbCommand(sql, myConnection))
{
using (OleDbDataReader reader = cmd.ExecuteReader())
{
if (reader != null)
{
for (int i = 0; i < reader.FieldCount; i++)
{
Console.WriteLine(reader.GetName(i));
Console.WriteLine(reader.GetDataTypeName(i));
}
}
}
}
}
}
Here is the structure of the mdb file im reading
And this is what i'm receiving
id
DBTYPE_I4
main_id
DBTYPE_I4
Document_ID
DBTYPE_WVARCHAR
End_page
DBTYPE_WVARCHAR
No_pages
DBTYPE_I4
Why does the AutoNumber datatype show the same as the number data type?
What i'm doing here is merging tables. And i want to skip the insert statement for those data type AutoNumber but i can't make it work because the result is the same with number. Thank you

Error in Bulk Upload in SQL server from Excel 2007 using C#

I am trying to insert in SQL using Bulk upload. It is working well when I am using from local, even working on Test live URL too. But when I am uploading on Live it breaks after uploading 10000 rows.
Below is my code:
public bool ExportExcelToSql (DataTable DT)
{
bool result = false;
tableColumns = "*";
try
{
using (var connection = new OleDbConnection(_excelConnectionString))
{
connection.Open();
DataTable dt = null;
dt = connection.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, null);
_excelSheetName = dt.Rows[0]["TABLE_NAME"].ToString();
var command = new OleDbCommand("Select " + tableColumns + " FROM [" + _excelSheetName + "]", connection);
command.CommandTimeout = 6000;
using (DbDataReader dr = command.ExecuteReader())
{
string conString = _sqlConnectionString;
var sqlConn = new SqlConnection(conString);
sqlConn.Open();
using (var bulkCopy = new SqlBulkCopy(sqlConn))
{
bulkCopy.BulkCopyTimeout = 6000;
bulkCopy.DestinationTableName = tableName.ToString();
for (int i = 0; i < DT.Rows.Count; i++)
{
bulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping(DT.Rows[i][0].ToString(), DT.Rows[i][1].ToString(),));
}
bulkCopy.WriteToServer(dr);
}
result = true;
}
}
}
catch (Exception ex)
{
throw ex;
}
return result;
}
I would split the file in chunks and import them one at a time. No, there is no such limit after 100,000 rows. I would suspect that the problem is that the file does not match the format file or that your using something like SQL Express 2008 R2 is limited to 4GB database size.

Get database list in Npgsql

How can I get a list of all available PostgreSQL databases on a user's computer using Npgsql in C#?
In postgreSQL there are two way to do it :
\l
SELECT datname FROM pg_database;
Using Npgsql You can use such code snippet:
IEnumerable<string> GetDatabaseNames()
{
var connStrBuilder = new NpgsqlConnectionStringBuilder();
connStrBuilder.Host = "localhost";
connStrBuilder.Username = "theduck";
connStrBuilder.Password = "password123";
connStrBuilder.Database = "postgres"; // this database is always present
using (var conn = new NpgsqlConnection(connStrBuilder.ConnectionString))
{
conn.Open();
using (var command = new NpgsqlCommand("SELECT datname FROM pg_database WHERE datistemplate = false ORDER BY datname;", conn))
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
yield return reader.GetString(0);
}
}
}
}

Categories