All,
I am encountering "Conversion overflow" exceptions on one of the SqlDataAdapter.Fill() usages for a decimal field. The error occurs for value beginning 10 billion, but not till 1 billion. Here is the code:
DataSet ds = new DataSet();
SqlDataAdapter sd = new SqlDataAdapter();
adapter.SelectCommand = <my SQL Command instance>
adapter.Fill(ds);
I have read using SqlDataReader as an alternate but we need to set the datatype and precision explicitly. There are at least 70 columns that I am fetching and I don't want to set all of them only for one decimal field in error.
Can anyone suggest alternate approaches?
Thank you.
Although dataset is allowed for "filling" a data adapter, I've typically done with a DataTable instead as when querying, I'm only expecting one result set. Having said that, I would pre-query the table, just to get its structure... something like
select whatever from yourTable(s) where 1=2
This will get the expected result columns when you do a
DataTable myTable = new DataTable();
YourAdapter.Fill( myTable );
Now that you have a local table that will not fail for content size because no records will have been returned, you can now explicitly go to that one column in question and set its data type / size information as you need...
myTable.Columns["NameOfProblemColumn"].WhateverDataType/Precision = Whatever you need...
NOW, your local schema is legit and the problem column will have been identified with its precision. Now, put in your proper query with proper where clause and not the 1=2 to actually return data... Since no actual rows in the first pass, you don't even need to do a myTable.Clear() to clear the rows... Just re-run the query and dataAdapter.Fill().
I haven't actually tried as I don't have your data issues to simulate same problem, but the theoretical process should get you by without having to explicitly go through all columns... just the few that may pose the problem.
I had the same problem and the reason is because in my stored procedure I returned a decimal(38,20) field. I changed it into decimal(20,10) and all works fine. It seems to be a limitation of Ado.Net.
CREATE PROCEDURE FOOPROCEDURE AS
BEGIN
DECLARE #A DECIMAL(38,20) = 999999999999999999.99999999999999999999;
SELECT #A;
END
GO
string connectionString ="";
SqlConnection conn = new SqlConnection(connectionString);
conn.Open();
SqlCommand cmd = new SqlCommand("EXEC FOOPROCEDURE", conn);
SqlDataAdapter adt = new SqlDataAdapter(cmd);
DataSet ds = new DataSet();
adt.Fill(ds); //exception thrown here
Related
In my project, I want to get some data from oracle database.
In the oracle database there are tables and views separately.
So I have connected to the database and tried gathering data from the views.
So I wrote this code to get the data. But, I'm returning an error in the dt.Load(dr) line that exception said Specified Cast is not valid
Can anyone explain me what this error means and how to avoid this?
This is first time I'm working with Oracle db.
OracleConnection con = new OracleConnection("Data Source=TEST;Persist Security Info=True;UserID=app;Password=test;");
con.Open();
OracleCommand cmd = con.CreateCommand();
cmd.CommandText = "SELECT * FROM P.INVENTORY_PART_IN_STOCK_UIV WHERE PART_NO = '90202-KPL-900D' and upper(P.Sales_Part_API.Get_Catalog_Group(CONTRACT, PART_NO) ) = upper('SPMB')";
cmd.CommandType = CommandType.Text;
OracleDataReader dr = cmd.ExecuteReader();
DataTable dt = new DataTable();
dt.Load(dr);
dataGridView1.DataSource = dt.DefaultView;
Oracle column types of NUMBER can represent more precision than .NET native type type Decimal. I assume you are using an IFS database, and have faced the same issue. I ended up with a solution using Entity Framework and mapped any NUMBER fields to a Double and converted locally to Decimal. So far I haven't had any conversion issues in several years.
As pointed out in the comments, you can restrict your field list in your select statement, but you will struggle if you need to call a NUMBER type column. Quick and dirty solution would be to use a TO_CHAR around any NUMBER columns and convert in your local code.
In my application I'm getting some data out of a local MS Access database file. I'm puzzled by a sporadic issue where my query for all records of a specific table sometimes returns all the records, and sometimes returns all but the last record. I'm using the following code
string resourceConStr = #"Provider=Microsoft.ACE.OLEDB.12.0;Data source = C:/FileName.mdb";
OleDbConnection resourceCon = new OleDbConnection(resourceConStr);
OleDbDataAdapter personnelAdapter = new OleDbDataAdapter("Select * From Personnel", resourceCon);
DataTable personnel = new DataTable();
personnelAdapter.Fill(personnel);
When I look at the personnel DataTable, sometimes I have the correct # of records and sometimes I'm missing the last record from the Access table. I haven't been able to find any pattern as to when it works successfully and when it does not. Any idea what could be the reason for this or suggestions or a way to validate that all records were copied into the DataTable sucessfully? Thanks
Any ... suggestions or a way to validate that all records were copied into the DataTable sucessfully?
One way to do it would be to execute a SELECT COUNT(*) AS n FROM Personnel, and compare that number (assuming that you get one back) with the number of rows in the DataTable after it gets filled.
Hi all I have written my query as follows in sql which gives me a result, this is my query
SELECT Technology,TechDescription, Technology.TechID, COUNT(Question) AS
'Totalposts' FROM Technology LEFT JOIN Question ON Question.TechID = Technology.TechID GROUP BY Technology.TechID, Technology,TechDescription
I just write it in to a stored procedure as follows
USE [newForumDB]
GO
/****** Object: StoredProcedure [dbo].[selectTechQuestions] Script Date: 01/24/2013 15:06:06 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
create PROCEDURE [dbo].[selectTechQuestions]
As
SET FMTONLY OFF
Begin
SELECT Technology,TechDescription, Technology.TechID, COUNT(Question) AS
'Totalposts'
FROM Technology
LEFT JOIN Questions
ON Questions.TechID = Technologies.TechID
GROUP BY Technologies.TechID, Technology,TechDescription
End
and added this in to my model, and tried to add an function for the procedure, but I am getting the message as The selected stored procedure returns no columns can some one help me
As per Habib requested I tried in both ways i.e
1) writing the query in a string and filled the DataSet using SqlCommand and SqlDataAdapter which works fine
2) SqlCommand cmd = new SqlCommand();
cmd.CommandText = "selectTechQuestions";
cmd.CommandType = CommandType.StoredProcedure;
cmd.Connection = con;
SqlDataAdapter da = new SqlDataAdapter();
da.SelectCommand = cmd;
DataSet ds = new DataSet();
da.Fill(ds);
The second one gives me the error {"Invalid object name 'Questions'."}
Put your as
SELECT Technology AS 'Technology',TechDescription AS 'TechDescription', Technology.TechID AS 'ID', COUNT(Question) AS
'Totalposts'
FROM Technology
LEFT JOIN Questions
ON Questions.TechID = Technologies.TechID
GROUP BY Technologies.TechID, Technology,TechDescription
So, many time before when I get this type of prob, at that time I had done this.
This is silly, solution but I had come out with my prob, may be It can be help full to you....
Detail description
As all we know select statement return result as single data-set, even if its from multiple table i.e query like inner join.
When a data comes form the multiple tables in there is a possibility to over come two different column from different table with same name.
This clause does not made any problem in simple data fetching style i.e as we simply done with DataAdapter and SqlCommand, but entity framework can't handle this thing, and there fore compiler does not allow query such like which containing inner join or multiple table queries.
So to resolve this problem we just have to assign a different name fore each column as I had done here and in this way there will be not at all any problem can cause...
I am trying to copy a large datatable (columns with more than 1000 rows) created dynamically in the applicaiton to a MySQL table using c#, WPF.
I have searched for various ways to do this but was unsuccessful to implement. I think the MySqlDataAdapter class is something I should use but I cant make it work. This is what I tried to do...
MySqlConnection con = new MySqlConnection(MyConString);
MySqlCommand comm = new MySqlCommand("Select * From kinectdata", con);
MySqlDataAdapter test1 = new MySqlDataAdapter(comm);
test1.Update(skelData);
Speed of this transfer is also important so I prefered not to call an insert or update statement 1000 times.
Many thanks for your feedback!
M
You can build a single INSERT statement which inserts all 1000 rows.
INSERT INTO table VALUES (1,2,3), (4,5,6), (7,8,9);
1000 rows is not that much, in database terms its nothing, using insert should be very fast. No more then 2 seconds.
In your example you do have to declare command type and set your query and command text.
Which one would be better in executing an insert statement for ms-sql database:
Sql DataAdapter or SQL Command
Object?
Which of them would be better, while inserting only one row and while inserting multiple rows?
A simple example of code usage:
SQL Command
string query = "insert into Table1(col1,col2,col3) values (#value1,#value2,#value3)";
int i;
SqlCommand cmd = new SqlCommand(query, connection);
// add parameters...
cmd.Parameters.Add("#value1",SqlDbType.VarChar).Value=txtBox1.Text;
cmd.Parameters.Add("#value2",SqlDbType.VarChar).Value=txtBox2.Text;
cmd.Parameters.Add("#value3",SqlDbType.VarChar).Value=txtBox3.Text;
cmd.con.open();
i = cmd.ExecuteNonQuery();
cmd.con.close();
SQL Data Adapter
DataRow dr = dsTab.Tables["Table1"].NewRow();
DataSet dsTab = new DataSet("Table1");
SqlDataAdapter adp = new SqlDataAdapter("Select * from Table1", connection);
adp.Fill(dsTab, "Table1");
dr["col1"] = txtBox1.Text;
dr["col2"] = txtBox5.Text;
dr["col3"] = "text";
dsTab.Tables["Table1"].Rows.Add(dr);
SqlCommandBuilder projectBuilder = new SqlCommandBuilder(adp);
DataSet newSet = dsTab.GetChanges(DataRowState.Added);
adp.Update(newSet, "Table1");
Updating a data source is much easier using DataAdapters. It's easier to make changes since you just have to modify the DataSet and call Update.
There is probably no (or very little) difference in the performance between using DataAdapters vs Commands. DataAdapters internally use Connection and Command objects and execute the Commands to perform the actions (such as Fill and Update) that you tell them to do, so it's pretty much the same as using only Command objects.
I would use LinqToSql with a DataSet for single insert and most Database CRUD requests. It is type safe, relatively fast for non compilcated queries such as the one above.
If you have many rows to insert (1000+) and you are using SQL Server 2008 I would use SqlBulkCopy. You can use your DataSet and input into a stored procedure and merge into your destination
For complicated queries I recommend using dapper in conjunction with stored procedures.
I suggest you would have some kind of control on your communication with the database. That means abstracting some code, and for that the CommandBuilder automatically generates CUD statements for you.
What would be even better is if you use that technique together with a typed Dataset. then you have intellisense and compile time check on all your columns