I have a table with Column data type "timestamp" in sql server.
Currently I am trying to get the data from this table to sqlite database. as it needs only string value. So far i have not been able to find correct way to convert to string.
So for example my SQL Value is 0x0000000000012DE0
When I get the record using entity framework, I get byte array.
Tried to convert using following code to string.
value = BitConverter.ToInt64(Version, 0);
However for same record, i get 0xE02D010000000000
This is one difference.
The second, Since I am working on azure mobile app, and this data also goes to android via WebAPI controller.The result I get from fiddler is something in this format
AAAAAAABM8s=
I want to also convert the byte arrray value in above format .
Any suggestions?
I had a similar issue with my ASP.NET Core, EF Core 2.0, Angular 2 app. This was a database first development and changing the database definition was not within my remit.
In my case, EF Core automatically performed the concurrency check for me where the Timestamp column was present in the table. This was not an issue for updates because I could pass the DTO in the body but with deletes I could not.
To pass the timestamp I used a query string. The value being the Base64 string representation of the timestamp e.g. AAAAAAACIvw=
In my repository layer I convert the string back to a byte array e.g.
byte[] ts = Convert.FromBase64String(timestampAsBase64String);
Then delete using the create and attach pattern (thereby eliminating any chance of lost updates)
ModelClass modelobj = new ModelClass { Id = id, Timestamp = ts};
_dbcontext.Entry(modelObj).State = EntityState.Deleted;
Thanks to this thread and the CodeProject article Converting Hexadecimal String to/from Byte Array in C# to get this resolved.
Related
I am coming across an issue where using JObject.Parse(); truncates my JSON read from a excel file. It does not work on my machine, but the will work for the exact same data on another person machine using the same method and implementation.
Basically, we are calling a method part of a internal framework that reads an excel doc (data provider) based on the test method that is calling it. Once the row is selected, it then pulls the data stored in the columns cell. The format of this data is a JSON format. I have used 3 different JSON validators to ensure the JSON is valid.
JSON below this has just filler data as i cannot share the actual JSON
{
"columns": [
"column1",
"column2",
"column3",
"column4",
"column5",
"column6",
"column7",
"column8"
],
"data": []
}
when attempting to return the JSON as a JObject the below is done.
var data = JObject.Parse(MyObject.value.AColumn[0]);
Which returns the cell data as a JObject of the data in the cell specified in the above.
When debugging this, I have typed the JSON in the Excel cell and have gotten data to return, but at one point the data starts getting truncated as if there is a specific character limitation. But again, this works perfectly fine on someone else's machine.
I get this error because of the JSON being truncated:
Unterminated string. Expected delimiter: ". Path 'columns[10]' line 13, position 9.
We are using Newtonsoft to handle the JSON and Dapper for the connection.Query to execute a simple query against the xlsx spreadsheet.
What I am finding is that when executing the Query in the OLDB connection the returned string is maxing out only at 255 length. So this looks more like a Dapper / OLDBConnection issue where i need to set the max length higher.
Here is the code for that
// executing the query and retrieving the test data from the specific column
var query = string.Format("select * from [DataSet$] where TestName = '{0}'", testName);
var value = connection.Query(query).Select(x =>
{
var result = new MyObject{ TestName = x.testName };
foreach (var element in x)
{
if (element.Key.Contains(column))
{
result.CustomColumns.Add(element.Value.ToString());
}
}
return result;
}).FirstOrDefault();
Where x is a dynamic data type.
Has anyone come across this before? Is there some hidden character that is preventing this?
Answered in comments within the main question..
Issue was that the OLEDB connection was reading the fist 8 rows and determining the data type of subsequent rows.
The cells data being pulled was a JSON string. The OLEDB connection was reading the string, however when trying to Parse the string to a JObject, parsing was throwing exception.
Further debugging reviled that within the OLEDB connection when reading the row of data that the string was getting truncated at 255 characters. Formatting the columns did not fix then issue, nor did adding OLEDB settings when creating the connection.
What resolved this was updating the System Reg key used to determine how many Rows to read before determining data type/ length. This solution can be found in the comment section of the original Question.
Or here Data truncated after 255 bytes while using Microsoft.Ace.Oledb.12.0 provider
I've a timestamp column in my database which is used as rowversion. At the time of pulling data out of the database we get rowversion also which i converted into byte[]. Upto this stage everything works as expected.
At the time of updating data, i'd like to check (in a stored procedure), if the rowversion is the same or not, that is compare one that is being passed from code with one that is stored in database. If that differs i abort the update otherwise it updates the data.
Now my problem is, how to pass byte[] to a stored procedure. The parameter type in the stored procedure is timestamp.
Note : I make all the db operations in c# using enterprise library. I can't change the stored procedure or datatypes. Its restricted.
See code below :
DateTime now = DateTime.Now;
long bNow = now.ToBinary();
byte[] arrayNow = BitConverter.GetBytes(bNow);
long getLong = BitConverter.ToInt64(arrayNow, 0);
DateTime getNow = DateTime.FromBinary(getLong);
Console.WriteLine(getNow.ToLongTimeString());
Console.ReadLine();
Below code works as expected
public static TestMethod(....,byte[] rowVersion)
{
.........
dbConnection_.AddInParameter(dbcommand, "#row_version", DbType.Binary,rowVersion);
...........
}
Above code is using enterprise library. I think this will help someone.
Thank you all of you who tried to help here. Cheers..!
I am working on a project using Asp.net mvc 3 and planning to have oracle 11g database as back-end.
I was able to access oracle server with no problem and data loaded successfully into an html table.
The problem comes when I try to add, edit or delete a record. I believe it is a very simple issue, but till now, I couldn't figure it out. The following simple model is used:
class Country
{
public int CountryId { get; set; }
public string CountryName { get; set; }
}
The CountryId feild was created at Oracle using NUMBER(10) as I thought this will work as SQL Server Integer. But an exception was raised, indicates that it couldn't take the value as Edm.decimal !
I tried to make it NUMBER(19) and changed the CountryId to long and still getting the same exception.
I spent long hours searching for an open source project that is using asp.net mvc with oracle, but I couldn't find any!
Any idea, why oracle is not supporting integer, long? how to make it working as expected with my MVC project?
From Oracle Data Type Mappings,
This data type is an alias for the NUMBER(38) data type, and is
designed so that the OracleDataReader returns a System.Decimal or
OracleNumber instead of an integer value. Using the .NET Framework
data type can cause an overflow.
So, you need to use decimal datatype in .net framework to support the NUMBER datatype in Oracle.
It would be better if you mapped your data type in following way:
[.NET: Int32] = [Oracle:NUMBER(2)..NUMBER(9)*]
[.NET: Int64] = [Oracle:NUMBER(10)..NUMBER(18)*]
[.NET: Double] = [Oracle:NUMBER(x, 0)..NUMBER(x, 15)*]
[.NET: Double] = [Oracle: FLOAT]
[.NET: Decimal] = [Oracle:NUMBER]
Have you tried using a decimal or an Int32? I am getting my suggestions from some documentation that I found online: http://docs.oracle.com/cd/E11882_01/win.112/e18754/featLINQ.htm . Let me know how this might work for you.
I am decorating one of my DateTime property in my class with Representation = BsonType.Int64 attribute so that it gets stored in the database with Int64 representation of a date.
When I used to store that property as a normal C# Datetime and did not set the value to anything then it would store DateTime.Min in the database. That is perfect because I was doing reading from databse and doing Query.LT operation like following on it:
Query.LT("MyField", DateTime.Now));
And it used to return all the values fine.
Now that I started storing it as BsonType.Int64 and the equivalent of DateTime.Min in BsonType.Int64 is "0". my Query.LT("MyField", DateTime.Now)); fails on all the dates that are stored with DateTime.Min.
Any idea on how to solve this?
The problem is that, during the query, the MongoDB driver doesn't know that you chose an alternate representation.
Hence, you need to query for an Int64 explicitly:
Query.LT("MyField", DateTime.Now.Ticks));
This will work as expected (tested w/ MongoDB 2.1.1, C# driver 1.4.2.4500)
In the program I'm currently working on, my table has a Create_Timestamp column, which I defined as timestamp.
When I'm working with my data context and my form values in my controller on the HttpPost, I'm trying the following:
NewsArticle article = new NewsArticle();
article.Create_Timestamp = System.DateTime.Now;
The error I get is Cannot implicitly convert from 'System.DateTime' to 'System.Data.Linq.Binary'
I've tried to force the conversion, but I'm unsure exactly what I'm doing at this point.
Is it possible in C# to do this conversion and still have Linq be happy with me?
Thanks
I am guessing you are using the SQL timestamp type in your table and you are expecting it to be a DateTime. Timestamp isn't really meant for holding Date/Time information. From MSDN (http://msdn.microsoft.com/en-us/library/aa260631(SQL.80).aspx):
timestamp is a data type that exposes
automatically generated binary
numbers, which are guaranteed to be
unique within a database. timestamp is
used typically as a mechanism for
version-stamping table rows. The
storage size is 8 bytes.storage size is 8 bytes.
Change your column "Create_Timestamp" to DateTime and you should be fine.