Differences between Convert in SQL and Convert in C# - c#

I have a C# project connected to a SQL Server Database, and I' m doing a format recognition for file upload.
More specifically, I upload files in varbinary and I store they in a column in the Db. When I download they, I want to recognize their format. I found on the web some hot keywords (or metadata) through which I could recognize if they are docx or xlsx or other. My problem is that if I do this on SQL:
select * from table where convert(varchar(8000),objectFile) like '%docprop%'`
it works and the db returns only word file.
But if I want to do this on C# after taking the varbinary how can I do?
I tried this but don' t work:
var item = context.tObject_M.SingleOrDefault(x => x.objectId == objectId);
var files = item.objectFile;
string filess = Convert.ToString(files);
byte[] itemi = item.objectFile;
string ciao = System.Text.Encoding.UTF8.GetString(itemi);
if (ciao.Contains("DOCPROPS"))
{
filess = "ciao";
}

http://www.garykessler.net/library/file_sigs.html
Check this link, you should investigate about magic numbers. This is the only reliable way to known the file type. In your case you can use content type like I told you in the comments !
If you search for sequence starting with [03] [04] [00]
byte[] fileSign={0x03, 0x04, 0x00};
for(int i=0; i<fileSign.Length; i++)
{
if(fileByteArray[i] != fileSign[i])
{
//not the same files
}
}

Related

CSV does not load to Google BigQuery

So I made a program that'll get the schema from a table in a MySQL Server, create a table based on the said schema, and insert the data rows which are saved in a CSV file in a Google Cloud Storage.
Code for getting the schema from MySQL:
foreach (string table in listOfTableNames)
{
MySqlCommand query = new MySqlCommand($"desc {table}", openedMySQLConnection);
MySqlDataReader reader = query.ExecuteReader();
DataTable dt = new DataTable(table);
dt.Load(reader);
reader.Close();
object[][] result = dt.AsEnumerable().Select(x => x.ItemArray).ToArray();
TableSchemas.Add(table, result);
}
Google BigQuery table maker from schema looped per table:
var schemaBuilder = new TableSchemaBuilder();
foreach (var column in dictionaryOfSchemas[myTablename])
{
string columnType = column[1].ToString().ToLower();
schemaBuilder.Add(
column[0].ToString(),
columnType.Contains("varchar") ? BigQueryDbType.String :
(columnType.Contains("int") ? BigQueryDbType.Int64 :
(columnType.Contains("decimal") ? BigQueryDbType.Float64 :
(columnType.Contains("timestamp") ? BigQueryDbType.DateTime :
(columnType.Contains("datetime") ? BigQueryDbType.DateTime :
(columnType.Contains("date") ? BigQueryDbType.Date :
BigQueryDbType.String)))))
);
}
TableSchema schema = schemaBuilder.Build();
BigQueryTable newTable = bigquery.GetDataset(myDataset).CreateTable(myTablename, schema);
CSV to created GBQ table looped per table:
bigquery.CreateLoadJob(
$"gs://{myProjectIdString}/{myBucketNameString}/{myCSVFilename}",
newTable.Reference,
schema,
new CreateLoadJobOptions()
{
SourceFormat = FileFormat.Csv,
SkipLeadingRows = 1
}
).PollUntilCompleted();
No error shows up after running the CreateLoadJob method.
Schema of the new table seems to be good and matches the CSV and the MySQL table:
Here's a sneak peak into the CSV file in Cloud Storage with some data redacted out:
But there's still no data in the table:
Am I doing something wrong? I'm still learning Google services so any help and insight would be appreciated. :)
There are a few things wrong here, but it's mostly to do with the data rather than with the code itself.
In terms of code, when a job has completed, it may still have completed with errors. You can call ThrowOnAnyError() to observe that. You can get detailed errors via job.Resource.Status.Errors. (I believe the first of those detailed errors is the one in job.Resource.Status.ErrorResults.)
Once the correct storage URL is provided (which would be observed that way as well) you'll see errors like this, with the CSV file you provided:
Error while reading data, error message: CSV processing encountered too many errors, giving up. Rows: 1; errors: 1; max bad: 0; error percent: 0
Could not parse '06/12/2014' as DATE for field delivery_date (position 0) starting at location 95 with message 'Unable to parse'
At that point, the problem is in your CSV file. There are two issues here:
The date format is expected to be ISO-8601, e.g. "2014-06-12" rather than "06/12/2014" for example
The date/time format is expected to include seconds as well, so "2014-05-12 12:37:00" rather than "05/12/2014 12:37" for example
Hopefully you're able to run a preprocessing job to fix the data in your CSV file.
This is assuming that the schema you've created is correct, of course - we can't tell that from your post, but here's the schema that worked for me:
var schema = new TableSchemaBuilder
{
{ "delivery_date", BigQueryDbType.Date },
{ "delivery_hour", BigQueryDbType.Int64 },
{ "participant_id", BigQueryDbType.String },
{ "resource_id", BigQueryDbType.String },
{ "type_id", BigQueryDbType.String },
{ "price", BigQueryDbType.Numeric },
{ "date_posted", BigQueryDbType.DateTime },
{ "date_created", BigQueryDbType.DateTime }
}.Build();
After fiddling with my variables and links, I figured out that making a link to your bucket folder should be gs://<bucket_name>/<csv_filename> instead of gs://<project_id>/<bucket_name>/<csv_filename>.
My code's running ok for now and successfully transferred data.

Convert result of File.ReadAllBytes as File object without actually writing it

One Module reads some files using
File.ReadAllBytes("path")
and stores the result in a database table.
Later I take the result from the table and normaly use
File.WriteAllBytes("outputPath", binary.Data)
to write the file back.
Now I have to change the content of the file. Of course I can write the file to a temp folder, read it back in as a File object, change it, write it back to the destination folder.
But is there a smarter way to do that? Create the File object directly out of the binary data?
Got a solution. Its the Encoding Class:
var binary = GetBinaryFileFromDatabase();
var inputLines = Encoding.Default.GetString(binary).Split('\n');
var outputLines = new List<string>();
foreach (var line in inputLines)
{
var columns = line.Split(';');
if (columns.Length > 28 && columns[28] != null && columns[28].Length > 0)
{
columns[28] = columns[28].Replace(Path.GetFileName(columns[28]), "SomeValue");
}
outputLines.Add(string.Join(";", columns));
}
File.WriteAllLines(outputPath, outputLines);
Thanks everybody!

Export varbinary to JSON File

Database table has stored document in varbinary.
So i can get in byte[] in C# code.
Now How can i export this byte[] JSON file field.
if (item.IS_VIDEO == 0)
{
var content = ctx.DOCUMENT_TABLE.First(a => a.document_id == item.document_id).DOCUMENT_CONTENT;
if (content != null)
{
publicationClass.document_content = System.Text.Encoding.Default.GetString(content); //for export to json field
}
}
is this a way to export byte[] file to JSON?
Have you considered letting JSON serializer deal with the problem?
byte[] file = File.ReadAllBytes("FilePath"); // replace with how you get your array of bytes
string str = JsonConvert.SerializeObject(file);
This can be then deserialized on the receiving end like this:
var xyz = JsonConvert.DeserializeObject<byte[]>(str);
This appears to be working without any issues, however there might be some size limitations that might be worth investigating before commiting to this method.

Search Android C# Xamarin Sqlite Database

I need some help to figure out how to search my SQLite database.
I am trying to search for their cake and stars; this is my connection:
// Create our connection
string folder = System.Environment.GetFolderPath(System.Environment.SpecialFolder.Personal);
var db = new SQLiteConnection(System.IO.Path.Combine(folder, "OnTopFiles.db"));
db.CreateTable<User>();
// Insert note into the database
var note = new User { cake = "frosting marble", stars = "5" };
db.Insert(note);
// Show the automatically set ID and message.
Console.WriteLine("{0}: {1}", note.cake, note.stars);
You can use a WHERE query:
var query = conn.Table<User>().Where(u => u.Cake.StartsWith("frosting"));
foreach (var user in query)
{
// Use user.cake or other properties
}
Alternatively, you can use SQL directly. The library you are using hides the SQLite SQL so most of the SQLite syntax is hidden from you. This library is pure .NET Standard and runs on all Xamarin Platforms. You have the full power of SQL so searching is the same as other platforms.
https://github.com/MelbourneDeveloper/SQLite.Net.Standard

LINQ doesn't provide a byte[] data type for storing files

I'm trying to store a file in an SQL Server database and am using ASP.NET MVC 3 with LINQ to SQL in C#. When I try to set the data type of a column to byte[] it says that this is an invalid data type. Using the vrabinary(MAX) data type instead seems like the logical thing to do.
The goal is to store PDF files in a database that the user has elected to upload using an HTTP file uploader. The error occurs during assignment of the LINQ member variable that refers to the column where the file is to be stored.
Error 3 Cannot implicitly convert type 'System.Web.HttpPostedFileBase' to 'System.Data.Linq.Binary' C:\Users\Aaron Patten\Documents\Visual Studio 2010\Projects\MyApp\MyApp\Controllers\MyController.cs 66 31 MyApp
I haven't found much in the way of other people encountering this problem, so I think there must be something I'm missing that is too elementary for anyone to post.
My controller:
public ActionResult UploadPDF()
{
List<ViewDataUploadFilesResult> r = new List<ViewDataUploadFilesResult>();
HttpPostedFileBase hpf;// = Request.Files[file] as HttpPostedFileBase;
foreach (string file in Request.Files)
{
hpf = Request.Files[file] as HttpPostedFileBase;
if (hpf.ContentLength == 0)
continue;
string savedFileName = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "images\\" + Path.GetFileName(hpf.FileName));
hpf.SaveAs(savedFileName);
r.Add(new ViewDataUploadFilesResult()
{
Name = savedFileName,
Length = hpf.ContentLength
});
}
MyApp.Models.MyAppDataContext db = new MyApp.Models.MyAppDataContext();
MyApp.Models.PDFFile myPDFFile = new MyApp.Models.PDFFile();
myPDFFile.Content = hpf;
db.questions.InsertOnSubmit(myPDFFile);
db.SubmitChanges();
return View("UploadPDF", r);
}
What's the proper way of going about all this?
P.S.: How do I display the PDF as an embedded object without saving it to the server?
D.A.P.
The error message tells you what's wrong... you are setting Content=hpf. hpf is a HttpPostedFileBase so it first needs to be converted to a byte[]. Do that by reading the byte[] out of the hpf's stream.
myPDFFile.Content = new BinaryReader(hpf.InputStream).ReadBytes(hpf.InputStream.Length)
In your LINQ to SQL entity designer change the field giving you trouble from System.Data.Linq.Binary to a type of System.Byte[]. The conversion will be implicit upon submission.

Categories