Read Oracle BLOB data as chunks - c#

I have the following queries on fetching a BLOB data from Oracle (
I am trying to use OracleDataReader - .Net to read the BLOB value.):
Is it possible to read a BLOB data on Oracle database as chunks without loading the entire BLOB on to server memory? I believe OracleDataReader.GetBytes() will load the entire blob on server memory.
Passing a null buffer on to GetBytes() fetches the size of the BLOB but would that require the BLOB to be loaded on server's memory?
What would be the optimal way to fetch the BLOB size and BLOB data as chunks without loading the entire BLOB in memory?

Look at DBMS_LOB.READ

Related

Streaming Data from Azure Blobs Back to Client

Using ASP.Net Web API, I am developing a service which (amongst other things) retrieves data from Azure, and returns it to the client.
One way of doing this would be to read the netire blob into a buffer, and then write that buffer to the response. However, I'd rather stream the contents, for better performance.
This is simple with the Azure API:
CloudBlobContainer container = BlobClient.GetContainerReference(containerName);
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
using (var buffer = new MemoryStream())
{
await blob.DownloadToStreamAsync(buffer);
}
And elsewhere in the code, this is returned to the client:
HttpResponseMessage response = Request.CreateResponse(HttpStatusCode.OK);
response.Content = new StreamContent(buffer);
But can I be certain that the MemoryStream won't be closed/disposed before the client finishes reading?
As long as you don't wrap your memory stream in a "using" statement you will be fine. If you do use "using" you end up with a weird race condition where it works sometimes and fails at other times.
I have code like yours in production and it works fine.
Only thing to be mindful of is that the whole blob is copied into memory before anything is sent to the client. This may cause memory pressures on your server and initial lag, depending on the size of the file.
If that is a concern, you have a couple of options.
One is to create a "lease" on the blob and give the user a URL to read it direct from blob storage for a limited time. That only works for low security scenarios though.
Alternatively you can use chunked transfer encoding. Basically, you read the file from blob storage in chunks and sends it to the client in those chunks. That saves memory - but I have not been able to make it work async, so you are trading memory for threads. Which is the right solution for you will depend in your specific circumstances.
(I have not got the code to hand, post a comment if you want it and I'll try to dig it out, even if it's a bit old).

Creating Blob in Azure Blob Storage with the same name simultaneously

I have task to load some images into the blob storage simultaneously. Name of blob is defined as md5 of the blob. It can happen that different threads try to load same files from different locations.
Now I need to know how to block other threads from loading same file, if first already trying to upload such blob.
You can do it without leasing it by using optimistic concurrency. Basicly set an access condition that says this blob will be different from all etags of blobs with this name. If there is indeed a blob with some etag the second upload will fail.
var access = AccessCondition.GenerateIfNoneMatchCondition("*");
await blobRef.UploadFromStreamAsync(stream, access, null, null);

converting a blob (.bacpac) to .bacpac file to import database to SQL Server Azure?

While working in MVC/ C# with Azure I need to restore database from a .bacpac file which is stored in blob storage. I m using DAC Framework API to access .bacpac from Blob storage.
Issue:
DacServices.ImportBacpac requires .bacpac file, I am able to refer blob file (which is a .bacpac) but it comes as a blob and not as a .bacpac file. I m not sure how to convert a blob to a .bacpac. Can you please guide me some way or API to do that conversion ?
Later I will use this file to import backpac to SQL Server Azure.
Thanks for your time and help.
The best way is probably to read the blob as a stream (CloudBlob.DownloadToStream()) and create the bacpac from said stream (BacPackage.Load()).

ASP.NET data corruption via WebClient.OpenWriteCompleted

Trying to upload a data byte[] to ASP.NET via "WebClient" and OpenWriteCompleted in Silverlight5. If the data size is smaller all the data gets written correctly. If I upload say a 500kb file I get data corruption.
How do I upload a 500kb file to ASP.NET ??
NOTE: I'm trying to upload a zip file into a MSSQL varbinary(MAX) column in my database.
Never mind, it was a zip compression bug on my part.

Upload/Download data directly to Sql Server

I'm working on a server/client application, in which the client has the ability to upload/download a file, which will be stored in SQL server (2012) as varbinay(max). The problem is that I want that the file will be uploaded directly to the database without saving the file on the server's hard drive, using ReadAllBytes method, which only accepts a path parameter.
Here is a fragment of the code used in the server side:
HttpPostedFile file1 = context.Request.Files[0];
byte[] buffer = new byte[file1.ContentLength];
file1.InputStream.Read(buffer, 0, file1.ContentLength);
Here is the code user in order to write into a file from data in the database:
foreach (var file in list)
{
System.IO.File.WriteAllBytes(path + file.FileName, file.FileContents);
}
The HttpPostedFile comes with an input stream you can read from. Read it into a byte array, then store it in the database. Or if your database interface (in the case of EF this is unlikely) directly accepts a stream, just put it in as-is,
Do note that ASP.NET (or was it IIS?) may store a file upload temporarily on disk anyways, if it's very large.

Categories