Pack files into one, to later programmatically unpack them [closed] - c#

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
Is it possible to take all files and folders in a directory and pack them into a single package file, so that I may transfer this package over network and then unpack all files and folders from the package?
I tried looking into ZIP files with C#, because I'm aiming for the same idea, but the actual methods for it only comes with .NET 3.5 (I believe), I also want the program to be very lightweight, meaning I don't want external modules lying around that has to be taken with if I wish to unzip/unpack a single file.
How can I accomplish this?

Just use a BinaryWriter/Reader and your own format. Something like this:
using (var fs = File.Create(...))
using (var bw = new BinaryWriter(fs))
{
foreach (var file in Directory.GetFiles(...))
{
bw.Write(true); // means that a file will follow
bw.Write(Path.GetFileName(file));
var data = File.ReadAllBytes(file);
bw.Write(data.Length);
bw.Write(data);
}
bw.Write(false); // means end of file
}
So basically you write a bool that means whether there is a next file, the name and contents of each file, one after the other. Reading is the exact opposite. BinaryWriter/Reader take care of everything (it knows how long each string and byte array is, you will read back exactly what you wrote).
What this solution lacks: not an industry standard (but quite simple), doesn't store any additional metadata (you can add creation time, etc.), doesn't use a checksum (you can add an SHA1 hash after the contents), doesn't use compression (you said you don't need it), doesn't handle big files well (the problematic part is that it reads an entire file into a byte array and writes that, should work pretty well under 100 MB), doesn't handle multi-level directory hierarchies (can be added of course).
EDIT: The BinaryR/W know about string lengths, but not about byte array lengths. I added a length field before the byte array so that it can be read back exactly as it was written.

Take a look at ziplib, it's free, open source and can be used in all .NET versions: http://www.icsharpcode.net/opensource/sharpziplib/

What i suggest you it's to consider the advantages taken by using an external library so you can forget a lot of troubles. A zip complex class could be a huge deal. Take a look at that: http://dotnetzip.codeplex.com/ it's simple, stable and lightweigth.
By the way if you totally don'want external libraries and data compression isn't mandatory for your project you can manage it in a sort like this (please, consider it as a sample written in lees than an hour ;-) ):
usage:
//to pack
Packer.SimplePack sp = new Packer.SimplePack(#"c:\filename.pack");
sp.PackFolderContent(#"c:\yourfolder");
sp.Save();
//to unpack
Packer.SimplePack sp = new Packer.SimplePack(#"c:\filename.pack");
sp.Open();
Here is SimplePack:
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace Packer
{
public class SimplePack
{
public class Header
{
public Int32 TotalEntries { get; set; }
public Int64[] EntriesSize
{
get
{
return EntriesSizeList.ToArray();
}
}
private List<Int64> EntriesSizeList { get; set; }
public Header()
{
TotalEntries = 0;
EntriesSizeList = new List<Int64>();
}
public void AddEntrySize(Int64 newSize)
{
EntriesSizeList.Add(newSize);
}
}
public class Item
{
public Byte[] RawData { get; set; }
public String Name { get; set; }
public String RelativeUri { get; set; }
public Int64 ItemSize
{
get
{
Int64 retVal = 4; //Name.Lenght;
retVal += Name.Length;
retVal += 4; //RelativeUri.Length
retVal += RelativeUri.Length;
retVal += RawData.Length;
return retVal;
}
}
public Byte[] SerializedData
{
get
{
List<Byte> retVal = new List<Byte>();
retVal.AddRange(BitConverter.GetBytes(Name.Length));
retVal.AddRange(Encoding.Default.GetBytes(Name));
retVal.AddRange(BitConverter.GetBytes(RelativeUri.Length));
retVal.AddRange(Encoding.Default.GetBytes(RelativeUri));
retVal.AddRange(RawData);
return retVal.ToArray();
}
}
public Item()
{
RawData = new Byte[0];
Name = String.Empty;
RelativeUri = String.Empty;
}
public Item(Byte[] serializedItem)
{
Int32 cursor = 0;
Int32 nl = BitConverter.ToInt32(serializedItem, cursor);
cursor += 4;
Name = Encoding.Default.GetString(serializedItem, cursor, nl);
cursor += nl;
Int32 rl = BitConverter.ToInt32(serializedItem, cursor);
cursor += 4;
RelativeUri = Encoding.Default.GetString(serializedItem, cursor, rl);
cursor += rl;
RawData = new Byte[serializedItem.Length - cursor];
for (int i = cursor; i < serializedItem.Length; i++)
{
RawData[i - cursor] = serializedItem[cursor];
}
}
}
public FileInfo PackedFile { get; private set; }
public List<Item> Data { get; private set; }
public Header FileHeaderDefinition { get; private set; }
public SimplePack(String fileName)
{
PackedFile = new FileInfo(fileName);
FileHeaderDefinition = new Header();
Data = new List<Item>();
}
public Boolean PackFolderContent(String folderFullName)
{
Boolean retVal = false;
DirectoryInfo di = new DirectoryInfo(folderFullName);
//Think about setting up strong checks and errors trapping
if (di.Exists)
{
FileInfo[] files = di.GetFiles("*", SearchOption.AllDirectories);
foreach (FileInfo fi in files)
{
Item it = setItem(fi, di.FullName);
if (it != null)
{
Data.Add(it);
FileHeaderDefinition.TotalEntries++;
FileHeaderDefinition.AddEntrySize(it.ItemSize);
}
}
}
//althoug it isn't checked
retVal = true;
return retVal;
}
private Item setItem(FileInfo sourceFile, String packedRoot)
{
if (sourceFile.Exists)
{
Item retVal = new Item();
retVal.Name = sourceFile.Name;
retVal.RelativeUri = sourceFile.FullName.Substring(packedRoot.Length).Replace("\\", "/");
retVal.RawData = File.ReadAllBytes(sourceFile.FullName);
return retVal;
}
else
{
return null;
}
}
public void Save()
{
if (PackedFile.Exists)
{
PackedFile.Delete();
System.Threading.Thread.Sleep(100);
}
using (FileStream fs = new FileStream(PackedFile.FullName, FileMode.CreateNew, FileAccess.Write))
{
//Writing Header
//4 bytes
fs.Write(BitConverter.GetBytes(FileHeaderDefinition.TotalEntries), 0, 4);
//8 bytes foreach size
foreach (Int64 size in FileHeaderDefinition.EntriesSize)
{
fs.Write(BitConverter.GetBytes(size), 0, 8);
}
foreach (Item it in Data)
{
fs.Write(it.SerializedData, 0, it.SerializedData.Length);
}
fs.Close();
}
}
public void Open()
{
if (PackedFile.Exists)
{
using (FileStream fs = new FileStream(PackedFile.FullName, FileMode.Open, FileAccess.Read))
{
Byte[] readBuffer = new Byte[4];
fs.Read(readBuffer, 0, readBuffer.Length);
FileHeaderDefinition.TotalEntries = BitConverter.ToInt32(readBuffer, 0);
for (Int32 i = 0; i < FileHeaderDefinition.TotalEntries; i++)
{
readBuffer = new Byte[8];
fs.Read(readBuffer, 0, readBuffer.Length);
FileHeaderDefinition.AddEntrySize(BitConverter.ToInt64(readBuffer, 0));
}
foreach (Int64 size in FileHeaderDefinition.EntriesSize)
{
readBuffer = new Byte[size];
fs.Read(readBuffer, 0, readBuffer.Length);
Data.Add(new Item(readBuffer));
}
fs.Close();
}
}
}
}
}

Related

Advice on appropriate data structure

Background
I have two pieces of data:
machineNumber which is just an id for a machine.
eventString which is an entry in a log.
The same log entry can occur multiple times on one machine and can occur on multiple machines. For example:
machineNumber
eventString
1
LogExample1
2
LogExample1
1
LogExample1
4
LogExample3
3
LogExample2
What I want to do is store this data temporarily in some sort of data structure so I can format it into the follow eventString, NumberOfMachinesEffected, TotalNumberOfInstances before storing it as a CSV file.
With the above example it would be formatted like LogExample1, 2, 3.
Problem
I'm wondering if someone can recommend an efficient method to store the data before formatting it. I need to be able to iterate over it to be able to count total number off occurrences, total number of machines effected, for each eventString.
Requested Code
I was asked to include the code. I don't think it pertains to the problem as it is purely a design question.
namespace ConfigLogAnalyser
{
/// <summary>
/// Interaction logic for MainWindow.xaml
/// </summary>
public partial class MainWindow : Window
{
public String fileName;
public MainWindow()
{
InitializeComponent();
}
private void MenuItem_Click(object sender, RoutedEventArgs e)
{
Microsoft.Win32.OpenFileDialog openFileDialog = new Microsoft.Win32.OpenFileDialog();
openFileDialog.Filter = "Text files(*.txt) | *.txt";
openFileDialog.InitialDirectory = "D:\\LogFiles"; //Testing only. Remove
if (openFileDialog.ShowDialog() == true)
{
//ensure it is a text file
fileName = openFileDialog.FileName;
if(!ProcessLogFile(fileName))
{
MessageBox.Show("Issue reading file: " + fileName);
}
}
}
//to be removed
private bool ProcessLogFile(string fileName)
{
if (!ReadLogFile(fileName))
{
return false;
}
return true;
}
//Why does this need to be a bool
private bool ReadLogFile(string fileName)
{
const Int32 BufferSize = 1024; //Changing buffersize will effect performance.
using (var fileStream = File.OpenRead(fileName))
using (var streamReader = new StreamReader(fileStream, Encoding.UTF8, true, BufferSize))
{
String line;
while ((line = streamReader.ReadLine()) != null)
{
ProcessLine(line);
}
}
return true;
}
private void ProcessLine(string line)
{
/*Process Line -
*
* Possibly use a multimap to store each logEntry of interest and a pair <machineId, NoOfOccurences>
* Problem. If an occurence happens twice by the same machine how do I make sure two isn't added to number of machines.
*
*/
throw new NotImplementedException();
}
}
}
I recommend you to create your own class to store some event information:
class EventInfo
{
public int MachineID { get; set; }
public string LogMessage { get; set; }
public DateTime EventTime { get; set; }
}
And then just create a list of EventInfo:
List<EventInfo> events = new List<EventInfo>();
C# List has quite good performance, and, in addition, using LINQ you can easily manipulate a data.
For example:
events.Where(item => item.MachineID == 1).Select(item => item.LogMessage);
This code is selecting all the events messages, related to the machine, with ID = 1

How can I add a filesystem to my HTTP-listener/ add frameworks in C# [duplicate]

I'm making a simple webserver to serve html, css, js & images (done in c#). I am using HttpListener and I can get the html, javascript and css files to work properly. I am just having trouble with the images. This is what I'm using currently:
if (request.RawUrl.ToLower().Contains(".png") || request.RawUrl.Contains(".ico") || request.RawUrl.ToLower().Contains(".jpg") || request.RawUrl.ToLower().Contains(".jpeg"))
{
string dir = System.IO.Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().Location);
string[] img = request.RawUrl.Split('/');
string path = dir + #"\public\imgs\" + img[img.Length - 1];
FileInfo fileInfo = new FileInfo(path);
long numBytes = fileInfo.Length;
FileStream fileStream = new FileStream(path, FileMode.Open, FileAccess.Read);
BinaryReader binaryReader = new BinaryReader(fileStream);
byte[] output = binaryReader.ReadBytes((int)numBytes);
binaryReader.Close();
fileStream.Close();
var temp = System.Text.Encoding.UTF8.GetString(output);
return temp;
}
I am converting the image into a string to return them (it's the way my boss suggested). This is the method where I am handling these requests.
private static string SendResponse(HttpListenerRequest request)
This is my WebServer classes Run() method. The call to SetContentType just goes through the request.RawUrl and determines the content type.
public void Run()
{
ThreadPool.QueueUserWorkItem((o) =>
{
Console.WriteLine("StackLight Web Server is running...");
try
{
while (_listener.IsListening)
{
ThreadPool.QueueUserWorkItem((c) =>
{
var ctx = c as HttpListenerContext;
try
{
// store html content in a byte array
string responderString = _responderMethod(ctx.Request);
// set the content type
ctx.Response.Headers[HttpResponseHeader.ContentType] = SetContentType(ctx.Request.RawUrl);
byte[] buffer = buffer = Encoding.UTF8.GetBytes(responderString);
// this writes the html out from the byte array
ctx.Response.ContentLength64 = buffer.Length;
using(Stream stream = ctx.Response.OutputStream)
{
stream.Write(buffer, 0, buffer.Length);
}
}
catch (Exception ex)
{
ConfigLogger.Instance.LogCritical(LogCategory, ex);
}
}, _listener.GetContext());
}
}
catch (Exception ex)
{
ConfigLogger.Instance.LogCritical(LogCategory, ex);
}
});
}
My html page needs to display an image to the screen, it displays a broken image so far. I know the images directory is correct, I tested that.
This is where I got my code for the webserver: here
I was thinking that maybe I have to change the SendResponse method to not return a string
I figured it out. I created a class to hold the data, content type and the request.RawUrl. Then, where I was passing a string, I changed it to pass the object I created.
So, for my WebServer class, my Run method looks like this:
public void Run()
{
ThreadPool.QueueUserWorkItem((o) =>
{
Console.WriteLine("StackLight Web Server is running...");
try
{
while (_listener.IsListening)
{
ThreadPool.QueueUserWorkItem((c) =>
{
var ctx = c as HttpListenerContext;
try
{
// set the content type
ctx.Response.Headers[HttpResponseHeader.ContentType] = SetContentType(ctx.Request.RawUrl);
WebServerRequestData data = new WebServerRequestData();
// store html content in a byte array
data = _responderMethod(ctx.Request);
string res = "";
if(data.ContentType.Contains("text"))
{
char[] chars = new char[data.Content.Length/sizeof(char)];
System.Buffer.BlockCopy(data.Content, 0, chars, 0, data.Content.Length);
res = new string(chars);
data.Content = Encoding.UTF8.GetBytes(res);
}
// this writes the html out from the byte array
ctx.Response.ContentLength64 = data.Content.Length;
ctx.Response.OutputStream.Write(data.Content, 0, data.Content.Length);
}
catch (Exception ex)
{
ConfigLogger.Instance.LogCritical(LogCategory, ex);
}
finally
{
ctx.Response.OutputStream.Close();
}
}, _listener.GetContext());
}
}
catch (Exception ex)
{
ConfigLogger.Instance.LogCritical(LogCategory, ex);
}
});
}
And my SendResponse method looks like this:
private static WebServerRequestData SendResponse(HttpListenerRequest request)
{
string dir = System.IO.Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().Location);
string[] fileUrl = request.RawUrl.Split('/');
// routes
if (request.RawUrl.Contains("/"))
{
// this is the main page ('/'), all other routes can be accessed from here (including css, js, & images)
if (request.RawUrl.ToLower().Contains(".png") || request.RawUrl.ToLower().Contains(".ico") || request.RawUrl.ToLower().Contains(".jpg") || request.RawUrl.ToLower().Contains(".jpeg"))
{
try
{
string path = dir + Properties.Settings.Default.ImagesPath + fileUrl[fileUrl.Length - 1];
FileInfo fileInfo = new FileInfo(path);
path = dir + #"\public\imgs\" + fileInfo.Name;
byte[] output = File.ReadAllBytes(path);
_data = new WebServerRequestData() {Content = output, ContentType = "image/png", RawUrl = request.RawUrl};
//var temp = System.Text.Encoding.UTF8.GetString(output);
//return Convert.ToBase64String(output);
return _data;
}
catch(Exception ex)
{
ConfigLogger.Instance.LogError(LogCategory, "File could not be read.");
ConfigLogger.Instance.LogCritical(LogCategory, ex);
_errorString = string.Format("<html><head><title>Test</title></head><body>There was an error processing your request:<br />{0}</body></html>", ex.Message);
_byteData = new byte[_errorString.Length * sizeof(char)];
System.Buffer.BlockCopy(_errorString.ToCharArray(), 0, _byteData, 0, _byteData.Length);
_data = new WebServerRequestData() { Content = _byteData, ContentType = "text/html", RawUrl = request.RawUrl };
return _data;
}
}
I'm still cleaning up the code a bit but it now serves the images!
Oh... And here is the object I'm using:
public class WebServerRequestData
{
public string RawUrl { get; set; }
public string ContentType { get; set; }
public byte[] Content { get; set; }
public string RawData { get; set; }
}
Some really bad stuff here:
Empty catch. You'll never find out about many bugs.
Stuffing binary data into a string. Why? There's no encoding that is able to roundtrip binary data.
You're not disposing of ctx. I don't see why you need a manual finally block. Use using.
Untrusted callers can inject arbitrary paths into path. I could request your web.config file by navigating to /img/..\..\web.config (something like that).
Consider factoring out some common expressions into variables. You've got a Copy&Paste error with ToLower. Don't do dirty stuff and you'll have less bugs.

How to read back JPG with multiple data created in Unity? ( C# )

I'm really sorry if this has been asked before, or if it is obvious, but I'm not entirely sure how to progress here. Please excuse the length of the question.
For a research project, my intention is to create a JPG image that contains after the JPG data a small plain text and another image as well. This image would be used later as an AR marker that 'contains' all the data to be displayed. I read that this could also be achieved with Stenography, but encrypting is not necessarily the goal, I think.
So far, using various other methods found here, the script can create a readable image that adds the byte[] data of the other content right after the first JPG, but I'm not entirely sure how to read it back.
using UnityEngine;
using Crosstales.FB;
using System.IO;
using System.Linq;
using System.Runtime.Serialization.Formatters.Binary;
using System;
using System.Text;
public class JPGMan : MonoBehaviour {
public UnityEngine.UI.Image markerThumb;
public UnityEngine.UI.Image mediaThumb;
public UnityEngine.UI.InputField descF;
Encoding u8 = Encoding.UTF8;
public void initLoadMarker()
{
string markerExt = "";
string markerPath = FileBrowser.OpenSingleFile("Image for Marker", "", markerExt);
StartCoroutine(loadMarker(markerPath));
Debug.Log(markerPath);
}
public void initLoadMedia()
{
string mediaExt = "";
string mediaPath = FileBrowser.OpenSingleFile("Image for Media", "", mediaExt);
StartCoroutine(loadMedia(mediaPath));
Debug.Log(mediaPath);
}
IEnumerator loadMarker(string mPath)
{
WWW markerWWW;
using (markerWWW = new WWW("File:///" + mPath))
{
yield return markerWWW;
Debug.Log(markerWWW.isDone);
Rect protoRect = new Rect(Vector2.zero, new Vector2(markerWWW.texture.width, markerWWW.texture.height));
Sprite protoSprite = Sprite.Create(markerWWW.texture, protoRect, Vector2.zero);
markerThumb.sprite = protoSprite;
markerThumb.preserveAspect = true;
}
}
IEnumerator loadMedia(string mPath)
{
WWW mediaWWW;
using (mediaWWW = new WWW("File:///" + mPath))
{
yield return mediaWWW;
Debug.Log(mediaWWW.isDone);
Rect protoRect = new Rect(Vector2.zero, new Vector2(mediaWWW.texture.width, mediaWWW.texture.height));
Sprite protoSprite = Sprite.Create(mediaWWW.texture, protoRect, Vector2.zero);
mediaThumb.sprite = protoSprite;
mediaThumb.preserveAspect = true;
}
}
public void encodeJPG()
{
string finalExt = "jpg";
string finalPath = FileBrowser.SaveFile("Save Marker", "", "aMARKER", finalExt);
Texture2D protoJPG = new Texture2D((int)mediaThumb.sprite.textureRect.width, (int)mediaThumb.sprite.textureRect.height);
Color[] pixels = mediaThumb.sprite.texture.GetPixels((int)mediaThumb.sprite.textureRect.x, (int)mediaThumb.sprite.textureRect.y, (int)mediaThumb.sprite.textureRect.width, (int)mediaThumb.sprite.textureRect.height);
protoJPG.SetPixels(pixels);
protoJPG.Apply();
Texture2D protoMarker = new Texture2D((int)markerThumb.sprite.textureRect.width, (int)markerThumb.sprite.textureRect.height);
Color[] markerPixels = markerThumb.sprite.texture.GetPixels((int)markerThumb.sprite.textureRect.x, (int)markerThumb.sprite.textureRect.y, (int)markerThumb.sprite.textureRect.width, (int)markerThumb.sprite.textureRect.height);
protoMarker.SetPixels(markerPixels);
protoMarker.Apply();
byte[] markerBytes = protoMarker.EncodeToJPG(100);
byte[] imgBytes = protoJPG.EncodeToJPG(100);
byte[] textBytes = System.Text.Encoding.Unicode.GetBytes(descF.text);
byte[] finalBytes = new byte[markerBytes.Length + textBytes.Length + imgBytes.Length];
System.Buffer.BlockCopy(markerBytes, 0, finalBytes, 0, markerBytes.Length);
System.Buffer.BlockCopy(textBytes, 0, finalBytes, markerBytes.Length, textBytes.Length);
System.Buffer.BlockCopy(imgBytes, 0, finalBytes, markerBytes.Length + textBytes.Length, imgBytes.Length);
File.WriteAllBytes(finalPath, finalBytes);
}
public void decodeJPG()
{
string inExt = "jpg";
string inPath = FileBrowser.OpenSingleFile("Open Marker", "", inExt);
StartCoroutine(loadMarkerImage(inPath));
}
IEnumerator loadMarkerImage(string inPath_)
{
byte[] bytesFile;
char[] lookFiles = new char[] {'J','F','I','F'};
using (FileStream fs = File.OpenRead(inPath_))
{
yield return fs;
bytesFile = new byte[20];
fs.Read(bytesFile, 0, 20);
fs.Close();
}
byte[] searchBytes = u8.GetBytes(lookFiles);
List<int> rList = IndexOfSequence(bytesFile, searchBytes, 0);
Debug.Log(rList[0].ToString());
}
I'm thinking that I could maybe find a way to write to the JPG spec's APP0 marker, but I haven't found much about doing this specifically in Unity. Another could be to loop through the whole byte[] to find the headers and thus differentiate the data, although I read that reading all the bytes in one operation is not particularly fast.
Finally, I was thinking of appending the serialization of a class that contains the extra data, making it possible to deserialize it and just refer to it as properties of the class. But that's probably a bad idea.
I am aware this is very unusual, and I don't mind as well just using a regular database and link the images to that data.
Is there anyone that could help with this matter? Any help is appreciated.
Thank you all for your time.

EntityTooSmall in CompleteMultipartUploadResponse

using .NET SDK v.1.5.21.0
I'm trying to upload a large file (63Mb) and I'm following the example at:
http://docs.aws.amazon.com/AmazonS3/latest/dev/LLuploadFileDotNet.html
But using a helper instead the hole code and using jQuery File Upload
https://github.com/blueimp/jQuery-File-Upload/blob/master/basic-plus.html
what I have is:
string bucket = "mybucket";
long totalSize = long.Parse(context.Request.Headers["X-File-Size"]),
maxChunkSize = long.Parse(context.Request.Headers["X-File-MaxChunkSize"]),
uploadedBytes = long.Parse(context.Request.Headers["X-File-UloadedBytes"]),
partNumber = uploadedBytes / maxChunkSize + 1,
fileSize = partNumber * inputStream.Length;
bool lastPart = inputStream.Length < maxChunkSize;
// http://docs.aws.amazon.com/AmazonS3/latest/dev/LLuploadFileDotNet.html
if (partNumber == 1) // initialize upload
{
iView.Utilities.Amazon_S3.S3MultipartUpload.InitializePartToCloud(fileName, bucket);
}
try
{
// upload part
iView.Utilities.Amazon_S3.S3MultipartUpload.UploadPartToCloud(fs, fileName, bucket, (int)partNumber, uploadedBytes, maxChunkSize);
if (lastPart)
// wrap it up and go home
iView.Utilities.Amazon_S3.S3MultipartUpload.CompletePartToCloud(fileName, bucket);
}
catch (System.Exception ex)
{
// Huston, we have a problem!
//Console.WriteLine("Exception occurred: {0}", exception.Message);
iView.Utilities.Amazon_S3.S3MultipartUpload.AbortPartToCloud(fileName, bucket);
}
and
public static class S3MultipartUpload
{
private static string accessKey = System.Configuration.ConfigurationManager.AppSettings["AWSAccessKey"];
private static string secretAccessKey = System.Configuration.ConfigurationManager.AppSettings["AWSSecretKey"];
private static AmazonS3 client = Amazon.AWSClientFactory.CreateAmazonS3Client(accessKey, secretAccessKey);
public static InitiateMultipartUploadResponse initResponse;
public static List<UploadPartResponse> uploadResponses;
public static void InitializePartToCloud(string destinationFilename, string destinationBucket)
{
// 1. Initialize.
uploadResponses = new List<UploadPartResponse>();
InitiateMultipartUploadRequest initRequest =
new InitiateMultipartUploadRequest()
.WithBucketName(destinationBucket)
.WithKey(destinationFilename.TrimStart('/'));
initResponse = client.InitiateMultipartUpload(initRequest);
}
public static void UploadPartToCloud(Stream fileStream, string destinationFilename, string destinationBucket, int partNumber, long uploadedBytes, long maxChunkedBytes)
{
// 2. Upload Parts.
UploadPartRequest request = new UploadPartRequest()
.WithBucketName(destinationBucket)
.WithKey(destinationFilename.TrimStart('/'))
.WithUploadId(initResponse.UploadId)
.WithPartNumber(partNumber)
.WithPartSize(maxChunkedBytes)
.WithFilePosition(uploadedBytes)
.WithInputStream(fileStream) as UploadPartRequest;
uploadResponses.Add(client.UploadPart(request));
}
public static void CompletePartToCloud(string destinationFilename, string destinationBucket)
{
// Step 3: complete.
CompleteMultipartUploadRequest compRequest =
new CompleteMultipartUploadRequest()
.WithBucketName(destinationBucket)
.WithKey(destinationFilename.TrimStart('/'))
.WithUploadId(initResponse.UploadId)
.WithPartETags(uploadResponses);
CompleteMultipartUploadResponse completeUploadResponse =
client.CompleteMultipartUpload(compRequest);
}
public static void AbortPartToCloud(string destinationFilename, string destinationBucket)
{
// abort.
client.AbortMultipartUpload(new AbortMultipartUploadRequest()
.WithBucketName(destinationBucket)
.WithKey(destinationFilename.TrimStart('/'))
.WithUploadId(initResponse.UploadId));
}
}
my maxChunckedSize is 6Mb (6 * (1024*1024)) as I have read that the minimum is 5Mb...
why am I getting "Your proposed upload is smaller than the minimum allowed size" exception? What am I doing wrong?
The error is:
<Error>
<Code>EntityTooSmall</Code>
<Message>Your proposed upload is smaller than the minimum allowed size</Message>
<ETag>d41d8cd98f00b204e9800998ecf8427e</ETag>
<MinSizeAllowed>5242880</MinSizeAllowed>
<ProposedSize>0</ProposedSize>
<RequestId>C70E7A23C87CE5FC</RequestId>
<HostId>pmhuMXdRBSaCDxsQTHzucV5eUNcDORvKY0L4ZLMRBz7Ch1DeMh7BtQ6mmfBCLPM2</HostId>
<PartNumber>1</PartNumber>
</Error>
How can I get ProposedSize if I'm passing the stream and stream length?
Here is a working solution for the latest Amazon SDK (as today: v.1.5.37.0)
Amazon S3 Multipart Upload works like:
Initialize the request using client.InitiateMultipartUpload(initRequest)
Send chunks of the file (loop until the end) using client.UploadPart(request)
Complete the request using client.CompleteMultipartUpload(compRequest)
If anything goes wrong, remember to dispose the client and request, as well fire the abort command using client.AbortMultipartUpload(abortMultipartUploadRequest)
I keep the client in Session as we need this for each chunk upload as well, keep an hold of the ETags that are now used to complete the process.
You can see an example and simple way of doing this in Amazon Docs itself, I ended up having a class to do everything, plus, I have integrated with the lovely jQuery File Upload plugin (Handler code below as well).
The S3MultipartUpload is as follow
public class S3MultipartUpload : IDisposable
{
string accessKey = System.Configuration.ConfigurationManager.AppSettings.Get("AWSAccessKey");
string secretAccessKey = System.Configuration.ConfigurationManager.AppSettings.Get("AWSSecretKey");
AmazonS3 client;
public string OriginalFilename { get; set; }
public string DestinationFilename { get; set; }
public string DestinationBucket { get; set; }
public InitiateMultipartUploadResponse initResponse;
public List<PartETag> uploadPartETags;
public string UploadId { get; private set; }
public S3MultipartUpload(string destinationFilename, string destinationBucket)
{
if (client == null)
{
System.Net.WebRequest.DefaultWebProxy = null; // disable proxy to make upload quicker
client = Amazon.AWSClientFactory.CreateAmazonS3Client(accessKey, secretAccessKey, new AmazonS3Config()
{
RegionEndpoint = Amazon.RegionEndpoint.EUWest1,
CommunicationProtocol = Protocol.HTTP
});
this.OriginalFilename = destinationFilename.TrimStart('/');
this.DestinationFilename = string.Format("{0:yyyy}{0:MM}{0:dd}{0:HH}{0:mm}{0:ss}{0:fffff}_{1}", DateTime.UtcNow, this.OriginalFilename);
this.DestinationBucket = destinationBucket;
this.InitializePartToCloud();
}
}
private void InitializePartToCloud()
{
// 1. Initialize.
uploadPartETags = new List<PartETag>();
InitiateMultipartUploadRequest initRequest = new InitiateMultipartUploadRequest();
initRequest.BucketName = this.DestinationBucket;
initRequest.Key = this.DestinationFilename;
// make it public
initRequest.AddHeader("x-amz-acl", "public-read");
initResponse = client.InitiateMultipartUpload(initRequest);
}
public void UploadPartToCloud(Stream fileStream, long uploadedBytes, long maxChunkedBytes)
{
int partNumber = uploadPartETags.Count() + 1; // current part
// 2. Upload Parts.
UploadPartRequest request = new UploadPartRequest();
request.BucketName = this.DestinationBucket;
request.Key = this.DestinationFilename;
request.UploadId = initResponse.UploadId;
request.PartNumber = partNumber;
request.PartSize = fileStream.Length;
//request.FilePosition = uploadedBytes // remove this line?
request.InputStream = fileStream; // as UploadPartRequest;
var up = client.UploadPart(request);
uploadPartETags.Add(new PartETag() { ETag = up.ETag, PartNumber = partNumber });
}
public string CompletePartToCloud()
{
// Step 3: complete.
CompleteMultipartUploadRequest compRequest = new CompleteMultipartUploadRequest();
compRequest.BucketName = this.DestinationBucket;
compRequest.Key = this.DestinationFilename;
compRequest.UploadId = initResponse.UploadId;
compRequest.PartETags = uploadPartETags;
string r = "Something went badly wrong";
using (CompleteMultipartUploadResponse completeUploadResponse = client.CompleteMultipartUpload(compRequest))
r = completeUploadResponse.ResponseXml;
return r;
}
public void AbortPartToCloud()
{
// abort.
client.AbortMultipartUpload(new AbortMultipartUploadRequest()
{
BucketName = this.DestinationBucket,
Key = this.DestinationFilename,
UploadId = initResponse.UploadId
});
}
public void Dispose()
{
if (client != null) client.Dispose();
if (initResponse != null) initResponse.Dispose();
}
}
I use DestinationFilename as the destination file so I can avoid the same name, but I keep the OriginalFilename as I needed later.
Using jQuery File Upload Plugin, all works inside a Generic Handler, and the process is something like this:
// Upload partial file
private void UploadPartialFile(string fileName, HttpContext context, List<FilesStatus> statuses)
{
if (context.Request.Files.Count != 1)
throw new HttpRequestValidationException("Attempt to upload chunked file containing more than one fragment per request");
var inputStream = context.Request.Files[0].InputStream;
string contentRange = context.Request.Headers["Content-Range"]; // "bytes 0-6291455/14130271"
int fileSize = int.Parse(contentRange.Split('/')[1]);,
maxChunkSize = int.Parse(context.Request.Headers["X-Max-Chunk-Size"]),
uploadedBytes = int.Parse(contentRange.Replace("bytes ", "").Split('-')[0]);
iView.Utilities.AWS.S3MultipartUpload s3Upload = null;
try
{
// ######################################################################################
// 1. Initialize Amazon S3 Client
if (uploadedBytes == 0)
{
HttpContext.Current.Session["s3-upload"] = new iView.Utilities.AWS.S3MultipartUpload(fileName, awsBucket);
s3Upload = (iView.Utilities.AWS.S3MultipartUpload)HttpContext.Current.Session["s3-upload"];
string msg = System.String.Format("Upload started: {0} ({1:N0}Mb)", s3Upload.DestinationFilename, (fileSize / 1024));
this.Log(msg);
}
// cast current session object
if (s3Upload == null)
s3Upload = (iView.Utilities.AWS.S3MultipartUpload)HttpContext.Current.Session["s3-upload"];
// ######################################################################################
// 2. Send Chunks
s3Upload.UploadPartToCloud(inputStream, uploadedBytes, maxChunkSize);
// ######################################################################################
// 3. Complete Upload
if (uploadedBytes + maxChunkSize > fileSize)
{
string completeRequest = s3Upload.CompletePartToCloud();
this.Log(completeRequest); // log S3 response
s3Upload.Dispose(); // dispose all objects
HttpContext.Current.Session["s3-upload"] = null; // we don't need this anymore
}
}
catch (System.Exception ex)
{
if (ex.InnerException != null)
while (ex.InnerException != null)
ex = ex.InnerException;
this.Log(string.Format("{0}\n\n{1}", ex.Message, ex.StackTrace)); // log error
s3Upload.AbortPartToCloud(); // abort current upload
s3Upload.Dispose(); // dispose all objects
statuses.Add(new FilesStatus(ex.Message));
return;
}
statuses.Add(new FilesStatus(s3Upload.DestinationFilename, fileSize, ""));
}
Keep in mind that to have a Session object inside a Generic Handler, you need to implement IRequiresSessionState so your handler will look like:
public class UploadHandlerSimple : IHttpHandler, IRequiresSessionState
Inside fileupload.js (under _initXHRData) I have added an extra header called X-Max-Chunk-Size so I can pass this to Amazon and calculate if it's the last part of the uploaded file.
Fell free to comment and make smart edits for everyone to use.
I guess you didn't set the content-length of the part inside the UploadPartToCloud() function.

Detecting if internet connection is busy

We are developing an application that will install on PC and it will perform some background upload and download to/from our server. One of the requirement is to detect if the internet connection is currently busy (say above 50% utilization) and if it is, it needs to back-off and try another time. The main reason is to ensure the app does not interfere with user experience if they are in the middle of gaming, watching online movie or aggressively downloading files
After much thinking and research on google and of course SO, I still haven't found a good way on how to implement this, so decided to throw this out here. The application is implemented in C#, .NET 4.0 and I am looking for all forms of responses - either implementation in C# or other languages, pseudo-logic or approach on how to achieve - measuring of internet traffic utilization on local PC with good enough accuracy.
To avoid duplication of effort, so far I have tried these (and why they aren't suitable)
Use WMI to get network statistic. Most SO posts and solutions out there since to refer to this as the approach but it doesn't meet our requirement as measuring of bytes sent/received against network interface capacity (e.g. 1GB Ethernet Card) for utilisation will yield a good measure for LAN traffic but not for internet traffic (where the actual internet bandwidth might only be say 8Mbps)
Use of .NET Network Information Statistics or performance counter - yield similar readings to the above hence have the same shortcomings
Use ICMP (Ping) and measure RTT. It was suggested that 400ms RTT is considered as slow and good indication for busy network, however I was told that user with modem (yes we have to support that), use of reverse proxy or microwave link often get ping above that hence not a good measure
Start downloading a known file and measure the speed - this itself generate traffic which we are trying to avoid, also if this check is done often enough, our application will end up creating a lot of internet traffic - which again not ideal
MOD: Using BITS - this service can be disabled on user pc, requires group policy changes and assume server to be IIS (with custom configuration) and in our case our server is not IIS
So here it is, I'm all confuse and looking for some advice. I highlighted the question text so that you guys don't get lost reading this and wondering what the question is. Thanks.
You could use UPnP to query the router, and retrive the number of bytes sent and received over the network. You could keep checking this value on the router to determine what the activity is. Unfortunately this functionality doesn't seem to be well-documented, but it is possible to implement UPnP communication functionality within a C# application. You will need to use UDP to query for the router (UPnP discover), and once you have found the device, query its functionality, and then query the number of packets sent and received for the Internet Gateway Device using a WebClient (TCP).
Code for a UPnP library:
using System;
using System.Collections.Generic;
using System.Text;
using System.Net.Sockets;
using System.Net;
using System.Xml;
using System.IO;
namespace UPNPLib
{
public class RouterElement
{
public RouterElement()
{
}
public override string ToString()
{
return Name;
}
public List children = new List();
public RouterElement parent;
public string Name;
public string Value;
public RouterElement this[string name] {
get
{
foreach (RouterElement et in children)
{
if (et.Name.ToLower().Contains(name.ToLower()))
{
return et;
}
}
foreach (RouterElement et in children)
{
Console.WriteLine(et.Name);
}
throw new KeyNotFoundException("Unable to find the specified entry");
}
}
public RouterElement(XmlNode node, RouterElement _parent)
{
Name = node.Name;
if (node.ChildNodes.Count
/// Gets the root URL of the device
///
///
public static string GetRootUrl()
{
StringBuilder mbuilder = new StringBuilder();
mbuilder.Append("M-SEARCH * HTTP/1.1\r\n");
mbuilder.Append("HOST: 239.255.255.250:1900\r\n");
mbuilder.Append("ST:upnp:rootdevice\r\n");
mbuilder.Append("MAN:\"ssdp:discover\"\r\n");
mbuilder.Append("MX:3\r\n\r\n");
UdpClient mclient = new UdpClient();
byte[] dgram = Encoding.ASCII.GetBytes(mbuilder.ToString());
mclient.Send(dgram,dgram.Length,new IPEndPoint(IPAddress.Broadcast,1900));
IPEndPoint mpoint = new IPEndPoint(IPAddress.Any, 0);
rootsearch:
dgram = mclient.Receive(ref mpoint);
string mret = Encoding.ASCII.GetString(dgram);
string orig = mret;
mret = mret.ToLower();
string url = orig.Substring(mret.IndexOf("location:") + "location:".Length, mret.IndexOf("\r", mret.IndexOf("location:")) - (mret.IndexOf("location:") + "location:".Length));
WebClient wclient = new WebClient();
try
{
Console.WriteLine("POLL:" + url);
string reply = wclient.DownloadString(url);
if (!reply.ToLower().Contains("router"))
{
goto rootsearch;
}
}
catch (Exception)
{
goto rootsearch;
}
return url;
}
public static RouterElement enumRouterFunctions(string url)
{
XmlReader mreader = XmlReader.Create(url);
XmlDocument md = new XmlDocument();
md.Load(mreader);
XmlNodeList rootnodes = md.GetElementsByTagName("serviceList");
RouterElement elem = new RouterElement();
foreach (XmlNode et in rootnodes)
{
RouterElement el = new RouterElement(et, null);
elem.children.Add(el);
}
return elem;
}
public static RouterElement getRouterInformation(string url)
{
XmlReader mreader = XmlReader.Create(url);
XmlDocument md = new XmlDocument();
md.Load(mreader);
XmlNodeList rootnodes = md.GetElementsByTagName("device");
return new RouterElement(rootnodes[0], null);
}
}
public class RouterMethod
{
string url;
public string MethodName;
string parentname;
string MakeRequest(string URL, byte[] data, string[] headers)
{
Uri mri = new Uri(URL);
TcpClient mclient = new TcpClient();
mclient.Connect(mri.Host, mri.Port);
Stream mstream = mclient.GetStream();
StreamWriter textwriter = new StreamWriter(mstream);
textwriter.Write("POST "+mri.PathAndQuery+" HTTP/1.1\r\n");
textwriter.Write("Connection: Close\r\n");
textwriter.Write("Content-Type: text/xml; charset=\"utf-8\"\r\n");
foreach (string et in headers)
{
textwriter.Write(et + "\r\n");
}
textwriter.Write("Content-Length: " + (data.Length).ToString()+"\r\n");
textwriter.Write("Host: " + mri.Host+":"+mri.Port+"\r\n");
textwriter.Write("\r\n");
textwriter.Flush();
Stream reqstream = mstream;
reqstream.Write(data, 0, data.Length);
reqstream.Flush();
StreamReader reader = new StreamReader(mstream);
while (reader.ReadLine().Length > 2)
{
}
return reader.ReadToEnd();
}
public RouterElement Invoke(string[] args)
{
MemoryStream mstream = new MemoryStream();
StreamWriter mwriter = new StreamWriter(mstream);
//TODO: Implement argument list
string arglist = "";
mwriter.Write("" + "" + "");
mwriter.Write("");//" + arglist + "");
mwriter.Write("");
mwriter.Flush();
List headers = new List();
headers.Add("SOAPAction: \"" + parentschema + "#" + MethodName + "\"");
mstream.Position = 0;
byte[] dgram = new byte[mstream.Length];
mstream.Read(dgram, 0, dgram.Length);
XmlDocument mdoc = new XmlDocument();
string txt = MakeRequest(url, dgram, headers.ToArray());
mdoc.LoadXml(txt);
try
{
RouterElement elem = new RouterElement(mdoc.ChildNodes[0], null);
return elem["Body"].children[0];
}
catch (Exception er)
{
RouterElement elem = new RouterElement(mdoc.ChildNodes[1], null);
return elem["Body"].children[0];
}
}
public List parameters = new List();
string baseurl;
string parentschema;
public RouterMethod(string svcurl, RouterElement element,string pname, string baseURL, string svcpdsc)
{
parentschema = svcpdsc;
baseurl = baseURL;
parentname = pname;
url = svcurl;
MethodName = element["name"].Value;
try
{
foreach (RouterElement et in element["argumentList"].children)
{
parameters.Add(et.children[0].Value);
}
}
catch (KeyNotFoundException)
{
}
}
}
public class RouterService
{
string url;
public string ServiceName;
public List methods = new List();
public RouterMethod GetMethodByNonCaseSensitiveName(string name)
{
foreach (RouterMethod et in methods)
{
if (et.MethodName.ToLower() == name.ToLower())
{
return et;
}
}
throw new KeyNotFoundException();
}
public RouterService(RouterElement element, string baseurl)
{
ServiceName = element["serviceId"].Value;
url = element["controlURL"].Value;
WebClient mclient = new WebClient();
string turtle = element["SCPDURL"].Value;
if (!turtle.ToLower().Contains("http"))
{
turtle = baseurl + turtle;
}
Console.WriteLine("service URL " + turtle);
string axml = mclient.DownloadString(turtle);
XmlDocument mdoc = new XmlDocument();
if (!url.ToLower().Contains("http"))
{
url = baseurl + url;
}
mdoc.LoadXml(axml);
XmlNode mainnode = mdoc.GetElementsByTagName("actionList")[0];
RouterElement actions = new RouterElement(mainnode, null);
foreach (RouterElement et in actions.children)
{
RouterMethod method = new RouterMethod(url, et,ServiceName,baseurl,element["serviceType"].Value);
methods.Add(method);
}
}
}
}
Code for a bandwidth meter:
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Text;
using System.Windows.Forms;
using UPNPLib;
using System.IO;
namespace bandwidthmeter
{
public partial class Form1 : Form
{
public Form1()
{
InitializeComponent();
BinaryReader mreader = new BinaryReader(File.Open("bandwidthlog.txt", FileMode.OpenOrCreate));
if (mreader.BaseStream.Length > 0)
{
prevsent = mreader.ReadInt64();
prevrecv = mreader.ReadInt64();
}
mreader.Close();
List services = new List();
string fullurl = UPNP.GetRootUrl();
RouterElement router = UPNP.enumRouterFunctions(fullurl);
Console.WriteLine("Router feature enumeration complete");
foreach (RouterElement et in router.children)
{
services.Add(new RouterService(et.children[0], fullurl.Substring(0, fullurl.IndexOf("/", "http://".Length+1))));
}
getReceiveDelegate = services[1].GetMethodByNonCaseSensitiveName("GetTotalBytesReceived");
getSentDelegate = services[1].GetMethodByNonCaseSensitiveName("GetTotalBytesSent");
Console.WriteLine("Invoking " + getReceiveDelegate.MethodName);
//Console.WriteLine(services[1].GetMethodByNonCaseSensitiveName("GetTotalPacketsSent").Invoke(null));
Timer mymer = new Timer();
mymer.Tick += new EventHandler(mymer_Tick);
mymer.Interval = 1000;
mymer.Start();
FormClosed += new FormClosedEventHandler(Form1_FormClosed);
}
long prevsent = 0;
long prevrecv = 0;
void Form1_FormClosed(object sender, FormClosedEventArgs e)
{
BinaryWriter mwriter = new BinaryWriter(File.Open("bandwidthlog.txt", FileMode.OpenOrCreate));
mwriter.Write(getsent());
mwriter.Write(getreceived());
mwriter.Flush();
mwriter.Close();
}
long getsent()
{
long retval = Convert.ToInt64(getSentDelegate.Invoke(null).children[0].Value);
if (prevsent > retval)
{
retval = prevsent + retval;
}
return retval;
}
long getreceived()
{
long retval = Convert.ToInt64(getReceiveDelegate.Invoke(null).children[0].Value);
if (prevrecv > retval)
{
retval = prevrecv + retval;
}
return retval;
}
void mymer_Tick(object sender, EventArgs e)
{
label1.Text = "Sent: "+(getsent()/1024/1024).ToString()+"MB\nReceived: "+(getreceived()/1024/1024).ToString()+"MB";
}
RouterMethod getSentDelegate;
RouterMethod getReceiveDelegate;
}
}
Have you considered using Background Intelligent Transfer Service (BITS). It's designed to do this job already:
Background Intelligent Transfer Service (BITS) transfers files (downloads or uploads) between a client and server and provides progress information related to the transfers. You can also download files from a peer.
and,
Preserve the responsiveness of other network applications.
I'm not sure if there's a managed interface to it (I can see reference to Powershell Cmdlets), so you might have to use COM interop to use it.
Making the assumption that you are targetting Windows PC's (as you said you were developing in C#), have you looked at BITS, the Background Intelligent Transfer Service?
There's examples of how to hook into it using C# on MSDN and elsewhere, e.g. http://msdn.microsoft.com/en-us/magazine/cc188766.aspx

Categories