As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I've made a typo in my password back in the days when 7-zip didn't have a 'confirm pwd' field. So now I have a pwd-protected 7-zip file. I've writen some software to generate most likely typo variations of my password (55 million) and stored those in a file per 25k. Now I'm trying them out, one-by-one. I can do about 25k pwd's in an hour, using the unar commandline tool on a Macbook.
It works, but it will still take a nice 100 days (24/7) to go through all 55 mln pwds. Now I'd like to know, if there's some library (c# mono/dotnet) that supports decoding a 7z file that is pwd protected?
Any other suggestions on fixing my problem are also welcome.
To speed up the brute force, look into using CUDA or OpenCL.
These will let you utilise the GPU of the host machine to perform your processing, and will produce results much faster.
25K passwords per hour is quite low - when hash cracking (for example), a good tool using GPU will be able to hit 9500million passwords a minute on a mid-high end GPU.
Whilst hitting that figure is unlikely when trying to break 7z, you could definitely see a speed increase.
Also - the better the PC, the better the result. In many cases a Linux box is your best bet. If you can use a cluster of computers - all the better.
Related
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I have a 500+GB text file. it has to be searched for duplicates, remove them, sort and save final file. Of course for such big file, the LINQ or such things are not good at all and will not work so they have to use External Sorting. there is an app called "Send-Safe List Manager". its speed is super fast, for a 200MB txt file it gives the result in less than 10 seconds. after examining inside the exe using "Greatis WinDowse" app i found that it has been written in Delphi. there are some external sorting classes written in C#. i have tested a 200MB file with them and all were over 1 minute. so my question is that for this kind of calculations is Delphi faster than C# and if i have to write my own, then should i use delphi? and with C# can i reach that speed at all?
Properly written sorting code for large file must be disk bound - at that point there essentially no difference what language you use.
Delphi generates native code and also allows for inline assembly, so in theory, maximum speed for a specific algorithm could be easier to reach in Delphi.
However, the performance of what you describe will be tied to the IO performance, and the performance difference between possible algorithms will be of several orders of magnitude more than the Delphi vs. .NET difference.
The language is probably the last thing you should look at if trying to speed that up.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
File compression is a very important matter now a days. I am a C# programmer. I heard about DotNetZip, SharpZibLib, GZipStream etc. Confused about which one is the best. What I need are:
1: Faster compression speed
2: Better encryption
3: Both zip, unzip facility
4: Password protection
5: Free and enough reference/tutorial
6: Easier for beginner
7: Add files having .mp3, .txt etc format
8: Can work with many files(say 500 at a time, total 1/2 GB)
Thanks in advance
Update: Time is more more expensive than space :)
I have used DotNetZip in some projects and it supports all the things you want. Considering the speed, you have to choose between speed a compression, thats just how zip works. The library provides an option for you to choose whant compression level you want.
The library is well documented and really easy for beginners. Starting with it is just about installing 1 nuget package and calling zipFile.AddFile and zipFile.Save.
http://nuget.org/packages/DotNetZip
They have different uses. The GZipStream class is part of the framework, and has the capability of compressing a single file.
If you want to compress several files into a single archive (that can be uncompressed by someone else), you need a library that adds that capability. Such libraries can also have setting to control the balance between perforance and compression rate, to achieve better performance or better compression rate than the built in class.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
A few months ago, we started developing an app to control an in-house developed test equipment and record a set of measurements. It should have a simple UI, and would likely require threads due to the continuous recording that must take place. This application will be used for a few years, and shall be maintained by a number of computer science students during this period.
Our boss graduated some 30 years ago (not to be taken as an offense; I have more than half that time on my back too) and has mandated that we develop this application in ANSI C. The rationale is that he is the only one that will be around the entire time, and therefore he must be able to understand what we are doing. He also ruled that we should use no abstract data types; he even gave us a list with the name of the global variables (sigh) he wants us to use.
I actually tried that approach for a while, but it was really slowing me down to make sure that all pointer operations were safe and all strings had the correct size. Additionally, the number of lines of code that actually related to the problem in hand was a only small fraction of our code base. After a few days, I scrapped the entire thing and started anew using C#. Our boss has already seen the program running and he likes the way it works, but he doesn't know that it's written in another language.
Next week the two of us will meet to go over the source code, so that he "will know how to maintain it". I am sort of scared, and I would like to hear from you guys what arguments I could use to support my decision.
Honestly it could blow up in your face, but if your boss isn't a complete twit he will see how much more productive you've been and be able to move forward. C# does use a different paradigm from ANSI C, with the OOP and all, but the syntax is similar enough he should be able to figure it out. If not leave lots of good comments in the code, and maybe be nice enough to produce some actual technical documentation he can read through.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
let's say that we ignore the target and source hardware for a moment. So, what's the better endian style to go with -- big or small?
I'm just trying to go with consensus / convention on this one. The best guidance I've received so far is "it depends" so always specify. That's fine. I'll do that.
However, in this situation there is no need to be one way or the other. There's no legacy, so I thought, "what would be the cleanest choice for current & emerging hardware."
Use whatever is predominant in your hardware. Or use "network byte order" (big endian) because the internet does. Or pick one at random. It's unimportant.
Don't choose. Just use whatever your compiler/platform uses. That gives no hassle and just works.
If you are doing raw network stuff, you may want to convert things to/from network endianness though, which is big endian. But don't mess up your whole code because of that. Just do the conversion when you get to the network writing part.
Actually the answer is it depends
If you just want a choice then Since in Big Endian high order byte comes first ,you can always check positive or negative from the first byte.
It doesn't matter. Just pick one.
This is a topic of endless debate. One does not hold a particular advantage over the other.
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
use net remoting or others ??
I want sample way, I think that no socket or other would be more easy to deploy ...
anyway ,help please , thanks.
You will want to begin with Namedpipes.
Since you are dealing with C#, have a look at:
http://msdn.microsoft.com/en-us/library/aa365590(v=vs.85).aspx
Essentially, this got me going instantly when I was looking into it:
http://msdn.microsoft.com/en-us/library/bb546085.aspx#Y1920
Good luck.
It depends on what you want to do. If you just want to send notifications between two processes running on the same computer, named events work just fine. If you want to send long messages, then there are named pipes, sockets, and WCF (which replaces .NET Remoting). You might also want to share memory with memory mapped files. There are several other possibilities.
The method you use depends in large part on how much data you want to communicate, how fast you need it to be, and how much time you want to spend futzing with it.