Intelligent compilers can't prevent infinite loops and SQL Cartesian joins [closed] - c#

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I am frustrated that my programs crash due to the following two problems:
Infinite loops (e.g. C# or Javascript)
SQL joins where I forgot to add a join clause
These seem to be preventable problems if the compilers were competent enough. How can these problems be prevented programatically?

Modern compilers can and do unroll loops for optimization reasons, but without knowing some of the data ahead of time, can't even make a heuristic for whether your loops will terminate (See: dataflow programming). In fact, deciding whether your program itself will terminate is called the Halting Problem
In other cases, you want infinite loops. For example a graphics engine usually does something like this:
while(true)
render
As for your SQL joins... I guess it should be pretty obvious when you miss one. In some cases, an INNER JOIN is implied when you don't give one, so in that sense your compiler is fixing this exact issue.

Related

How would I go about deobfuscating confuserex on most amount of obfuscation? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
If I had a confuserex application with the most amount of obfuscation added (max preset) how would I deobfuscate it?
It is orders of magnitude easier to obfuscate something, the to de-obfuscate it. This question has a high likelyhood of having nefarious intent. And any practical help would just result in future obfuscation being way more agressive. A scenario worse for performance, debugging and reliability of hte obfuscated software.
About the only theoretical way I could think off, would having something like Visual Studioes/.NET JiT level code optimisation. They are capable of swapping out constructst for totally different, quicker ones. A optimsaition for speed. This would be a optimisation towards readability. But even that will not be able to recover names obfuscated with some placeholder name. But I do not want to think this any further.

Can I use unmanaged C++ code to reduce calculation costs in a C# project? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I want to use a c++ project to do some calculations for a c# project and return the results.
I was wondering if I would benefit a more efficient calculation speed in c++ if I do so?
Would still be efficient if I wrapped the native code in c++/cli?
Are there any examples out there?
Just as simple example say you have two double values A and B in C#, how would you have c++ project to receive A and B and a string value "plus" or "times" to calculate and return A + B or A * B?
Use Process.Start(); to spawn your optimized program. You will be able to pass parameters and even read the output. Start here: https://msdn.microsoft.com/en-us/library/53ezey2s(v=vs.110).aspx
You've got two separate issues in your question: "How do I", and "Should I".
If you're having problems with the "How do I", please post a question with the specific code you have, and what problems you're having.
"Should I" is somewhat of a nebulous question: It depends a lot on the type of calculations you're trying to do. These questions often have no one right answer. (Also note that this type of question is often offtopic for Stack Overflow for that very reason, so this question may be closed.)
For some types of calculations, the C++ compiler might produce more efficient code than the .Net Jitter. For some types, it won't make a difference. C++ would also let you do things like using the GPU to perform the calculations.
Also, consider how long it will take you to write this optimized code, and how often you're going to run it. If this needs to run overnight once a month, maybe a couple hours to run is fine.

How split a big method to smaller several methods in C# [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
How can I optimally split one big, difficult method to several smaller methods in C#? Is there any perception or functionality for this issue?
Assuming you are using Visual Studio to edit your C# code, there is built-in functionality for what you are trying to do: Extract Method Refactoring. Note that this is still a manual process - there is no automatic tool which knows how to take your entire method apart.
What to keep in mind while refactoring? Taken from Robert C. Martin's Clean Code:
The first rule of functions is that they should be small. The second rule of functions is that they should be smaller than that.
Functions should do one thing. They should do it well. They should do it only.
We want the code to read like a top-down narrative.
Use descriptive names.
The ideal number of arguments for a function is
zero (niladic). Next comes one (monadic), followed
closely by two (dyadic). Three arguments (triadic)
should be avoided where possible. More than three
(polyadic) requires very special justification

When to ditch LINQ? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
LINQ just seems to generate horrendously optimised SQL (in general). I can write way more efficient T-SQL myself.
Do huge websites with my thousands of daily visitors use LINQ? Or should LINQ at some point be ditched? And if so, with what? and when?
Do huge websites with my thousands of daily visitors use LINQ?
Yes. Why not? Most SQL is trivial and you can always fall back to a stored procedure where it matters.
And that is the point. Let LINQ handlethe easy 80% and focus on the more complex.
And ther rest you handle by not talking to the database - caching is not that hard to plan for.
What I have done with my apps, when L2S or EF creates inefficient T-SQL is to create a view that does exactly what I want, in an efficient manner and I just have L2S or EF query the view.
In fact, this website (stackoverflow) uses linq2sql extensively, although I believe they are using a micro-orm (sqlfu?) for some of the heavier taks..

Multithreading advice on approach needed [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I have been told to make a process that insert data for clients using multithreading.
Need to update a client database in a short period of time.There is an application that does the job but it's single threaded.Need to make it multithread.
The idea being is to insert data in batches using the existing application
EG
Process 50000 records
assign 5000 record to each thread
The idea is to fire 10-20 threads and even multiple instance of the same application to do the job.
Any ideas,suggestions examples how to approach this.
It's .net 2.0 unfortunately.
Are there any good example how to do it that you have come across,EG ThreadPool etc.
Reading on multithreading in the meantime
I'll bet dollars to donuts the problem is that the existing code just uses an absurdly inefficient algorithm. Making it multi-threaded won't help unless you fix the algorithm too. And if you fix the algorithm, it likely will not need to be multi-threaded. This doesn't sound like the type of problem that typically benefits from multi-threading itself.
The only possible scenario I could see where this matters is if latency to the database is an issue. But if it's on the same LAN or in the same datacenter, that won't be an issue.

Categories