Passing array into query string - c#

I need to send array as query parameter, I do it like this
StringBuilder Ids = new StringBuilder();
for (int i = 0; i < array.Count; i++)
{
Ids.Append(String.Format("&id[{0}]={1}", i, items[i].ID));
}
ifrDocuments.Attributes.Add("src", "Download.aspx?arrayCount=" + array.Count + Ids);
After this I have string:
Download.aspx?arrayCount=8&id[0]=106066&id[1]=106065&id[2]=106007&id[3]=105284&id[4]=105283&id[5]=105235&id[6]=105070&id[7]=103671
It can contain 100 elements and in this case I'm getting error:
enter image description here
Maybe I can do it in another way? Not by sending it inside query qtring.

There is a limit on URL length on multiple levels (browsers, proxy servers, etc). You can change maxQueryString (*1) but I would not recommend it if you expect real users to use your system.
It looks like downloads.aspx is your page. Put all those ids in temporary storage - (cache or database) and pass the key to this new entity in the request
*1: https://blog.elmah.io/fix-max-url-and-query-string-length-with-web-config-and-iis/

QueryString is not the way to pass an array because of the limits.
If you have hands on the endpoint, you should consider sending your array in a POST Body.
Regards

Related

How to create a smallest hash with less character in a url

I have the following Hash.EncodeMD5(p) that take a value p and Encode it, and eventually it get passed into a url as such:
www.mysite/test/test.aspx?perm=?|1098951-c2bcc0d267304a3d7d663007dbf801bc|1011796-3af44ad8442000232390799c367a06ed|
My problem is my url can be very long in length. and how can I reduce it's length? I believe the issue is Hash.EncodeMD5(p);
string[] x = (Request.Form["ID"]).Split(',');
foreach (string pp in x)
{
perm += pp + "-" + Hash.EncodeMD5(pp);
}
Try one of these without shortening the hash :
Store the very long hash value in a cookie that you read later server-side.
Hide the values in some hidden Asp.Net/HTML controls.
Hide the values (client-side) via window.localStorage. This solution might not work but it worths the try.

ASP.Net MVC-generate new HttpContext.Session.SessionID after request

I have created an application using asp.net MVC on user login I save the value of session id using
Session["SessionId"] = HttpContext.session.SessionId
during the session when the user create new request on a page called "NewQuotee" I would like to generate new SessionId and assign the new value to Session["SessionId"] How can I do that?
so after further reading their is no direct way to request the client browser to regenerate a new session id, even if I used Session.Clear(), Session.RemoveAll() or Session.Abandon() it will not serve my needs (you can read about the difference in this artical Link it was a good read), so at the end of the day I was using the sessionid as a generaic string to identify database records for update statement at the end of page request so I created a function that generate a random 20 character string by each call and assign the value to a global var, this var can be used in my insert/update requests, the method I created to generate a random string is as follows :-
public static string GenerateSessionID()
{
char[] characters = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz".ToCharArray();
Random r = new Random();
System.Threading.Thread.Sleep(10);
string randomstring = "";
for (int i = 0; i < 20; i++)
{
randomstring += characters[r.Next(0, 60)].ToString();
}
return randomstring;
}
to help understand the code
char[] characters = "0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz".ToCharArray();
this line of code will create an array of char including all numbers letters in lower and upper case, you can put any kind of characters you would like
Random r = new Random();
here you create an object "r" of the random class
string randomstring = "";
just creating a string to hold the value of generated key
for (int i = 0; i < 20; i++)
you can define the length of the random key by increasing or decreasing the max value of "i"
r.Next(0, 60)
the method r.Next(0,60) return a random number between 0, 60 ; r.Next(min, max); you can call r.Next() it will get any random number
I defined the min and max value because I will use the number generated as an index to the characters array
randomstring += characters[r.Next(0, 60)].ToString();
I guess now it is self explained that everytime you loop, the char in the character array with index random will be concatenated to the randomstring
System.Threading.Thread.Sleep(10);
now the reason I used thread.sleep for 10 mili-sesond is, when u keep calling the random .next() function without any break it will return the same value, I briefly read about it, I kept decresing the value of the mili-second until I reached 2 and it was a hit and miss sometimes it generate a new value sometimes not so just to be on the safe side I put the value as 10
now I call the function when I need it and place the value in a string and use it

Why isn't this method returning anything?

This is for my internship so I can't give much more context, but this method isn't returning the desired int and causing an Index Out of Bounds exception instead.
The String[] taken into the method is composed of information from a handheld scanner used by my company's shipping department. Its resulting dataAsByteArray is really a Byte[][] so the .Length in the nested If statement will get the number of Bytes of a Bundle entry and then add it to fullBundlePacketSize as long as the resulting sum is less than 1000.
Why less than 1000? The bug I've been tasked with fixing is that some scanners (with older versions of Bluetooth) will only transmit about 1000 bytes of data to the receiver at a time. This method is to find how many bytes can be transmitted without cutting into a bundle entry (originally I had it hard coded to just transmit 1000 bytes at a time and that caused the receiver to get invalid bundle data).
The scanners are running a really old version of Windows CE and trying to debug in VS (2008) just opens an emulator for the device which doesn't help.
I'm sure it's something really simple, but I feel like a new set of eyes looking at it would help, so any help or solutions are greatly appreciated!
private int MaxPacketSizeForFullBundle(string[] data)
{
int fullBundlePacketSize = 0;
var dataAsByteArray = data.Select(s => Encoding.ASCII.GetBytes(s)).ToArray();
for (int i = 0; i < dataAsByteArray.Length; i++)
{
if ((fullBundlePacketSize + dataAsByteArray[i + 1].Length < 1000))
{
fullBundlePacketSize += dataAsByteArray[i].Length;
}
}
return fullBundlePacketSize;
}
Take a look at your loop:
for (int i = 0; i < dataAsByteArray.Length; i++)
{
if ((fullBundlePacketSize + dataAsByteArray[i + 1].Length < 1000))
^^^^^
I suspect you are throwing an exception because you are indexing an array beyond its length.
Do you mean this?
for (int i = 0; i < (dataAsByteArray.Length - 1); i++)
In conjunction is #n8wrl's answer. Your problem was accessing out of range exception. I believe you were attempting to access the actual byte values inside of the values in dataAsByteArray.
string[] data = {"444", "abc445x"};
int x = MaxPacketSizeForFullBundle(data);
dataAsByteArray = {byte[2][]}
byte[2][0] contains {byte[3]} which contains the actual values of each character in the string, so for "444", it would contain 52, 52, 52. You are probably attempting to access these individual values which means that you need to access the deeper nested bytes with byte[i][j] where 0<j<data[i].Length.

Programatically reference ascending variable names (var1, var2, ... )

I'm currently coding a project that can take up to 200 entries of a specific product, as determined by user input. Basically, my GUI loads, and I use jQuery to dynamically build the entries whenever there is a change to the amount field. When using jQuery, I simply give each of them ids in the form of variable1, variable2, ...., variableX (where X is the amount of entries indicated). Small snippet of code to clarify:
for(var i = 1;i <= amount_selected; i++) {
$('table_name tr:last').after('<tr><td><input type="text" id="variable' + i + '"></td></tr>');
}
Now when I try to move to the back end, I'm trying to reference these variable names by putting them in a list. I went ahead and put them in a list of HtmlInputText, to call the Variable names from the list itself. (This would save having to call all (up to 200) methods manually, which is really not an option).
So what I did (in C#) was:
List<HtmlInputText> listvar = new List<HtmlInputText>();
for(int i = 1; i <= amount_selected; i++) {
string j = "variable" + Convert.ToString(i);
HtmlInputText x = j;
listvar.Add((x));
samplemethod(listvar[i]);
}
But it's not working at all. Does anyone have any ideas as to how this would be done, without doing so manually? I know my logic might be completely off, but hopefully this illustrates at least what I'm attempting to do.
I'm assuming these inputs are in a form? If you're submitting then you can access the text boxes from the Request object:
List<string> results = new List<string>();
for (int i = 1; i <= amount_selected; i++)
{
string s = String.Format("{0}", Request.Form["variable" + Convert.ToString(i)]);
results.Add(s);
}
you could do $("#variable" + Convert.ToString(i)).val()

SQL huge selection of IDs - How to make it faster?

I have an array with a huge amounts of IDs I would like to select out from the DB.
The usual approach would be to do select blabla from xxx where yyy IN (ids) OPTION (RECOMPILE).
(The option recompile is needed, because SQL server is not intelligent enough to see that putting this query in its query cache is a huge waste of memory)
However, SQL Server is horrible at this type of query when the amount of IDs are high, the parser that it uses to simply too slow.
Let me give an example:
SELECT * FROM table WHERE id IN (288525, 288528, 288529,<about 5000 ids>, 403043, 403044) OPTION (RECOMPILE)
Time to execute: ~1100 msec (This returns appx 200 rows in my example)
Versus:
SELECT * FROM table WHERE id BETWEEN 288525 AND 403044 OPTION (RECOMPILE)
Time to execute: ~80 msec (This returns appx 50000 rows in my example)
So even though I get 250 times more data back, it executes 14 times faster...
So I built this function to take my list of ids and build something that will return a reasonable compromise between the two (something that doesn't return 250 times as much data, yet still gives the benefit of parsing the query faster)
private const int MAX_NUMBER_OF_EXTRA_OBJECTS_TO_FETCH = 5;
public static string MassIdSelectionStringBuilder(
List<int> keys, ref int startindex, string colname)
{
const int maxlength = 63000;
if (keys.Count - startindex == 1)
{
string idstring = String.Format("{0} = {1}", colname, keys[startindex]);
startindex++;
return idstring;
}
StringBuilder sb = new StringBuilder(maxlength + 1000);
List<int> individualkeys = new List<int>(256);
int min = keys[startindex++];
int max = min;
sb.Append("(");
const string betweenAnd = "{0} BETWEEN {1} AND {2}\n";
for (; startindex < keys.Count && sb.Length + individualkeys.Count * 8 < maxlength; startindex++)
{
int key = keys[startindex];
if (key > max+MAX_NUMBER_OF_EXTRA_OBJECTS_TO_FETCH)
{
if (min == max)
individualkeys.Add(min);
else
{
if(sb.Length > 2)
sb.Append(" OR ");
sb.AppendFormat(betweenAnd, colname, min, max);
}
min = max = key;
}
else
{
max = key;
}
}
if (min == max)
individualkeys.Add(min);
else
{
if (sb.Length > 2)
sb.Append(" OR ");
sb.AppendFormat(betweenAnd, colname, min, max);
}
if (individualkeys.Count > 0)
{
if (sb.Length > 2)
sb.Append(" OR ");
string[] individualkeysstr = new string[individualkeys.Count];
for (int i = 0; i < individualkeys.Count; i++)
individualkeysstr[i] = individualkeys[i].ToString();
sb.AppendFormat("{0} IN ({1})", colname, String.Join(",",individualkeysstr));
}
sb.Append(")");
return sb.ToString();
}
It is then used like this:
List<int> keys; //Sort and make unique
...
for (int i = 0; i < keys.Count;)
{
string idstring = MassIdSelectionStringBuilder(keys, ref i, "id");
string sqlstring = string.Format("SELECT * FROM table WHERE {0} OPTION (RECOMPILE)", idstring);
However, my question is...
Does anyone know of a better/faster/smarter way to do this?
In my experience the fastest way was to pack numbers in binary format into an image. I was sending up to 100K IDs, which works just fine:
Mimicking a table variable parameter with an image
Yet is was a while ago. The following articles by Erland Sommarskog are up to date:
Arrays and Lists in SQL Server
If the list of Ids were in another table that was indexed, this would execute a whole lot faster using a simple INNER JOIN
if that isn't possible then try creating a TABLE variable like so
DECLARE #tTable TABLE
(
#Id int
)
store the ids in the table variable first, then INNER JOIN to your table xxx, i have had limited success with this method, but its worth the try
You're using (key > max+MAX_NUMBER_OF_EXTRA_OBJECTS_TO_FETCH) as the check to determine whether to do a range fetch instead of an individual fetch. It appears that's not the best way to do that.
let's consider the 4 ID sequences {2, 7}, {2,8}, {1,2,7}, and {1,2,8}.
They translate into
ID BETWEEN 2 AND 7
ID ID in (2, 8)
ID BETWEEN 1 AND 7
ID BETWEEN 1 AND 2 OR ID in (8)
The decision to fetch and filter the IDs 3-6 now depends only on the difference between 2 and 7/8. However, it does not take into account whether 2 is already part of a range or a individual ID.
I think the proper criterium is how many individual IDs you save. Converting two individuals into a range removes has a net benefit of 2 * Cost(Individual) - Cost(range) whereas extending a range has a net benefit of Cost(individual) - Cost(range extension).
Adding recompile not a good idea. Precompiling means sql does not save your query results but it saves the execution plan. Thereby trying to make the query faster. If you add recompile then it will have the overhead of compiling the query always. Try creating a stored procedure and saving the query and calling it from there. As stored procedures are always precompiled.
Another dirty idea similar to Neils,
Have a indexed view which holds the IDs alone based on your business condition
And you can join the view with your actual table and get the desired result.
The efficient way to do this is to:
Create a temporary table to hold the IDs
Call a SQL stored procedure with a string parameter holding all the comma-separated IDs
The SQL stored procedure uses a loop with CHARINDEX() to find each comma, then SUBSTRING to extract the string between two commas and CONVERT to make it an int, and use INSERT INTO #Temporary VALUES ... to insert it into the temporary table
INNER JOIN the temporary table or use it in an IN (SELECT ID from #Temporary) subquery
Every one of these steps is extremely fast because a single string is passed, no compilation is done during the loop, and no substrings are created except the actual id values.
No recompilation is done at all when this is executed as long as the large string is passed as a parameter.
Note that in the loop you must tracking the prior and current comma in two separate values
Off the cuff here - does incorporating a derived table help performance at all? I am not set up to test this fully, just wonder if this would optimize to use between and then filter the unneeded rows out:
Select * from
( SELECT *
FROM dbo.table
WHERE ID between <lowerbound> and <upperbound>) as range
where ID in (
1206,
1207,
1208,
1209,
1210,
1211,
1212,
1213,
1214,
1215,
1216,
1217,
1218,
1219,
1220,
1221,
1222,
1223,
1224,
1225,
1226,
1227,
1228,
<...>,
1230,
1231
)

Categories