I'm attempting to utilize the DataView.ToValue function, and while it is pulling the correct columns needed, I am still not getting the DISTINCT values. I have a DataValue that contains a gridview of all of the information on the website. When a client goes to download this data in a specific format, they need just the unique values to be processed into the file.
At present, I store all of the columns needed for the report within an array with the purpose of being dynamic. Once that string array has been created, I create my DataTable with
tblData = dvData.ToTable(true, arrColumns);
Despite this, the data is still coming back with all rows. Am I missing something? According to this from Microsoft's documentation I SHOULD be getting back distinct values.
Turns out I forgot that a step on the backend handles cleaning up NULLs into valid datapoints for the xml. So it was properly sorting, just the null fields were actually causing "copies".
Related
I have a situation wherein a List object is built off of values pulled from a MSSQL database. However, this particular table is mysteriously getting an errant record or two tossed in. Removing the records cause trouble even though they have no referential links to any other tables, and will still get recreated without any known user actions taken. This causes some trouble as it puts unwanted values on display that add a little bit of confusion. The specific issue is that this is a platform that allows users to run a search for quotes, and the filtering allows for sales rep selection. The select/dropdown field is showing these errant values, and they need to be removed.
Given that deleting the offending table rows does not provide a desirable result, I was thinking that maybe the best course of action was to modify the code where the List object is created and either filter the values out or remove them after the object is populated. I'd like to do this in a clean, scalible fashion by providing some kind of appendable data object where I could just add in a new string value if something else cropped up as opposed to doing something clunky that adds new code to find the value and remove it each time.
My thought was to create a string array, and somehow loop through that to remove bad List values, but I wasn't entirely certain that was the best way to approach this, and I could not for the life of me think of a clean approach for this. I would think that the best way would be to add a filter within the Find arguments, but I don't know how to add in an array or list that way. Otherwise I figured to loop through the values either before or after the sorting of the List and remove any matches that way, but I wasn't sure that was the best choice of actions.
I have attached the current code, and would appreciate any suggestions.
int licenseeID = Helper.GetLicenseeIdByLicenseeShortName(Membership.ApplicationName);
List<User> listUsers;
if (Roles.IsUserInRole("Admin"))
{
//get all users
listUsers = User.Find(x => x.LicenseeID == licenseeID).ToList();
}
else
{
//get only the current user
listUsers = User.Find(x => (x.LicenseeID == licenseeID && x.EmailAddress == Membership.GetUser().Email)).ToList();
}
listUsers.Sort((x, y) => string.Compare(x.FirstName, y.FirstName));
-- EDIT --
I neglected to mention that I did not develop this, I merely inherited its maintenance after the original developer(s) disappeared, and my coworker who was assigned to it left the company. I'm not really really skilled at handling ASP.NET sites. Many object sources are hidden and unavailable for edit, I assume due to them being defined in a DLL somewhere. So, for any of these objects that are sourced from database tables, altering the tables will not help, since I would not be able to get the new data anyway.
However, I did try to do the following to filter out the undersirable data:
List<String> exclude = new List<String>(new String[] { "value1" , "value2" });
listUsers = User.Find(x => x.LicenseeID == licenseeID && !exclude.Contains(x.FirstName)).ToList();
Unfortunately it only resulted in an error being displayed to the page.
-- EDIT #2 --
I got the server setup to accept a new event viewer source so I could write info to the Application log to see what was happening. Looks like this installation of ASP.NET does not accept "Contains" as an action on a List object. An error gets kicked out stating that the method is not available.
I will probably add a bit to the table and flag Errant rows and then skip them when I query the table, something like
&& !ErrantData
Other way, that requires a bit more upkeep but doesn't require db change, would be to keep a text file that gets periodically updated and you read it and remove users from list based on it.
The bigger issue is unknown rows creeping in your database. Changing user credentials and adding creation timestamps may help you narrow down the search scope.
I am using SqlBulkCopy to upload lots of data to an SQL table. It works very well apart from one thing (always the way).
So in my c# app, I have a function. It receives a variable myObj of type object (from Matlab). The object is actually an array.
I create a DataTable where I specify the column type & read in the data from myObj. One of the columns (let's call it salary) in the table is of type double.
The problem
If one of the rows has a NaN value for salary the upload won't work, it returns the message below.
OLE DB provider 'STREAM' for linked server '(null)' returned invalid data for column
What I need in the database is the row to be uploaded but with the Salary, column to have a value of null. Is there any way of doing this?
The only crude way I have come up with is testing for when the value is null ( or NaN) in my c# app and assigning it a value of -999 and then after the upload updating any values from -999 to null. However, this seems like a poor workaround.
I'm currently working on a legacy project that has some SQL needs to run to get some data.
They use DataTables, DataSets, etc to work with the data.
The query in question that gets executed only returns one row, however it contains well over 700 columns.
Unfortunately when the code executes to fill the data set, if the query contains over 655 columns, nothing gets returned.
Is there a way to get around this limitation so if a query returns 656+ columns data will get returned or is there some other workaround?
Thanks!
EDIT:
Chasing a red herring. The data is there, I just can't view it in the debugger as a table if there's 656+ columns in the data. The viewer can't handle more than 655.
The data is there, I just can't view it in the debugger as a table if there's 656+ columns in the data. The viewer can't handle more than 655.
Not sure if it resolve the issue but try using the overloaded Fill method
DbDataAdapter.Fill(Int32, Int32, DataTable[]).
As per MSDN document:
Adds or refreshes rows in a DataTable to match those in the data
source starting at the specified record and retrieving up to the
specified maximum number of records
See here for more http://msdn.microsoft.com/en-us/library/0z5wy74x%28v=vs.110%29.aspx
I'm using the LumenWorks CsvReader to parse a file with the following structure.
heading1,heading2,heading3
data1,data2,data3,data4,data5
The code detects that I've got three fields per row because I've got three headings. It happily allows me to parse the first row and retrieve data1, data2, and data3, but i can't access data4 or data5. At the very least, I'd like to be able to detect additional fields in the data rows. Does anyone know if this is possible?
Thanks!
It does this because it uses the first row to know how many columns your file has. If you change the first row to "heading1,heading2,heading3,," it will work as expected.
I expect that you won't be able to read the data in those fields. Rather, I expect that an error is being raised and you're not seeing it. This would at least allow you to detect that those other fields exist in the data.
Try setting the DefaultParseErrorAction enum property so that you ensure you're seeing any errors being raised. I would have expected the scenario you describe to trigger a MalformedCsvException (if you have set the DefaultParseErrorAction = ParseErrorAction.ThrowException). You could also set DefaultParseErrorAction = ParseErrorAction.RaiseEvent and then attach an event handler to the ParseError event. Finally, you should be able to just check if the ParseErrorFlag is true after each record.
HTH
I'm trying to mimic the functionality of the Query Analyzer piece of the SQL Server Management Studio using .NET. The user will input a SQL script, and the program will run it. If it returns a result set, the program will load that up into a datagrid and show the user.
My question is: Is there a way to return more than one result set from a single script? I know Query Analyzer runs this and loads up multiple datagrids if several result sets are returned, but as far as I know when you try to do this with SqlDataAdapter.Fill(...), it only returns the last result set in the script.
This will show you how to return multiple result sets: http://vb.net-informations.com/ado.net-dataproviders/ado.net-multiple-result-sets.htm
You loop through the different result sets using the sqlReader.NextResult() method. You can then use sqlReader.Read() to get each individual record in that result set.
You can call SqlDataAdapter.Fill passing in a DataSet. Each query result will populate a table in the DataSet:
When the query specified returns multiple results, the result set for each row returning query is placed in a separate table. Additional result sets are named by appending integral values to the specified table name (for example, "Table", "Table1", "Table2", and so on).
I think I might have figured this out. Sometimes writing the problem out helps you understand it better, and then fresh ideas start to pop up :)
I had been using the SqlDataAdapter.Fill(...) method to fill a DataTable. As it turns out, if you fill an empty DataSet it will automatically create a table for each result set returned. So, I can have a few hidden datagrids on hand and when the program detects that the DataSet filled has multiple tables, then I'll just load up each datagrid with data from each DataTable in the DataSet.