I want to show numbers in "From - To" Format - c#

I have multiple random numbers in table column ID like
8v12027
8v12025
8v12024
8v12029.
8v12023
8v12030
8v12020
O/p - 8v12020, From 8v12023 To 8v12025, 8v12027, From 8v12029 To 8v12030,

I assume you'are waiting for an sql solution so :
You have to use Lead or Lag KeyWord and concat it.
SELECT CONCAT('From ',Id,'To :', LEAD(p.Id) OVER (ORDER BY p.Id),'s') FROM YourTable p
There is a really good explanation about thoses keyword in the sqlauthority web site.
https://blog.sqlauthority.com/2013/09/22/sql-server-how-to-access-the-previous-row-and-next-row-value-in-select-statement/
But If you were waiting for a pure C# solution, you can retreive the data set in an Array, after order it by Id and concat and with a for loop concat current value with previous (or next) one.
Or with a Linq use Aggregate
yourArray.Aggregate((a,b)=> String.Concat("From ",a," To ",b,";")).Split(';')

Related

Query ODataV4 connected service with LINQ - Get last record from table

Im trying to query my OData webservice from a C# application.
When i do the following:
var SecurityDefs = from SD in nav.ICESecurityDefinition.Take(1)
orderby SD.Entry_No descending
select SD;
i get an exception because .top() and .orderby is not supposed to be used together.
I need to get the last record in the dataset and only the last.
The purpose is to get the last used entry number in a ledger and then continue creating new entries incrementing the found entry no.
I cant seem to find anything online that explains how to do this.
Its very important that the service only returns the last record from the feed since speed is paramount in this solution.
i get an exception because .top() and .orderby is not supposed to be used together.
Where did you read that? In general .top() or .Take() should ONLY be used in conjunction WITH .orderby(), otherwise the record being retrieved is not guaranteed to be repeatable or predictable.
Probably the compounding issue here is mixing query and fluent expression syntax, which is valid, but you have to understand the order of precedence.
Your syntax is taking 1 record, then applying a sort order... you might find it easier to start with a query like this:
// build your query
var SecurityDefsQuery = from SD in nav.ICESecurityDefinition
orderby SD.Entry_No descending
select SD;
// Take the first item from the list, if it exists, will be a single record.
var SecurityDefs = SecurityDefsQuery.FirstOrDefault();
// Take an array of only the first record if it exists
var SecurityDefsDeferred = SecurityDefsQuery.Take(1);
This can be executed on a single line using brackets, but you can see how the query is the same in both cases, SecurityDefs in this case is a single ICESecurityDefinition typed record, where as SecurityDefsDeferred is an IQueryable<ICESecurityDefinition> that only has a single record.
If you only need the record itself, you this one liner:
var SecurityDefs = (from SD in nav.ICESecurityDefinition
orderby SD.Entry_No descending
select SD).FirstOrDefault();
You can execute the same query using fluent notation as well:
var SecurityDefs = nav.ICESecurityDefinition.OrderByDescending(sd => sd.Entry_No)
.FirstOrDefault();
In both cases, .Take(1) or .top() is being implemented through .FirstOrDefault(). You have indicated that speed is important, so use .First() or .FirstOrDefault() instead of .Single() or .SingleOrDefault() because the single variants will actually request .Take(2) and will throw an exception if it returns 1 or no results.
The OrDefault variants on both of these queries will not impact the performance of the query itself and should have negligble affect on your code, use the one that is appriate for your logic that uses the returned record and if you need to handle the case when there is no existing record.
If the record being returned has many columns, and you are only interested in the Entry_No column value, then perhaps you should simply query for that specific value itself:
Query expression:
var lastEntryNo = (from SD in nav.ICESecurityDefinition
orderby SD.Entry_No descending
select SD.Entry_No).FirstOrDefault();
Fluent expression:
var lastEntryNo = nav.ICESecurityDefinition.OrderByDescending(sd => sd.Entry_No)
.Select(sd => sd.Entry_No)
.FirstOrDefault();
If Speed is paramount then look at providing a specific custom endpoint on the service to either serve the record or do not process the 'Entry_No` in the client at all, make that the job of the code that receives data from the client and compute it at the time the entries are inserted.
Making the query perform faster is not the silver bullet you might be looking for though, Even if this is highly optimised, your current pattern means that X number of clients could all call the service to get the current value of Entry_No, meaning all of them would start incrementing from the same value.
If you MUST increment the Entry_No from the client then you should look at putting a custom endpoint on the service to simply return the Next Entry_No to use. This should be optimistic meaning that you don't care if the Entry_No actually gets used in the end, but you can implement the end point such that every call will increment the field in the database and return the next value.
Its getting a bit beyond the scope of your initial post, but SQL Server now has support for Sequences that formalise this type of logic from a database and schema point of view, using Sequence simplifies how we can manage these types of incrementations from the client, because we no longer rely on the outcome of data updates to be comitted to the table before the client can increment the next record. (which is what your TOP, Order By Desc solution is trying to do.

Which is faster between Linq to Sql And SQl Query

I have List of object like this
List<Product> _products;
Then I get productId input and search in this list like this
var target = _peoducts.Where(o => o.productid == input).FirstOrDefault();
my Question is
If This list have 100 Products (productId from 1 to 100) and an
input I get productId = 100. that mean this Method must loop for 100
time Right ? (If I ORDER BY productId ASC in Query)
Between use this Method and Query on Database with where clause like
this WHERE productId = #param
Thank you.
No. If there is an index with key productId it finds the correct row with O(log n) operations
Just implement both methods and take the time. (hint: use StopWatch() class)
Edit
To get the full performance you should not create an intermediate (unsorted) List<T> but put all your logic in a LINQ query which operates on the SQL Server.
#might be helpful to get your answer.
https://www.linqpad.net/WhyLINQBeatsSQL.aspx
If you execute that Where on a List<Product>, then:
you got all 100 rows from the database
and then looped through all products in memory until you found the one that matches or until you went through the entire list and found nothing.
If, on the other hand, you used an IQueryable<Product> that was connected to the database table, then:
You wouldn't have read anything from the database yet
When you apply the Where, you still wouldn't read anything
When you apply the FirstOrDefault a sql query is constructed to find just the one row you need. Given correct indexes on the table, this would be quite fast.

Limit Number of Results being returned in a List from Linq

I'm using Linq/EF4.1 to pull some results from a database and would like to limit the results to the (X) most recent results. Where X is a number set by the user.
Is there a way to do this?
I'm currently passing them back as a List if this will help with limiting the result set. While I can limit this by looping until I hit X I'd just assume not pass the extra data around.
Just in case it is relevant...
C# MVC3 project running from a SQL Server database.
Use the Take function
int numberOfrecords=10; // read from user
listOfItems.OrderByDescending(x => x.CreatedDate).Take(numberOfrecords)
Assuming listOfItems is List of your entity objects and CreatedDate is a field which has the date created value (used here to do the Order by descending to get recent items).
Take() Function returns a specified number of contiguous elements from the start of a
sequence.
http://msdn.microsoft.com/en-us/library/bb503062.aspx
results = results.OrderByDescending(x=>x.Date).Take(10);
The OrderByDescending(...) will sort items by your date/time property (or w/e logic you want to use to get most recent) and Take(...) will limit to first x items (first being most recent, thanks to the ordering).
Edit: To return some rows not starting at the first row, use Skip():
results = results.OrderByDescending(x=>x.Date).Skip(50).Take(10);
Use Take(), before converting to a List. This way EF can optimize the query it creates and only return the data you need.

SQL user defined aggregate order of values preserved?

Im using the code from this MSDN page to create a user defined aggregate to concatenate strings with group by's in SQL server. One of my requirements is that the order of the concatenated values are the same as in the query. For example:
Value Group
1 1
2 1
3 2
4 2
Using query
SELECT
dbo.Concat(tbl.Value) As Concat,
tbl.Group
FROM
(SELECT TOP 1000
tblTest.*
FROM
tblTest
ORDER BY
tblTest.Value) As tbl
GROUP BY
tbl.Group
Would result in:
Concat Group
"1,2" 1
"3,4" 2
The result seems to always come out correct and as expected, but than I came across this page that states that the order is not guaranteed and that attribute SqlUserDefinedAggregateAttribute.IsInvariantToOrder is only reserved for future use.
So my question is: Is it correct to assume that the concatenated values in the string can end up in any order? If that is the case then why does the example code on the MSDN page use the IsInvariantToOrder attribute?
I suspect a big problem here is your statement "the same as in the query" - however, your query never defines (and cannot define) an order by the things being aggregated (you can of course order the groups, by having a ORDER BY after the GROUP BY). Beyond that, I can only say that it is based purely on a set (rather than an ordered sequence), and that technically the order is indeed undefined.
While the accepted answer is correct, I wanted to share a workaround that others may find useful. Warning: it involves not using a user-defined aggregate at all :)
The link below describes an elegant way to build a concatenated, delimited list using only a SELECT statement and a varchar variable. The upside (for this thread) is that you can specify the order in which the rows are processed. The downside is that you can't easily concatenate across many different subsets of rows without painful iteration.
Not perfect, but for my use case was a good workaround.
http://blog.sqlauthority.com/2008/06/04/sql-server-create-a-comma-delimited-list-using-select-clause-from-table-column/

Converting user-entered search query to where clause for use in SQL Server full-text search

What's the best way to convert search terms entered by a user, into a query that can be used in a where clause for full-text searching to query a table and get back relevant results? For example, the following query entered by the user:
+"e-mail" +attachment -"word document" -"e-learning"
Should translate into something like:
SELECT * FROM MyTable WHERE (CONTAINS(*, '"e-mail"')) AND (CONTAINS(*, '"attachment"')) AND (NOT CONTAINS(*, '"word document"')) AND (NOT CONTAINS(*, '"e-learning"'))
I'm using a query parser class at the moment, which parses the query entered by users into tokens using a regular expression, and then constructs the where clause from the tokens.
However, given that this is probably a common requirement by a lot of systems using full-text search, I'm curious as to how other developers have approached this problem, and whether there's a better way of doing things.
How to implement the accepted answer using .Net / C# / Entity Framework...
Install Irony using nuget.
Add the sample class from:
http://irony.codeplex.com/SourceControl/latest#Irony.Samples/FullTextSearchQueryConverter/SearchGrammar.cs
Write code like this to convert the user-entered string to a query.
var grammar = new Irony.Samples.FullTextSearch.SearchGrammar();
var parser = new Irony.Parsing.Parser(grammar);
var parseTree = parser.Parse(userEnteredSearchString);
string query = Irony.Samples.FullTextSearch.SearchGrammar.ConvertQuery(parseTree.Root);
Perhaps write a stored procedure like this:
create procedure [dbo].[SearchLivingFish]
#Query nvarchar(2000)
as
select *
from Fish
inner join containstable(Fish, *, #Query, 100) as ft
on ft.[Key] = FishId
where IsLiving = 1
order by rank desc
Run the query.
var fishes = db.SearchLivingFish(query);
This may not be exactly what you are looking for but it may offer you some further ideas.
http://www.sqlservercentral.com/articles/Full-Text+Search+(2008)/64248/
In addition to #franzo's answer above you probably also want to change the default stop word behaviour in SQL. Otherwise queries containing single digit numbers (or other stop words) will not return any results.
Either disable stop words, create your own stop word list and/or set noise words to be transformed as explained in SQL 2008: Turn off Stop Words for Full Text Search Query
To view the system list of (English) sql stop words, run:
select * from sys.fulltext_system_stopwords where language_id = 1033
I realize it's a bit of a side-step from your original question, but have you considered moving away from SQL fulltext indexes and using something like Lucene/Solr instead?
The easiest way to do this is to use dynamic SQL (I know, insert security issues here) and break the phrase into a correctly formatted string.
You can use a function to break the phrase into a table variable that you can use to create the new string.
A combination of GoldParser and Calitha should sort you out here.
This article: http://www.15seconds.com/issue/070719.htm has a googleToSql class as well, which does some of the translation for you.

Categories