StepArgumentTransformation is not getting hit - c#

I have a scenario outline which contains scenarios which makes GET requests to a oData web API to get some data from it. Scenario
validate whether data returned from API is according to filters and in right order. Order by clause is built from table provided in the scenario
Scenario Outline: Validate that data from API call for a given user is according to filters provided
Given for the user id of 101
Given default filters for GET request
Given the result multicolumn order direction is <firstColumn> <firstOrderby> then <secondColumn> <secondOrderby>
And following is unordered list of securities
| securityID | attribute1 | attribute2 | attribute3 | attribute4 |
| 16654 | active | 0 | pending | 33 |
| 16655 | active | 0 | pending | 33 |
| 16656 | active | 0 | pending | 33 |
| 16657 | active | 0 | pending | 33 |
| 16658 | inactive | 4 | pending | 33 |
| 16659 | active | 0 | pending | 33 |
| 16660 | active | 0 | pending | 33 |
| 16661 | active | 0 | pending | 33 |
| 16662 | active | 0 | pending | 33 |
| 16663 | inactive | 0 | pending | 33 |
| 16664 | inactive | 2 | pending | 33 |
When I invoke the API GET
Then the SecAPI should return HTTP <statusCode>
And the response securities should be in expected order in each <sampleName> with matching fields and record count of 11
Examples:
| firstColumn | firstOrderby | secondColumn | secondOrderby | statusCode | sampleName |
| securityID | Asc | attribute2 | Desc | 200 | Asc-Desc |
| securityID | Asc | attribute2 | Asc | 200 | Asc-Asc |
| securityID | Desc | attribute2 | Asc | 200 | Desc-Asc |
| securityID | Asc | attribute2 | Desc | 200 | Asc-Desc |
| securityID | Asc | attribute2 | | 200 | Asc-Desc |
| securityID | | attribute2 | | 200 | Asc-Desc |
For above scenario outline, all is working fine except below given statement:
Given the result multicolumn order direction is <firstColumn> <firstOrderby> then <secondColumn> <secondOrderby>
for above statement, I have below step in steps.cs file
[Given(#"the result multicolumn (order direction is (.*) (.*) then (.*) (.*))")]
public void GivenTheResultOrderDirectionIs(StringWrapper orderBy)
{
//step code here
}
and following steptransformation to transform 4 arguments in given statement to proper oData orderBy clause:
[Binding]
public class CustomTransforms
{
[StepArgumentTransformation(#"order direction is <(\w+)> <(\w+)> then <(\w+)> <(\w+)>")]
public StringWrapper OrderByTransform(string column1, string column1Direction, string column2, string column2Direction)
{
string orderByClause;
//build clause here
return new StringWrapper(orderByClause);
}
}
problem is OrderByClauseTransform is never called. I am getting below exception:
Exception thrown: 'TechTalk.SpecFlow.BindingException' in TechTalk.SpecFlow.dll
An exception of type 'TechTalk.SpecFlow.BindingException' occurred in TechTalk.SpecFlow.dll but was not handled in user code
Parameter count mismatch! The binding method '.......GivenTheResultMulticolumnOrderDirectionIs(StringWrapper)' should have 5 parameters

Step transformations only receive a single argument. That's just how SpecFlow works. Once you have the full matched string, use a Regex to extract the desired pieces from that string. By declaring a constant for the regex pattern, you can reuse that in a Regex object as well as the [StepArgumentTransformation] attribute:
[Binding]
public class CustomTransforms
{
private const string OrderByPattern = #"order direction is (\w+) (\w+) then (\w+) (\w+)";
private static readonly Regex OrderByRegex = new Regex(OrderByPattern);
[StepArgumentTransformation(OrderByPattern)]
public StringWrapper OrderByTransform(string text)
{
var match = OrderByRegex.Match(text);
var column1 = match.Groups[1].Value;
var direction1 = match.Groups[2].Value;
var column2 = match.Groups[3].Value;
var direction2 = match.Groups[4].Value;
// Build your order by clause however you need to do it. For example, SQL:
var orderByClause = $"ORDER BY {column1} {direction1}, {column2} {direction2}";
return new StringWrapper(orderByClause);
}
}
Important: The < and > characters in your step argument transformation pattern are also messing things up. In your scenario, the <firstColumn> token is completely replaced by the current value in the examples table.
When the current example row is:
| firstColumn | firstOrderby | secondColumn | secondOrderby | statusCode | sampleName |
| securityID | Asc | attribute2 | Desc | 200 | Asc-Desc |
The step:
Given the result multicolumn order direction is <firstColumn> <firstOrderby> then <secondColumn> <secondOrderby>
is converted to this automatically:
Given the result multicolumn order direction is securityID Asc then attribute2 Desc
Notice that the < and > characters do not exist in the converted version of the step. The angle brackets are used to denote parameterized portions of a step that are replaced at run time by data from the examples table.

Related

Single Table Design - DynamoDB - Merge two tables into one?

I've watched this video and I was thinking about the Single Table Design. Is it a good idea to merge these two tables into one, just like the example in the video? Since they are related to each other (user trades and user transfers i.e. deposits and withdrawals) and the access patterns are all about given an exchange name and an account name.
The access patterns I'm interested in right now:
Get the last user transfer for a given exchange and an account name.
Get all user trades by date range.
Get all user transfers by date range.
Is it a good idea to merge both tables into one with the same PK and SK? There will be items with SK TradeDate#12/17/2022 and others with TransferDate#12/17/2022.
I'm about to figure out how to do the date range queries. I would have to use GSIs.
User Trades
+-------------------------------+----------------------------------------+---------+----------+-----+-------------+------------+-------+-------------+----------+-------------------------------+-----------+-----------------------------------+
| PK | SK | Account | Exchange | Fee | FeeCurrency | Instrument | Price | ProductType | Quantity | TradeDate | TradeId | UpdatedAt |
+-------------------------------+----------------------------------------+---------+----------+-----+-------------+------------+-------+-------------+----------+-------------------------------+-----------+-----------------------------------+
| Exchange#Binance#Account#main | TradeDate#12/17/2022 4:59:12 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:17:56.436+00:00 | 814696766 | 2023-01-26T23:17:57.7786154+00:00 |
| Exchange#Binance#Account#main | TradeDate#12/17/2022 5:38:23 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:57:06.876+00:00 | 814746356 | 2023-01-26T23:57:08.3696852+00:00 |
| Exchange#FTX#Account#main | TradeDate#12/17/2021 5:38:23 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:57:21.226+00:00 | 814746461 | 2023-01-26T23:57:21.8543695+00:00 |
| Exchange#FTX#Account#main | TradeDate#12/19/2022 4:59:12 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:57:21.901+00:00 | 814746513 | 2023-01-26T23:57:22.528155+00:00 |
| Exchange#Binance#Account#main | TradeDate#12/17/2022 4:59:12 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:57:22.348+00:00 | 814746517 | 2023-01-26T23:57:22.9753506+00:00 |
| Exchange#Binance#Account#main | TradeDate#12/17/2022 5:38:23 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:57:22.802+00:00 | 814746518 | 2023-01-26T23:57:23.429097+00:00 |
| Exchange#FTX#Account#main | TradeDate#12/17/2021 5:38:23 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:57:23.252+00:00 | 814746521 | 2023-01-26T23:57:23.8756532+00:00 |
| Exchange#FTX#Account#main | TradeDate#12/19/2022 4:59:12 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:57:23.759+00:00 | 814746524 | 2023-01-26T23:57:24.3824745+00:00 |
+-------------------------------+----------------------------------------+---------+----------+-----+-------------+------------+-------+-------------+----------+-------------------------------+-----------+-----------------------------------+
User Transfers (Deposits + Withdrawals)
+-------------------------------+-------------------------------------------+---------+--------------------------------------------+------------+----------+------------+---------+-----------+----------------+--------------------------------------------------------------------+---------------------------+----------------------------------+--------------+------------------------------+
| PK | SK | Account | Address | AddressTag | Exchange | Instrument | Network | Quantity | TransactionFee | TransactionId | TransferDate | TransferId | TransferType | UpdatedAt |
+-------------------------------+-------------------------------------------+---------+--------------------------------------------+------------+----------+------------+---------+-----------+----------------+--------------------------------------------------------------------+---------------------------+----------------------------------+--------------+------------------------------+
| Exchange#Binance#Account#main | TransferDate#12/17/2022 4:59:12 PM +02:00 | main | 0xF76d3f20bF155681b0b983bFC3ea5fe43A2A6E3c | null | Binance | USDT | ETH | 97.500139 | 3.2 | 0x46d28f7d0e1e5b1d074a65dcfbb9d90b3bcdc7e6fca6b1f1f7abb5ab219feb24 | 2022-12-17T16:59:12+02:00 | 1b56485f6a3446c3b883f4f485039260 | 0 | 2023-01-28T20:19:59.9181573Z |
| Exchange#Binance#Account#main | TransferDate#12/17/2022 5:38:23 PM +02:00 | main | 0xF76d3f20bF155681b0b983bFC3ea5fe43A2A6E3c | null | Binance | USDT | ETH | 3107.4889 | 3.2 | 0xbb2b92030b988a0184ba02e2e754b7a7f0f963c496c4e3473509c6fe6b54a41d | 2022-12-17T17:38:23+02:00 | 4747f6ecc74f4dd8a4b565e0f15bcf79 | 0 | 2023-01-28T20:20:00.4536839Z |
| Exchange#FTX#Account#main | TransferDate#12/17/2021 5:38:23 PM +02:00 | main | 0x476d3f20bF155681b0b983bFC3ea5fe43A2A6E3c | null | FTX | USDT | ETH | 20 | 3.2 | 0xaa2b92030b988a0184ba02e2e754b7a7f0f963c496c4e3473509c6fe6b54a41d | 2021-12-17T17:38:23+02:00 | 4747f6ecc74f4dd8a4b565e0f15bcf79 | 0 | 2023-01-28T20:20:00.5723855Z |
| Exchange#FTX#Account#main | TransferDate#12/19/2022 4:59:12 PM +02:00 | main | 0xc46d3f20bF155681b0b983bFC3ea5fe43A2A6E3c | null | FTX | USDT | ETH | 15 | 3.2 | 0xddd28f7d0e1e5b1d074a65dcfbb9d90b3bcdc7e6fca6b1f1f7abb5ab219feb24 | 2022-12-19T16:59:12+02:00 | 1b56485f6a3446c3b883f4f485039260 | 0 | 2023-01-28T20:20:00.5207119Z |
+-------------------------------+-------------------------------------------+---------+--------------------------------------------+------------+----------+------------+---------+-----------+----------------+--------------------------------------------------------------------+---------------------------+----------------------------------+--------------+------------------------------+
public async Task<UserTransferDto?> GetLastAsync(string exchange, string account)
{
var queryRequest = new QueryRequest
{
TableName = TableName,
KeyConditionExpression = "#pk = :pk",
ExpressionAttributeNames = new Dictionary<string, string>
{
{ "#pk", "PK" }
},
ExpressionAttributeValues = new Dictionary<string, AttributeValue>
{
{ ":pk", new AttributeValue { S = $"Exchange#{exchange}#Account#{account}" } }
},
ScanIndexForward = false,
Limit = 1
};
var response = await _dynamoDb.QueryAsync(queryRequest);
if (response.Items.Count == 0)
{
return null;
}
var itemAsDocument = Document.FromAttributeMap(response.Items[0]);
return JsonSerializer.Deserialize<UserTransferDto>(itemAsDocument.ToJson());;
}
A little edit:
I realized I needed the transfer type too, so I changed SK to TransferType#Withdraw#TransferDate#2022-12-17 14:59:12
and now the code looks like:
public async Task<UserTransferDto?> GetLastAsync(string exchange, string account, TransferType transferType)
{
var queryRequest = new QueryRequest
{
TableName = TableName,
KeyConditionExpression = "#pk = :pk and begins_with(#sk, :sk)",
ExpressionAttributeNames = new Dictionary<string, string> { { "#pk", "PK" }, { "#sk", "SK" } },
ExpressionAttributeValues = new Dictionary<string, AttributeValue>
{
{ ":pk", new AttributeValue { S = $"Exchange#{exchange}#Account#{account}" } },
{ ":sk", new AttributeValue { S = $"TransferType#{transferType}" } }
},
ScanIndexForward = false,
Limit = 1
};
var response = await _dynamoDb.QueryAsync(queryRequest);
if (response.Items.Count == 0)
{
return null;
}
var itemAsDocument = Document.FromAttributeMap(response.Items[0]);
return JsonSerializer.Deserialize<UserTransferDto>(itemAsDocument.ToJson());
}
My personal attitude - not to change anything if:
It works right now
I don't see any future risks on my system
There is no benefits on performance/durability/maintenance/ease of support
And coming to your question I don't see how merging two tables can help you.
I'd challenge that your table structure won't help you to cover these queries:
Get the last user transfer for a given exchange and an account name.
Get all user trades by date range.
Get all user transfers by date range.
If I understand correctly you don't need to query anything by your current PK => why did you choose this PK? Of course you can solve it by using GSI but you won't get them for free and if you can restructurise your table to make your 3 queries easier I'd do that.
Hope I answered your question. Good luck, have fun)
A query for the newest/latest item for a given partition key (pk) is possible if the sort key (sk) has a lexicographically sortable date/time representation e.g. ISO 8601. This is possible because items with the same pk value are stored in sorted order by sk.
To do this, simply issue a query for the pk + sk prefix, and supply Limit=1 and ScanIndexForward=False. For example:
KeyConditionExpression = "pk=:pk and begins_with(sk, :trade_prefix)"
Limit=1
ScanIndexForward=False
You can query for items with a given partition key (pk) and a sort key (sk) within a given range of dates using a key condition expression on the query which looks like a BETWEEN b and c.
For example:
KeyConditionExpression = "pk=:pk and sk between :t1 and :t2"
On the question of using DynamoDB as a simple key/value store where the value is a JSON blob, it depends on your access requirements. Some reasons not to have single JSON blob are that:
any update to the item means you have to rewrite the whole item which costs more WCUs
you can't create GSIs on most of the item's attributes because they're not top-level
it can makes concurrent puts more challenging

Check if id's in a comma-separated string match any id's in another string array

I have two tables in DB Master Setup Approval and Order Details and I want check On any Master Approval Setup, this purchase order will go on relying CostCenter.
Table Master Setup Approval:
|-------|---------|-------------|-------------|---------------|---------------------|
| ID | Name | CRG_COM_ID| CRG_BRN_ID |ApprovalTypeId |CostCenter(string) |
|-------|---------|-------------|-------------|---------------|---------------------|
| 1 | Setup1 | 1 | 1 | 1 | "1,2,5,7" |
|-------|---------|-------------|-------------|---------------|---------------------|
| 2 | Setup2 | 1 | 1 | 1 | "1,3,6" |
|-------|---------|-------------|-------------|---------------|---------------------|
Table OrderDetails :
|-------|---------|-------------|-------------|------------------|
| ID | Name | CRG_COM_ID| CRG_BRN_ID |CostCenterID(long)|
|-------|---------|-------------|-------------|------------------|
| 1 | Item1 | 1 | 1 | 1 |
|-------|---------|-------------|-------------|------------------|
| 2 | Item2 | 1 | 1 | 7 |
|-------|---------|-------------|-------------|------------------|
This is my code:
var orderDetails = db.OrderDetails.Where(c => c.OrderId == orderId);
var costc = orderDetails.Select(c => c.CostCenterId.Value).ToList().ConvertAll<string>(delegate (long i) { return i.ToString(); });
var ApprovalProcess_Count12 = db.MasterSetupApproval.Where(x =>
x.CRG_COM_ID == order.CompanyId &&
(x.CRG_BRN_ID == null || x.CRG_BRN_ID == order.BranchId) &&
x.ApprovalTypeId == (int)ApprovalTypes.PO &&
x.CostCenter.Split(',').Select(aee => aee).Any(val => costc.Contains(val))
).ToList();
I am getting the following error:
LINQ to Entities does not recognize the method 'System.String[] Split(Char[])' method, and this method cannot be translated into a store expression.
Output should be:
|-------|---------|-------------|-------------|---------------|---------------------|
| ID | Name | CRG_COM_ID| CRG_BRN_ID |ApprovalTypeId |CostCenter(string) |
|-------|---------|-------------|-------------|---------------|---------------------|
| 1 | Setup1 | 1 | 1 | 1 | "1,2,5,7" |
|-------|---------|-------------|-------------|---------------|---------------------|
Assuming you are working with a badly-designed DB (as pointed out by Crowcoder) that comma-separated values should not be present in a database, you may refer this to tackle your way through.
HTH!

Read IDataReader on a GROUP BY MySQL query

I have a table "nesting_bar_detail", that I read like that in my C# code :
public List<RepereNest> SelectListRepereNestInMeb(string query)
{
List<RepereNest> list = new List<RepereNest>();
if (this.OpenConnection() == true)
{
IDataReader dataReader = ExecuteReader(query);
while (dataReader.Read())
{
RepereNest det = new RepereNest();
det.ID=(long)dataReader["ID"];
det.IdDetail=(long)dataReader["ID_DETAIL"];
det.IdNesting=(long)dataReader["ID_NESTING"];
det.Name = (string)dataReader["NAME"];
det.Quantity = (int)dataReader["QUANTITY"];
list.Add(det);
}
this.CloseConnection();
}
return list;
}
If I make a simple query, as here, all is working fine
SELECT * FROM nesting_bar_detail WHERE NAME='TITI'
But when I want to group the results, I make the following request :
SELECT ID,ID_DETAIL,ID_NESTING, NAME, SUM(QUANTITY) AS QUANTITY GROUP BY ID_DETAIL,ID_NESTING ORDER BY ID_NESTING
But then I have an error on the lines where I "calculate" the field (in that case on the line det.Quantity = (int)dataReader["QUANTITY"];, that is a SUM)
Error "The specified cast is invalid"
I don't understand if this my SQL request that is not correct, or why the returned value type is not recognized ?
Edit :
Here is the data I have inside database :
+-------+-----------+------------+------+----------+
| ID | ID_DETAIL | ID_NESTING | NAME | QUANTITY |
+-------+-----------+------------+------+----------+
| 10754 | 10 | 58 | TITI | 2 |
+-------+-----------+------------+------+----------+
| 10755 | 11 | 59 | TITI | 3 |
+-------+-----------+------------+------+----------+
| 10756 | 11 | 59 | TITI | 4 |
+-------+-----------+------------+------+----------+
And here is Expected result :
+-------+-----------+------------+------+----------+
| ID | ID_DETAIL | ID_NESTING | NAME | QUANTITY |
+-------+-----------+------------+------+----------+
| 10754 | 10 | 58 | TITI | 2 |
+-------+-----------+------------+------+----------+
| 10755 | 11 | 59 | TITI | 7 |
+-------+-----------+------------+------+----------+
Aggregate (GROUP BY) Function Descriptions
"The SUM() and AVG() functions return a DECIMAL value for exact-value arguments (integer or DECIMAL), and a DOUBLE value for approximate-value arguments (FLOAT or DOUBLE)."
In your C# code you're trying to cast the resulting DECIMAL value of SUM(QUANTITY) as an int

merge single table column with different types

Table Columns
or_cv_no
here is where 'cv_no' and 'or_no' stored
type
different types are 'CV' and 'OR'
amount
here is where 'cv_amt' and 'or_amt' stored
date
data that is being fetched has same date
my query is this
SELECT IIF(type = 'CV', or_cv_no) AS cv_no, IIF(type = 'CV', amount) AS cv_amt, IIF(type = 'OR', or_cv_no) AS or_no, IIF(type = 'OR', amount) AS or_amt FROM transacAmt
and the output is
how can i output it to look like this sample:
| cv_no | cv_amt | or_no | or_amt |
|---------------------------------|
| 1234 | 1,500 | 4321 | 1,200 |
|-------+--------+-------+--------|
| 7777 | 1,700 | 2222 | 1,760 |
|-------+--------+-------+--------|
| 1111 | 1,900 | 3333 | 1,530 |
|-------+--------+-------+--------|
| | | 5355 | 1,420 |
|-------+--------+-------+--------|
| | | 1355 | 4,566 |
EDIT: The output must be like this: marked 'x' must be removed.. or be filled by data

Query database for multiple columns from a key value pair table

I am working on defining a sql query for use in sql server compact edition 3.5 on a windows mobile handset. I am going to need to get back a result set from three tables.
I don't exactly remember all the column names, as I'm asking this question at home, but here is a good example of the tables I'm dealing with.
Table 1: Customers
Table 2: PresoldOrders
Table 3: CustomerDetails
*
________________________________________
| |
|--------------- Customers --------------|
|________________________________________|
| |
| PK int CustomerNumber |
| varchar(125) FirstName |
| varchar(125) LastName |
| varchar(125) Email |
| varchar(200) Address1 |
| varchar(200) Address2 |
| varchar(200) City |
| varchar(2) State |
| varchar(5) Zip |
|________________________________________|
*
________________________________________
| |
|------------ CustomerDetails -----------|
|________________________________________|
| |
| PK int CustomerDetailsId |
| FK int CustomerNumber |
| varchar(255) FieldName |
| varchar(255) FieldValue |
|________________________________________|
*
________________________________________
| |
|------------ PresoldOrders -------------|
|________________________________________|
| |
| PK int PresoldOrderId |
| FK int CustomerNumber |
| int OrderNumber |
| int RouteStopNumber |
| datetime DeliveryDate |
| varchar(100) Other1 |
| varchar(100) Other2 |
|________________________________________|
Now, the query should return all records that exist in customers even if they don't exist in 'PresoldOrderHeaders' table. This part of it is pretty easy, I plan to just use a left outer join. The second part of the query is a bit more complex.
Here is the query I've constructed so far.
SELECT c.CustomerNumber
c.FirstName
c.LastName
c.Email
c.Address1
c.Address2
c.City
c.State
c.Zip
po.OrderNumber
po.DeliveryDate
po.Other1
po.Other2
FROM Customer c
LEFT OUTER JOIN PresoldOrders po on c.CustomerNumber = po.CustomerNumber
ORDER BY po.RouteStopNumber;
Tricky part is the CustomerDetails table. Here is an example of some data
_________________________________________________________
| | | | |
| PK | CustomerNumber | FieldName | FieldValue |
|-------|-----------------|--------------|----------------|
| 1 | 1 | A | 125 |
|-------|-----------------|--------------|----------------|
| 2 | 1 | B | 126 |
|-------|-----------------|--------------|----------------|
| 3 | 1 | C | 127 |
|-------|-----------------|--------------|----------------|
| 4 | 2 | A | 138 |
|-------|-----------------|--------------|----------------|
| 5 | 2 | B | 140 |
|-------|-----------------|--------------|----------------|
| 6 | 2 | C | 143 |
|-------|-----------------|--------------|----------------|
|_________________________________________________________|
For the information that I will be displaying in the Component One Flex Grid, the FieldName's listed in the CustomerDetails table will be fixed.
Here is want I want to archive:
_____________________________________________________________________________________________________________________
| | | | | | | |
| CustomerNumber | FirstName | LastName | ... | FieldName A's value | FieldName B's Value | FieldName C's Value |
|-----------------|-----------|----------|-----|---------------------|---------------------|--------------------------|
| 1 | John | Someone | ... | 125 | 126 | 127 |
|-----------------|-----------|----------|-----|---------------------|---------------------|--------------------------|
| 2 | Dan | Other | ... | 138 | 140 | 143 |
|-----------------|-----------|----------|-----|---------------------|---------------------|--------------------------|
|_____________________________________________________________________________________________________________________|
Normally, I'd have column names for A, B, and C defined in the 'CustomerDetails' table; however, this table can't be changed, so I must work with what I've been given. The requirements in the spec for my task is to have 15 plus columns to be displayed in a grid on a mobile device; not something I'd go for but those are the requirements.
Ok finally, the question:
Can one use sql to query a key value pairing table and have those key's value's displayed in columns like the above? This is the requirement I have and I'm thinking I'll need to create one query with my join on presoldorders table and then get a list of all details for each customer in a list and iterate through and combine into data table in code on handheld.
If you know in advance all key values, you can pivot resultset. I'm not sure if sql server compact supports PIVOT, but you can do:
select CustomerNumber,
Max(Case when FieldName='A' then FieldValue end) as a_value,
// the same for B, C, all keys.
From CustomerDetails
Group by CustomerNumber
For simplicity I don't join other tables, but I hope it gives you an idea how to turn rows into columns.

Categories