What kind of format is this? Cisco WIM Chat Transcript - c#

O{bTyp | S{WCS-STD}bUsr | S{Paul R Dunaway}bUid | S{1350}sCmd | S{}sId | A{}sNme | S{}sUrl | S{}sLbl | S{}sCok | S{}mMsg | S{All Other Software}fAct | S{}fTyp | S{}fKey | S{}fVal | S{}bUserType | S{CUST}transType | S{}mTsp | S{2012-01-26 15:03:04}}|O{bTyp | S{WCS-STD}bUsr | S{system}bUid | I{-1}sCmd | S{}sId | A{}sNme | S{}sUrl | S{}sLbl | S{}sCok | S{}mMsg | S{[An agent will be with you shortly.]}fAct | S{}fTyp | S{}fKey | S{}fVal | S{}bUserType | S{SYSTEM}transType | S{}mTsp | S{2012-01-26 15:03:04}}
This is used by our webchat system(Cisco UCCE/eGain) to store transcripts. I am looking to access them via C#/SQL but I am finding this encoding to be a bit weird. the above is after I cleaned out all the URL encoding bits example below
O%7BbTyp%20%7C%20S%7BWCS-STD%7DbUsr%20%7C%20S%7BPaul%20R%20Dunaway%7DbUid%20%7C%20S%7B1350%7DsCmd%20%7C%20S%7B%7DsId%20%7C%20A%7B%7DsNme%20%7C%20S%7B%7DsUrl%20%7C%20S%7B%7DsL

It seems to make a little more sense reindented:
O{
bTyp | S{WCS-STD}
bUsr | S{Paul R Dunaway}
bUid | S{1350}
sCmd | S{}
sId | A{}
sNme | S{}
sUrl | S{}
sLbl | S{}
sCok | S{}
mMsg | S{All Other Software}
fAct | S{}
fTyp | S{}
fKey | S{}
fVal | S{}
bUserType | S{CUST}
transType | S{}
mTsp | S{2012-01-26 15:03:04}
}
|
O{
bTyp | S{WCS-STD}
bUsr | S{system}
bUid | I{-1}
sCmd | S{}
sId | A{}
sNme | S{}
sUrl | S{}
sLbl | S{}
sCok | S{}
mMsg | S{[An agent will be with you shortly.]}
fAct | S{}
fTyp | S{}
fKey | S{}
fVal | S{}
bUserType | S{SYSTEM}
transType | S{}
mTsp | S{2012-01-26 15:03:04}
}
(Please excuse me if I'm just restating the obvious.) This doesn't look like any form of serialization I've seen before; I don't think it's outside the realm of possibility to think it's proprietary. Values, contained within braces, can be of one of four data types, indicated by the character immediately preceding the opening brace: S for string, I for integer, O for object (a collection of values as named properties), or A, presumably for array. Object properties are named based on the string preceding the value and separated from it with a pipe character.
Some questions still remain:
What does an array value look like?
Why does the data type for bUid change from one object to the next?
Why is the whitespace so inconsistent? (I would assume whitespace is ignored when not part of the value.)
Why is a pipe character used to separate the two top-level objects? Is this how the array notation works?

Related

Single Table Design - DynamoDB - Merge two tables into one?

I've watched this video and I was thinking about the Single Table Design. Is it a good idea to merge these two tables into one, just like the example in the video? Since they are related to each other (user trades and user transfers i.e. deposits and withdrawals) and the access patterns are all about given an exchange name and an account name.
The access patterns I'm interested in right now:
Get the last user transfer for a given exchange and an account name.
Get all user trades by date range.
Get all user transfers by date range.
Is it a good idea to merge both tables into one with the same PK and SK? There will be items with SK TradeDate#12/17/2022 and others with TransferDate#12/17/2022.
I'm about to figure out how to do the date range queries. I would have to use GSIs.
User Trades
+-------------------------------+----------------------------------------+---------+----------+-----+-------------+------------+-------+-------------+----------+-------------------------------+-----------+-----------------------------------+
| PK | SK | Account | Exchange | Fee | FeeCurrency | Instrument | Price | ProductType | Quantity | TradeDate | TradeId | UpdatedAt |
+-------------------------------+----------------------------------------+---------+----------+-----+-------------+------------+-------+-------------+----------+-------------------------------+-----------+-----------------------------------+
| Exchange#Binance#Account#main | TradeDate#12/17/2022 4:59:12 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:17:56.436+00:00 | 814696766 | 2023-01-26T23:17:57.7786154+00:00 |
| Exchange#Binance#Account#main | TradeDate#12/17/2022 5:38:23 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:57:06.876+00:00 | 814746356 | 2023-01-26T23:57:08.3696852+00:00 |
| Exchange#FTX#Account#main | TradeDate#12/17/2021 5:38:23 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:57:21.226+00:00 | 814746461 | 2023-01-26T23:57:21.8543695+00:00 |
| Exchange#FTX#Account#main | TradeDate#12/19/2022 4:59:12 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:57:21.901+00:00 | 814746513 | 2023-01-26T23:57:22.528155+00:00 |
| Exchange#Binance#Account#main | TradeDate#12/17/2022 4:59:12 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:57:22.348+00:00 | 814746517 | 2023-01-26T23:57:22.9753506+00:00 |
| Exchange#Binance#Account#main | TradeDate#12/17/2022 5:38:23 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:57:22.802+00:00 | 814746518 | 2023-01-26T23:57:23.429097+00:00 |
| Exchange#FTX#Account#main | TradeDate#12/17/2021 5:38:23 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:57:23.252+00:00 | 814746521 | 2023-01-26T23:57:23.8756532+00:00 |
| Exchange#FTX#Account#main | TradeDate#12/19/2022 4:59:12 PM +02:00 | main | Binance | 0 | BNB | BTCBUSD | 0 | Spot | 0.00086 | 2023-01-26T23:57:23.759+00:00 | 814746524 | 2023-01-26T23:57:24.3824745+00:00 |
+-------------------------------+----------------------------------------+---------+----------+-----+-------------+------------+-------+-------------+----------+-------------------------------+-----------+-----------------------------------+
User Transfers (Deposits + Withdrawals)
+-------------------------------+-------------------------------------------+---------+--------------------------------------------+------------+----------+------------+---------+-----------+----------------+--------------------------------------------------------------------+---------------------------+----------------------------------+--------------+------------------------------+
| PK | SK | Account | Address | AddressTag | Exchange | Instrument | Network | Quantity | TransactionFee | TransactionId | TransferDate | TransferId | TransferType | UpdatedAt |
+-------------------------------+-------------------------------------------+---------+--------------------------------------------+------------+----------+------------+---------+-----------+----------------+--------------------------------------------------------------------+---------------------------+----------------------------------+--------------+------------------------------+
| Exchange#Binance#Account#main | TransferDate#12/17/2022 4:59:12 PM +02:00 | main | 0xF76d3f20bF155681b0b983bFC3ea5fe43A2A6E3c | null | Binance | USDT | ETH | 97.500139 | 3.2 | 0x46d28f7d0e1e5b1d074a65dcfbb9d90b3bcdc7e6fca6b1f1f7abb5ab219feb24 | 2022-12-17T16:59:12+02:00 | 1b56485f6a3446c3b883f4f485039260 | 0 | 2023-01-28T20:19:59.9181573Z |
| Exchange#Binance#Account#main | TransferDate#12/17/2022 5:38:23 PM +02:00 | main | 0xF76d3f20bF155681b0b983bFC3ea5fe43A2A6E3c | null | Binance | USDT | ETH | 3107.4889 | 3.2 | 0xbb2b92030b988a0184ba02e2e754b7a7f0f963c496c4e3473509c6fe6b54a41d | 2022-12-17T17:38:23+02:00 | 4747f6ecc74f4dd8a4b565e0f15bcf79 | 0 | 2023-01-28T20:20:00.4536839Z |
| Exchange#FTX#Account#main | TransferDate#12/17/2021 5:38:23 PM +02:00 | main | 0x476d3f20bF155681b0b983bFC3ea5fe43A2A6E3c | null | FTX | USDT | ETH | 20 | 3.2 | 0xaa2b92030b988a0184ba02e2e754b7a7f0f963c496c4e3473509c6fe6b54a41d | 2021-12-17T17:38:23+02:00 | 4747f6ecc74f4dd8a4b565e0f15bcf79 | 0 | 2023-01-28T20:20:00.5723855Z |
| Exchange#FTX#Account#main | TransferDate#12/19/2022 4:59:12 PM +02:00 | main | 0xc46d3f20bF155681b0b983bFC3ea5fe43A2A6E3c | null | FTX | USDT | ETH | 15 | 3.2 | 0xddd28f7d0e1e5b1d074a65dcfbb9d90b3bcdc7e6fca6b1f1f7abb5ab219feb24 | 2022-12-19T16:59:12+02:00 | 1b56485f6a3446c3b883f4f485039260 | 0 | 2023-01-28T20:20:00.5207119Z |
+-------------------------------+-------------------------------------------+---------+--------------------------------------------+------------+----------+------------+---------+-----------+----------------+--------------------------------------------------------------------+---------------------------+----------------------------------+--------------+------------------------------+
public async Task<UserTransferDto?> GetLastAsync(string exchange, string account)
{
var queryRequest = new QueryRequest
{
TableName = TableName,
KeyConditionExpression = "#pk = :pk",
ExpressionAttributeNames = new Dictionary<string, string>
{
{ "#pk", "PK" }
},
ExpressionAttributeValues = new Dictionary<string, AttributeValue>
{
{ ":pk", new AttributeValue { S = $"Exchange#{exchange}#Account#{account}" } }
},
ScanIndexForward = false,
Limit = 1
};
var response = await _dynamoDb.QueryAsync(queryRequest);
if (response.Items.Count == 0)
{
return null;
}
var itemAsDocument = Document.FromAttributeMap(response.Items[0]);
return JsonSerializer.Deserialize<UserTransferDto>(itemAsDocument.ToJson());;
}
A little edit:
I realized I needed the transfer type too, so I changed SK to TransferType#Withdraw#TransferDate#2022-12-17 14:59:12
and now the code looks like:
public async Task<UserTransferDto?> GetLastAsync(string exchange, string account, TransferType transferType)
{
var queryRequest = new QueryRequest
{
TableName = TableName,
KeyConditionExpression = "#pk = :pk and begins_with(#sk, :sk)",
ExpressionAttributeNames = new Dictionary<string, string> { { "#pk", "PK" }, { "#sk", "SK" } },
ExpressionAttributeValues = new Dictionary<string, AttributeValue>
{
{ ":pk", new AttributeValue { S = $"Exchange#{exchange}#Account#{account}" } },
{ ":sk", new AttributeValue { S = $"TransferType#{transferType}" } }
},
ScanIndexForward = false,
Limit = 1
};
var response = await _dynamoDb.QueryAsync(queryRequest);
if (response.Items.Count == 0)
{
return null;
}
var itemAsDocument = Document.FromAttributeMap(response.Items[0]);
return JsonSerializer.Deserialize<UserTransferDto>(itemAsDocument.ToJson());
}
My personal attitude - not to change anything if:
It works right now
I don't see any future risks on my system
There is no benefits on performance/durability/maintenance/ease of support
And coming to your question I don't see how merging two tables can help you.
I'd challenge that your table structure won't help you to cover these queries:
Get the last user transfer for a given exchange and an account name.
Get all user trades by date range.
Get all user transfers by date range.
If I understand correctly you don't need to query anything by your current PK => why did you choose this PK? Of course you can solve it by using GSI but you won't get them for free and if you can restructurise your table to make your 3 queries easier I'd do that.
Hope I answered your question. Good luck, have fun)
A query for the newest/latest item for a given partition key (pk) is possible if the sort key (sk) has a lexicographically sortable date/time representation e.g. ISO 8601. This is possible because items with the same pk value are stored in sorted order by sk.
To do this, simply issue a query for the pk + sk prefix, and supply Limit=1 and ScanIndexForward=False. For example:
KeyConditionExpression = "pk=:pk and begins_with(sk, :trade_prefix)"
Limit=1
ScanIndexForward=False
You can query for items with a given partition key (pk) and a sort key (sk) within a given range of dates using a key condition expression on the query which looks like a BETWEEN b and c.
For example:
KeyConditionExpression = "pk=:pk and sk between :t1 and :t2"
On the question of using DynamoDB as a simple key/value store where the value is a JSON blob, it depends on your access requirements. Some reasons not to have single JSON blob are that:
any update to the item means you have to rewrite the whole item which costs more WCUs
you can't create GSIs on most of the item's attributes because they're not top-level
it can makes concurrent puts more challenging

WPF Datagrid or maybe something else that can have indents in the first column?

I have a table with a bunch of data this is a small sample:
Header - Object name
Waarde
ExportRoot.a80_02_00_01_Status.Settings.Off
0
ExportRoot.a80_02_00_01_Status.Standby
0
ExportRoot.a80_02_00_01_Status.Cooling
1
ExportRoot.a80_02_00_01_Status.Drying
0
ExportRoot.a80_02_00_01_Status.Heating
0
ExportRoot.a80_02_00_01_Status.PrepDrain
0
I wish to style this table in a way to where its more readable by removing the object parts and indenting the table like how it looks below.
| Header - Object name | Waarde |
|--------------------------|---------|
| ExportRoot | |
| | a80_02_00_01_Status | |
| | | Settings | |
| | | | Off | 0 |
| | | | Standby | 0 |
| | | | Cooling | 1 |
| | | | Drying | 0 |
| | | | Heating | 0 |
| | | | PrepDrain | 0 |
Is there a way of doing this and if so what should I be looking for all I've found so far are tables with sub tables and that not exactly what I'm looking for. If anyone has an idea let me know I'm super lost.
So basically you need a TreeListView:
There is one project on CodeProject that you can check:
WPF TreeListView Control
There is also commercial products, like DevExpress:
DevExpress TreeList View

Notify in .NET application if SQL data has changed

I have got two tables. Table A contains the main data in every change state, Table B has the detail data.
Table A
| ID | Name |
| 1 | House |
| 2 | Tree |
| 3 | Car |
Table B
| ID | FK | DateTime | Color | Type |
| 1 | 1 | 2017-26-01T13:00:00 | Red | Bungalow |
| 2 | 2 | 2017-26-01T13:05:00 | Brown | Oak |
| 3 | 1 | 2017-26-01T13:10:00 | Green | Bungalow |
| 4 | 3 | 2017-26-01T13:15:00 | Yellow | Smart |
| 5 | 1 | 2017-26-01T13:20:00 | White | Bungalow |
| 6 | 3 | 2017-26-01T13:25:00 | Black | Smart |
Result to watch
| ID | Name | DateTime | Color | Type |
| 1 | House | 2017-26-01T13:20:00 | White | Bungalow |
| 2 | Tree | 2017-26-01T13:05:00 | Brown | Oak |
| 3 | Car | 2017-26-01T13:25:00 | Black | Smart |
The current state of an entity in Table A is described by the record with the youngest timestamp in Table B.
Now, I want to be nofitied, if an entity gets a new state (new record in Table B) or a new entity is created (new records in Table A and B), i.e. it could look like.
New result to watch
| ID | Name | DateTime | Color | Type |
| 1 | House | 2017-26-01T13:20:00 | White | Bungalow |
| 2 | Tree | 2017-26-01T13:05:00 | Brown | Oak |
| 3 | Car | 2017-26-01T19:25:00 | Silver | Smart |
| 4 | Dog | 2017-26-01T20:25:00 | White / Black | Poodle |
With SqlDependency it is not possible to be notified for a statement which contains GROUP BY with MAX aggregate, window functions or TOP clause. So I have no idea how I can get the last detail data for an entity.
Is there any possibility to create a statement for this requirement or are there any other ways to be notified after changes for this result?
You can create and use SQL Triggers usin CLR
Here - SQL Trigger CLR

merge single table column with different types

Table Columns
or_cv_no
here is where 'cv_no' and 'or_no' stored
type
different types are 'CV' and 'OR'
amount
here is where 'cv_amt' and 'or_amt' stored
date
data that is being fetched has same date
my query is this
SELECT IIF(type = 'CV', or_cv_no) AS cv_no, IIF(type = 'CV', amount) AS cv_amt, IIF(type = 'OR', or_cv_no) AS or_no, IIF(type = 'OR', amount) AS or_amt FROM transacAmt
and the output is
how can i output it to look like this sample:
| cv_no | cv_amt | or_no | or_amt |
|---------------------------------|
| 1234 | 1,500 | 4321 | 1,200 |
|-------+--------+-------+--------|
| 7777 | 1,700 | 2222 | 1,760 |
|-------+--------+-------+--------|
| 1111 | 1,900 | 3333 | 1,530 |
|-------+--------+-------+--------|
| | | 5355 | 1,420 |
|-------+--------+-------+--------|
| | | 1355 | 4,566 |
EDIT: The output must be like this: marked 'x' must be removed.. or be filled by data

Creating Database Relationship programmatically

I have two separate databases that have data that coincides with another. However they are not relational.
I'm trying pro-grammatically create a relationship to build some statistics in C#. I'm looking for the number of actions in a case and its assets associated.
From one database I can see what asset belongs to what case:
| 7AU45448 | cases/unchanged/
| 7AI61361 | cases/unchanged/
| 8C52A5A1 | cases/unchanged/
| 8643Y053 | cases/unchanged/
| 8643Y052 | cases/unchanged/
| 8643Y051 | cases/unchanged/
| 8643Y050 | cases/unchanged/
| B4F043RB | cases/ups01/
| B4F043R7 | cases/ups01/
| B4F043R5 | cases/ups01/
| B4F043QZ | cases/ups01/
| B4F043QY | cases/ups01/
| B4F043RA | cases/ups01/
| B4F043R1 | cases/ups01/
| B4F043R8 | cases/ups01/
| B4F043R9 | cases/ups01/
| B4F043QX | cases/ups01/
| B4F043R3 | cases/ups01/
| B4F043QW | cases/ups01/
| B4F043R4 | cases/ups01/
| B4F043RC | cases/ups01/
| B4F043R2 | cases/ups01/
| B4F043R0 | cases/ups01/
| B4F043RD | cases/ups01/
| B4F043R6 | cases/ups01/
The other database is for logs, and holds no information on the case itself. Only the asset and detail are inside.
The information in this database is like:
7AU45448 | Processed file
7AU45448 | Download file
7AU45448 | View file
I can easily do a action count per asset on the database but not on the case. This is why I need the relationship.
If anyone has and Ideas or suggestions please let me know!
Thanks in Advance!
Since your definition of "not relational" was merely meant to be "without constraints", you should be able to compare data in two different databases as long as the field you're joining on is the same data type. Just make sure your left table is the table with the values you care about if you use a LEFT OUTER JOIN. In this case, [db1].[dbo].[table1] is the left table.
Example:
SELECT [db1].[dbo].[table1].*, [db2].[dbo].[table2].*
FROM [db1].[dbo].[table1]
LEFT OUTER JOIN [db2].[dbo].[table2] ON [db1].[dbo].[table1].[field_in_db1_table1] = [db2].[dbo].[table2].[field_in_db2_table2]

Categories