LINQ convert SQL server Datetime to string using a DTO - c#

I've been tasked to add a page to an API that we didn't build but are tasked to work on. The API is using C# MVC5. I don't really know MVC5, and but I'm attempting to keep the same design pattern that the rest of the API is using. I have to add functionality that will allow a user on the front end to upload a file to a server that will be processed for inserting into a SQL Server DB. This functionality will also return a list of all the files names and status of the imported files from a table in the DB.
The issue I'm having is converting a Datetime to a string in the LINQ query that is pulling the list of files.
I've attempted to try using this answer LINQ convert DateTime to string, but with no luck.
This is what I have so far, I've marked the line that is causing the issue:
[Route("ImportLogs", Name ="GetImportLogs")]
[HttpGet]
[ResponseType(typeof(List<IHttpActionResult>))]
public IHttpActionResult GetImportLogs()
{
var query =
from dbitem in db.ImportLogs
orderby dbitem.import_log_id
select new ImportLogDto()
{
Id = dbitem.import_log_id,
FileName = dbitem.import_file_name,
ImportTimeStamp = dbitem.import_timeStamp,
ImportStatus = dbitem.import_status,
ImportMessage = dbitem.import_message
};
query.ToList()
.Select(o => new ImportLogDto
{
Id = o.Id,
FileName = o.FileName,
ImportMessage = o.ImportMessage,
ImportStatus = o.ImportStatus,
ImportTimeStamp = o.ImportTimeStamp.ToString() //PROBLEM LINE
});
return Ok(query);
}
The error that I'm getting is
Cannot implicitly convert type 'string' to 'System.DateTime'
What am doing wrong? Any help would be appreciated. TIA.
EDIT:
Here is the DTO:
public class ImportLogDto
{
public int Id { get; set; }
public string FileName { get; set; }
public DateTime ImportTimeStamp { get; set; }
public string ImportStatus { get; set; }
public string ImportMessage { get; set; }
}

You are trying to assign a string into a DateTime so you get that exception. If you want to cast it to a string change your model as follows:
public class ImportLogDto
{
public int Id { get; set; }
public string FileName { get; set; }
public string ImportTimeStamp { get; set; } // Changed type
public string ImportStatus { get; set; }
public string ImportMessage { get; set; }
}
And then your query:
var query = (from dbitem in db.ImportLogs
orderby dbitem.import_log_id
select new {
Idbitem.import_log_id,
dbitem.import_file_name,
dbitem.import_timeStamp, // Still DateTime
dbitem.import_status,
dbitem.import_message
}).AsEnumerable(); // Retrieved from Database
.Select(o => new ImportLogDto
{
Id = o.import_log_id,
FileName = o.import_file_name,
ImportMessage = o.import_message,
ImportStatus = o.import_status,
ImportTimeStamp = o.import_timeStamp.ToString() // Changes to string
});
If you wnat to change the format of the DateTime for the API then then use its overload of ToString and specify a format:
ImportTimeStamp = o.ImportTimeStamp.ToString("dd/MM/yyyy HH24:MI:ss")
For more on the overload read: Custom Date and Time Format Strings

Your types are already DateTime, and since these types are tied to the backing data* you probably shouldn't change them. But where you return the values on the API you can really return anything you want. So an anonymous type could be a quick solution:
.Select(o => new // <--- notice no type definition here
{
Id = o.Id,
FileName = o.FileName,
ImportMessage = o.ImportMessage,
ImportStatus = o.ImportStatus,
ImportTimeStamp = o.ImportTimeStamp.ToString()
})
The compiler will know based on the type returned by .ToString() that you want ImportTimeStamp to be a string. You can then add formatters to .ToString() to customize the output however you like.
*If your DTO isn't actually tied to the database then you can change the type there from DateTime to string, of course. It's not really clear from the context of the code shown whether these are data DTOs or application DTOs.

Related

Filter data from a virtual foreign key on an SQL query

I am relatively new to Lambda/Linq, however I want to retrieve all events from a specific calendar where the events are still in the future...
If I use:
EventCalendar eventCalendar;
eventCalendar = db.Events_Calendars.Find(id);
I can get all events and filter by the current date in the view, but I don't think that is the best method.
The Model is as follows:
[Table("Events_Calendars")]
public class EventCalendar
{
public int Id { get; set; }
public string Calendar { get; set; }
public virtual List<Event> Events { get; set; }
}
Event Model is:
public class Event
{
public int Id { get; set; }
public string Title { get; set; }
public string Description { get; set; }
public DateTime Start { get; set; }
public DateTime End { get; set; }
public int? Capacity { get; set; }
.
.
.
}
One of my failed attempts at this issue is:
eventCalendar = db.Events_Calendars.Where(x => x.Events.Any(y => y.End >= DateTime.Today));
But it is giving me "Cannot implicitly convert type 'System.Linq.IQueryable<...Models.EventCalendar>' to '...Models.EventCalendar'
EDIT: Added the declaration line... EventCalendar eventCalendar;
Your Events_Calendars.Where returns the IQueryable<Models.EventCalendar> that means a some "query source" of Models.EventCalendar items but Events_Calendars.Find return the one Models.EventCalendar, so you cannot convert a some set of Models.EventCalendar to Models.EventCalendar. You can declarate the new varibale eventCalendars and store the filtering items to it:
var eventCalendars = db.Events_Calendars.Where(x => x.Events.Any(y => y.End >= DateTime.Today)).ToList();
Also you can read about difference between IQueryable and IEnumerable
Next to you where clause use .ToList() and also declare either var list = or list
The Linq Where method returns a list of matching entities. You are trying to return it to a variable defined as a single entity. If you define eventCalendar as type IQueryable<Models.EventCalendar> instead of Models.EventCalendar the Linq statement will work.
Or, alternatively, you can define eventCalendar as the more robust List<Models.EventCalendar> and use the Linq method ToList() at the end of your query. ie. eventCalendar = db.Events_Calendars.Where(x => x.Events.Any(y => y.End >= DateTime.Today)).ToList();

AutoMapper: mapping many properties into one

The scenario is the following: I receive a message containing a lot of variables, several hundreds. I need to write this to Azure Table storage where the partition key is the name of the individual variables and the value gets mapped to e.g. Value.
Let’s say the payload looks like the following:
public class Payload
{
public long DeviceId { get; set; }
public string Name { get; set; }
public double Foo { get; set; }
public double Rpm { get; set; }
public double Temp { get; set; }
public string Status { get; set; }
public DateTime Timestamp { get; set; }
}
And my TableEntry like this:
public class Table : TableEntity
{
public Table(string partitionKey, string rowKey)
{
this.PartitionKey = partitionKey;
this.RowKey = rowKey;
}
public Table() {}
public long DeviceId { get; set; }
public string Name { get; set; }
public double Value { get; set; }
public string Signal { get; set; }
public string Status { get; set; }
}
In order to write that to Table storage, I need to
var table = new Table(primaryKey, payload.Timestamp.ToString(TimestampFormat))
{
DeviceId = payload.DeviceId,
Name = payload.Name,
Status = payload.Status,
Value = value (payload.Foo or payload.Rpm or payload.Temp),
Signal = primarykey/Name of variable ("foo" or "rmp" or "temp"),
Timestamp = payload.Timestamp
};
var insertOperation = TableOperation.Insert(table);
await this.cloudTable.ExecuteAsync(insertOperation);
I don’t want to copy this 900 times (or how many variables there happen to be in the payload message; this is a fixed number).
I could make a method to create the table, but I will still have to call this 900 times.
I thought maybe AutoMapper could help out.
Are they always the same variables? A different approach could be to use DynamicTableEntity in which you basically have a TableEntity where you can fill out all additional fields after the RowKey/PartitionKey Duo:
var tableEntity = new DynamicTableEntity();
tableEntity.PartitionKey = "partitionkey";
tableEntity.RowKey = "rowkey";
dynamic json = JsonConvert.DeserializeObject("{bunch:'of',stuff:'here'}");
foreach(var item in json)
{
tableEntity.Properties.Add(item.displayName, item.value);
}
// Save etc
The problem is to map these properties, it is right?
Value = value (payload.Foo or payload.Rpm or payload.Temp),
Signal = primarykey/Name of variable ("foo" or "rmp" or "temp"),
This conditional mapping can be done via Reflection:
object payload = new A { Id = 1 };
object value = TryGetPropertyValue(payload, "Id", "Name"); //returns 1
payload = new B { Name = "foo" };
value = TryGetPropertyValue(payload, "Id", "Name"); //returns "foo"
.
public object TryGetPropertyValue(object obj, params string[] propertyNames)
{
foreach (var name in propertyNames)
{
PropertyInfo propertyInfo = obj.GetType().GetProperty(name);
if (propertyInfo != null) return propertyInfo.GetValue(obj);
}
throw new ArgumentException();
}
You may map rest of properties (which have equal names in source and destination) with AutoMapper.Mapper.DynamicMap call instead of AutoMapper.Mapper.Map to avoid creation of hundreds configuration maps. Or just cast your payload to dynamic and map it manually.
You can create a DynamicTableEntity from your Payload objects with 1-2 lines of code using TableEntity.Flatten method in the SDK or use the ObjectFlattenerRecomposer Nuget package if you are also worried about ICollection type properties. Assign it PK/RK and write the flattened Payload object into the table as a DynamicTableEntity. When you read it back, read it as DynamicTableEntity and you can use TableEntity.ConvertBack method to recreate the original object. Dont even need that intermediate Table class.

Automapper object to object

UPDATE 2:
It seems to be a problem with the containing value in Format.
The Debugger shows me "{{MinValue: 6, MaxValue:44}}" for property Format.
When I change this to .Format = new MyFormat(6,44) it works fine.. so maybe its a problem with the web api 2 I'm using ... but anyway it should just push the source-object to the destination-object ;)
UPDATE:
I removed the originial post and added some real code here. the other classes were only dummies to explain my problem.
By the way: The Problem only occurrs when I add the "Format" Property of type object to the class. otherwise it works fine...
I tried to create an simple example to keep things easy ;) but here are parts of my real code
private void InitAutoMapper()
{
if (_mappingInitialized) //static init
return;
//will prevent Mapper to try to map constructor-parameters. With Copy-Constructors otherwise this would cause infiniteloops and stackoverflowexceptions.
Mapper.Configuration.DisableConstructorMapping();
//From Client to Server
...
Mapper.CreateMap<SDK.Model.Form.Field, Field>();
//From Server to Client
...
Mapper.CreateMap<Field, SDK.Model.Form.Field>();
...
//checks if the configuration was correct.
Mapper.AssertConfigurationIsValid();
_mappingInitialized = true;
}
and this is the call
public string CreateEntityField(SDK.Model.Form.Field field)
{
var mappedField = Mapper.Map<Field>(field);
...
}
my Field-class looks like this (source and destination classes look exactly the same. they are just separated in two different namespaces to have the possibility to add different properties in the future.
public class Field : IEntityRelatedEntity, IModificationTrackObject
{
public string Id { get; set; }
public string Name { get; set; }
public string DisplayName { get; set; }
public string Description { get; set; }
public DateTime CreatedOn { get; set; }
public DateTime ModifiedOn { get; set; }
public int Status { get; set; }
public int SubStatus { get; set; }
...many more fields
public FieldType Type { get; set; }
public object Format { get; set; }
public FieldRequirement Required { get; set; }
public Field()
{
Name = null;
Description = string.Empty;
DisplayName = null;
Type = FieldType.Text;
Required = FieldRequirement.Optional;
CreatedOn = new DateTime();
ModifiedOn = new DateTime();
}
public Field(Field copy)
{
Id = copy.Id;
Format = copy.Format;
...
}
}
and this is the exception I get (sorry parts of this exception are in german but it means that the source and destination-Type of Format are not the same)
exception =
{
Mapping types:\r\nJObject -> JObject
Newtonsoft.Json.Linq.JObject -> Newtonsoft.Json.Linq.JObject
Destination path:
Field.Format.Format
Source value:
{
"MinValue": "2",
"MaxValue": "100"
}
}
The Mapper should just copy the source to the destination for property "Format" and shouldn't care about whats in there...
one more thing: when i Ignore the Format-Property everything works fine, too.
Mapper.CreateMap<SDK.Model.Form.Field, Field>().ForMember(m => m.Format, opt => opt.Ignore());

mongodb data losing when run web service in c#

I have an entitiy class as
public class City
{
[BsonId]
public string id { get; set; }
public int value { get; set; }
}
I created a web service and there is a webmethod as
[WebMethod]
public List<City> linQuery()
{
MongoConn dao = new MongoConn();
MongoServer mongo = dao.getConnection();
List<City> list = new List<City>();
string dbName = dao.dbName();
var db = mongo.GetDatabase(dbName);
Console.WriteLine(db);
using (mongo.RequestStart(db))
{
var collection = db.GetCollection<City>("cityMap");
IQueryable<City> cities = collection.AsQueryable().Where(c => c.value > 1200);
foreach (City city in cities)
list.Add(city);
return list;
}
}
when I run the service, I get that error:
System.IO.FileFormatException: An error occurred while deserializing the value property of class WebService1.City: Truncation resulted in data loss. ---> MongoDB.Bson.TruncationException: Truncation resulted in data loss.
How can I solve this problem?
Thanks for your helps.
You typically don't want your model to know or care how it's being stored, so peppering it with MongoDB.Bson attributes is not ideal. You can configure how MongoDB.Driver should read and store decimals with this config instead.
BsonSerializer.RegisterSerializer(
typeof(decimal),
new DecimalSerializer(BsonType.Double,
new RepresentationConverter(
true, // allow overflow, return decimal.MinValue or decimal.MaxValue instead
true //allow truncation
))
);
source
did you try set AllowTruncation=true for your properities in City class?
public class City
{
[BsonId]
[BsonRepresentation(BsonType.Int, AllowTruncation=true)]
public string id { get; set; }
[BsonRepresentation(BsonType.Int, AllowTruncation=true)]
public int value { get; set; }
}
How do I set the serialization options for the geo values using the official 10gen C# driver?

Reuse index transformer expressions fails on Id

I'm looking to be able to reuse some of the transform expressions from indexes so I can perform identical transformations in my service layer when the document is already available.
For example, whether it's by a query or by transforming an existing document at the service layer, I want to produce a ViewModel object with this shape:
public class ClientBrief
{
public int Id { get; set; }
public string FullName { get; set; }
public string Email { get; set; }
// ellided
}
From this document model:
public class Client
{
public int Id { get; private set; }
public CompleteName Name { get; private set; }
public Dictionary<EmailAddressKey, EmailAddress> Emails { get; private set; }
// ellided
}
public class CompleteName
{
public string Title { get; set; }
public string GivenName { get; set; }
public string MiddleName { get; set; }
public string Initials { get; set; }
public string Surname { get; set; }
public string Suffix { get; set; }
public string FullName { get; set; }
}
public enum EmailAddressKey
{
EmailAddress1,
EmailAddress2,
EmailAddress3
}
public class EmailAddress
{
public string Address { get; set; }
public string Name { get; set; }
public string RoutingType { get; set; }
}
I have an expression to transform a full Client document to a ClientBrief view model:
static Expression<Func<IClientSideDatabase, Client, ClientBrief>> ClientBrief = (db, client) =>
new ClientBrief
{
Id = client.Id,
FullName = client.Name.FullName,
Email = client.Emails.Select(x => x.Value.Address).FirstOrDefault()
// ellided
};
This expression is then manipulated using an expression visitor so it can be used as the TransformResults property of an index (Client_Search) which, once it has been generated at application startup, has the following definition in Raven Studio:
Map:
docs.Clients.Select(client => new {
Query = new object[] {
client.Name.FullName,
client.Emails.SelectMany(x => x.Value.Address.Split(new char[] {
'#'
})) // ellided
}
})
(The Query field is analysed.)
Transform:
results.Select(result => new {
result = result,
client = Database.Load(result.Id.ToString())
}).Select(this0 => new {
Id = this0.client.__document_id,
FullName = this0.client.Name.FullName,
Email = DynamicEnumerable.FirstOrDefault(this0.client.Emails.Select(x => x.Value.Address))
})
However, the transformation expression used to create the index can then also be used in the service layer locally when I already have a Client document:
var brief = ClientBrief.Compile().Invoke(null, client);
It allows me to only have to have one piece of code that understands the mapping from Client to ClientBrief, whether that code is running in the database or the client app. It all seems to work ok, except the query results all have an Id of 0.
How can I get the Id property (integer) properly populated in the query?
I've read a number of similar questions here but none of the suggested answers seem to work. (Changing the Ids to strings from integers is not an option.)
I have a hard time following your sample fully, Really the best way to dig in to this would be with a failing self-contained unit test.
Nonetheless, let's see if I can pull out the important bits.
In the transform, you have two areas where you are working with the id:
...
client = Database.Load(result.Id.ToString())
...
Id = this0.client.__document_id,
...
The result.Id in the first line and the Id = in the second line are expected to be integers.
The Database.Load() expects a string document key and that is also what you see in __document_id.
The confusion comes from Raven's documentation, code, and examples all use the terms id and key interchangeably, but this is only true when you use string identifiers. When you use non-string identifiers, such as ints or guids, the id may be 123, but the document key is still clients/123.
So try changing your transform so it translates:
...
client = Database.Load("clients/" + result.Id)
...
Id = int.Parse(this0.client.__document_id.Split("/")[1]),
...
... or whatever the c# equivalent linq form would be.

Categories