I am performing an insert operation with Entity Framework 5.
My code inserts a new row from a user input and parsed values in another method. The debug operation shows all object attributes have a value before the insert operation is called, the code executes without any exception, and the row is updated in the LocalDb, but content column value is missing or never saved.
Here is my entity framework code:
public async Task<IHttpActionResult> PostFormData()
{
var profile = db.ProfileRepository.dbSet.Where(m => m.profileId == 1).FirstOrDefaultAsync().Result;
if (!Request.Content.IsMimeMultipartContent())
{
throw new HttpResponseException(HttpStatusCode.UnsupportedMediaType);
}
var research = new Research();
research.profile = profile;
string root = HttpContext.Current.Server.MapPath("~/documentSafe");
var provider = new MultipartFormDataStreamProvider(root);
try
{
// Read the form data.
await Request.Content.ReadAsMultipartAsync(provider);
research.title = provider.FormData["title"];
research.researchAbstract = provider.FormData["researchAbstract"];
research.publisher = provider.FormData["publisher"];
var areaList = new List<ResearchArea>();
string areas = provider.FormData["researchArea"];
foreach (var r in areas.Split(','))
{
var area = new ResearchArea()
{
name = r,
departmentId = profile.departmentId
};
areaList.Add(area);
}
research.researchArea = areaList;
research.researchType = new ResearchType()
{
name = provider.FormData["type"]
};
string content = WebUtility.HtmlEncode(parser.convert(provider.FileData[0]));
research.content = content;
using (var context = new AcademicContext())
{
context.Research.Add(research);
await context.SaveChangesAsync();
}
return Ok();
}
catch (System.Exception e)
{
return InternalServerError(e);
}
}
The missing column is a parsed html string encoded using webutility and has an average size of 15,000 characters. I have checked the database column attribute too, and it is set to nvarchar(MAX).
If I insert a sample plain text to the column, the value get saved, but if I pass the encoded html string it does not.
Any suggestions why my column will not be null but still not contain any value.
Solution: The sql server explorer in visual studio seem to display an empty column when the character size is greater than 4000. I ran multiple test to verity this. I could get the content of the empty column by running raw sql select query.
Related
Method One
_AcDb.Line oLine = new _AcDb.Line(ptStart, ptEnd);
AddToModelSpace("PLOT", oLine);
Where AddToModelSpace is:
public static void AddToModelSpace(string strLayer, _AcDb.Entity oEntity)
{
_AcAp.Document acDoc = _AcAp.Application.DocumentManager.MdiActiveDocument;
_AcDb.Database acCurDb = acDoc.Database;
_AcEd.Editor ed = acDoc.Editor;
using (_AcDb.BlockTable bt = acCurDb.BlockTableId.GetObject(_AcDb.OpenMode.ForRead) as _AcDb.BlockTable)
using (_AcDb.BlockTableRecord ms = bt[_AcDb.BlockTableRecord.ModelSpace].GetObject(_AcDb.OpenMode.ForWrite) as _AcDb.BlockTableRecord)
ms.AppendEntity(oEntity);
oEntity.Layer = strLayer;
oEntity.Dispose();
}
Method Two
// Get the current document and database
_AcAp.Document docActive = _AcAp.Application.DocumentManager.MdiActiveDocument;
_AcDb.Database docDB = docActive.Database;
// Start a transaction
using (_AcDb.Transaction acTrans = docDB.TransactionManager.StartTransaction())
{
// Open the Block table for read
_AcDb.BlockTable acBlkTbl;
acBlkTbl = acTrans.GetObject(docDB.BlockTableId,
_AcDb.OpenMode.ForRead) as _AcDb.BlockTable;
// Open the Block table record Model space for write
_AcDb.BlockTableRecord acBlkTblRec;
acBlkTblRec = acTrans.GetObject(acBlkTbl[_AcDb.BlockTableRecord.ModelSpace],
_AcDb.OpenMode.ForWrite) as _AcDb.BlockTableRecord;
// Create line
using (_AcDb.Line acLine = new _AcDb.Line(ptStart, ptEnd))
{
// Add the new object to the block table record and the transaction
acBlkTblRec.AppendEntity(acLine);
acTrans.AddNewlyCreatedDBObject(acLine, true);
}
// Save the new object to the database
acTrans.Commit();
}
I have used AddToModelSpace in my project so I hope it is fine!
Method Two is the way Autodesk recommends in the developer's documentation (you can read this section).
In Method One, you use the ObjectId.GetObject() method to open the BlockTable and the model space 'BlockTableRecord'. This method uses the top transaction to open object which means that there's an active transaction you should use to add the newly created entity. You can get it with Database.TransactionManager.TopTransaction. If you don't want to use a transaction at all, you have to use the "for advanced use only" ObjectId.Open() method.
A Method Three should be using some extension methods to be called from within a transaction. Here's a simplified (non error checking) extract of the ones I use.
static class ExtensionMethods
{
public static T GetObject<T>(
this ObjectId id,
OpenMode mode = OpenMode.ForRead,
bool openErased = false,
bool forceOpenOnLockedLayer = false)
where T : DBObject
{
return (T)id.GetObject(mode, openErased, forceOpenOnLockedLayer);
}
public static BlockTableRecord GetModelSpace(this Database db, OpenMode mode = OpenMode.ForRead)
{
return SymbolUtilityServices.GetBlockModelSpaceId(db).GetObject<BlockTableRecord>(mode);
}
public static ObjectId Add (this BlockTableRecord owner, Entity entity)
{
var tr = owner.Database.TransactionManager.TopTransaction;
var id = owner.AppendEntity(entity);
tr.AddNewlyCreatedDBObject(entity, true);
return id;
}
}
Using example:
using (var tr = db.TransactionManager.StartTransaction())
{
var line = new Line(startPt, endPt) { Layer = layerName };
db.GetModelSpace(OpenMode.ForWrite).Add(line);
tr.Commit();
}
Here's a reproduction
Using build 2.5.2910
So we store a member in the database the normal way:
await session.StoreAsync(member)
I can then sign in, as that member, when
await session.LoadByUniqueConstraintAsync<Member>(m => m.Email, email)
I then do a batch update of email addresses (the batch only contains that one email address)
for (var batch = 0; (records = allRecords.Skip(batch * BatchSize).Take(BatchSize).ToList()).Any(); batch++)
{
using(var querySession = this.documentStore.OpenSession())
{
var existingMembers = session.Query<Member, Member_ByEmail> ().Where(m => m.Email.In(records.Select(r => r.OldEmailAddress))).ToDictionary(m => m.Email, m => m);
using(var bulkInsertOperation = this.documentStore.BulkInsert(this.systemConfiguration.DatabaseName, new BulkInsertOptions { CheckForUpdates = true }))
{
foreach(var member in records)
{
var existingMemberKey = member.OldEmailAddress;
var existingMemberRecord = existingMembers[existingMemberKey];
existingMemberRecord.Email = member.EmailAddress;
}
}
}
}
When I try and log in again with the new email address, this line:
await session.LoadByUniqueConstraintAsync<Member>(m => m.Email, email)
Returns null....
I've checked the new email being used is the same one in the database. The database shows the new one. I've used the database interface and queried the index for the new email and that works.
I've set the database to use wait for non stale results and also:
store.Conventions.DefaultQueryingConsistency = ConsistencyOptions.AlwaysWaitForNonStaleResultsAsOfLastWrite;
None of these options have worked.
I'm wondering if there is something special I have to do with bulk insert operation in order to get the .NET client to read the indexes for this new email.
I've drilled into the session variable at runtime and found that there is a known missing ids field with this value:
"UniqueConstraints/members/email/Ym1hcmxleTFAbmV3b3JiaXQuY28udWs="
You need to download the Raven plugin Raven.Bundles.UniqueConstraints.dll and add it to the \Plugins\ directory in your Raven installation folder (in my case, D:\RavenDB\Plugins)
You will need to re-insert the records after doing this (I deleted the database and re-seeded).
I created a database with SQLite-net so:
SQLiteAsyncConnection conn = new SQLiteAsyncConnection(Path.Combine(ApplicationData.Current.LocalFolder.Path, "Database.db"), true);
await conn.CreateTableAsync<Musei>();
Musei musPref;
if (muss.NumeroTel != null && muss.Descrizione!=null && muss.indirizzoEmail!= null && muss.Immagine!= null)
{
musPref = new Musei
{
DidascaliaLista=muss.DidascaliaLista,
NomeMuseo = muss.NomeMuseo,
Luogopreciso = muss.Luogopreciso,
Descrizione = muss.Descrizione,
NumeroTel = muss.NumeroTel,
IndirizzoEmail = muss.IndirizzoEmail,
Immagine= muss.Immagine,
};
}
await conn.InsertAsync(musPref);
In another project I need to recover the database created and insert objects inside a ListView, But I do not know how to proceed ..
try
{
StorageFile data = await ApplicationData.Current.LocalFolder.GetFileAsync("Database.db");
}
catch(Exception)
{
}
And now??
I would like to retrieve the database created above and use it, inserting objects "Musei" that are in it and display it in a ListView
If you want to read from the database you created earlier, you can do the following:
// Get a connection to the database that is in the local folder.
var dbPath = Path.Combine(ApplicationData.Current.LocalFolder.Path, "Database.db");
var con = new SQLiteAsyncConnection(dbPath, true);
// Get all "Musei" in the database stored in the "Musei" table.
var results = await con.QueryAsync<Musei>("SELECT * FROM Musei");
If you only want the Musei that match a certain field value, for example: you only want to read those in the specific location "Rome", you can do that like this:
var searchLocation = "Rome"; // for example entered by the user in your UI.
// Get only the "Musei" in `searchLocation`.
var results = await con.QueryAsync<Musei>("SELECT * FROM Musei WHERE Luogopreciso ='?'", searchLocation);
An alternative, if you are only querying a single table, is to do it like this, using LINQ:
var query = con.Table<Musei>();
// or, if looking for `searchLocation`:
var query = con.Table<Musei>().Where(m => m.Luogopreciso == "Rome");
you can then get this as a list using:
var result = await query.ToListAsync();
To find out which tables are actually present in your opened database files, you can do this:
var nTables = 0;
System.Diagnostics.Debug.WriteLine("Tables in the database");
foreach (var mapping in con.TableMappings)
{
System.Diagnostics.Debug.WriteLine(mapping.TableName);
nTables++;
}
System.Diagnostics.Debug.WriteLine("{0} tables in total", nTables);
and look at the debug output.
I have 3 tables:
master_upload (master upload has primary key with auto increment named master_upload_id)
master_upload_files (this consist of 2 columns and refer master_upload_id from above table )
master_upload_tags (same as second)
in 2nd and 3rd table there can be multiple rows for 1st table.
Now to insert in 2nd and 3rd table I need a master_upload_id which I only get after inserting. Hence I had to call db.SubmitChanges at least 3 times. If there are multiple values for 2nd and 3rd table I had to call db.SubmitChanges for each row in those two table. But some times the insertion in 2nd or 3rd table can fail due to some rule violation.
Hence I need to roll back in these cases. How can I do that?
I use to do these things via a SQL Server stored procedure, but now I need to do it in LINQ.
// Here is my code sample
using (dbDataContext db = new dbDataContext())
{
db.master_uploads.InsertOnSubmit(mu);// toget mu.upload_id
try
{
db.SubmitChanges();
master_upload_file mf = new master_upload_file();
mf.master_upload_id = mu.upload_id;
mf.upload_file_id = uploadedfile.file_id;
db.master_upload_files.InsertOnSubmit(mf);
for (int i = 0; i < tags.Length; i++)
{
master_upload_tag mt = new master_upload_tag();
mt.master_upload_id = mu.upload_id;
mt.tag = tags[i];
db.master_upload_tags.InsertOnSubmit(mt);
}
db.SubmitChanges();
gtu.writetext("0",context);
}
catch (Exception)
{
gtu.writetext("1:File Upload Add Error", context);
}
}
I am using SQL Server 2008.
Thanks
You're doing this way too complicated! Use the force, man!! :-)
Try this code:
// define your context
using (UploadContextDataContext ctx = new UploadContextDataContext())
{
try
{
// create your new upload
upload up = new upload();
up.UploadName = "Some test name";
// define two new upload files
UploadFile file1 = new UploadFile { FileName = "TestFile1.zip" };
UploadFile file2 = new UploadFile { FileName = "TestFile2.zip" };
// *ADD* those two new upload files to the "Upload"
up.UploadFiles.Add(file1);
up.UploadFiles.Add(file2);
// define three new upload tags
UploadTag tag = new UploadTag { TagName = "Tag #1" };
UploadTag tag2 = new UploadTag { TagName = "Tag #2" };
UploadTag tag3 = new UploadTag { TagName = "Tag #3" };
// *ADD* those three new upload tags to the "Upload"
up.UploadTags.Add(tag);
up.UploadTags.Add(tag2);
up.UploadTags.Add(tag3);
// add the "Upload" to the context - this *INCLUDES* the files and tags!
ctx.uploads.InsertOnSubmit(up);
// call SubmitChanges *just once* to store everything!
ctx.SubmitChanges();
}
catch (Exception exc)
{
string msg = exc.GetType().Name + ": " + exc.Message;
}
You basically set up an object graph - your basic Upload object - and you add your files and tags to that Upload object. You only need to add that Upload object to the data context - the other (added) subobjects are tagging along automatically!
And in this case, you only need to call SubmitChanges() a single time and this will insert all the new objects, set up all the necessary foreign key / primary key relationships and all for you. No fiddling around with primary keys, multiple calls to the database - just use the magic of Linq-to-SQL!
I am trying to delete the document by id, which is of type ObjectId, I do have converted the string to ObjectId and passed as parameter to remove from collection, but I am not able to delete the record.
I don't know what is the actuall reason behind, Looking for solution, below is my code sample:
public void DeleteRecords(string objectID)
{
try
{
// Create server settings to pass connection string, timeout, etc.
MongoServerSettings settings = new MongoServerSettings();
settings.Server = new MongoServerAddress("localhost", 27017);
// Create server object to communicate with our server
MongoServer server = new MongoServer(settings);
MongoDatabase myDB = server.GetDatabase("DemoMongoDB");
MongoCollection<BsonDocument> records = myDB.GetCollection<BsonDocument>("Records");
//var query = Query<Records>.EQ(fd => fd._id, ObjectId.Parse(name));
var query = Query<Records>.EQ(e => e._id, new BsonObjectId(objectID));
records.Remove(query);
}
catch (Exception ex)
{
}
}
Try below code, and see whether is working?
var query = Query.EQ("_id", new BsonObjectId("objectID"));
Or
var query = Query.EQ("_id", name);
records.Remove(query);
Finally, This worked for me, without converting the string to object id and pass as a parameter as a string itself.
var query = Query.EQ("_id", objectID);
records.Remove(query);