I'm writing a plugin for AutoCAD and want to import all the blocks it will use at the beginning to make sure that they are available when needed. To do that, I use this method
public static void ImportBlocks(string[] filesToTryToImport, string filter = "")
{
foreach (string blockToImport in filesToTryToImport)
{
if (blockToImport.Contains(filter))
{
Database sourceDb = new Database(false, true); //Temporary database to hold data for block we want to import
try
{
sourceDb.ReadDwgFile(blockToImport, System.IO.FileShare.Read, true, ""); //Read the DWG into a side database
ObjectIdCollection blockIds = new ObjectIdCollection(); // Create a variable to store the list of block identifiers
Autodesk.AutoCAD.DatabaseServices.TransactionManager tm = sourceDb.TransactionManager;
using (Transaction myT = tm.StartTransaction())
{
// Open the block table
BlockTable bt = (BlockTable)tm.GetObject(sourceDb.BlockTableId, OpenMode.ForRead, false);
// Check each block in the block table
foreach (ObjectId btrId in bt)
{
BlockTableRecord btr = (BlockTableRecord)tm.GetObject(btrId, OpenMode.ForRead, false);
// Only add named & non-layout blocks to the copy list
if (!btr.IsAnonymous && !btr.IsLayout)
{
blockIds.Add(btrId);
}
btr.Dispose();
}
}
// Copy blocks from source to destination database
IdMapping mapping = new IdMapping();
sourceDb.WblockCloneObjects(blockIds, _database.BlockTableId, mapping, DuplicateRecordCloning.Replace, false);
_editor.WriteMessage("\nCopied " + blockIds.Count.ToString() + " block definitions from " + blockToImport + " to the current drawing.");
}
catch (Autodesk.AutoCAD.Runtime.Exception ex)
{
_editor.WriteMessage("\nError during copy: " + ex.Message);
}
finally
{
sourceDb.Dispose();
}
}
}
}
That method appears to work because it successfully executes. However when I go to insert a block in the drawing via AutoCAD's interface it doesn't show up as an option and when I try to insert it programmatically it throws a FileNotFound exception meaning it didn't work. What's wrong with this method? Thanks in advance!
EDIT: Here is a less complicated method with a test method
public static void ImportSingleBlock(string fileToTryToImport)
{
using (Transaction tr = _database.TransactionManager.StartTransaction())
{
Database sourceDb = new Database(false, true); //Temporary database to hold data for block we want to import
try
{
sourceDb.ReadDwgFile(fileToTryToImport, System.IO.FileShare.Read, true, ""); //Read the DWG into a side database
_database.Insert(fileToTryToImport, sourceDb, false);
_editor.WriteMessage("\nSUCCESS: " + fileToTryToImport);
}
catch (Autodesk.AutoCAD.Runtime.Exception ex)
{
_editor.WriteMessage("\nERROR: " + fileToTryToImport);
}
finally
{
sourceDb.Dispose();
}
tr.Commit();
}
}
[CommandMethod("TESTSINGLEBLOCKIMPORTING")]
public void TestSingleBlockImporting()
{
OpenFileDialog ofd = new OpenFileDialog();
DialogResult result = ofd.ShowDialog();
if (result == DialogResult.Cancel) //Ending method on cancel
{
return;
}
string fileToTryToImport = ofd.FileName;
using (Transaction tr = _database.TransactionManager.StartTransaction())
{
EntityMethods.ImportSingleBlock(fileToTryToImport);
tr.Commit();
}
}
This file is the block I'm trying to import. Hope this inspires someone cause I am desperately lost right now.
Your code is correct and should work. In fact I did tried it and works fine. You're probably missing to Commit() a outer transaction (where you call this ImportBlocs() method). Check:
using (Transaction trans = _database.TransactionManager.StartTransaction())
{
ImportBlocks(... parameters here ...);
trans.Commit(); // remember to call this commit, if omitted, Abort() is assumed
}
I had the same issue, very similar code. Issue was that
_database.Insert(fileToTryToImport, sourceDb, false);
should be
_database.Insert(blockName, sourceDb, false);
You can see that the first parameter has to be "blockName", and not the file path.
Related
I am trying to get the changes that have occurred in a couchDB database since the last time the code was executed. To do so, I have taken the following code from MYCouch library documentation. However, nothing is being printed and I am not sure why. I have made changes to the database between executions, but nothing is still printed and it is being exited with code 0. The problem is not with the database, since when I run a curl command for the changes from cmd, i get all the changes. So the error must be in the code
var cnInfo1 = new DbConnectionInfo("url", "mycompany")
{
Timeout = TimeSpan.FromMilliseconds(System.Threading.Timeout.Infinite)
};
using (var client = new MyCouchClient(cnInfo1))
{
var getChangesRequest = new GetChangesRequest
{
Feed = ChangesFeed.Continuous,
Since = "1",
Heartbeat = 3000
};
var cancellation = new CancellationTokenSource();
try
{
await client.Changes.GetAsync(
getChangesRequest,
data =>
{
Console.WriteLine("Within the data callback.");
if (data != null)
{
Console.WriteLine("Data is not null.");
foreach (var change in data)
{
Console.WriteLine(change);
}
}
else
{
Console.WriteLine("Data is null.");
}
},
cancellation.Token);
}
catch (Exception ex)
{
Console.WriteLine("Exception caught: " + ex.Message);
}
I'm trying to remove all Lookup Tables that start with a specific prefix inside all families.
The method "sizeTableManager.RemoveSizeTable(tableToRemove)" returns true as if it succeeded but when I go edit the families in the project and bring up the Lookup Tables list they are still there.
The transaction seems to be committing with no errors too, which is even more puzzling...
Any ideas as to what I'm doing wrong?
This is my code so far:
string existingPrefix = "ExistingPrefix_";
Document doc = this.ActiveUIDocument.Document;
FilteredElementCollector collector = new FilteredElementCollector(doc);
ICollection<Element> elements = collector.OfClass(typeof(Family)).ToElements();
foreach (var element in elements)
{
using (Transaction t = new Transaction(doc, "flush old lookup tables"))
{
t.Start();
FamilySizeTableManager sizeTableManager = FamilySizeTableManager.GetFamilySizeTableManager(doc, element.Id);
if(sizeTableManager != null)
{
foreach (var tableToRemove in sizeTableManager.GetAllSizeTableNames())
{
if(tableToRemove.StartsWith(existingPrefix))
{
bool result = sizeTableManager.RemoveSizeTable(tableToRemove);
if(result)
{
// TaskDialog.Show("Success", "Removed " + tableToRemove + " from " + element.Name);
var test = "test";
}
else
{
TaskDialog.Show("Warning", "Unable to remove " + tableToRemove + " from " + element.Name);
}
}
}
}
var commitResult = t.Commit();
}
}
Thanks in advance!
Well, turns out that the RemoveSizeTable() method was indeed working, but that I had to regenerate the document for the changes to take effect:
using (Transaction t = new Transaction(doc, "regenerate document"))
{
t.Start();
doc.Regenerate();
t.Commit();
}
Discovered this after noticing that saving the document caused the changes to take effect.
The following code gives me the error (I get it from the MessageBox.Show() in the catch block)
"Exception in PopulateBla() : There is a file sharing violation. A
different process might be using the file [,,,,,,]
CODE
using (SqlCeCommand cmd = new SqlCeCommand(SQL_GET_VENDOR_ITEMS, new SqlCeConnection(SQLCE_CONN_STR)))
{
cmd.Parameters.Add("#VendorID", SqlDbType.NVarChar, 10).Value = vendorId;
cmd.Parameters.Add("#VendorItemID", SqlDbType.NVarChar, 19).Value = vendorItemId;
try
{
cmd.Connection.Open();
using (SqlCeDataReader SQLCEReader = cmd.ExecuteReader(CommandBehavior.SingleRow))
{
if (SQLCEReader.Read())
{
itemID = SQLCEReader.GetString(ITEMID_INDEX);
packSize = SQLCEReader.GetString(PACKSIZE_INDEX);
recordFound = true;
}
}
}
catch (SqlCeException err)
{
MessageBox.Show(string.Format("Exception in PopulateControlsIfVendorItemsFound: {0}\r\n", err.Message));//TODO: Remove
}
finally
{
if (cmd.Connection.State == ConnectionState.Open)
{
cmd.Connection.Close();
}
}
}
SQL_GET_VENDOR_ITEMS is my query string.
What file sharing problem could be happening here?
UPDATE
This is the kind of code that makes that sort of refactoring recommended by ctacke below difficult:
public void setINVQueryItemGroup( string ID )
{
try
{
dynSQL += " INNER JOIN td_item_group ON t_inv.id = td_item_group.id AND t_inv.pack_size = td_item_group.pack_size WHERE td_item_group.item_group_id = '" + ID + "'";
}
catch( Exception ex )
{
CCR.ExceptionHandler( ex, "InvFile.setINVQueryDept" );
}
}
A SQL statement is being appended to by means of a separate method, altering a global var (dynSQL) while possibly allowing for SQL Injection (depending on where/how ID is assigned). If that's not enough, any exception thrown could mislead the weary bughunter due to indicating it occurred in a different method (doubtless the victim of a careless copy-and-paste operation).
This is "Coding Horror"-worthy. How many best practices can you ignore in a scant few lines of code?
Here's another example:
string dynSQL = "SELECT * FROM purgatory WHERE vendor_item = '" + VendorItem + "' ";
if (vendor_id != "")
{
dynSQL += "AND vendor_id = '" + vendor_id + "' ";
}
It could be done by replacing the args with "?"s, but the code to then determine which/how many params to assign would be 42X uglier than Joe Garagiola's mean cleats.
I really like Chris' idea of using a single connection to your database. You could declare that global to your class like so:
public ClayShannonDatabaseClass
{
private SqlCeConnection m_openConnection;
public ClayShannonDatabaseClass()
{
m_openConnection = new SqlCeConnection();
m_openConnection.Open();
}
public void Dispose()
{
m_openConnection.Close();
m_openConnection.Dispose();
m_openConnection = null;
}
}
I'm guessing your code is crashing whenever you attempt to actually open the database.
To verify this, you could stick an integer value in the code to help you debug.
Example:
int debugStep = 0;
try
{
//cmd.Connection.Open(); (don't call this if you use m_openConnection)
debugStep = 1;
using (SqlCeDataReader SQLCEReader = cmd.ExecuteReader(CommandBehavior.SingleRow))
{
debugStep = 2;
if (SQLCEReader.Read())
{
debugStep = 3;
itemID = SQLCEReader.GetString(ITEMID_INDEX);
debugStep = 4;
packSize = SQLCEReader.GetString(PACKSIZE_INDEX);
debugStep = 5;
recordFound = true;
}
}
}
catch (SqlCeException err)
{
string msg = string.Format("Exception in PopulateControlsIfVendorItemsFound: {0}\r\n", err.Message);
string ttl = string.Format("Debug Step: {0}", debugStep);
MessageBox.Show(msg, ttl); //TODO: Remove
}
// finally (don't call this if you use m_openConnection)
// {
// if (cmd.Connection.State == ConnectionState.Open)
// {
// cmd.Connection.Close();
// }
// }
I'm guessing your error is at Step 1.
Provided the file isn't marked read-only (you checked that, right?), then you have another process with a non-sharing lock on the file.
The isql.exe database browser that comes with SQL CE is a common culprit if it's running in the background.
Depending on your version of SQLCE, it's quite possible that another process has an open connection (can't recall what version started allowing multiple process connections), so if you have any other app in the background that has it open, that may be a problem too.
You're also using a boatload of connections to that database, and they don't always get cleaned up and released immediately up Dispose. I'd highly recommend building a simple connection manager class that keeps a single (or more like two) connections to the database and just reuses them for all operations.
I'm designing an API. Currently I'm trying to safely handle a condition where we run out of disk space. Basically, we have a series of files holding some data. When the disk is full, when we go to write another data file, it will of course throw an error. At this point, we delete a single file(loop through file list from oldest to newest and retry after we successfully delete a file). Then, we retry writing the file. Repeat that process until the file is written without error.
Now the fun part. All of this happens concurrently. Like, at some point there are 8 threads doing this at once. This makes things extra interesting, and has lead to an odd error.
Here is the code
public void Save(string text, string id)
{
using (var store = IsolatedStorageFile.GetUserStoreForApplication())
{
var existing = store.GetFileNames(string.Format(Prefix + "/*-{0}.dat", id));
if (existing.Any()) return; //it already is saved
string name = string.Format(Prefix + "/{0}-{1}.dat", DateTime.UtcNow.ToString("yyyyMMddHHmmssfffffff"), id);
tryagain:
bool doover=false;
try
{
AttemptFileWrite(store, name, text);
}
catch (IOException)
{
doover = true;
}
catch (IsolatedStorageException) //THIS LINE
{
doover = true;
}
if (doover)
{
Attempt(() => store.DeleteFile(name)); //because apparently this can also fail.
var files = store.GetFileNames(Path.Combine(Prefix, "*.dat"));
foreach (var file in files.OrderBy(x=>x))
{
try
{
store.DeleteFile(Path.Combine(Prefix, file));
}
catch
{
continue;
}
break;
}
goto tryagain; //prepare the velociraptor shield!
}
}
}
void AttemptFileWrite(IsolatedStorageFile store, string name, string text)
{
using (var file = store.OpenFile(
name,
FileMode.Create,
FileAccess.ReadWrite,
FileShare.None | FileShare.Delete
))
{
using (var writer = new StreamWriter(file))
{
writer.Write(text);
writer.Flush();
writer.Close();
}
file.Close();
}
}
static void Attempt(Action func)
{
try
{
func();
}
catch
{
}
}
static T Attempt<T>(Func<T> func)
{
try
{
return func();
}
catch
{
}
return default(T);
}
public string GetSaved()
{
string content=null;
using (var store = IsolatedStorageFile.GetUserStoreForApplication())
{
var files = store.GetFileNames(Path.Combine(Prefix,"*.dat")).OrderBy(x => x);
if (!files.Any()) return new MessageBatch();
foreach (var filename in files)
{
IsolatedStorageFileStream file=null;
try
{
file = Attempt(() =>
store.OpenFile(Path.Combine(Prefix, filename), FileMode.Open, FileAccess.ReadWrite, FileShare.None | FileShare.Delete));
if (file == null)
{
continue; //couldn't open. assume locked or some such
}
file.Seek(0L, SeekOrigin.Begin);
using (var reader = new StreamReader(file))
{
content = reader.ReadToEnd();
}
//take note here. We delete the file, while we still have it open!
//This is done because having the file open prevents other readers, but if we close it first,
//then there is a race condition that right after closing the stream, another reader could pick it up and
//open exclusively. It looks weird, but it's right. Trust me.
store.DeleteFile(Path.Combine(Prefix, filename));
if (!string.IsNullOrEmpty(content))
{
break;
}
}
finally
{
if (file != null) file.Close();
}
}
}
return content;
}
At the line marked THIS LINE, is what I'm talking about. When doing AttemptFileWrite, I can look over at store.AvailableSpace and see that there is enough room to fit the data into it, but upon trying to open the file, it throws this IsolatedStorageException with the description of Operation Not Permitted. Aside from this weird case, in all other cases it's just an IOException thrown with a message about the disk being full
I'm trying to figure out if I have some odd race condition, or if this is an error I just have to deal with or what?
Why does this error occur?
I'm creating a program to generate my database schema using the SMO libraries distributed with SQL Server 2008.
I've gotten the scripter outputting code which is virtually the same as SQL Server Management Studio outputs when it's configured to output everything, but with one curious exception: it doesn't output comment headers for the foreign key constraints it generates at the bottom, whereas SSMS does. Can anyone figure out why this is? Here's my code:
private void btnExportScript_Click(object sender, EventArgs ea) {
Server srv = setupConnection();
// Reference the database
if (!srv.Databases.Contains(cbChooseDb.SelectedItem.ToString())) {
_utils.ShowError("Couldn't find DB '" + cbChooseDb.SelectedItem.ToString() + "'.");
return;
}
Database db = srv.Databases[cbChooseDb.SelectedItem.ToString()];
StringBuilder builder = new StringBuilder();
try {
Scripter scrp = new Scripter(srv);
scrp.Options.AppendToFile = false;
scrp.Options.ToFileOnly = false;
scrp.Options.ScriptDrops = false; // Don't script DROPs
scrp.Options.Indexes = true; // Include indexes
scrp.Options.DriAllConstraints = true; // Include referential constraints in the script
scrp.Options.Triggers = true; // Include triggers
scrp.Options.FullTextIndexes = true; // Include full text indexes
scrp.Options.NonClusteredIndexes = true; // Include non-clustered indexes
scrp.Options.NoCollation = false; // Include collation
scrp.Options.Bindings = true; // Include bindings
scrp.Options.SchemaQualify = true; // Include schema qualification, eg. [dbo]
scrp.Options.IncludeDatabaseContext = false;
scrp.Options.AnsiPadding = true;
scrp.Options.FullTextStopLists = true;
scrp.Options.IncludeIfNotExists = false;
scrp.Options.ScriptBatchTerminator = true;
scrp.Options.ExtendedProperties = true;
scrp.Options.ClusteredIndexes = true;
scrp.Options.FullTextCatalogs = true;
scrp.Options.SchemaQualifyForeignKeysReferences = true;
scrp.Options.XmlIndexes = true;
scrp.Options.IncludeHeaders = true;
// Prefectching may speed things up
scrp.PrefetchObjects = true;
var urns = new List<Urn>();
// Iterate through the tables in database and script each one.
foreach (Table tb in db.Tables) {
if (tb.IsSystemObject == false) {
// Table is not a system object, so add it.
urns.Add(tb.Urn);
}
}
// Iterate through the views in database and script each one. Display the script.
foreach (Microsoft.SqlServer.Management.Smo.View view in db.Views) {
if (view.IsSystemObject == false) {
// View is not a system object, so add it.
urns.Add(view.Urn);
}
}
// Iterate through the stored procedures in database and script each one. Display the script.
foreach (StoredProcedure sp in db.StoredProcedures) {
if (sp.IsSystemObject == false) {
// Procedure is not a system object, so add it.
urns.Add(sp.Urn);
}
}
// Start by manually adding DB context
builder.AppendLine("USE [" + db.Name + "]");
builder.AppendLine("GO");
System.Collections.Specialized.StringCollection sc = scrp.Script(urns.ToArray());
foreach (string st in sc) {
// It seems each string is a sensible batch, and putting GO after it makes it work in tools like SSMS.
// Wrapping each string in an 'exec' statement would work better if using SqlCommand to run the script.
builder.Append(st.Trim(new char[] { '\r', '\n' }) + "\r\nGO\r\n");
}
}
catch (Exception ex) {
showExceptionError("Couldn't generate script.", ex);
return;
}
try {
File.WriteAllText(txtExportToFile.Text, builder.ToString());
_utils.ShowInfo("DB exported to script at: " + txtExportToFile.Text);
}
catch (Exception ex) {
showExceptionError("Couldn't save script file.", ex);
return;
}
}
Note that foreign keys fall under the category of DRI constraints, and are scripted because of scrp.Options.DriAllConstraints = true;.
I have a solution here: Can't get EnumScript() to generate constraints
for some reason, Scripter won't emit DRI constraints (foreign keys, etc) when it's given a list of Urns, but it will if it's given one Urn at a time. The trick there is to give the Urns in the order of their ancestry: tables have to be defined before they're referenced by constraints. To do that I've used the DependencyWalker.
Here's a synopsis:
var urns = new List<Urn>();
Scripter schemaScripter = new Scripter(srv) { Options = schemaOptions };
Scripter insertScripter = new Scripter(srv) { Options = insertOptions };
var dw = new DependencyWalker(srv);
foreach (Table t in db.Tables)
if (t.IsSystemObject == false)
urns.Add(t.Urn);
DependencyTree dTree = dw.DiscoverDependencies(urns.ToArray(), true);
DependencyCollection dColl = dw.WalkDependencies(dTree);
foreach (var d in dColl)
{
foreach (var s in schemaScripter.Script(new Urn[] { d.Urn }))
strings.Add(s);
strings.Add("GO");
if (scriptData)
{
int n = 0;
foreach (var i in insertScripter.EnumScript(new Urn[] {d.Urn}))
{
strings.Add(i);
if ((++n) % 100 == 0)
strings.Add("GO");
}
}
}
Note: adding a "GO" every so often keeps the batch size small so SSMS doesn't run out of memory.