Quick one right here: I used Entity Framework's database-first approach so all the classes are automatically generated.
I am delivering the application on Monday, I was thinking that it would be better to generate the database on the first launch of the app.
Is this doable? I know it possible if you are using the code-first approach.
So I thought it is reciprocal.
If it helps I am using SQL Server 2014, Visual Studio 2015, C#, Entity Framework, Winforms.
var scsbCentral = new SqlConnectionStringBuilder(connectionString);
var ecbCentral = new EntityConnectionStringBuilder();
ecbCentral.Metadata = "res://*/KBSCentral.csdl|res://*/KBSCentral.ssdl|res://*/KBSCentral.msl";
ecbCentral.Provider = "System.Data.SqlClient";
scsbCentral.MultipleActiveResultSets = true;
ecbCentral.ProviderConnectionString = scsbCentral.ConnectionString;
var CentralContext = new KBS_CentralEntities(ecbCentral.ConnectionString);
if (!CentralContext.Database.Exists())
{
CentralContext.Database.Create();
}
Related
I have a WinForms database driven application that I want to make it work in online/offline mode using Dotmim sync framework that I find an article by their author here.
The documentation for the library is here
this is my code to sync the two SQL Server databases one is localdb and the other one is now on the SQL Server Management Studio for the testing purpose:
string connect = #"Data Source=(LocalDb)\MSSQLLocalDB;Initial Catalog=bright_square_db;Integrated Security=SSPI;AttachDBFilename=D:\Folder\project_file\bright_square_db.mdf";
string constring = ConfigurationManager.ConnectionStrings["conString"].ConnectionString;
SqlSyncProvider serverProvider = new SqlSyncProvider(constring);
SqlSyncProvider clientProvider = new SqlSyncProvider(connect);
SyncAgent agent = new SyncAgent(clientProvider, serverProvider, new string[] { "I have listed all the tables here" });
var progress = new SynchronousProgress<ProgressArgs>(s => MessageBox.Show($"{s.Context.SyncStage}:\t{s.Message}"));
var syncContext = await agent.SynchronizeAsync(progress);
MessageBox.Show(syncContext.ToString());
But, when I try to run the code. I am getting this error
The columns that indicated in the error are for a table that created by the sync process named "scope_info" inside the SQL Server database.
I have solved the problem by swapping the client and server connection string link in the third and fourth line of the above code. I don't know what exactly cause the problem, but lastly this changed makes the code work for me.
I have tried to add oracle procedures that placed inside the package using EF5 with ADO.NET Entity Data Model and nothing get imported. I cannot see any procedures.
I am using ODAC 12c Release 3 - > Oracle Data Provide.
I can import procedures that created outside the package.
Our development standard is to use oracle packages to define procedures. I am now moving from VS2008 to VS2013 with Entity Framework.
Please advise and provide possible solutions to overcome this problem. Many Thanks.
Here's how I did something similar in a recent project. Note that I use DevArt instead of ODAC, but perhaps you'll find some inspiration here to solve your issue.
I first created a class that held all parameters required for the function:
public class MyFunctionParameter
{
public DateTime? MyDateTime { get; set; }
public string Source { get; set; }
}
This class also contained a ToOracleParameters() method. Note that OracleParameter is from DevArt:
public List<OracleParameter> ToOracleParameters()
{
return new List<OracleParameter>
{
new OracleParameter("My_Date_Time", OracleDbType.Date, MyDateTime, ParameterDirection.Input),
new OracleParameter("Source", OracleDbType.VarChar, Source, ParameterDirection.Input),
new OracleParameter("ID", OracleDbType.Number, ParameterDirection.Output)
};
}
In my method I then used this class like this:
var parameters = parameter.ToOracleParameters();
var inParameterList = string.Join(", ",
parameters.Where(x => x.Direction == ParameterDirection.Input)
.Select(x => ":" + x.ParameterName));
var outParameter = parameters.Single(x => x.Direction == ParameterDirection.Output);
var sql = string.Format("BEGIN :{0} := MY_ORACLE_FUNCTION({1}); end;",
outParameter.ParameterName, inParameterList);
DbContext.Database.SqlQuery<object>(sql, parameters.Cast<object>().ToArray()).SingleOrDefault();
try to do the following:
In Server Explorer-->Data connections you will see the connection you have made to generate your model. You can modify the connection from there. There is a Tab called Filter, from where you can set many objects you can see. For example, you can get the public synonyms and displayed collections (packages, views, tables, etc)
I am not an Oracle expert, but, can you create a public synonym for a procedure wich is inside a package? Probably it would work
If you cannot find a solution, you can go here:
https://community.oracle.com/community/database/developer-tools/windows_and_.net
Some of the guys who have developed the Entity Framework for Oracle probably will help you.
Hope it helps.
Regards
I use a Postgres database for my backend for multiple web applications. These applications are hosted on a third party server that I have limited access to. I currently use npgsql and nhibernate for all my data connection needs, and this works quite well.
Well, now I need to write some Crystal Reports. For performance's sake (and because the results data will not fit into any entities) I can't use my nhibernate entities as a data source for the reports; likewise I can't create an ODBC connection for the Crystal Reports because there is no driver or DSN on the destination server. So, I thought, maybe I can build datasets based off of queries I write, and fill them using the Npgsql data provider, and feed these to the Crystal Reports. I have performed a proof of concept, and it works pretty well.
The problem is, I have a lot of reports, and a lot of datasets to build, and it's very time consuming to manually build the schema in each of these. The dataset automation interfaces won't let me choose my npgsql data provider in the connection selector, which is rather annoying.
I was hoping there might be a fairly straightforward way of throwing together a little code to get a dataset schema via the npgsql data provider at runtime, and then serialize the schemas into files I could then import into my reporting project in design time.
Can this practically be done? Is there an easier way? There's a bunch of reports and a lot of columns, and hand-coding the schema for them will be extremely time-consuming.
The dataset automation interfaces won't let me choose my npgsql data provider
Any .NET data provider must be registered within DDEX to be supported in Visual Studio designers (although, this is not necessary for using particular provider for building and running applications).
There was similar question about npgsql, but it almost 2 years old.
Maybe, something changed since 2011, because official documentation says:
2.2 Installing binary package
...
Note that placing Npgsql in the GAC is required for Npgsql design
time support in Visual Studio .Net.
Here's what I ended up doing.
I made a quick little WinForms App that with three multiline text boxes and a button. The button click event had the following code attached to it:
var query = txtQuery.Text;
var connectionString = ConfigurationManager.ConnectionStrings["connection"].ConnectionString;
try
{
string data;
string schema;
GetSchema(connectionString, query, out data, out schema);
txtXML.Text = data;
txtXSD.Text = schema;
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString(), "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
}
The GetSchema method looked like this:
private void GetSchema(string connectionString, string query, out string data, out string schema)
{
using (var conn = new Npgsql.NpgsqlConnection(connectionString))
using (var da = new Npgsql.NpgsqlDataAdapter(query, conn))
using (var ds = new DataSet())
using (var dataStream = new MemoryStream())
using (var schemaStream = new MemoryStream())
{
conn.Open();
da.Fill(ds);
ds.WriteXml(dataStream);
ds.WriteXmlSchema(schemaStream);
dataStream.Position = 0;
schemaStream.Position = 0;
using (var dataReader = new StreamReader(dataStream))
using (var schemaReader = new StreamReader(schemaStream))
{
data = dataReader.ReadToEnd();
schema = schemaReader.ReadToEnd();
}
}
}
When I ran it, I got my data XML and my schema XML. With this I was able to build my result sets.
How we select data from SDF (webmatrix) database in visual studio with Linq just like we can with northwind, like this:?
// Northwnd inherits from System.Data.Linq.DataContext.
Northwnd nw = new Northwnd(#"northwnd.mdf");
// or, if you are not using SQL Server Express
// Northwnd nw = new Northwnd("Database=Northwind;Server=server_name;Integrated Security=SSPI");
var companyNameQuery =
from cust in nw.Customers
where cust.City == "London"
select cust.CompanyName;
foreach (var customer in companyNameQuery)
{
Console.WriteLine(customer);
}
Ref: http://msdn.microsoft.com/en-us/library/bb399398.aspx
please thank you for your help.
I don't believe that Linq To SQL is officially supported with SQL Server CE 4.0, although it appears you can get it working. Microsoft's recommended approach is to use the Entity Framework.
I've written a couple of articles on using EF with SQL Server CE in WebMatrix. One covers the Code First approach, and the other looks at a database first approach.
I'm following http://www.mssqltips.com/tip.asp?tip=1910 and the following code actually does work, but takes about 90 seconds to run. The database schema has less than 10 (fairly straightforward) tables. I'm not sure why it takes so long. Any suggestions on how to debug this?
var host = "192.168...";
var user = "username";
var pass = "password";
var srcDbName = "srcDbName";
var dstDbName = "dstDbName";
var server = new Server(new ServerConnection(host, user, pass));
var srcDb = server.Databases[srcDbName];
var dstDb = new Database(server, dstDbName);
dstDb.Create();
var transfer = new Transfer(srcDb);
transfer.CopyAllTables = true;
transfer.Options.DriAll = true;
transfer.Options.ContinueScriptingOnError = false;
transfer.DestinationDatabase = dstDbName;
transfer.DestinationServer = server.Name;
transfer.DestinationLoginSecure = false;
transfer.DestinationLogin = user;
transfer.DestinationPassword = pass;
transfer.TransferData();
I think that you should leave this code alone. You cannot improve it. AS I understand from the question you want to transfer tables/schema from one database to other .
Below options that I suggest :
Linked Servers
From SQL 2000 you should be able to connect directly to other database as a linked server. In the pros column this kind of direct access can be easy to work with if you don't have any other technical skills such as DTS or SSIS, but it can be complex to get the initial set-up right and there may be security concerns/issues.
DTS
DTS is packaged with SQL 2000 and is made for this kind of a task. If written correctly, your DTS package can have good error-handling and be rerunnable/reusable.
SSIS
SSIS is actually packaged with SQL 2005 and above, but you can connect it to other databases. It's basically a better version of DTS.